Embracing Human Confusion: The Hidden Power of AI in Operational Environments
In the fast-paced world of live-event operations, clarity and precision often take a backseat to the chaos of human emotion and confusion. As a product manager working in live-event operations technology, I’ve seen firsthand how support tickets rarely arrive as straightforward bug reports. Instead, they come as vague, emotionally-loaded messages like, “Hey, I think something’s off with the order data for last night. Can you check what happened?” No order ID, no suite number—just a trail of uncertainty that requires digging through multiple tools and spreadsheets for answers.
This confusion is not just a nuisance; it represents a significant opportunity for innovation, particularly with the advent of Generative AI (GenAI). Last month, I embarked on a journey to create a GenAI prototype aimed at answering these messy questions in 30 seconds instead of the usual 30 minutes. What I discovered was not just the capability of AI to provide insights, but the profound importance of embracing human confusion rather than fighting it.
The Motivation Behind a “Useless” Prototype
Operational teams at large venues juggle massive amounts of data—point of sale (POS), orders, refunds—during events. Extracting insights from this data traditionally involves a laborious process: downloading multiple Excel reports, manually cross-referencing order IDs, and constructing pivot tables, often with little certainty of finding the root cause of issues. My goal was to replace this friction with something fast, conversational, and helpful. I envisioned a system where users could type natural questions like “How many refunds happened in suites last night?” and receive immediate, actionable answers presented as charts, structured tables, and suggested next steps.
The aim wasn’t to showcase AI prowess but to reduce the mental load on operations teams who are constantly multitasking and reacting in real-time.
Strategic Constraints: A Blessing in Disguise
This was not a well-funded AI moonshot. I had two weeks and no budget, which led to three strategic decisions that unexpectedly enhanced the prototype:
1. Excel Uploads Over Database Integration: Starting with drag-and-drop file uploads allowed for speed and flexibility. This constraint forced me to focus on the core interaction: transforming messy human questions into clear AI insights.
2. Keywords Before LLM Magic: By using pattern matching for phrases like “refunds” and “payment failures,” I could validate user intent without incurring high API costs.
3. Five Questions Over Fifty Features: I concentrated on the top five questions frequently asked by ops teams. Solving these few problems perfectly proved more effective than addressing fifty issues poorly.
Building for Evolution: A Modular Approach
I designed the system using six modular blocks, each with a clear upgrade path:
– Excel Upload → Real-time API connections
– Data Validation → Auto-healing for common errors
– Vector Embeddings → Advanced semantic search
– Query Processing → Full LLM reasoning
– Response Generation → Multi-modal outputs
– Chat Interface → Workflow automation
This modular architecture allowed for simplicity in the beginning, with the ability to layer in sophistication over time. Each block could be independently upgraded, avoiding the need to rebuild the entire system.
The Breakthrough Moment
The true value of the prototype became apparent during a conversation with a concession vendor who expressed a simple need: “I just want to know what I’m going to run out of tonight before I do.” This was the moment I realized I was solving the right problem. Traditional dashboards are inadequate for forecasting, static reports are too slow, and manual analysis is too time-consuming during live events. A GenAI assistant could transform reactive firefighting into proactive management by surfacing insights like low-stock trends or missed reorder windows in real time.
Lessons Learned: Beyond the Prototype
The real value of this exercise wasn’t the prototype itself but the lessons learned:
Technical Lessons
1. Define Precise Requirements: Vague specifications lead to rework. Documenting the exact Excel structure would have saved significant debugging time.
2. Test Incrementally: Add one capability at a time and test thoroughly to avoid compounding bugs.
3. Build Feedback Loops Early: Incorporate user ratings and query success tracking from the start.
4. AI Code Generation Needs Oversight: Generated code is difficult to audit; maintain step-by-step logs for clarity.
Product Lessons
1. Categorize Performance Expectations: Design systems to handle various response times based on query complexity.
2. Start Where Users Are Confused: Focus on real user pain points rather than perfect data structures.
3. Conversational Interfaces Change Behavior: Even a basic chat UI encourages users to ask follow-up questions, highlighting the need for context handling.
Business Lessons
1. Budget for Iteration: GenAI development costs escalate with usage; account for this in your budget.
2. Start in Dev Environments: Serious prototyping is faster outside of no-code platforms.
3. Constraints Force Clarity: Limited resources drive focus on core user value over flashy features.
The Uncomfortable Truth About GenAI Implementations
Most enterprise GenAI implementations focus on choosing the right language model or achieving perfect data queries. However, the real opportunity lies in building systems that embrace human messiness. Teams that prioritize understanding where users feel lost and build solutions around these pain points will gain a significant advantage.
Conclusion: Start with Confusion, Not Strategy
The most valuable outcome of my prototype wasn’t the technology but the insight into the gap between user confusion and data capabilities. This understanding emerged not from planning but from building. If you’re working in operations-heavy environments, don’t wait for perfect conditions—start prototyping. Ask where users feel lost, not what your AI strategy should be. The insights you need will come from the process of creation, not from the drawing board.
Leave a Reply