AI Adoption Problem-Solving Scenarios - Patterns for Solving Real Problems
Quote automation, new hire onboarding, marketing performance analysis — follow three real-world scenarios from problem definition to AI implementation, step by step.
We talked about defining problems and exploring the right tools. Now let me show you how that actually works in practice.
Enough theory. Let's get real.
In the previous two posts, we covered two key ideas.
First, before asking AI for answers, you need to craft the right questions. Second, tools aren't limited to software — you need to explore quickly and validate lightly.
But out in the field, we still hear this:
"Okay, but what does that actually look like in practice?"
Fair point. Frameworks only matter when they work in the real world. So in this post, we'll walk through three realistic scenarios from start to finish — from defining the problem, to exploring tools, to applying AI/AX.
Scenario 1: "It takes a full day to create a quote"
The typical approach
The sales team is in pain. Creating quotes takes way too long. The usual response? Jump straight to searching for "quote automation solutions."
Start with the problem definition
"Who is affected, in what situation, because of what, and what's the outcome?"After receiving a customer request, sales reps have to pull product specs, unit prices, discount rates, and delivery timelines from the ERP, spreadsheets, and old emails separately, then manually transfer everything into a quote template. The result: it takes an average of 6 hours to send a quote, and 40% of urgent requests don't get a same-day response.
Now the problem is clear. It's not that "creating quotes is slow." It's that information is scattered and manual aggregation is the bottleneck.
Tool exploration: four questions
① Can we eliminate it? We can't get rid of quotes entirely. But if we separate the information customers actually look at from the items included out of internal habit, we might be able to cut the template in half.
② Is the core issue judgment or repetitive execution? Setting the discount rate requires judgment. But gathering product specs, unit prices, and delivery timelines is pure repetitive work. 80% of the problem lives on the repetitive execution side.
③ What can we use right now? If the ERP has an API, connecting it to a spreadsheet might be possible without any custom development. And if past quote data has been accumulating, simply searching for similar cases could save time.
④ What's the cost of validation? The fastest experiment is this — analyze the patterns in the 50 most recent quotes, then test a prompt where you feed a customer request email into a generative AI and have it draft a quote. It takes one day.
Applying AI/AX
Day 1: Train a generative AI (Claude, GPT, etc.) on your past quote templates and product catalog, then build a workflow where inputting a customer request email generates a draft quote. The rep reviews the draft and only needs to decide on the discount rate.
Week 1: Add ERP integration for automatic data retrieval. Enter a product code, and the latest unit price and delivery timeline auto-populate.
Month 1: Use the accumulated data to analyze patterns like "this discount rate increases the win probability for this customer." AI starts assisting even in the judgment zone.
Result: Quote creation time drops from 6 hours to 40 minutes. The people didn't change. The process changed, and AI was layered on top of it.
Scenario 2: "It takes a month to onboard a new hire"
The typical approach
HR wants to build an AI chatbot — one that automatically answers new hires' questions. Sounds reasonable.
Start with the problem definition
During their first month, new hires have to individually ask an average of 12 colleagues to learn how to use internal systems, company policies, and team-specific processes. This drains 40 hours per month from existing team members for each new hire, and it takes new employees an average of 4 weeks before they can do any real work.
The core problem is visible now: knowledge lives only in people's heads and isn't systematically documented.
Tool exploration
① Can we eliminate it? We can't eliminate onboarding itself, but we can eliminate the "asking 12 different people" pattern. If the same questions keep coming up, the answers just need to live in one place.
② Judgment vs. repetition? Most of what new hires ask are factual lookups. "Where do I submit a vacation request?" "What's the code review process?" These aren't judgment calls — they're information retrieval.
③ What can we use right now? The fastest tool isn't AI. It's asking people who recently joined or are about to leave to write down the 20 things they asked most during their first month. That list becomes the foundation for everything that follows.
④ What's the cost of validation? Two days to compile those 20 questions and answers into a single document. One month to validate whether it actually helps the next new hire.
Applying AI/AX
Immediately: Build the list of frequently asked questions and consolidate team-specific processes into a single document. This is human work, not AI — it's a process tool.
Week 2: Build an internal AI assistant on top of that consolidated documentation. Layer a RAG-based (Retrieval-Augmented Generation) chatbot over Notion, Confluence, or your internal wiki. New hires ask questions in plain language and get answers pulled from the relevant documents.
Month 1: Collect the log of questions the chatbot couldn't answer. That log becomes a list of "undocumented institutional knowledge." Document those, and you end up overhauling the organization's entire knowledge system.
Result: Onboarding drops from 4 weeks to 2, and the time drain on existing team members falls from 40 hours to 10. But the real win is something else entirely. The knowledge base built through this process becomes valuable not just for new hires, but for existing employees too. You set out to solve an onboarding problem, and you ended up improving information access across the entire organization.
Scenario 3: "We have no idea how marketing is performing"
The typical approach
The marketing team wants to adopt an AI analytics dashboard — a sleek screen that shows all the data at a glance. They schedule a vendor meeting.
Start with the problem definition
The marketing team runs campaigns across 5+ channels (search ads, social media, email, content, offline), but performance data is scattered across different platforms. It takes 3 days each month to compile a unified report, and by the time it's done, the decision-making window has already closed — insights never make it into the next campaign.
The core issue isn't "we need better analysis." It's that slow data consolidation means insights expire before they're used.
Tool exploration
① Can we eliminate it? Are all 5 channels actually necessary? If you cut underperforming channels, you reduce the amount of data that needs consolidating in the first place. Trimming down to 3 channels alone could cut report creation time by 40%.
② Judgment vs. repetition? Downloading data from each platform, reformatting it, and merging it together is pure repetition. But interpreting "why search ad conversion rates dropped this month" is judgment. Automate the repetition, and people can focus on the judgment.
③ What can we use right now? Google Ads, Meta, GA4 — most ad platforms already offer APIs. Just connecting them to Looker Studio or a spreadsheet eliminates manual downloads. This isn't AI — it's using existing tools properly.
④ What's the cost of validation? One day to connect APIs to a spreadsheet. One week to compare it against the old manual report.
Applying AI/AX
Day 1: Set up automatic data collection from each ad platform into a single spreadsheet or data warehouse. This is data engineering, not AI. But if you skip this step and layer AI on top, you end up with AI sitting on a mess.
Week 1: Connect generative AI to the consolidated data. Natural language queries become possible: "Summarize this week's search ad performance," "Show me the top 3 creatives by conversion rate." The time spent creating reports disappears — only the time spent reading them remains.
Month 1: AI starts detecting anomalies automatically. "CPC rose 35% yesterday compared to the average. Likely causes: ①increased competitor bidding ②keyword quality score decline." AI flags issues before anyone even asks.
Result: Report creation goes from 3 days to real-time auto-generation, and decision-making shifts from once a month to at least once a week. The marketing team's role transforms from "people who organize data" to "people who make decisions based on insights."
The common pattern across all three scenarios
Looking back, all three cases follow the same structure.
Step 1 — Define the problem concretely. Go beyond "it's slow," "we don't know," or "it's hard" — break it down into who, situation, cause, and outcome.
Step 2 — Find the non-AI tools first. Process cleanup, eliminating unnecessary steps, better use of existing systems — this alone often solves a significant chunk of the problem.
Step 3 — Layer AI on top of the reduced problem. AI works best on clean data and clear processes. Stack AI on top of chaos, and all you get is faster chaos.
Step 4 — Start small and let results drive the next step. A one-day experiment → a one-week pilot → a one-month validation. Stick to this rhythm, and failures stay cheap while successes give you the confidence to scale.
AI transformation isn't a technology project
One message runs through all three of these posts.
AI transformation (AX) isn't about adopting technology — it's about changing how you solve problems.The ability to define problems well, the habit of exploring tools with an open mind, and the discipline to validate quickly through small experiments — organizations that have these three things will adapt no matter what technology comes along. AI is just the accelerator that sits on top.
Technology keeps changing. But the ability to define good problems, find the right tools, and execute fast — that's timeless.