91% of mid-market companies say they’re using AI. Only 11% have anything in production. Which number are you?
TL;DR: There are four stages of AI adoption: Experimenting, Piloting, Shipping, and Compounding. Most companies are stuck in the first two. This assessment helps you figure out where you are—and what it takes to move forward. The gap between “using AI” and “shipping AI” is where value lives.
The Four Stages
AI adoption isn’t binary. It’s not “using” or “not using.” It’s a spectrum.
Experimenting → Piloting → Shipping → Compounding
Most companies are in the first stage. They say they’re “using AI” because someone has a ChatGPT subscription. That’s not adoption. That’s experimentation.
Stage 1: Experimenting
You’re here if:
- People use AI tools individually, for ad-hoc tasks
- No shared processes or standards
- Results vary wildly by person
- No integration with existing workflows
- AI is a curiosity, not a capability
What it looks like: The marketing person uses ChatGPT to draft emails. The developer uses Copilot for autocomplete. The CEO uses Claude to summarize documents. Everyone’s exploring. Nobody’s coordinating.
Moving to Piloting requires: Picking one use case. Assigning an owner. Defining what success looks like.
Stage 2: Piloting
You’re here if:
- You have 1-3 AI projects in active development
- There’s a defined scope and owner
- Results are promising but not proven
- The system works when the builder is present
- It hasn’t been through production stress
What it looks like: The team built something. It’s impressive in demos. It handles the happy path. But it needs the person who built it to run it. Edge cases cause problems. There’s no monitoring. There’s no documentation.
You have a demo, not a system.
Moving to Shipping requires: The checklist: reliability, observability, security, maintainability, independence. If you can’t tick most of those boxes, you’re not ready to ship.
Stage 3: Shipping
You’re here if:
- You have AI systems running in production
- They handle real workloads without constant supervision
- They’re monitored and maintained
- Your team can operate them without vendor support
- They deliver measurable value
What it looks like: The AI system runs. It processes real data. It makes real decisions (or assists real decisions). When it fails, alerts fire. When it needs updates, there’s a process. The value is measurable—time saved, errors reduced, throughput increased.
This is where most companies want to be. Few get here.
Moving to Compounding requires: Systematizing what you learned. Codifying patterns. Building reusable components. Making each project easier than the last.
Stage 4: Compounding
You’re here if:
- AI is embedded across multiple workflows
- New AI projects ship faster than previous ones
- Knowledge is documented and shared
- Non-specialists can contribute to AI systems
- Each implementation makes the next one easier
What it looks like: The team has shipped multiple AI systems. Each one was easier than the last. There are templates. There are patterns. There are documented approaches. When someone new joins, they’re productive quickly. When a new use case emerges, the team knows how to evaluate it.
This is organizational capability, not just technology.
Very few companies reach this stage. The ones that do have massive advantages.
The Assessment
Score yourself honestly. For each question, pick the answer that best describes your situation.
Question 1: How is AI being used?
- (1) Individuals experimenting on their own
- (2) One or two coordinated projects in progress
- (3) AI systems running in production
- (4) AI embedded across multiple workflows
Question 2: Who can operate your AI systems?
- (1) Only the person who built it
- (2) The team that built it, with some documentation
- (3) Anyone on the tech team with access
- (4) Non-technical staff can use and maintain them
Question 3: What happens when AI fails?
- (1) We find out when someone complains
- (2) The builder gets a call
- (3) Alerts fire, runbooks exist
- (4) Automatic fallbacks handle most failures
Question 4: How long to ship a new AI use case?
- (1) We haven’t shipped one yet
- (2) Months of work
- (3) Weeks of work
- (4) Days to weeks, depending on complexity
Question 5: What’s documented?
- (1) Nothing formal
- (2) Some notes the builder made
- (3) Full documentation for production systems
- (4) Playbooks, patterns, and reusable components
Question 6: How is learning shared?
- (1) It isn’t
- (2) Informal conversations
- (3) Post-mortems and documentation
- (4) Codified into prompts, templates, and processes
Question 7: What value has AI delivered?
- (1) Hard to say—mostly exploration
- (2) Promising results in pilots
- (3) Measurable value in production (time, cost, quality)
- (4) AI is core to how we operate
Scoring
Add your numbers.
7-10: Experimenting
You’re exploring. That’s fine. But you’re not getting value yet. Pick one use case. Go deeper. Stop spreading attention across experiments.
11-17: Piloting
You’re building. You have promising projects. But they’re not production-ready. Focus on the checklist: reliability, observability, security, maintainability, independence.
18-24: Shipping
You’re delivering value. But each project is standalone. Start systematizing. Document what works. Build reusable components. Make the next project easier.
25-28: Compounding
You’re ahead of 99% of companies. Keep codifying. Keep sharing. The advantage compounds.
Moving Forward
Wherever you scored, the path forward is the same: one stage at a time.
From Experimenting to Piloting:
- Pick one use case with clear value
- Assign an owner
- Define success criteria
- Set a timeline
From Piloting to Shipping:
- Run through the production-ready checklist
- Build monitoring before you need it
- Document while you build, not after
- Plan for the builder to step back
From Shipping to Compounding:
- Document what worked (and what didn’t)
- Create templates for common patterns
- Share knowledge across teams
- Make the next project easier
The 91% who are “using AI” are mostly experimenting. The 11% who are shipping have crossed a real threshold. The companies who are compounding are building lasting advantage.
The Bottom Line
91% using AI. 11% shipping. The gap isn’t about technology. It’s about discipline. Experimenting is easy. Piloting is harder. Shipping is where most fail. Compounding is rare. Which stage are you in? And what would it take to move to the next one? That’s the question that matters.
Need help moving to the next stage? We close the gap between experimenting and shipping. Let’s talk.

