Skip to main content

Part of the AI Transformation series

AI & Technology6 min readJanuary 30, 2026
Share:

AI and Data / IoT Strategy

Most AI strategies fail because they start with technology instead of business problems. Here is the order of operations that works.

Key Takeaways

  • 80% of AI projects fail because they start with the technology instead of the business problem.
  • The correct order: business problem first, data audit second, model selection third, implementation fourth.
  • Before investing in AI, answer: 'What decision are we trying to make better, and what data do we need to make it?'
  • Start with one high-value use case that has clean data. Prove ROI before scaling.

Industry studies report 70-85% of AI projects fail when they start with technology instead of a business problem. I've seen the same pattern in enterprise and growth-stage work since 2008. The correct order: define the business problem with a measurable outcome, audit your data for quality and completeness, select the simplest approach that works, and implement with outcome measurement. A first AI use case should ship in 4-8 weeks.

A CEO told me he needed an "AI strategy." I asked what business problem he was trying to solve. He paused. "I do not know yet. But our competitors are all investing in AI, and our board is asking about it."

What Is AI and Data / IoT Strategy?

An AI and data strategy for IoT is the sequencing of data pipelines, governance, and models so connected products produce decisions, not just telemetry. You decide what to instrument, what to centralize, and where automation earns margin before you fund science projects. Mid-market teams win when they tie each initiative to a revenue or cost outcome.

That conversation captures the pattern. They start with the technology and work backwards toward a problem. The successful ones do the opposite.

The Right Order of Operations

Step 1: Define the Business Problem

Not "we need AI." Instead: "Our sales team spends 40% of their time on leads that never convert. If we could predict which leads will convert before they enter the funnel, we could redirect that time to high-probability deals."

That is a business problem with a clear outcome: reduce wasted sales time by 40%, increase revenue per rep, shorten the sales cycle. This is also one of the highest-ROI AI applications for revenue teams.

Step 2: Audit Your Data

AI is only as good as the data it learns from. Before building anything, audit what you have:

  • Is the data clean? (Accurate, consistent, complete)
  • Is there enough of it? (Most ML models need thousands of examples)
  • Is it accessible? (In a database, not in spreadsheets or people's heads)
  • Is it labeled? (For supervised learning, you need historical outcomes attached to the data)

I worked with a company that wanted to build a churn prediction model. When we audited their data, we found that churn reasons had not been tracked for the first two years. They had usage data but no outcome labels. We had to build the labeling system first, collect 6 months of data, and then build the model. Had they audited the data first, they would have known the timeline was 9 months, not 3.

Step 3: Select the Approach

Based on the problem and data, choose the simplest approach that works:

  • Rule-based logic: If the data patterns are obvious and the decisions are binary, you do not need ML. An if/then rule set is cheaper, faster, and easier to maintain.
  • Classical ML: If you have structured data with clear features and labels, standard models (regression, decision trees, random forests) work well and are interpretable.
  • Deep learning / LLMs: If you have unstructured data (text, images, conversations) or very complex pattern recognition needs, more sophisticated models are warranted.

The mistake is jumping to the most complex option. Start simple. Graduate to complex when simple is not enough. For one practical example of this progression, see how AI personalization engines move from rules to ML to real-time adaptation.

Step 4: Implement with Measurement

Build the first version in 4-8 weeks. Measure the business outcome, not the model accuracy. An ML model that is 85% accurate but moves no revenue metric is a failed project. A rule-based system that is 70% accurate and saves 20 hours of sales time per week is a success.

IoT Strategy: Same Principle

IoT projects follow the same pattern. The question is not "how many sensors can we deploy?" It is "what decision do we need to make better with real-time data?"

A manufacturing client wanted to deploy IoT sensors across their fleet. I asked: "What will you do differently with the sensor data?" After some discussion, the answer was: "predict maintenance needs before equipment fails." That is a business problem. From there, we could scope the data requirements, sensor placement, and decision model. The same pattern applies in telecom and connected infrastructure, where IAM orchestration for IoT adds a critical security layer.

Get the Growth Diagnostic Framework

The same diagnostic I run in the first 14 days of every engagement. Three biggest revenue gaps, prioritized with dollar impact.

Book a diagnostic

Your First Step

Write one sentence: "The business decision we want AI to improve is [X], and the metric we will use to measure success is [Y]." If you cannot complete that sentence with specifics, you are not ready for an AI project. You are ready for a problem definition workshop. The AI transformation guide walks through the full readiness framework.

If you want help defining your AI or IoT strategy, book a diagnostic.

Frequently Asked Questions

How long does it take to see results?

Most teams see the first measurable movement within 4-6 weeks once KPI ownership and the weekly cadence are in place. The bigger shifts usually show up within two quarters.

What metrics should I track first?

Start with the one metric closest to revenue and the one metric closest to leakage. If you cannot connect a metric to a P&L outcome, it is not a first-week metric.

What is the most common reason AI and Data / IoT Strategy fails?

Lack of ownership. The work gets discussed, but no one owns the KPI, the meeting, and the follow-up. When the cadence breaks, execution drifts.

If you want help applying this on AI and Data / IoT Strategy, Book a diagnostic.

Use The KPI Tree Framework to connect action to a P&L outcome, then course-correct weekly.

Related

Dhaval Shah, professional headshot

Dhaval Shah

Fractional Leader

26+ years in product and revenue operations. $50M+ revenue influenced across healthcare, fintech, retail, and telecom.

Connect on LinkedIn

AI strategy that connects to revenue?

I focus on the 2-3 AI applications with the fastest path to ROI. No science projects. 30-minute call to identify the highest-impact AI investment for your business.

Start with proof in case studies, then review engagement models.

Book a diagnostic