All articles
AI Isn't Failing Revenue Teams. It's Exposing the Gaps Between Them.
Stephen Ko, GM for APAC at Jeen.ai, breaks down why revenue teams, not IT, must own the business logic behind enterprise AI.

Key Points
When marketing, sales, customer success, and finance each deploy their own AI tools with their own definitions of success, the revenue chain fragments, and executives lose confidence in the insights AI produces.
Stephen Ko, General Manager for APAC at Jeen.ai, identifies siloed AI capabilities and misaligned data definitions as the root cause of enterprise AI stagnation, not the technology itself.
He recommends consolidating AI tools under a shared semantic layer that encodes unified business definitions across functions, enabling faster ROI without multi-year data overhauls.
The primary factor driving adoption isn't distrust itself, but siloed AI capabilities. When different teams are using different definitions of what the data looks like, the misalignment causes information to not be well perceived by the executive.
Executive frustration with enterprise AI often lands on the technology itself. But for revenue teams, the deeper issue is usually structural. When marketing, sales, customer success, and finance each deploy their own AI tools and define their own success metrics, the revenue chain fragments. A quality lead in marketing may not register as a meaningful engagement in sales, and neither definition maps cleanly to how finance recognizes revenue. The tools are doing what they were built to do. The problem is that no one agreed on what to build them around.
Stephen Ko serves as General Manager for APAC at Jeen.ai, a unified enterprise AI platform that helps organizations accelerate AI adoption. One platform where every model, every AI agent, every workflow, and every team comes together, with knowledge systems and built-in governance. No silos. No security gaps. No vendor lock-in. Just unified AI that works with your business, not around it. His background includes managing operations teams at Atos International and holding APAC leadership roles at Amazon Web Services and Hitachi Vantara. That cross-functional experience informs a pattern he sees repeatedly: companies adopt AI in pockets, and the resulting fragmentation is what erodes executive confidence. "The primary factor driving adoption isn't distrust itself, but siloed AI capabilities. When different teams are using different definitions of what the data looks like, the misalignment causes information to not be well perceived by the executive," Ko says.
The magic wand myth: When organizations recognize the need for AI alignment, the project often lands on IT's desk by default. "The responsibility quite often gets pushed to IT. It feels like IT has the magic wand to set this up. Unfortunately, IT is just an enabler, the executor. They don't know how to define the business logic," Ko says. The instinct is understandable. IT controls the infrastructure, manages vendor relationships, and has the technical fluency to deploy new tools. But defining what a quality opportunity looks like, how churn risk should be measured, or when revenue can be recognized are decisions that live with the people closest to the deal cycle. That planning work belongs to revenue operations and go-to-market leaders. Without their input, IT builds systems around assumptions rather than actual business rules, and the resulting outputs are the ones executives later dismiss as unreliable.
No delegation without direction: Even when revenue teams take ownership of the planning, the process stalls without executive backing. "It shouldn't just be, 'Revenue operations or the CRO, you do the job, and IT will implement, and things will run smoothly.' That's not going to happen. We need the executive to reinforce how to do it," Ko says. Executives can't delegate alignment and walk away. They have to enforce that new data follows the same definitions as existing information, and that governance is what keeps the system consistent as it scales. Some leaders hesitate because they assume they lack a dedicated AI task force, but most companies already have the people they need. What's missing is a shared framework for how data translates across teams.
That framework is the semantic layer. In practice, it functions as a unified translation mechanism that encodes how the business defines success so AI systems apply rules consistently across every function. When marketing defines a quality opportunity, the semantic layer maps that definition to how sales measures engagement, how customer success tracks retention risk, and how finance recognizes revenue. Instead of each department interpreting data through its own lens and producing conflicting reports, the semantic layer establishes a single authoritative reference point. The AI still does the analysis, but it operates from shared rules rather than siloed assumptions.
Lost in translation: "I certainly believe AI will be able to learn more, but eventually, it's up to humans to explain what they mean. What something means in one organization doesn't apply to all organizations. The semantic layer is basically the guidebook," Ko says. AI can process and pattern-match at scale, but it cannot independently determine that one company's definition of churn means something entirely different from another's. That institutional knowledge lives with the people already on the revenue team. When customers express concern about a lack of dedicated AI roles, Ko reminds them that the talent is already in place. "You already have your CRO, your finance team, and your go-to-market teams. Eventually, it's about how we're going to manage revenue holistically," he adds. The gap isn't personnel. It's the absence of a shared language that connects what each of those teams already knows.
Don't boil the ocean: Messy data is one of the most common friction points in enterprise AI. Companies with multiple systems and inconsistent records often feel they need to scrub every historical file before they can move forward, but that effort quickly becomes an endless loop. "By the time they clean up all the data, there's already a new set of data coming in. It will never end. It's a vicious cycle. What we suggest is starting to clean up the garbage, the really big troublemaker data first," Ko says. Rather than pursuing a full historical cleanse, the more tactical approach is to prioritize the worst offenders and then establish a semantic layer so new information is governed as it enters the system. Once those shared definitions are in place, incoming data follows consistent rules by default, which gradually reduces the burden of cleaning what came before.
Tick-tock tech: Once a basic level of readiness is in place, the next step is auditing the current AI landscape. If different functions have already deployed several siloed products, consolidation is the priority. With a shared semantic layer defining what data means, a unified platform aligns each team's requirements and relieves the pressure to endlessly scrub historical records. But that consolidation has to happen fast. Leadership teams are asking for ROI in months, not years, and the pace of AI advancement makes multi-year IT overhauls a losing proposition. Disciplined, phased execution that focuses on two or three priorities at a time keeps deployments moving fast enough to show value while remaining grounded in clear definitions. "If we talk about eighteen to twenty-four months, that's too long nowadays. Six to nine months down the road, AI will advance a lot. AI will keep advancing by itself, but it still needs to fit into the corresponding organizational structures to gain the optimal benefit," Ko says.
Balancing speed and structure starts with getting cross-functional teams in the same room to align on how they define core metrics. Marketing, sales, customer success, and finance each contribute their own definitions of things like quality opportunities, churn, and recognized revenue, and a semantic layer encodes those shared definitions so AI outputs reflect the same meanings for every function. In that kind of environment, marketing and sales connect quality opportunities to engagement metrics. Customer success reads the same signals about churn risk that finance uses to forecast revenue. Instead of departments operating from separate interpretations, the whole revenue chain works from a consistent view of what the data means.
But operational alignment is only part of the picture. Governance extends well beyond internal data management. As the semantic layer becomes a standard component of modern data stacks and AI capabilities continue to accelerate, the question shifts from whether organizations can make these tools work to whether they can deploy them responsibly. "Apart from business achievement, AI governance and ethics can't be an afterthought; they must be built in by design. At Jeen.ai, we embed governance, compliance, and ethics into every layer: Auditability, FinOps, and Explainability. Because AI will keep advancing, we need the right guardrails to ensure it advances responsibly—not just for organizations, but for society."





.webp)