What Three AI Startup CEOs Got Right, Got Wrong, and Won't Stop Thinking About

by Rebecca Blouin | April 7, 2026

CEO roundtableNot long ago, the AI demo was enough. Investors leaned in. Customers got excited. Everyone wanted a piece of it.

That phase is over.

The excitement has given way to something harder—the work of making AI production-ready. At our recent roundtable, Architect It Once or Rebuild It Twice, we brought together three CEOs of AI-native startups who are living this in real time: Alex Noe of AnySoft, Asa Whillock of Euphonic AI, and Akash Sureka of TheNoah.ai, joined by CData's Field CTO Rahul Pahuja. The conversation was candid, specific, and occasionally humbling.

Here's what they shared.

What they got right—and what they'd do differently

Every one of these founders has shipped AI features to real customers. And every one of them has something they'd go back and change.

Alex Noe (AnySoft) said they got integrations right from the start—one shared database across all the apps they build, with the best integrations in the world behind it. The thing he'd go back and change? Analytics. "We didn't know exactly what was going on with our users and where they were clicking." When you're building fast, instrumentation gets deprioritized. It shouldn't.

Asa Whillock (Euphonic AI) called out integrations as his biggest do-it-earlier regret. They invested in integrations early. It still wasn't early enough. "Working with integrations at scale? Everything has hair on it. That is the major gate for driving everything forward."


Akash Sureka (TheNoah.ai) was unambiguous about the importance of getting the data layer right. "In the AI world, system of record is all that data. That's the most foundational layer moving forward. That's where it's going to make or break."

The data integration problem is bigger than it looks

All three platforms depend heavily on customer data. And all three will tell you the same thing: it's messier than it looks.


Asa described working across five to 15 go-to-market stacks in a single customer environment where "not a one of them agreed with each other." The problem, he was quick to say, isn't really a data engineering problem. It's about connecting to those systems and asking a harder question: what was actually meant to be the ground truth here? What was the process that was supposed to be in place?

That's what makes data context so difficult—and so important. Asa put it plainly: models are the easy part, but "without context, they really don't know anything at all." The frontier models are capable. What they lack, in most deployments, is the understanding of why the data looks the way it does. That's what breaks AI features in production.

Don't try to build connectivity yourself


When asked whether they'd tried to build connectivity in-house, the answers were: briefly, and never again.

Asa was direct: "The perception is that APIs abound—AI should be able to figure that out quickly for you, plug into the data and get going. What’s to know? That could not be farther from the truth." Even well-documented APIs have undocumented components, mismatched performance semantics, and variations that stack before you ever reach the semantic layer—which, by the way, is custom in every single account. His advice: "Why are you going to reinvent the wheel if there's capability available to you so you can specialize in what you actually want to be great at?"

Connectivity isn't enough. Accuracy is

Connectivity gets your AI features off the ground. But as you move toward agentic workflows—where there's no human in the loop double-checking outputs—accuracy becomes the real problem. A wrong answer at step one compounds at step two, step three, step four.

CData's Field CTO Rahul Pahuja walked through findings from a study evaluating five MCP providers across four enterprise systems and 378 standardized test runs. CData came in more than 25 percentage points more accurate than the next closest provider. The reasons came down to four things: relative data logic, multi-filter queries, semantic intelligence, and write operations. In each case, the issue wasn't the model—it was what the connectivity layer handed the model to work with.


Asa put the business case plainly: "I've been an analytical leader for twenty years. I can you tell me how many customers have asked for a system where if you ask the same question two or three times, it comes up with a different answer? It's zero. Not one of them wants to rely on a system like that. Accuracy is key."

What they're building toward


Akash is focused on two things over the next six months: large datasets and relationships. As datasets scale into the tens of millions of records, everything changes—context windows, memory layers, latency. And data relationships, which vary company to company and department to department, are the real key to making everything else work. "Understanding business relationships and data relationships—that's the central pillar."

Asa zoomed out. He's thinking about what he calls putting down the connective tissue—moving enterprise beyond what he describes as "the amoeba stage, where one cell doesn't talk to another very well." The fundamentals being laid now are what that evolution runs on.

Alex sees the future of applications as hyper-personalized, assembled in the moment, with agents handling much of the work autonomously. The goal isn't to predict exactly when—it's to take one step in front of the other to get customers there today.

One piece of advice

We ended with rapid fire. One early decision that paid off. One thing to start sooner.


Alex: Start compliance certifications as soon as you start the company. "It has paid off like crazy." AnySoft is now the most certified, most compliant platform in their competitive space—and it started on day one.

Akash: Integrations first. Governance second. Both from the start.


Asa's advice is one every founder building fast should hear: "Have a plan for what you're going to allow AI to code—and a different plan for what you're really going to support and scale. You don't want to end up in a state where you're trying to support something that you asked Claude to put together six months ago."

That's not an argument against using AI to build. It's an argument for being deliberate about which parts of your foundation you're trusting to a quick-turn vibe-coded sprint.

Build it right the first time. Or you'll rebuild it twice.

Want more? Watch the full roundtable discussion here.

See how CData powers AI features in production

Connect AI Embed gives AI-native software companies the data connectivity layer their features actually need.

Get the demo