“Ambition needs to be funnelled to build great things,” explained Sachin Duggal, founder, Builder AI, in 2016. “Ambition cannot go in every single direction.” He built an AI-assisted platform that enabled businesses to develop apps without coding. The company raised more than $450 million from global investors, including Microsoft, Qatar Investment Authority and Insight Partners.
Internally, Builder.AI scaled rapidly from about 40 employees in 2018 to more than 1,100 employees by 2024, reflecting aggressive global expansion. The platform was designed to automate a large part of the software development process through reusable components and AI-assisted workflows.
The Indian start-up ecosystem has come of age, and 2025 marked a maturity inflexion for India’s start-up ecosystem – not because ambition declined, but because the market began rewarding unglamorous disciplines that compounded over time – governance, unit economics, retention and repeatable growth. After one decade, when scale was often measured by headline valuations and funding rounds, the ecosystem has now shifted decisively towards durability.
Looking back, three structural changes stand out most clearly. First, liquidity returned with consequences. India saw an uptick in IPOs, OFS activity and secondary transactions, converting paper wealth into realised outcomes. When liquidity becomes real, storytelling gives way to scrutiny. Balance sheets matter. Governance frameworks move from optional to essential.
Second, capital became earned. Funding did not vanish; it became selective. While overall start-up funding volumes were significantly lower than the 2021 peak, companies with strong distribution moats, predictable delivery and operating leverage continued to raise capital. Growth narratives gave way to metrics, such as cohort retention and margin visibility.
Finally, wealth creation shifted from valuation inflation to conversion events. New start-up billionaires were no longer defined by notional valuations but by exits, partial liquidity and public-market validation. “The year 2025 generated a feeling that the ecosystem has moved from fast fashion to tailoring, fewer flashy jackets, better stitching, better fit,” Duggal put it succinctly.
Then there was the curious case of Builder AI, which set out to solve a real and persistent problem: reducing the friction between ideas and software execution. The promise was compelling, making bespoke software predictable, reusable and accessible, without requiring founders to write code or deeply understand engineering.
Enthusiastic response
The market responded enthusiastically. Builder AI scaled rapidly, adding headcount at speed, onboarding customers across geographies and raising capital that pushed its valuation to about $2 billion. At its peak, the company represented the archetype of a modern, globally ambitious SaaS plus AI start-up.
But speed, while intoxicating, can create fragility. Hyper-growth often disguises risk. When teams expand rapidly, sometimes by double digits every quarter, and customer acquisition accelerates into the thousands, momentum itself begins to feel like validation. Yet, Builder AI’s trajectory shows how velocity without corresponding internal controls can magnify small cracks into systemic failures.
“The idea was sound,” Duggal now acknowledges. “The ambition was real. But I built too much, too fast.” Growth was genuine, but internal verification mechanisms failed to keep pace. Trust replaced audits. Reports replaced scrutiny. Assumptions replaced checks. In hindsight, none of this feels dramatic. It feels familiar to anyone who has watched a start-up outgrow its own operating rhythm.
“There were moments I asked for slowdowns – negotiated pauses meant to let verification catch up,” Duggal concedes. “Too often, those pauses were shortened or ignored, and the machine kept accelerating. I should have pushed harder. That’s on me.”
A familiar founder dilemma ‘dogma’ played out: do not interfere, do not micro-manage, empower teams. In theory, this is correct. In practice, at scale, it can become dangerous. Layers formed between leadership, the board, and operational truth. What began as a delegation gradually turned into distance.
In Builder AI’s case, that distance became structural – a layer formed between founder and board – and a vacuum emerged where two versions of ‘truth’ could quietly coexist. “I should never have let that layer develop,” Duggal admits. “It allowed two versions of reality to brew – controlled by the layer or person in between.” But the pattern is universal: the moment operational truth becomes mediated, everything starts steering by narrative instead of reality.
“When anything broke, I thought it was my responsibility to fix it,” Duggal owns up, having a classical problem for any founder deeply obsessed about their product and customers, “which meant taking away the ability for others to solve even though my intention was to be in the trenches to help.” Ownership, when unchecked, can turn into a bottleneck rather than a safeguard. Eventually, a company once valued in the billions unravelled within months.
Lessons from the collapse
Builder AI’s story is uncomfortable precisely because it is not about fraud or fantasy growth. It is about structural imbalance. Verification is not interference. Asking questions is not micro-management.
One more uncomfortable lesson emerges from Duggal’s reflection: proximity creates an illusion of trust. “When you speak to someone every day, solve every problem together, you assume you share the same definition of responsibility,” he observes. “You do not always. Trust needs measurement – not vibes – because, in fast-moving companies, when incentives shift, memories can shift with them.”
What compounded the collapse was the speed at which narrative replaced reality – not just inside the company, but in public. The ‘copy-and-paste economy’ of shallow rewrites with little right of reply can be devastating for founders, who are already fighting an uphill battle.
The domino effects are always severe – board trust frayed, question marks multiplied, and a parallel storyline took on a life of its own. “Fake narratives on AI, round-tripping and beyond spread faster than facts,” Duggal says.
Subsequent reviews surfaced co-ordination patterns – pay-to-publish items, sock-puppet personas and content laundering. Multiple outlets later removed or corrected stories; bloggers issued apologies; a court issued interim takedown orders. An independent forensic review of transactions that had been characterised as ‘round-tripping’ was found bona fide, and arm’s-length dealings were supported by delivery artefacts on both sides. But, by then, the damage was structural: trust had been withdrawn faster than evidence could restore it.
Duggal is candid about the internal fractures, too. A lender executed a cash sweep after a covenant breach; the quantum and settlement, he says, were handled by a newly appointed leader who knew in January but informed the lender months later. An internal investigation conducted under that same leadership continuity, Duggal argues, “felt like a mission to fit a narrative, not a mission to find the truth.” A separate forensic review flagged what it called selective sourcing, inference chains without support, and omitted context. “Titles are not governance,” he says. “Verification is.”
None of this absolves the operational failings. But it complicates the clean narrative of a founder who simply lost control. Builder AI’s collapse was, in Duggal’s telling, a compound fracture: internal governance gaps exploited by external misinformation, with each reinforcing the other until the structure gave way. For an ecosystem learning to distinguish signal from noise, that compounding effect deserves scrutiny in its own right.
Strong companies institutionalise clarity, accountability and transparency early, especially when speed accelerates. In an ecosystem that celebrates velocity, Builder AI serves as a reminder that scale must be accompanied by governance bandwidth. Growth that outpaces internal checks does not merely stretch systems. It breaks them.
The uncomfortable truth for founders is this: momentum can hide weaknesses, but markets eventually expose them. For Duggal, the aftermath has been neither silence nor spectacle – but something closer to a disciplined reckoning. “Every success belonged to teams of brilliant people,” he says. “The failures are mine.”
That candour matters, because what followed Builder AI’s collapse was not a retreat. Duggal is building again, and he frames the shift without theatrics: “This is the part that feels like rehab: naming what I got wrong and rebuilding with different instincts. The next chapter is slower where it matters – clearer contracts, unfiltered operational truth, measurable trust and governance that scales ahead of ambition.”
He has codified the scars into a set of operating principles he now applies daily: no filters between the CEO and the board; verification cadences that are sacred and cannot be renegotiated by momentum; and trust that is measured through incentives, accountability, and audit trails – never by proximity alone. “Do not take my word for anything,” he says. “Look for external signals. Do customers re-buy and refer? Are financials independently audited – and boring on purpose? Can outside researchers reproduce the system’s behaviours?” It is the language of someone who has learned that credibility is rebuilt in public, not narrated in private.
The future of AI and Indian start-ups. India’s AI opportunity remains enormous, but future winners will not be defined by output alone. They will be defined by dependability.
As AI moves into high-stakes domains such as finance, healthcare, compliance and public services, tolerance for error drops sharply. The next wave of AI start-ups will focus on embedding intelligence into real workflows, including BFSI risk and compliance systems, claims processing, logistics optimisation, healthcare administration and SME automation.
As data-centre investments scale and computers become infrastructure, AI begins to resemble industrial capability rather than pure software economics. “Everyone loves bigger engines, but if the braking system does not scale with speed, you haven’t built progress,” Duggal observes. “You’ve built a faster accident.” India’s AI champions will be those who build dashboards, brakes and rules of the road, not just engines.
Duggal’s story is not a redemption arc – he is the first to say so. It is a pattern: mistakes owned without theatrics, lessons extracted without bitterness and a founder back at the workbench with harder-won instincts. “I am not asking for a redemption arc,” he adds. “I am asking to be held to a higher standard – and to be judged by what I build next.” As one of his prior American lead investors put it: “You ran fast, learned a lot and are on to the next adventure.”
Speed still matters. But durability matters more. If you are building in India’s start-up moment, build the engine – and build the brakes, the dashboard and the wiring that keeps reality unedited. Because a comeback is not a headline. It is a pattern.

