News and Insights
Building the Infrastructure to Make AI a Reality in the Enterprise
March 18, 2025
Behind the popularity of AI, a lot goes into building the infrastructure of what many view as the most disruptive technology since the Internet. In addition to the energy requirements needed to power data centers, models themselves must be properly trained with data to perform certain tasks, prevent hallucinations and build trust. Pricing models are likely to evolve too, business as usual SaaS paradigms don’t work in the AI world.
For a geeky enterprise software nerd like me, sitting in a San Francisco conference room at The Information’s AI Agenda listening to the innovators, builders and backers making it all happen gave me hope and excitement to be in the PR epicenter of the evolution. Khosla Ventures founder, Vinod Khosla, Anthropic co-founder, Jared Kaplan and Glean Founder and CEO, Arvid Jain were just a few of the brilliant minds on stage.
Building the Foundation: Steel, Concrete or a Composite of Both?
In keeping with my railroad track theme, Vinod Khosla kicked things off with a prediction that AI model building will no longer be limited to the top three or four hyperscalers, imagining a composite world powered by complementary purpose-built enterprise-scale models akin to an AI Lego block network.
Khosla, known for taking an early $3M bet on OpenAI and turning that into a 25x windfall, made a case for this type of model creation touting his latest investment in Symbolica AI – a stealthy startup creating structured outputs that help builders achieve greater model accuracy with fewer resources.
They aren’t the only ones. My eye is on Guide Labs, a yet-to-be-launched startup creating a new class of interpretable foundation models systems that can reliably explain their reasoning to humans and domain experts. Since no conversation about AI models would be complete without a question about Deepseek, Khosla nonchalantly summed the model up to:
- one-third stolen
- one-third basic old-fashioned programming
- only one-third real innovation
Where We Are: IRL Enterprise AI Adoption
Questions about job displacement and real-world use cases also made the agenda. The general consensus being that we need to focus less on occupations and more about function. Knowledge workers and jobs requiring large amounts of data analysis (researchers, biotech, finance) and service related fields (like legal) stand to benefit the most from the speed and efficiency AI supports workers. Likewise, fast-growing segments like coding agents and Agentic AI are expected to explode. Also, I would be remiss not to mention conversations about the importance of keeping humans in the loop. Critical human-reliant AI work like Reinforcement Learning from Human Feedback (RLHF) depends on people just as much as professions like nursing.
As it relates to broader large scale AI adoption, there is still work to be done:
- One of my clients, Invisible Technologies, often reminds us that most (85%) AI projects fail due to poor data quality, inadequate data availability, and a limited understanding of AI’s capabilities.
- Ritka Gunnar, GM of Data & AI at IBM, echoed that sentiment pointing to Big Blue’s own “enterprise-ready” AI model targeted at developer efficiency and precision. Models like Granite were designed to integrate AI with proprietary data (what I’ve been referring to as “behind the firewall” data) adopting a ‘smaller is better’ sentiment.
- Then, there’s the subject of cost. Lightspeed Ventures Partner, Guru Chahal, explained the differences between AI and SaaS business models, pointing to the need for pricing measured at the business level – a far cry from the software licensing days.
Got Power?
After an invigorating talk about models, a panel of energy and power experts took the stage including Chris James, Founder of investment firm Engine 01, Chase Lochmiller CEO and cofounder of Crusoe, and Aaron Ginn, Founder & CEO of Hydra Host – the so-called ‘GPU Whisperer’ for data centers. James and Lochmiller had recently travelled to North Virginia (Stargate Project) and Ginn talked up his Texas headquarters where wind and land are abundantly available to power AI.
All of the panelists agreed, in a world where 70-100 gigawatts of power will be needed to power AI by 2030, nothing is off the table – nuclear, wind, solar, even sourcing power from countries like Albania, which are capable of providing a 100% renewable source of power, were all seen as viable options.
What’s Next?
For those of us in the industry, it feels like AI has been around much longer than it has, yet we’re still in the early days. The work being done behind the scenes to lay the foundation is inspiring, and there’s a collective sense of urgency to build quickly, efficiently and responsibly. The global impact of AI across borders, age groups, income levels and modalities deserves this kind of attention, and after listening to the work being done behind the scenes I’m optimistic in the preparation being done to bring the value of AI to life in the enterprise.
Frances Bigley is a Senior Partner in the global technology practice who has decades of experience putting emerging technology clients on the map.