In a recent episode of Y Combinator’s Lightcone Podcast, Aaron Levie—co-founder and CEO of Box—joined hosts Gary, Jared, and Diana for a wide-ranging conversation on how Fortune 500 companies and large enterprises are adopting AI. Although the discussion was billed as “How AI Is Changing Enterprise,” it quickly ventured far beyond that, touching on the commoditization of intelligence, the inevitability of cheaper and more capable models, and why the true value of AI in business is less about wrapping large language models in a new UI and more about delivering real workflow outcomes.
Early chatter around AI often dismissed new startups as mere “GPT wrappers”—front-end veneers on top of large language models. Levie pushed back on this notion, comparing it to the early days of cloud computing. At the start of the cloud era, skeptics might have scoffed at a simple storage interface as “just a wrapper for AWS,” ignoring the massive engineering effort and specialized software that actually made solutions like Box or Salesforce valuable.
In the same way, AI-enabled applications require significant workflow design, integrations, compliance controls, and proprietary logic to deliver enterprise-grade solutions. The underlying large language model is only one component. What enterprise customers truly want is a specific business outcome—whether that’s automated document handling or more efficient customer support—and it takes a sophisticated software stack to achieve it.
According to Levie, most Fortune 500 executives don’t fixate on which particular model—GPT-4, Claude, or a newly open-sourced system—powers their tools. Instead, they care about:
As Levie noted, “An Enterprise doesn’t want a model—it wants an outcome.” While technical teams may geek out over parameter counts or the nuances of prompt chaining, the broader organization just needs a tool that shortens its time-to-value.
One of the most striking points of the conversation was the idea that “intelligence” is rapidly moving toward a commodity. Just as the cost of cloud storage dropped precipitously over the years, we can expect the same trend with AI compute. Competition among big tech players and open-source communities will drive down the unit price of AI operations.
In practice, that means startups that rely purely on “selling tokens”—charging for raw language model usage—may struggle unless they build significant software, integrations, or high-value services on top. Even major model providers focus on more than just tokens: OpenAI layers ChatGPT Enterprise with compliance features, Anthropic offers Claude with dedicated support, Google wraps its PaLM models in Vertex AI on GCP, and Meta open-sources Llama for broader ecosystem adoption. The big winners will be those who treat AI as a foundational resource, weaving it into a product suite that directly tackles mission-critical workflows.
When asked which AI components enterprises tend to develop internally versus purchase from vendors, Levie highlighted Geoffrey Moore’s concept of “core” (unique IP or value proposition) vs. “context” (necessary but not differentiating). Most big companies don’t want to build general-purpose HR or CRM software themselves; that’s context. They’re happy to pay a trusted AI vendor for these capabilities.
By contrast, if a bank’s competitive edge lies in a highly specialized approach to client onboarding or wealth management, that “core” software is more likely to be built in-house—often leveraging externally sourced Dev tools and cloud-based AI services. Understanding this distinction allows both startups and enterprises to invest AI budgets in the right places.
Levie and the hosts drew a parallel between the early struggles of cloud adoption and the current surge in AI enthusiasm. Fifteen years ago, large organizations were wary of a “bookstore” (Amazon) providing infrastructure services. Today, no one questions the legitimacy of cloud providers. The shift to AI, however, might happen faster: open-source models, user familiarity with ChatGPT-like tools, and new AI-native college graduates all accelerate enterprise acceptance.
This newfound momentum is often spurred by competition. If one bank automates critical workflows with AI—improving customer experiences or accelerating product development—others will have to adopt it simply to keep up. The result is a wave of AI-based transformation that’s broader and deeper than the cloud transition ever was.
Unlike a zero-sum scenario that merely replaces existing labor with AI-driven processes, Levie believes AI will drive a massive expansion in total software spending (and overall economic activity). Because AI can do tasks that businesses previously never budgeted for—like rapid multi-language translations or large-scale contract analysis—new opportunities and services arise that didn’t exist.
Moreover, once a company sees gains from AI-led automation, it can reinvest those savings in further growth. As a competitive market compels more investment, the total spending on AI-infused software may multiply as companies race to incorporate next-generation capabilities.
In closing, Levie offered a hopeful, abundance-oriented vision of AI. While fears of mass automation persist, he sees an evolutionary path where new tools raise the bar for everyone. By lowering the cost of sophisticated tasks—writing, coding, translating, analyzing—AI frees talent to focus on more creative, strategic endeavors.
He points out that no major wave of technology adoption has left the world with fewer opportunities. Instead, each one expands what humans can achieve. Provided we maintain competitive markets and robust regulatory frameworks, AI can accelerate innovation, improve quality of life, and broaden access to critical services—creating a future that is more inclusive and prosperous.
From Levie’s vantage point, enterprises might be just 10% into their AI journey—still grappling with data governance, pilot projects, and compliance policies. Yet the direction seems clear: AI-driven automation and decision-making will soon be as ubiquitous as cloud computing. Ultimately, the question is less about which large language model is used and more about which software layers and services can turn AI’s raw power into real, transformative business outcomes.