NextFin

Sam Altman: "Buy It From Us On a Meter" — AI Will Be a Utility

Summarized by NextFin AI
  • OpenAI CEO Sam Altman emphasized a future where intelligence is treated as a utility, similar to electricity, suggesting that people will purchase it on a metered basis.
  • To prevent high prices and limited access, Altman advocates for aggressively expanding AI infrastructure capacity to meet rising demand.
  • OpenAI's strategy involves significant upfront investments in infrastructure, driven by the need to prepare for future demand and avoid capacity constraints.
  • Altman predicts that AI will increasingly handle operational burdens in leadership roles, requiring executives to supervise AI systems rather than perform all tasks themselves.
NextFin News -

Recorded for Mint on March 12, 2026, the interview features OpenAI CEO Sam Altman in conversation with a Mint interviewer. The discussion came roughly two weeks after OpenAI announced a major $110 billion funding round led by Amazon, Nvidia and SoftBank, a development Altman referenced while explaining his company’s strategy and spending priorities. (apnews.com)

The following presents Altman’s core statements from the interview in his own words and as recorded in the transcript.

AI as a utility: "buy it from us on a meter"

Altman repeatedly framed the long‑term aim in simple, infrastructural terms. He described a future where intelligence is treated as an everyday commodity: "We see a future where intelligence is a utility like electricity or water and people buy it from us um on a meter and use it for whatever they want to use it for." He returned to the same metaphor later, framing tokenized usage and metered access as the natural commercial model for AI.

"We see a future where intelligence is a utility like electricity or water and people buy it from us um on a meter and use it for whatever they want to use it for. The demand that we see for that seems like it's going to continue to just go like this."

Demand, capacity and the need to "flood the market"

Altman tied the utility metaphor to a strategic conclusion: if supply of compute and inference capacity is constrained, prices spike and access becomes limited. To avoid priced‑out access or heavy central planning, he said the correct approach is to expand capacity aggressively. "The best thing to me throughout all the history of capitalism, innovation, whatever you want, is to just flood the market."

He described how capacity constraints shape business behavior and warned that insufficient supply could lead to limited access or undesirable allocation choices: "If we don't have enough, we either can't sell it or the price gets really high and it kind of goes to rich people or society makes a bunch of sort of central planning decisions that I think almost always go badly."

Why OpenAI is spending heavily on infrastructure

Explaining the company’s recent commitments and the rationale for large up‑front capital outlays, Altman said the infrastructure for advanced AI is unusually expensive and must be planned far in advance. He described OpenAI’s behavior — building capacity ahead of revenue and experimenting with varied business models — as a necessary response to rapidly rising demand. "There's many hard parts of this business, but one of the hardest ones is the infrastructure is so expensive. You need so much of it and you have to commit so far in advance."

Altman reiterated that OpenAI prefers to avoid remaining capacity constrained and cited the company’s guiding principle of abundance: "We want to flood the world with intelligence. We want people to just use it for everything."

Compute, tokens and the revenue model

On how the company thinks about revenue, Altman described a tokenized usage model: different models and usage modes cost different amounts, and customers will choose trade‑offs between continuous background assistance and pay‑as‑needed services. He explained: "Fundamentally, our business and I think the business of every other model provider is going to look like selling tokens."

He outlined tiers of cost and value: smaller or less reasoning‑intensive models are cheaper, services that run constantly will cost more, and exceptionally hard single problems may justify enormous investment. In that framework, meter‑style billing becomes a natural mechanism to match supply and demand.

How AI changes leadership and executive work

Altman addressed what AI means for senior roles: while human judgment remains essential, many of the operational and informational burdens will be handled by AI. He asked rhetorically when a CEO, a head of state, or a top scientist could no longer do their job without heavy AI use, and answered that the threshold for such reliance may arrive sooner than many expect. "You still do need a person to stand behind decisions and kind of exercise human judgment... But the actual parts of my role that I will increasingly have to rely on an AI to do because no human can."

He described the evolving role as supervising and providing oversight of AI systems: deciding how to trust outputs, how to provide guidance, and where human responsibility must remain.

How Altman uses AI in his own work

Altman was candid about his own habits: when he has a new business idea, strategy shift, or product thought, the first place he turns is OpenAI’s tools. He emphasized that as models gain access to fuller context — internal documents, code, customer data — their usefulness increases. "If I have like a new idea for a business model... the very first thing I do before I even bounce it off somebody else is to ask our tools."

On long‑term strategy and public commitments

Referencing the company’s recent, large fundraising commitments, Altman explained why strategic partners and infrastructure deals make sense for enabling broad access to intelligence. He reiterated that OpenAI pursues unusual behaviours — investing early, experimenting with business models — because of a belief in making intelligence abundant and broadly available.

Altman framed those choices as intended to reduce scarcity and avoid an outcome in which only a few can afford frontier capabilities.

Selected direct lines from the interview

"The demand that we see for that seems like it's going to continue to just go like this."

"You have to do some pretty unusual things. OpenAI does a lot of things that look weird. We spend a ton of money on infrastructure in advance of revenue."

"No human CEO can talk to every employee at a company, every customer, be in every meeting, be an expert in every field. And so more and more I think of these jobs will be supervising a bunch of AI."

References

Video interview (Mint): Title: ‘‘Buy It From Us On A Meter…’, Sam Altman Says AI Will Be a Utility, People Should Buy It Like Water" (Mint video).

Reporting on OpenAI's funding announcement: AP News — OpenAI gets $110 billion in funding. (apnews.com)

Related coverage and context: LiveMint — OpenAI's $110 billion funding pushes its valuation. (livemint.com)

AI summit and public appearances providing wider context: LiveMint — AI Summit 2026 highlights. (livemint.com)

Explore more exclusive insights at nextfin.ai.

Insights

What does Sam Altman mean when he describes AI as a utility?

What are the origins of the utility model in the AI industry?

How has OpenAI's funding situation changed recently?

What trends are currently shaping the AI market according to Altman?

What are the implications of AI being treated like electricity or water?

What challenges does OpenAI face in expanding its infrastructure?

How does OpenAI plan to manage the pricing of AI services?

What role does AI play in the future of leadership and executive work?

What are the potential long-term impacts of AI becoming a utility?

What are the major controversies surrounding the commodification of AI?

How does OpenAI's business model compare to its competitors?

What lessons can be learned from historical cases of utility markets?

How does Altman envision AI influencing market dynamics?

What specific strategies is OpenAI employing to avoid capacity constraints?

How does the concept of tokenized usage work in AI services?

What are the expected challenges in implementing a metered AI service model?

How does OpenAI's approach differ from traditional tech companies?

What feedback have users provided regarding OpenAI's AI tools?

What recent policy changes could affect the AI industry?

What future developments can we expect from OpenAI and the AI sector?

Search
NextFinNextFin
NextFin.Al
No Noise, only Signal.
Open App