NextFin

Jensen Huang: Building the AI Factory — From Reinventing Computing to Putting AI in the Loop

Summarized by NextFin AI
  • Jensen Huang, CEO of NVIDIA, highlighted a shift from explicit programming to implicit intelligence, where users express intent and AI systems autonomously solve problems.
  • He described AI as a new industrial stack requiring integration of processing hardware, storage, networking, and security to manufacture intelligence at scale.
  • Huang advised enterprises to experiment widely with AI applications and curate the most promising solutions, emphasizing that AI should enhance domain expertise rather than replace existing tools.
  • He proposed that organizations should embed AI within their operations to capture institutional intelligence and continuously improve business capabilities.
NextFin News -

Jensen Huang, founder and CEO of NVIDIA, spoke with Chuck Robbins, Chair and CEO of Cisco, in a fireside-style conversation titled "The AI Factory: Infrastructure for Intelligence" at Cisco's AI Summit, a virtual event held on February 3, 2026. The session was scheduled for 7:30–8:00 p.m. Pacific Time and brought together industry leaders to discuss the technical and organizational shifts required to scale AI across enterprises. (ciscoaisummit.com)

Reinventing computing: from explicit programming to implicit intelligence

Huang opened by describing a fundamental shift in how computing operates. He contrasted the era of explicit programming — where engineers wrote deterministic programs in languages such as Fortran, C++ or COBOL — with a new era of implicit programming, where users express intent and models figure out how to solve problems. In his words, intelligence is no longer primarily about calculation but about systems that can "know what you don't know, partly about reasoning, how to solve a problem you've never seen before," and that can "break it down into elements that you know how to solve."

"We're reinventing computing for the first time in 60 years... You now tell the computer what your intent is, and it goes off and figures out how to solve your problem."

What an AI factory means: the full industrial stack

Huang argued that AI is becoming a new industrial stack that requires not only new processing hardware but reinvention across storage, networking and security. He framed the "AI factory" as the combination of models, chips, software and systems that together "manufacture intelligence at planetary scale." He emphasized that while the processing layer (where NVIDIA sits) is crucial, the enterprise-grade value comes from integrating AI performance with controllability, security and manageability provided by partners such as Cisco.

"There's computing, there's the processing, but there's storage, networking and security. All that is being reinvented as we speak."

From chatbots to problem-solving intelligence

Huang differentiated early generative systems that primarily regurgitated memorized patterns from the next generation of AI that can plan, use tools, retrieve grounded knowledge and ask for help. He described retrieval-augmented generation, tool use and planning as central capabilities that move AI from curiosity to real usefulness.

"Until now, chat bots where you give it a prompt... is interesting and curious, but not useful... Intelligence is about solving problems."

Practical steps for enterprises: experiment widely, then curate

When asked what enterprises should do first, Huang recommended against expecting immediate spreadsheet-ready ROI. Instead he urged leaders to identify the single most important work of their organization and apply AI to revolutionize it. His operational advice was to allow broad experimentation — "let a thousand flowers bloom" — and later use judgment to curate and invest behind the most promising platforms.

"Let a thousand flowers bloom. Let people experiment, let the people experiment safely... At some point you have to start curating to find what's the best approach."

AI sensibility: assume abundance and zero gravity

Huang urged companies to adopt an "AI sensibility" that assumes compute and data abundance: think in terms of massive scale, real-time response and unconstrained graph analytics. He said this sensibility should drive teams to tackle the hardest, highest-impact problems because AI can dramatically compress timelines and costs — "what used to take a year could take an hour."

"Think about what AI does. It reduces the cost of intelligence or creates the abundance of intelligence by orders of magnitude."

Tools, applications and the enduring value of domain expertise

Huang warned against the notion that AI will erase the need for software tools. Instead, he argued that advanced AIs will use existing tools rather than re-inventing them. The critical opportunity, he said, is in applications: companies that apply AI to their domain expertise will gain enormous leverage because domain knowledge and intent remain the most valuable assets.

"It is the most illogical thing in the world [to think] AI will replace tools... The most important part of AI is applications. Apply the technology."

Building vs renting infrastructure: know how systems work and protect your questions

Huang recommended that organizations develop tactile understanding of AI systems: "lift the hood, change the oil, build something." He also argued that some AI workloads and questions are too sensitive to be entrusted entirely to third-party clouds. For NVIDIA, protecting the privacy of internal questions and workflows motivated on-prem investments. "My questions are the most valuable IP to me," he said, explaining why certain systems should remain under a company's direct control.

"You must have some tactile understanding of it... I am not confident... putting all of Nvidia's conversations in the cloud... My questions are the most valuable IP to me."

AI in the loop: the future company as a repository of institutional intelligence

Concluding his remarks, Huang reversed the familiar "human-in-the-loop" framing and proposed that every company should have "AI in the loop." He predicted that AI agents embedded across an organization will capture experience, raise collective capability, and become core intellectual property that continuously improves the business.

"Every company should have AI in the loop... Every single employee in the future will have AI... those AIs will become the company's intellectual property."

References: Cisco AI Summit — The builders of the AI economy (event page). (ciscoaisummit.com)

Further reading: Cisco Newsroom — ICYMI: Cisco AI Summit, and NVIDIA announcement on LinkedIn. (newsroom.cisco.com)

Explore more exclusive insights at nextfin.ai.

Insights

What are the core principles behind implicit programming in AI?

What does the term 'AI factory' encompass in the context of AI infrastructure?

What feedback have enterprises shared about integrating AI into their operations?

What recent developments were highlighted at the Cisco AI Summit 2026?

How might AI technologies evolve in the next decade?

What challenges do organizations face when implementing AI systems?

How does NVIDIA's approach to AI compare to its competitors?

What historical shifts have influenced the current state of AI development?

What are the implications of adopting an 'AI sensibility' in companies?

What are the potential risks associated with relying on cloud-based AI solutions?

How does Huang envision the role of AI in future organizational structures?

What are the key capabilities differentiating early AI systems from advanced ones?

What strategies should enterprises adopt for effective AI experimentation?

How does domain expertise play a role in maximizing AI applications?

What are the major components of an AI infrastructure according to Huang?

What potential long-term impacts could AI have on workforce dynamics?

What criticisms exist regarding the current trends in AI development?

How does Huang propose companies should curate their AI initiatives?

What lessons can be learned from past AI implementation failures?

How does Huang's vision for AI challenge traditional computing paradigms?

Search
NextFinNextFin
NextFin.Al
No Noise, only Signal.
Open App