Intro
In January 2026, Anthropic CEO Dario Amodei said this at Davos: “The bottleneck is no longer intelligence — it’s chips and factories.”
That same month, Palantir CEO Alex Karp fired back at short sellers on CNBC: “Shorting chips and ontology is bats— crazy.”
These statements came from different stages, but they point in the same direction. The real competition in the AI era is not about model performance — it is about the foundation on which models stand.
That foundation is ontology. The core asset Palantir has been building for 20 years with the CIA and the Department of Defense as its customers. In this series, we have already dissected Palantir’s financial structure, FDE model, and philosophical foundations. This installment tackles the centerpiece of every puzzle — why ontology is destined to become the operating system of the post-AGI era.
The bottom line: whether AGI arrives in one year or ten, the company that owns the context wins. And the most systematic way to own context is ontology.
Background: What Is an Ontology?
Palantir Q4 2025 Key Metrics
$1.41B
Q4 Revenue (+70%)
$507M
US Commercial (+137%)
139%
Net Dollar Retention (NDR)
127%
Rule of 40
Data Lakes vs Ontology
The first thing most companies do when adopting AI is aggregate data. Build a data lake, move it to the cloud, connect an LLM API.
The problem surfaces at the next step. The AI reads the data but cannot deliver meaningful answers. Data exists, but the AI does not understand what the data means.
An analogy: imagine a massive library containing hundreds of thousands of books — but with no classification system, no librarian, no catalog. No matter how many books are on the shelves, the information you need is unfindable. That is the state of a data lake.
An ontology gives that library a classification system, a librarian, and an index — all at once. In Palantir’s formal definition: “A semantic and dynamic layer that connects an enterprise’s logic, assets, and processes to their real-world counterparts.”
A concrete example makes the difference clear. Suppose a logistics company has data on “Truck 1234.” A data lake stores this as a single number. An ontology transforms that truck into a living object — linking its current location, payload, next delivery destination, the driver’s cumulative driving hours, and its relationship to other trucks on the same route.
When an AI agent receives the query “Which truck can deliver to Seoul Station by tomorrow morning?”, a data lake requires manual queries. On top of an ontology, the agent reasons and acts immediately.
Why Ontology Cannot Be Auto-Generated
Here is a critical fact: ontologies are not auto-generated. A human must understand the enterprise’s business logic and design which objects connect to which relationships.
This is precisely why Palantir’s FDE (Forward Deployed Engineer) model exists. FDEs embed at client sites 3–4 days per week, translating on-the-ground business logic into ontology. Former Palantir FDE Nabeel Qureshi put it precisely: “Context is that which is scarce.”
As of 2025, AI models themselves are no longer scarce. OpenAI, Google, Anthropic, and Meta are competing, and open-source models abound. But understanding what problems are unfolding at a specific enterprise’s shop floor, and what business context gives that data meaning, is an entirely different matter.
This is where Palantir’s ontology thesis crystallizes: as models commoditize, the value of the context on which models operate grows. This is the structural basis for Karp’s claim that “all value will flow to chips and what we call ontology.”
Palantir Q4 2025 Key Metrics
$1.41B
Q4 Revenue (+70%)
$507M
US Commercial (+137%)
139%
Net Dollar Retention (NDR)
127%
Rule of 40
Palantir’s AGI-Era Positioning
Q4 2025 — “From Experimentation to Massive Deployment”
The numbers alone signal that Palantir has entered a different phase. Q4 2025 results, reported February 1, 2026:
| Metric | Value | YoY |
|---|---|---|
| Q4 2025 Total Revenue | $1.407B | +70% |
| US Commercial Revenue | $507M | +137% |
| FY2025 Total Revenue Growth | — | +56% |
| FY2026 Guidance | $7.19B | +61% |
| US Commercial 2026 Guidance | — | +115% |
| Rule of 40 Score | 127% | — |
| Adjusted Operating Margin | 57% | — |
| US Commercial Customers | 571 | +49% |
Rule of 40 is a SaaS industry health metric: revenue growth rate + operating margin. Anything above 40% is considered “excellent.” Palantir’s 127% is more than triple that threshold.
A February 26 FinancialContent headline captured the shift best: “Enterprise AI Shift Moves from Experimentation to Massive Deployment.”
The engine behind this growth is AIP (AI Platform) — Palantir’s AI operating layer that connects LLMs to the ontology. In simple terms, AIP is the socket that plugs an AI brain into the ontology where all of an enterprise’s data and business logic live.
AIP’s 2026 Evolution: Agent Orchestration
AIP’s character has shifted in 2026. It has moved beyond simply connecting LLMs to enterprise data, becoming an orchestration layer where multiple AI agents collaborate to autonomously execute complex tasks.
Palantir’s early 2026 feature announcements make the direction clear:
- AI Hivemind: Multiple AI agents sharing an ontology and collaborating in a distributed architecture
- Edge Ontology: A lightweight ontology running on mobile devices, enabling AI reasoning on the battlefield or in offline environments
- AIP Document Intelligence (GA, 2026-02-04): Low-code document extraction workflows that automatically pull information from PDFs, contracts, and reports into the ontology
The common thread: all of these expand from the ontology as their center. Hivemind, Edge, Document Intelligence — none of them works without it. The more AI capabilities Palantir adds, the more the ontology’s importance compounds.
Model Agnostic — Why This Is the Key Strategy
Palantir’s most underappreciated strategic choice is its model-agnostic philosophy.
Competitors try to lock customers into proprietary AI models. Microsoft has Azure OpenAI, Google has Gemini, Amazon has Bedrock. Using their model strengthens their platform.
Palantir went the opposite direction. GPT, Claude, Llama — any LLM can be plugged into the ontology. LLMs are replaceable components. The ontology is the machine itself.
Why this matters strategically, for two reasons. First, customers lose the fear of vendor lock-in. When a better model emerges, swap it in. The platform persists. Second, regardless of which company wins the AGI race, Palantir remains neutral infrastructure. GPT-5, Gemini Ultra — the ontology layer stays Palantir’s.
The industry assessment that “LLMs have been waiting 20 years for Palantir’s ontology” stems from this dynamic. The ontology existed first. AI models were mounted on top of it later.
Ontology vs Model Dependency
Palantir (Ontology)
- Model-swap freedom
- Data context ownership
- NDR 139% lock-in effect
Model-Dependent SI (Legacy)
- Model lock-in risk
- Data fragmentation
- Project-based, one-off
What NDR 139% Tells Us
The strongest evidence that Palantir’s business model works is its 139% NDR (Net Dollar Retention).
NDR 139% means: even without acquiring a single new customer, existing customers are spending 39% more than they did last year. In SaaS, NDR above 120% is considered “excellent.” Palantir’s 139% is on par with Snowflake and Datadog. Even more remarkable: this figure rose 5 percentage points from the prior quarter.
Why are customers voluntarily spending more? Because of the ontology’s expansion characteristics. When you build an ontology for one use case, the second use case simply adds a new layer on top of the same ontology — faster and cheaper to implement. Tampa General Hospital started with 1 use case and expanded to 12 over four years, saving 700+ lives in the process. As switching costs rise and use cases stack, customers have no reason to leave.
Palantir (Ontology)
Korean SI (Model-Dependent)
- Model Lock-in Risk
- Data Fragmentation
- Project One-off Nature
- Model Swap Freedom
- Data Context Ownership
- NDR 139% Lock-in Effect
맥락 소유
The Enterprise SI Reality and Opportunity
“Following Palantir” — Samsung SDS’s Admission
In early 2026, a revealing headline appeared from The Elec (Korean tech publication): “Samsung SDS to follow Palantir… considering ontology integration into collaboration software.”
The largest IT services company in Korea publicly admitted it would “follow” an American data platform company’s methodology — not a competitor. This is the most candid indicator of where the traditional IT services industry stands.
As of 2026, 85% of enterprises in Korea plan to adopt generative AI (CIO.com survey). 79.3% plan to increase budgets. But when asked about the biggest barrier to adoption, 49.8% cite “lack of technical talent and expertise” and 32.0% cite “difficulty securing proper data infrastructure.”
They have the tool (LLM) but lack the blueprint (ontology). Companies subscribe to ChatGPT but cannot build the context layer connecting it to their enterprise data.
What Happens When You Deploy AI Without Ontology
The recurring pattern of failed public-sector AI deployments becomes clear through the ontology lens. Governments have attempted “national Palantir” initiatives. Progress remains slow. Two reasons: data regulation and inter-agency silos.
The essence of ontology is connecting data — and those connections are blocked by regulatory and administrative barriers. Linking personnel data across agencies can take months just for approval. No matter how powerful the AI model, if data remains isolated, the LLM stays at the level of a clueless parrot.
The private sector is no different. Major IT service companies generate trillions in revenue, but their profit structure remains a project-based labor model. They deploy engineers to client sites; when the project ends, the team withdraws. Domain expertise accumulated over years on-site almost never converts into an internal ontology.
The result: gross margins stuck at 20–30%. Compare that to Palantir’s 82.4%. They cannot escape the constraint where revenue scales only if headcount scales.
The Fork in the Road for Enterprise SI
To be fair, there is potential. Samsung SDS’s decision to explore ontology is a directional signal. LG CNS’s DAP (Data Analytics Platform) and SK C&C’s AI transformation efforts point the same way.
But the decisive gap remains. Palantir built ontologies on-site through its FDE model over 20 years. Traditional SI companies have yet to spin the flywheel that converts field experience into internal products. The Nebraska Medicine bootcamp that generated an $88M contract in 5 days, the Tampa General pattern that scaled from 1 to 12 use cases over 4 years — these patterns require ontology. Without it, they are simply unreplicable.
INSIGHT
Whether AGI arrives or not, the structure where context scarcity determines competitive advantage has already begun. Companies with blueprints (ontology), not just tools, will survive.
ACTION
If you’re planning AI adoption, the first question should be ‘do we have a context map for our organization,’ not ‘which model should we use.’ Without structuring data relationships and business processes first, any AI model will stay at surface-level automation.
ationships and business processes first, any AI model will stay at surface-level automation.
Conclusion
When Amodei said at Davos that “the bottleneck is not intelligence but chips and factories,” Palantir had already been filling the other side of that bottleneck for 20 years. If NVIDIA handles chips and cloud companies handle factories, ontology handles the “context” of enterprise operations.
Karp’s thesis in summary: “All value will flow to chips and what we call ontology.” As AI models commoditize, this statement becomes more accurate. No matter which company builds GPT-6 or ships Claude 4, the ontology containing all the context of enterprise operations belongs to the company that built it first.
If AGI arrives, this dynamic intensifies. AGI can learn and reason on its own, but a specific enterprise’s business logic — the parts procurement sequence at this factory, the patient triage rules at this hospital, the credit decision framework at this bank — lives in the ontology. The company that owns the interface for handing that context to AGI is the one that can fully harness AGI’s capability.
For enterprises, this is both a warning and an opportunity. At a moment when 85% plan to adopt AI, the question “which LLM to choose” matters less than “who builds the ontology” — and that question will determine competitive positioning five years from now. What Palantir has proven over 20 years is ultimately this: it is not intelligence itself but the structure on which intelligence stands that determines value.
Whoever owns the context rules the AGI era.
Bottom line. AI models become commodities, but the ontology containing an enterprise’s unique context becomes an irreplaceable asset. Karp’s “chips and ontology capture all the value” is not prophecy — it is structural logic.
Takeaway for professionals. One thing you can do right now: identify the points in your work where you repeatedly make judgment calls, and map where the data required for those judgments is scattered. This is the starting point for building a personal “ontology.” The ability to draw the blueprint of your own work matters more than knowing how to use an AI tool.
Recommended Reading
- The Day After AGI — Two AI Titans at Davos Reveal Diverging Paths
- Why Palantir Spent 20 Years Building an Ontology (feat. AI Agents, Maduro Arrest, East India Company)
- The FDE Model: Proving Value in 5 Days (feat. Palantir, AIP Bootcamp, Service→Product)
- Thiel vs Karp: Opposite Sides, Same Conclusion (feat. Palantir, Habermas, Carl Schmitt)
Sources
- Palantir Q4 2025 Earnings Press Release (Palantir IR, 2026-02-01)
- Palantir Q4 2025 CNBC Coverage (CNBC, 2026-02-02)
- Enterprise AI Shift — Massive Deployment (FinancialContent, 2026-02-26)
- The AI Operating System — Path to $150 (FinancialContent, 2026-02-27)
- Alex Karp WEF Davos 2026 Session (WEF, 2026-01-21)
- Alex Karp — Chips and Ontology Quote (CNBC)
- Palantir Ontology Official Page (Palantir)
- Palantir AIP Overview (Palantir)
- AI Infrastructure & Ontology (Palantir Blog)
- Palantir February 2026 Announcements (Palantir, 2026-02)
- Samsung SDS Exploring Ontology (The Elec)
- 85% of Korean Enterprises to Adopt GenAI in 2026 (CIO.com)
- 2026 IT Strategy Survey (CIO.com)
- Korean Government’s Palantir Initiative (News THE AI)
- Reflections on Palantir — Nabeel S. Qureshi (Substack, 2025)
- Alex Karp — Ontology Strategy (IndexBox, 2025)
- The Day After AGI — TheByteDive (TheByteDive, 2026-02-28)
- Palantir Ontology Analysis — TheByteDive (TheByteDive, 2026-02-25)
- FDE Model Analysis — TheByteDive (TheByteDive, 2026-02-27)
Disclaimer: This article is for informational purposes only and does not constitute investment advice. All data cited is sourced from publicly available reports and filings.
