Interview with Nebo Djurdjevic: AI investments, market dynamics & the coming realignment

Artificial intelligence is reshaping global technology at a pace unmatched by any previous era. Investment cycles are compressing, infrastructure spending is exploding, valuations are soaring, and competitive dynamics are shifting monthly. And yet, beneath the momentum, serious structural questions remain about where value will ultimately settle, and how sustainable this acceleration really is.
I sat down with Nebo Djurdjevic, Executive Partner & Chief Strategy & Innovation Officer at Vega IT, to explore these dynamics, the risks emerging beneath the surface, and what companies must do to stay relevant as the market realigns.
The full-stack race
Sasa: AI has become a full-stack race, stretching from chips to applications. Who are the real power players today, and why?
Nebo: The AI value chain naturally clusters into four layers: compute, frontier models, platforms, and applications. The players that dominate today are those who control structural bottlenecks.
- NVIDIA owns the compute bottleneck. Demand for training and inference is still far ahead of supply.
- OpenAI, Anthropic, Google and DeepMind operate at the frontier of model capability, with talent, IP, and capital intensity that few can replicate.
- Microsoft, AWS, and Google Cloud own enterprise distribution. They decide where AI becomes real in production systems.
Their strength isn’t only technical. It’s ecosystem gravity: talent concentration, capital reserves, distribution power, and the ability to set the pace.
But it’s important to acknowledge something new: OpenAI’s dominance is no longer absolute.
The recent Microsoft + NVIDIA investment in Anthropic, historically aligned with Amazon and Google, tells us that even OpenAI’s closest allies want optionality. And Google’s Gemini 3, trained on Google’s own chips, potentially gives Google a cost advantage that could reshape the economics of the model layer.
We are shifting from a world that looked like "winner-takes-all" toward one where multiple frontier labs are competing intensely, which is healthy, but makes the economics more volatile.
Where value will concentrate
Sasa: Once the hype clears, which layers will hold durable value, and which are overexposed?
Nebo: Value will concentrate in three places:
- Compute is still the hardest bottleneck.
- Frontier models define the capability ceiling for everyone else.
- Enterprise orchestration is the layer that turns AI from demos into reliable systems with ROI.
The most exposed area is the application layer. We’re seeing thousands of thin wrappers around the same APIs – no proprietary data, no complex workflow integration, no defensible moat. These companies will either consolidate, pivot, or disappear.
Another dynamic is that competition is compressing margins, not expanding them.
Labs are burning extraordinary amounts of cash to keep prices low for users. That’s fantastic for adoption but brutal for business models.
The more competition there is, the harder it becomes for model labs to generate the revenues required to justify their enormous capital intensity.
The infrastructure question
Sasa: Technology booms often come with surges of overinvestment. Where do you see the risks in today’s AI cycle?
Nebo: Every major technology cycle has an infrastructure overbuild phase. In the late ’90s, telcos invested heavily in fibre-optic networks, expecting immediate demand. ROI was disappointing, but the infrastructure later became the backbone of the modern internet. Society benefited enormously, even if the original investors didn’t.
AI is following a similar pattern, but with a twist: the scale of spending is unprecedented. OpenAI alone has committed around $1.4 trillion over five years for computing power. That number has raised legitimate concerns in financial markets about uncreditworthy borrowers funding supercomputers with no guaranteed monetisation.
What worries analysts today is not only the scale but the circularity of the ecosystem.
You have AI labs buying compute from cloud providers, who then buy equity in the AI labs, using chips from vendors who also buy equity, capital and infrastructure flowing in a closed loop.
It looks elegant on paper. It becomes risky when revenue doesn’t keep up.
That does not mean AI won’t deliver value. It absolutely will. The question is whether the companies funding this infrastructure will see an ROI within any reasonable timeframe.
Remember, the people laying the tracks seldom end up owning the train stations.
Early signs of overheating
Sasa: Some analysts argue the AI market already shows bubble characteristics. Do you agree?
Nebo: We’re seeing several early red flags:
- Compressed time horizons – valuations assuming dominance before revenue exists.
- Fear-driven investment – companies investing because they’re afraid of being left behind.
- Tools maturing faster than business cases – a classic danger signal.
- Narratives outrunning fundamentals – always a sign of overheating.
One telling data point is how public markets reacted: Microsoft’s stock fell after the Anthropic arrangement because investors worried that model labs are chasing growth at unsustainable cost levels.
Meanwhile, the private-market valuations for frontier labs, $350B for Anthropic, $500B for OpenAI, $230B for XAI, are stretching well ahead of their monetisation capacity.
We’re at the stage where capital is trying to buy optionality. Historically, that’s when expectations are highest and risk is least visible.
Cross-layer hedging
Sasa: Many major players are investing across the value chain. Chip-makers are investing in model labs, and model labs are investing in infrastructure. How do you interpret this strategy?
Nebo: This is one of the most important shifts in the market. We’re seeing strategic cross-layer hedging, not linear value-chain competition.
Consider two recent deals:
- NVIDIA and Microsoft are investing $15 billion into Anthropic, which will, in turn, spend $30 billion on Azure, underpinned by NVIDIA chips.
- NVIDIA’s planned $100 billion investment in OpenAI, while OpenAI commits hundreds of billions on NVIDIA-powered compute.
These loops aren’t exuberance – they’re insurance. None of these players trusts that the profit pool will remain fixed in their layer.
Compute might commoditise.
Model margins might collapse.
Platforms might absorb the economics of both.
So instead of betting on one layer, they’re essentially buying exposure across the stack to avoid being stranded in the wrong one. The ecosystem is becoming circular because the uncertainty is so high.
It’s not confidence – it’s precaution.
What will define long-term winners
Sasa: Looking ahead five years, what separates companies that create lasting value from those that fade?
Nebo: Three capabilities stand out:
- Operationalisation – moving from experiments to production systems.
- Ecosystem leverage – controlling workflows others depend on.
- Proprietary data + domain depth – still the only defensible moats.
Innovation speed matters. But reliability, governance, and integration depth will ultimately decide who wins. The market is shifting from “show me a demo” to “show me uptime, compliance, and ROI.”
Where transformation partners fit in
Sasa: As AI reshapes the stack, where do you see the biggest opportunities for digital transformation partners like Vega IT?
Nebo: Most enterprises don’t need to build models. They need partners who can make AI actually work – inside their systems, on their data, in their workflows, and under their regulatory constraints.
This creates opportunities in:
- Modernising legacy estates so AI can plug in - reworking the architectures, APIs, data foundations and security layers that sit underneath the business, so AI can connect to real workflows.
- Embedding AI into core processes instead of edge use cases.
- Building domain-specific accelerators on top of foundation models.
- Turning experiments into operational workflows with measurable ROI.
Model labs will keep pushing the frontier. But the partners who can turn capability into outcomes, that’s where the next decade of value gets created.



