Finance News

Anthropic’s Daniela Amodei on the company’s ‘do more with less’ bet


Anthropic's Daniela Amodei on spending less than competitors, keeping AI safe and a possible IPO

SAN FRANCISCO — Inside Anthropic headquarters, President and co-founder Daniela Amodei keeps coming back to a phrase that’s become a sort of governing principle for the artificial intelligence startup’s entire strategy: Do more with less.

It’s a direct challenge to the prevailing mood across Silicon Valley, where the biggest labs and their backers are treating scale as destiny.

Firms are raising record sums, locking up chips years in advance, and pouring concrete across the American heartland for data centers in the belief that the company that builds the largest intelligence factory will win.

OpenAI has become the clearest example of that approach.

The company has made roughly $1.4 trillion in headline compute and infrastructure commitments as it works with partners to stand up massive data center campuses and secure next-generation chips at a pace the industry has never seen.

Anthropic’s pitch is that there’s another way through the race, one where disciplined spending, algorithmic efficiency, and smarter deployment can keep you at the frontier without trying to outbuild everyone else.

“I think what we have always aimed to do at Anthropic is be as judicious with the resources that we have while still operating in this space where it’s just a lot of compute,” Amodei told CNBC. “Anthropic has always had a fraction of what our competitors have had in terms of compute and capital, and yet, pretty consistently, we’ve had the most powerful, most performant models for the majority of the past several years.”

Anthropic bets efficiency can beat brute-force scale in the AI arms race

Daniela Amodei and her brother, Dario Amodei, who is Anthropic’s CEO and a Baidu and Google alumni, helped build the very worldview they’re now betting against.

Dario Amodei was among the researchers who helped popularize the scaling paradigm that has guided the modern model race. It is the strategy that increasing compute, data, model size, and capabilities tends to improve the model in a predictable way.

That pattern has effectively become the financial bedrock of the AI arms race.

It underwrites hyperscaler capital spending, justifies towering chip valuations, and keeps private markets willing to assign enormous prices to companies that are still spending heavily to reach profitability.

But even as Anthropic has benefited from that logic, the company is trying to prove that the next phase of competition won’t be decided only by who can afford the largest pre-training runs.

Its strategy leans into higher-quality training data, post-training techniques that improve reasoning, and product choices designed to make models cheaper to run and easier to adopt at scale — the part of the AI business where the compute bill never stops.

To be clear, Anthropic isn’t operating on a shoestring. The company has roughly $100 billion in compute commitments, and expects those requirements to keep rising if it wants to stay at the frontier.

“The compute requirements for the future are very large,” Daniela Amodei said. “So our expectation is, yes, we will need more…



Read More: Anthropic’s Daniela Amodei on the company’s ‘do more with less’ bet

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More