When you purchase through links on our site, we may earn an affiliate commission. This doesn’t affect our editorial independence.

In July 2024, Meta CEO Mark Zuckerberg said that “selling access” to Meta’s openly available Llama AI models “isn’t Meta’s business model.” According to a newly unredacted court filing, Meta Llama AI revenue is obtained through sharing agreements.

Attorneys filed the process for the plaintiffs in the copyright lawsuit Kadrey v. Meta, in which Meta stands accused of training its Llama models on hundreds of terabytes of pirated ebooks, revealing that Meta “shares a percentage of the revenue” that companies hosting its Llama models generate from users of those models.

The filing doesn’t indicate which specific hosts pay Meta. But Meta lists some Llama host partners in various blog reports, including AWS, Nvidia, Databricks, Groq, Dell, Azure, Google Cloud, and Snowflake.

Developers aren’t required to use a Llama model through a host partner. The models can be downloaded, fine-tuned, and run on various hardware. However, many hosts provide additional services and tooling, making getting Llama models up and running simpler and more manageable.

Zuckerberg stated the possibility of licensing access to Llama models during a call last April, when he also floated Meta Llama AI revenue generation in other ways, like through business messaging services and ads in “AI interactions.” However, he didn’t outline specifics.

According to Zuckerberg, “If you’re someone like Microsoft or Amazon or Google and you’re going to be reselling these services basically, that’s something we think we should get some portion of the revenue for.” He affirmed, “Those are the deals we intend to make, and we’ve started doing that a little bit.”

Recently, Zuckerberg asserted that most of the value Meta derives from Llama comes in the form of improvements to the models from the AI research community. Meta uses Llama models to power several products across its platforms and properties, including Meta’s AI assistant, Meta AI.

“I think it’s good business for us to do this openly,” Zuckerberg said during Meta’s Q3 2024 earnings call. “It makes our products better than if we were just on an island building a model that no one was standardizing in the industry.”

The fact that Meta Llama AI revenue can be obtained in a rather direct way is a significant move because plaintiffs in Kadrey v. Meta claim that Meta not only used pirated works to develop Llama but facilitated infringement by “seeding” or uploading these works. Plaintiffs allege that Meta used surreptitious torrenting methods to obtain ebooks for training and, in the process, due to the way torrenting works, shared the ebooks with other torrents.

Meta plans to significantly increase its capital expenditures this year, largely thanks to its increasing investments in AI. In January, the company said it would spend $60 billion-$80 billion on CapEx in 2025, roughly double Meta’s CapEx in 2024, primarily on data centers and growing the company’s AI development teams.

Meta is reportedly considering launching a subscription service to build more on the Meta Llama AI revenue service for Meta AI that’ll add unspecified capabilities to the assistant. However, this is believed to offset a portion of the costs.

LEAVE A REPLY

Please enter your comment!
Please enter your name here