Meta is working with US government to use Llama AI

Meta is “working with the public sector to adopt Llama across the US government,” according to CEO Mark Zuckerberg.

The comment, made during his opening remarks for Meta’s Q3 earnings call on Wednesday, raises a lot of important questions: Exactly which parts of the government will use Meta’s AI models? What will the AI be used for? Will there be any kind of military-specific applications of Llama? Is Meta getting paid for any of this?

There’s also the cozying up to the government that Meta’s AI rivals are doing. OpenAI and Anthropic recently said they would share their models with the US AI Safety Institute ahead of time for safety screening. Google’s on-andoff-again relationship as an AI vendor for the Pentagon is well documented. In a recent blog post, OpenAI said its models were being used by DARPA⁠, the U.S. Agency for International Development⁠, and the Los Alamos National Laboratory.

While we wait to learn about Meta’s AI work with the government, Zuckerberg teased a bit more about the next Llama model on the Q3 earnings call. He said version four is training on “a cluster bigger than I’ve seen reported for anything else others are doing” and that he expects “new modalities,” “stronger reasoning,” and “much faster” performance when it debuts next year.

He acknowledged that Meta plans to continue spending more on AI in 2025, which is “maybe not want investors want to hear in the near term.” But he sees the upside as being worth it.

“I’m pretty amped about all the work we’re doing right now,” he said. “This may be the most dynamic moment I’ve seen in our industry, and I’m focused on making sure that we build some awesome things and make the most of the opportunities ahead.”

As a business, Meta is still continuing to grow. The company reported revenue of $40.5 billion for Q3, a 19-percent increase from a year ago, and $17.3 billion in profit. And it claims that 3.29 billion people use at least one of its apps each day, an increase of 5 percent from a year ago.

Source link