LAB51 logo
Menu
Subscribe

Llama 4 by Meta – a New Era of AI (?)

Llama 4 AI models by Meta - a new era of AI (?)
By Anja Prosch
Anja Prosch

3 Min

April 9, 2025

This week, Meta announced their newest collection of AI models, Llama 4. These models will be integrated into several Meta products, such as WhatsApp, Messenger, and Instagram. 

In the company press release, Meta proclaimed Llama 4 “a new era of natively multimodal AI innovation” and made a few ambitious claims.

Powerful than competitors (?)

Meta, in their announcement, firmly stated that Llama 4 models have several benefits over the competitors. 

Llama 4 Scout possesses a 10-million-token context window, which serves as a working memory of an AI model. At the same time, “it fits a single Nvidia H100 GPU”. This is a benefit before Gemma 3 and Gemini 2.0 by Google, as well as Mistral 3.1.

Another benefit of the Llama 4 is switching to a “mixture of experts” (MoE) architecture. This approach conserves resources by using only the parts of a model that are needed for a certain task. 

A similar claim was made about Meta’s larger AI model performance, Maverick. According to Meta, it outperforms OpenAI’s GPT-4o and Google’s Gemini 2.0 Flash. The results of this model are claimed to be comparable to DeepSeek-V3 in coding and reasoning tasks using “less than half the active parameters.”

Another Meta’s AI Model Llama 4 Behemoth, which was not been released yet, has 288 billion active parameters with 2 trillion parameters in total. Meta states that Behemoth can outperform its competitors, such as GPT-4.5 and Claude Sonnet 3.7.

Meta's new AI models Llama 4

The further plans for AI models and products Meta representatives will discuss at its LlamaCon conference, which will take place on April 29th.

Users’ reactions

Since the model was released, users were able to try it out and give feedback that they shared via social media. Overall, the reaction was controversial.

Despite Meta’s bold statements regarding Llama 4, many users noticed flaws in the model’s performance. Some challenged the benchmark results, pointing out inconsistencies when replicating tests. 

Also, there were issues with long-context capabilities. Although Meta claimed the model could handle up to 10 million tokens, some users reported poor quality output.

LLama 4  and the European market

Unfortunately, European companies and individuals are excluded from the license of the LLama 4 models. The reason is a consequence of the Llama 4 Community License Agreement, which states that the rights do not apply to EU residents or companies. This appears to be driven by the transparency and compliance obligations introduced by the EU’s AI Act, which came into effect in August 2024. It sets stringent regulations on AI systems within the European market. 

However, end users in the EU can still access services powered by Llama 4 models, as long as those services are provided from outside the Union.

magnifiercrossmenuchevron-down