Google releases pint-size Gemma open AI model

Google releases pint-size Gemma open AI model

In recent years, major technology firms have focused on developing increasingly large AI models, utilizing vast arrays of costly GPUs to deliver generative AI services through the cloud. However, the significance of smaller AI models cannot be overlooked. Google has introduced a miniature iteration of its Gemma open model, specifically designed to operate on local devices. The newly launched Gemma 3 270M is engineered to be easily adjustable while still offering impressive performance despite its reduced size. Earlier this year, Google released its first Gemma 3 models, which boasted between 1 billion and 27 billion parameters. In the realm of generative AI, parameters represent the learned variables that dictate how the model interprets inputs to generate output tokens. Typically, a model's performance improves with an increase in parameters. With only 270 million parameters, the Gemma 3 can function on devices such as smartphones or even run entirely within a web browser. Operating an AI model locally presents numerous advantages, including improved privacy and reduced latency, which are essential considerations for many users. The Gemma 3 270M was crafted with these applications in mind. In tests conducted on a Pixel 9 Pro, the Gemma model successfully managed 25 conversations using the Tensor G4 chip, consuming merely 0.75 percent of the device's battery. This positions it as the most energy-efficient model in the Gemma lineup. While developers may not achieve the same performance as seen in multi-billion-parameter models, the Gemma 3 270M holds considerable potential for specific tasks. Using the IFEval benchmark, which evaluates a model's ability to adhere to instructions, Google demonstrated that its latest model exceeds expectations for its size. The Gemma 3 270M achieved a score of 51.2 percent in this assessment, outperforming other lightweight models with a greater number of parameters. Although it does not match the capabilities of models with over a billion parameters, such as Llama 3.2, it comes surprisingly close given its limited parameter count.

Sources : Ars Technica

Published On : Aug 14, 2025, 20:05

Automotive
Lucid Motors Unveils Ambitious Plans for Affordable Electric SUVs

Lucid Motors is setting its sights on the bustling midsize SUV market, a move that could prove pivotal for the company's...

Ars Technica | Mar 12, 2026, 17:55
Lucid Motors Unveils Ambitious Plans for Affordable Electric SUVs
AI
Perplexity Launches Innovative AI Tool for Desktop Users

In an exciting development for AI enthusiasts, Perplexity has introduced its latest innovation: the 'Personal Computer.'...

Ars Technica | Mar 12, 2026, 17:45
Perplexity Launches Innovative AI Tool for Desktop Users
Startups
Sunday Secures $165 Million to Propel Humanoid Robotics into Homes

Robotics innovator Sunday has achieved a remarkable milestone, raising $165 million in a recent funding round that eleva...

TechCrunch | Mar 12, 2026, 17:45
Sunday Secures $165 Million to Propel Humanoid Robotics into Homes
Computing
Software Industry Faces a Financial Reckoning Amid AI Disruption

A recent conversation with a CEO from a leading software firm revealed alarming predictions for the industry. He warned ...

Business Insider | Mar 12, 2026, 18:20
Software Industry Faces a Financial Reckoning Amid AI Disruption
Startups
Atlassian CEO Highlights Graduate Talent Amid Job Cuts, Offering Hope for New Entrants

In a recent communication, Atlassian's CEO Mike Cannon-Brookes provided unexpected reassurance to recent graduates conce...

Business Insider | Mar 12, 2026, 17:01
Atlassian CEO Highlights Graduate Talent Amid Job Cuts, Offering Hope for New Entrants
View All News