Skip to main content
Back to Pulse
Hugging Face

Introducing The World’s Largest Open Multilingual Language Model: BLOOM

Read the full articleIntroducing The World’s Largest Open Multilingual Language Model: BLOOM on Hugging Face

What Happened

Introducing The World’s Largest Open Multilingual Language Model: BLOOM

Fordel's Take

Hugging Face just dropped BLOOM — 176B parameters trained on 46 languages, Apache 2.0, and you can download the weights today. English-only models like GPT-3.5 cover <30% of global users; BLOOM hits 46 languages with one checkpoint.

Most teams burn $5k+/month on GPT-4 for non-English prompts that BLOOM handles at 0.3× the cost on a single A100. Running Opus for multilingual support is just burning money when BLOOM’s 46-language vocab already exists.

If your product serves EU, LATAM, or India, swap GPT-4 to BLOOM-7B quantized today; keep English-only SaaS on Haiku and ignore this.

What To Do

Deploy BLOOM-7B-int8 on one 80 GB A100 instead of GPT-4-Turbo because $0.0015/input token beats $0.03 and covers 3× more languages

Cited By

React

Newsletter

Get the weekly AI digest

The stories that matter, with a builder's perspective. Every Thursday.

Loading comments...