Meta: Llama 3.2 3B Instruct (free)
by meta-llama
About
Llama 3.2 3B is a 3-billion-parameter multilingual large language model, optimized for advanced natural language processing tasks like dialogue generation, reasoning, and summarization. Designed with the latest transformer architecture, it supports eight languages, including English, Spanish, and Hindi, and is adaptable for additional languages. Trained on 9 trillion tokens, the Llama 3.2 3B model excels in instruction-following, complex reasoning, and tool use. Its balanced performance makes it ideal for applications needing accuracy and efficiency in text generation across multilingual settings. Click here for the [original model card](https://github.com/meta-llama/llama-models/blob/main/models/llama3_2/MODEL_CARD.md). Usage of this model is subject to [Meta's Acceptable Use Policy](https://www.llama.com/llama3/use-policy/).
Specifications
Error Rate
Based on feedback reported by Free LLM Router users. Error rate reflects the percentage of requests that encountered issues such as rate limiting, unavailability, or errors. Learn more.
Availability
Available 41 of 41 tracked days. Daily snapshots show whether this model was accessible as a free model on OpenRouter.
| Feb | Mar | ||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Model | Days | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 |
| Meta: Llama 3.2 3B Instruct (free) | 30 | ||||||||||||||||||||||||||||||
Similar Models
More from Meta-llama
Frequently Asked Questions
Is Meta: Llama 3.2 3B Instruct (free) free to use?
Yes, Meta: Llama 3.2 3B Instruct (free) is completely free to use through OpenRouter. You can access it via the Free LLM Router API at no cost.
What is the context window for Meta: Llama 3.2 3B Instruct (free)?
Meta: Llama 3.2 3B Instruct (free) supports a context window of 131K tokens (131,072 tokens).