meta-llama

Meta: Llama 3.3 70B Instruct (free)

by meta-llama

OpenRouter
Tools

About

The Meta Llama 3.3 multilingual large language model (LLM) is a pretrained and instruction tuned generative model in 70B (text in/text out). The Llama 3.3 instruction tuned text only model is optimized for multilingual dialogue use cases and outperforms many of the available open source and closed chat models on common industry benchmarks. Supported languages: English, German, French, Italian, Portuguese, Hindi, Spanish, and Thai. [Model Card](https://github.com/meta-llama/llama-models/blob/main/models/llama3_3/MODEL_CARD.md)

Specifications

Context Length 66K
Max Output Tokens -
Modality text->text
Input text
Output text
Supported Parameters frequency_penalty, max_tokens, presence_penalty, stop, temperature, tool_choice, tools, top_k, top_p
Content Moderation No

Error Rate

Based on feedback reported by Free LLM Router users. Error rate reflects the percentage of requests that encountered issues such as rate limiting, unavailability, or errors. Learn more.

Loading chart...

Availability

Available 41 of 41 tracked days. Daily snapshots show whether this model was accessible as a free model on OpenRouter.

FebMar
ModelDays181920212223242526272812345678910111213141516171819
Meta: Llama 3.3 70B Instruct (free)30

Similar Models

More from Meta-llama

Frequently Asked Questions

Is Meta: Llama 3.3 70B Instruct (free) free to use?

Yes, Meta: Llama 3.3 70B Instruct (free) is completely free to use through OpenRouter. You can access it via the Free LLM Router API at no cost.

What is the context window for Meta: Llama 3.3 70B Instruct (free)?

Meta: Llama 3.3 70B Instruct (free) supports a context window of 66K tokens (65,536 tokens).

Does Meta: Llama 3.3 70B Instruct (free) support function calling (tools)?

Yes, Meta: Llama 3.3 70B Instruct (free) supports function calling / tool use, enabling it to interact with external APIs and services.