Meta: Llama 4 Scout

Provided by OpenRouter

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...

Specifications

Context Length
327,680 tokens
Input Price
$0.080/M
Output Price
$0.300/M
Vision Support
Yes
Capabilities
TextVision

About Meta: Llama 4 Scout

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...

Strengths

  • Multimodal understanding - can process text and images
  • Large context window (328k tokens) for long conversations

Use Cases

  • Image and document understanding
  • Content creation and writing assistance
  • General conversations and Q&A

Limitations

Performance may vary based on query complexity, context length, and task type. Consider using higher-tier models for production-critical applications.

Sample Prompts

Try these prompts to explore Meta: Llama 4 Scout's capabilities:

Analyze this image and describe what you see in detail

Extract the key information from this screenshot

Compare the two images and explain the differences

Tip: Customize these prompts to fit your specific needs and use cases.