diff --git a/docs/concepts/llms.mdx b/docs/concepts/llms.mdx
index 249a2c7e50..3946fabbac 100644
--- a/docs/concepts/llms.mdx
+++ b/docs/concepts/llms.mdx
@@ -151,6 +151,39 @@ In this section, you'll find detailed examples that help you select, configure,
| o1 | 200,000 tokens | Fast reasoning, complex reasoning |
+
+ Meta's Llama API provides access to Meta's family of large language models.
+ The API is available through the [Meta Llama API](https://llama.developer.meta.com?utm_source=partner-crewai&utm_medium=website).
+ Set the following environment variables in your `.env` file:
+
+ ```toml Code
+ # Meta Llama API Key Configuration
+ LLAMA_API_KEY=LLM|your_api_key_here
+ ```
+
+ Example usage in your CrewAI project:
+ ```python Code
+ from crewai import LLM
+
+ # Initialize Meta Llama LLM
+ llm = LLM(
+ model="meta_llama/Llama-4-Scout-17B-16E-Instruct-FP8",
+ temperature=0.8,
+ stop=["END"],
+ seed=42
+ )
+ ```
+
+ All models listed here https://llama.developer.meta.com/docs/models/ are supported.
+
+ | Model ID | Input context length | Output context length | Input Modalities | Output Modalities |
+ | --- | --- | --- | --- | --- |
+ | `meta_llama/Llama-4-Scout-17B-16E-Instruct-FP8` | 128k | 4028 | Text, Image | Text |
+ | `meta_llama/Llama-4-Maverick-17B-128E-Instruct-FP8` | 128k | 4028 | Text, Image | Text |
+ | `meta_llama/Llama-3.3-70B-Instruct` | 128k | 4028 | Text | Text |
+ | `meta_llama/Llama-3.3-8B-Instruct` | 128k | 4028 | Text | Text |
+
+
```toml Code
# Required