Llama 2 7B Chat Hf is an open-source language model by meta-llama. Features: 7b LLM, VRAM: 13.5GB, Context: 4K, License: llama2, HF Score: 50.7, LLM Explorer Score: 0.44, ELO: 1052, Arc: 52.9, HellaSwag: 78.6, MMLU: 48.3, TruthfulQA: 45.6, WinoGrande: 71.7, GSM8K: 7.4.
| Model Type |
| ||||||||||||||||||
| Use Cases |
| ||||||||||||||||||
| Additional Notes |
| ||||||||||||||||||
| Supported Languages |
| ||||||||||||||||||
| Training Details |
| ||||||||||||||||||
| Input Output |
|
| LLM Name | Llama 2 7B Chat Hf |
| Repository ๐ค | https://huggingface.co/meta-llama/Llama-2-7b-chat-hf |
| Model Size | 7b |
| Required VRAM | 13.5 GB |
| Updated | 2025-09-23 |
| Maintainer | meta-llama |
| Model Type | llama |
| Model Files | |
| Supported Languages | en |
| Model Architecture | LlamaForCausalLM |
| License | llama2 |
| Context Length | 4096 |
| Model Max Length | 4096 |
| Transformers Version | 4.32.0.dev0 |
| Tokenizer Class | LlamaTokenizer |
| Beginning of Sentence Token | <s> |
| End of Sentence Token | </s> |
| Unk Token | <unk> |
| Vocabulary Size | 32000 |
| Torch Data Type | float16 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| Llama 2 7B Chat GGUF | 492 | 72367 | 2 GB |
| Llama 2 7B Chat GPTQ | 265 | 14315 | 3 GB |
| Llama 2 7B Chat GGML | 872 | 355 | 2 GB |
| Llama 2 7B Chat AWQ | 23 | 5739 | 3 GB |
| Llama 2 7B Chat Hf GGUF | 0 | 37 | 2 GB |
| OpenHathi 7B Hi V0.1 Base Gptq | 3 | 8 | 4 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| A6 L | 1024K / 16.1 GB | 201 | 0 |
| A3.4 | 1024K / 16.1 GB | 13 | 0 |
| A5.4 | 1024K / 16.1 GB | 12 | 0 |
| A2.4 | 1024K / 16.1 GB | 12 | 0 |
| M | 1024K / 16.1 GB | 127 | 0 |
| 157 | 1024K / 16.1 GB | 101 | 0 |
| 124 | 1024K / 16.1 GB | 93 | 0 |
| 162 | 1024K / 16.1 GB | 60 | 0 |
| 2 Very Sci Fi | 1024K / 16.1 GB | 317 | 0 |
| 118 | 1024K / 16.1 GB | 15 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐