Zephyr 7B Gemma DPO Avg is an open-source language model by ZHLiu627. Features: 7b LLM, VRAM: 17.1GB, Context: 8K, License: apache-2.0, LLM Explorer Score: 0.22.
| LLM Name | Zephyr 7B Gemma DPO Avg |
| Repository 🤗 | https://huggingface.co/ZHLiu627/zephyr-7b-gemma-dpo-avg |
| Model Size | 7b |
| Required VRAM | 17.1 GB |
| Updated | 2025-03-05 |
| Maintainer | ZHLiu627 |
| Model Type | gemma |
| Model Files | |
| Model Architecture | GemmaForCausalLM |
| License | apache-2.0 |
| Context Length | 8192 |
| Model Max Length | 8192 |
| Transformers Version | 4.43.3 |
| Tokenizer Class | GemmaTokenizer |
| Padding Token | <pad> |
| Vocabulary Size | 256000 |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Kaggle Math Model Gemma V1 | 12K / 17.1 GB | 5 | 0 |
| Gemma 1.1 7B It | 8K / 17.1 GB | 20065 | 275 |
| Gemma 1.1 7B It | 8K / 17.1 GB | 28 | 4 |
| Gemma 7B It | 8K / 17.1 GB | 1029 | 10 |
| SeaLLM 7B V2.5 | 8K / 17.1 GB | 12410 | 50 |
| Zephyr 7B Gemma Rpo Avg | 8K / 17.1 GB | 7 | 0 |
| ... Codegemma 2 7B It Alpaca V1.3 | 8K / 17.1 GB | 9 | 1 |
| Zephyr 7B Gemma V0.1 | 8K / 17.1 GB | 302 | 124 |
| ... 7B Finetuned Sft Navarasa 2.0 | 8K / 34 GB | 314 | 23 |
| Codegemma 7B It | 8K / 17.1 GB | 3605 | 251 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟