Mamba GPT 7B V1 is an open-source language model by CobraMamba. Features: 7b LLM, VRAM: 14.4GB, Context: 32K, License: apache-2.0, HF Score: 58.6, LLM Explorer Score: 0.14, Arc: 61.3, HellaSwag: 84.1, MMLU: 63.5, TruthfulQA: 46.3, WinoGrande: 79.2, GSM8K: 17.4.
| Model Type |
| |||
| Training Details |
|
| LLM Name | Mamba GPT 7B V1 |
| Repository ๐ค | https://huggingface.co/CobraMamba/mamba-gpt-7b-v1 |
| Model Size | 7b |
| Required VRAM | 14.4 GB |
| Updated | 2026-03-11 |
| Maintainer | CobraMamba |
| Model Type | mistral |
| Model Files | |
| Supported Languages | en |
| Model Architecture | MistralForCausalLM |
| License | apache-2.0 |
| Context Length | 32768 |
| Model Max Length | 32768 |
| Transformers Version | 4.35.0.dev0 |
| Tokenizer Class | LlamaTokenizer |
| Vocabulary Size | 32000 |
| Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| ...Nemo Instruct 2407 Abliterated | 1000K / 24.5 GB | 254 | 20 |
| MegaBeam Mistral 7B 512K | 512K / 14.4 GB | 8908 | 50 |
| SpydazWeb AI HumanAI RP | 512K / 14.4 GB | 9 | 1 |
| SpydazWeb AI HumanAI 002 | 512K / 14.4 GB | 18 | 1 |
| ...daz Web AI ChatML 512K Project | 512K / 14.5 GB | 12 | 0 |
| MegaBeam Mistral 7B 300K | 282K / 14.4 GB | 3779 | 16 |
| MegaBeam Mistral 7B 300K | 282K / 14.4 GB | 8082 | 16 |
| Hebrew Mistral 7B 200K | 256K / 30 GB | 1316 | 15 |
| Astral 256K 7B V2 | 250K / 14.4 GB | 5 | 0 |
| Astral 256K 7B | 250K / 14.4 GB | 5 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐