Snorkel Mistral PairRM DPO is an open-source language model by snorkelai. Features: LLM, VRAM: 14.4GB, Context: 32K, License: apache-2.0, HF Score: 66.2, LLM Explorer Score: 0.15, Arc: 66, HellaSwag: 85.6, MMLU: 60.9, TruthfulQA: 70.9, WinoGrande: 77.6, GSM8K: 36.2.
| Model Type |
| |||||||||
| Use Cases |
| |||||||||
| Additional Notes |
| |||||||||
| Training Details |
| |||||||||
| Input Output |
| |||||||||
| Release Notes |
|
| LLM Name | Snorkel Mistral PairRM DPO |
| Repository ๐ค | https://huggingface.co/snorkelai/Snorkel-Mistral-PairRM-DPO |
| Required VRAM | 14.4 GB |
| Updated | 2026-03-30 |
| Maintainer | snorkelai |
| Model Type | mistral |
| Model Files | |
| Model Architecture | MistralForCausalLM |
| License | apache-2.0 |
| Context Length | 32768 |
| Model Max Length | 32768 |
| Transformers Version | 4.34.0 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | </s> |
| Vocabulary Size | 32000 |
| Torch Data Type | bfloat16 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| ...dle Snorkel Mistral PairRM DPO | 0 | 129 | 14 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Krutrim 2 Instruct | 1000K / 49.3 GB | 147 | 36 |
| Ft V1 Violet | 1000K / 24.5 GB | 5 | 0 |
| Mistral Large Instruct 2407 | 128K / 226.7 GB | 7491 | 859 |
| Tiny Random MistralForCausalLM | 128K / 0 GB | 3252 | 1 |
| Winterreise M7 | 32K / 14.4 GB | 0 | 0 |
| Frostwind V2.1 M7 | 32K / 14.4 GB | 0 | 0 |
| MistralLite | 32K / 14.4 GB | 11345 | 435 |
| K2S3 V0.1 | 32K / 28.7 GB | 6 | 0 |
| MistralLite | 32K / 14.4 GB | 61777 | 430 |
| ...ydaz Web AI Reasoner BaseModel | 32K / 14.4 GB | 0 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐