| Additional Notes |
| |||||||||
| Training Details |
|
| LLM Name | Philosophy Mistral |
| Repository ๐ค | https://huggingface.co/Heralax/philosophy-mistral |
| Base Model(s) | |
| Model Size | 7.2b |
| Required VRAM | 7.7 GB |
| Updated | 2025-09-23 |
| Maintainer | Heralax |
| Model Type | mistral |
| Model Files | |
| GGML Quantization | Yes |
| GGUF Quantization | Yes |
| Quantization Type | ggml|q8|gguf |
| Model Architecture | MistralForCausalLM |
| License | apache-2.0 |
| Context Length | 32768 |
| Model Max Length | 32768 |
| Transformers Version | 4.45.0.dev0 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | <|end_of_text|> |
| Vocabulary Size | 32001 |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| ...illHave30DolarInMyAzureAccount | 32K / 14.4 GB | 12 | 0 |
| Mistral Kimiko CSFT | 32K / 14.4 GB | 58 | 2 |
| Spydaz Web AGI DeepThink | 128K / 14.4 GB | 17 | 1 |
| ...daz Web AI Pre Train Align 004 | 128K / 14.4 GB | 28 | 0 |
| Zion Alpha | 32K / 14.4 GB | 1338 | 4 |
| LCARS AI 001 | 32K / 14.4 GB | 14 | 4 |
| Mixtral Quantized | 32K / 14.4 GB | 27 | 0 |
| Mayonnaise 4in1 02 | 32K / 14.4 GB | 35 | 0 |
| Zsql En Postgres | 32K / 14.4 GB | 64 | 5 |
| MHENNlitv3 | 32K / 28.9 GB | 6 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐