Magnum V2 123B by anthracite-org

 Β»  All LLMs  Β»  anthracite-org  Β»  Magnum V2 123B   URL Share it on

  Autotrain compatible Base model:finetune:mistralai/... Base model:mistralai/mistral-l...   Chat   Conversational Dataset:anthracite-org/kalo-op... Dataset:anthracite-org/nopm cl... Dataset:doctor-shotgun/c2-sthe...   De   En   Endpoints compatible   Es   Fr   Instruct   It   Ja   Mistral   Pt   Region:us   Ru   Safetensors   Sharded   Tensorflow   Zh

Magnum V2 123B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Magnum V2 123B (anthracite-org/magnum-v2-123b)
🌟 Advertise your project πŸš€

Magnum V2 123B Parameters and Internals

Model Type 
text generation
Additional Notes 
This is the sixth model in a series to replicate the prose quality of Claude 3 models, specifically Sonnet and Opus.
Supported Languages 
en (supported), fr (supported), de (supported), es (supported), it (supported), pt (supported), ru (supported), zh (supported), ja (supported)
Training Details 
Data Sources:
anthracite-org/Stheno-Data-Filtered, anthracite-org/kalo-opus-instruct-22k-no-refusal, anthracite-org/nopm_claude_writing_fixed
Methodology:
Instruct tuning with Mistral formatting
Training Time:
1.5 epochs
Hardware Used:
8x AMD Instinctβ„’ MI300X Accelerators
Input Output 
Input Format:
Mistral formatting: [INST] USER MESSAGE[/INST]
Accepted Modalities:
text
Performance Tips:
Recommend replacing misconfigured SillyTavern presets with Context and Instruct presets.
LLM NameMagnum V2 123B
Repository πŸ€—https://huggingface.co/anthracite-org/magnum-v2-123b 
Base Model(s)  Mistral Large Instruct 2407   mistralai/Mistral-Large-Instruct-2407
Model Size123b
Required VRAM207.6 GB
Updated2025-09-10
Maintaineranthracite-org
Model Typemistral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-51   4.8 GB: 2-of-51   4.9 GB: 3-of-51   4.8 GB: 4-of-51   4.8 GB: 5-of-51   4.8 GB: 6-of-51   4.9 GB: 7-of-51   4.8 GB: 8-of-51   4.8 GB: 9-of-51   4.8 GB: 10-of-51   4.9 GB: 11-of-51   4.8 GB: 12-of-51   4.8 GB: 13-of-51   4.8 GB: 14-of-51   4.9 GB: 15-of-51   4.8 GB: 16-of-51   4.8 GB: 17-of-51   4.8 GB: 18-of-51   4.9 GB: 19-of-51   4.8 GB: 20-of-51   4.8 GB: 21-of-51   4.8 GB: 22-of-51   4.9 GB: 23-of-51   4.8 GB: 24-of-51   4.8 GB: 25-of-51   4.8 GB: 26-of-51   4.9 GB: 27-of-51   4.8 GB: 28-of-51   4.8 GB: 29-of-51   4.8 GB: 30-of-51   4.9 GB: 31-of-51   4.8 GB: 32-of-51   4.8 GB: 33-of-51   4.8 GB: 34-of-51   4.9 GB: 35-of-51   4.8 GB: 36-of-51   4.8 GB: 37-of-51   4.8 GB: 38-of-51   4.9 GB: 39-of-51   4.8 GB: 40-of-51   4.8 GB: 41-of-51   4.8 GB: 42-of-51   4.9 GB: 43-of-51
Supported Languagesen fr de es it pt ru zh ja
Model ArchitectureMistralForCausalLM
Licenseother
Context Length131072
Model Max Length131072
Transformers Version4.43.4
Vocabulary Size32768
Torch Data Typebfloat16

Best Alternatives to Magnum V2 123B

Best Alternatives
Context / RAM
Downloads
Likes
Behemoth R1 123B V2128K / 226.4 GB32720
Behemoth X 123B V2128K / 217.2 GB8313
ML2 123B Magnum Diamond128K / 212.4 GB218
Gigaberg Mistral Large 123B128K / 222 GB52
Behemoth 123B V2128K / 221.6 GB5910
Cakrawala 123B128K / 222 GB43
Behemoth 123B V1128K / 221.6 GB9544
Magnum V4 123B128K / 244.2 GB5827
Lumikabra 123B V0.4128K / 216.7 GB712
Magstral 123B128K / 221.6 GB71
Note: green Score (e.g. "73.2") means that the model is better than anthracite-org/magnum-v2-123b.

Rank the Magnum V2 123B Capabilities

πŸ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51262 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124