Skadi Mixtral V1 by Sao10K

 ยป  All LLMs  ยป  Sao10K  ยป  Skadi Mixtral V1   URL Share it on

Skadi Mixtral V1 is an open-source language model by Sao10K. Features: 46.7b LLM, VRAM: 93.5GB, Context: 32K, License: cc-by-nc-4.0, HF Score: 72.7, LLM Explorer Score: 0.26, Arc: 70.1, HellaSwag: 87.7, MMLU: 72.2, TruthfulQA: 60.4, WinoGrande: 81.3, GSM8K: 64.8.

  Autotrain compatible   En   Endpoints compatible   License:cc-by-nc-4.0   Merge   Mixtral   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/Sao10K/Skadi-Mixtral-v1 

Skadi Mixtral V1 Benchmarks

Skadi Mixtral V1 (Sao10K/Skadi-Mixtral-v1)
๐ŸŒŸ Advertise your project ๐Ÿš€

Skadi Mixtral V1 Parameters and Internals

LLM NameSkadi Mixtral V1
Repository ๐Ÿค—https://huggingface.co/Sao10K/Skadi-Mixtral-v1 
Model Size46.7b
Required VRAM93.5 GB
Updated2024-07-04
MaintainerSao10K
Model Typemixtral
Model Files  10.0 GB: 1-of-10   10.0 GB: 2-of-10   10.0 GB: 3-of-10   10.0 GB: 4-of-10   10.0 GB: 5-of-10   9.9 GB: 6-of-10   9.9 GB: 7-of-10   10.0 GB: 8-of-10   9.9 GB: 9-of-10   3.8 GB: 10-of-10
Supported Languagesen
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-4.0
Context Length32768
Model Max Length32768
Transformers Version4.39.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Skadi Mixtral V1

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 8x7B Instruct V0.132K / 93.6 GB5027464659
Nous Hermes 2 Mixtral 8x7B DPO32K / 93.6 GB8997452
Mixtral 8x7B V0.132K / 93.6 GB1173651801
Sensualize Mixtral Bf1632K / 93.6 GB00
Franziska Mixtral V132K / 93.5 GB00
Typhon Mixtral V132K / 93.4 GB00
GritLM 8x7B KTO32K / 93.6 GB81423
Smaug Mixtral V0.132K / 187.7 GB854812
XLAM 8x7b R32K / 93.6 GB5094415
Mixtral 8x7B Instruct V0.1 FP832K / 47.1 GB35020
Note: green Score (e.g. "73.2") means that the model is better than Sao10K/Skadi-Mixtral-v1.

Rank the Skadi Mixtral V1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52758 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a