MixtralOrochi8x7B by smelborp

 ยป  All LLMs  ยป  smelborp  ยป  MixtralOrochi8x7B   URL Share it on

  Autotrain compatible   En   Endpoints compatible   High-intelligence   Mixtral   Region:us   Safetensors   Sharded   Tensorflow   Uncensored

MixtralOrochi8x7B Benchmarks

MixtralOrochi8x7B (smelborp/MixtralOrochi8x7B)
๐ŸŒŸ Advertise your project ๐Ÿš€

MixtralOrochi8x7B Parameters and Internals

Model Type 
high-intelligence, uncensored
Use Cases 
Primary Use Cases:
Uncensored Content, High Intelligence
Considerations:
Due to its uncensored nature, Orochi is best utilized in environments where intelligent, unrestricted dialogue is necessary.
Additional Notes 
Orochi is a cutting-edge language model based on the Mixtral architecture developed by Mistral. It's a sophisticated merge of several prominent models designed to provide highly intelligent responses unrestricted by content limitations.
Supported Languages 
languages (en), proficiency ()
Training Details 
Methodology:
mergekit with the DARE merge method
Model Architecture:
Mixtral, a Mixture of Experts model
Responsible Ai Considerations 
Mitigation Strategies:
Users are encouraged to implement their own content moderation or alignment strategies.
LLM NameMixtralOrochi8x7B
Repository ๐Ÿค—https://huggingface.co/smelborp/MixtralOrochi8x7B 
Model Size46.7b
Required VRAM93.5 GB
Updated2025-09-16
Maintainersmelborp
Model Typemixtral
Model Files  10.0 GB: 1-of-10   10.0 GB: 2-of-10   10.0 GB: 3-of-10   10.0 GB: 4-of-10   10.0 GB: 5-of-10   10.0 GB: 6-of-10   10.0 GB: 7-of-10   10.0 GB: 8-of-10   9.9 GB: 9-of-10   3.6 GB: 10-of-10
Supported Languagesen
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-4.0
Context Length32768
Model Max Length32768
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typebfloat16

Quantized Models of the MixtralOrochi8x7B

Model
Likes
Downloads
VRAM
MixtralOrochi8x7B GGUF157015 GB
MixtralOrochi8x7B GPTQ7723 GB
MixtralOrochi8x7B AWQ2324 GB

Best Alternatives to MixtralOrochi8x7B

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 8x7B Instruct V0.132K / 93.6 GB2826614551
Nous Hermes 2 Mixtral 8x7B DPO32K / 93.6 GB11490449
Mixtral 8x7B V0.132K / 93.6 GB547501753
GritLM 8x7B KTO32K / 93.6 GB96423
Sensualize Mixtral Bf1632K / 93.6 GB00
Skadi Mixtral V132K / 93.5 GB00
Franziska Mixtral V132K / 93.5 GB00
Typhon Mixtral V132K / 93.4 GB00
Smaug Mixtral V0.132K / 187.7 GB962512
NatureLM 8x7B32K / 0.3 GB10215
Note: green Score (e.g. "73.2") means that the model is better than smelborp/MixtralOrochi8x7B.

Rank the MixtralOrochi8x7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51408 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124