Calm2 7B Chat 7B MoE by aixsatoshi

 ยป  All LLMs  ยป  aixsatoshi  ยป  Calm2 7B Chat 7B MoE   URL Share it on

  Autotrain compatible   Endpoints compatible   Mixtral   Region:us   Safetensors   Sharded   Tensorflow

Calm2 7B Chat 7B MoE Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Calm2 7B Chat 7B MoE (aixsatoshi/calm2-7b-chat-7b-moe)
๐ŸŒŸ Advertise your project ๐Ÿš€

Calm2 7B Chat 7B MoE Parameters and Internals

Model Type 
general-purpose, conversational
Use Cases 
Areas:
research, development
Limitations:
experimental model, not fine-tuned post-composition
Considerations:
Users should perform their own tuning and optimization.
Additional Notes 
This model combines the general-purpose capabilities of the calm2-7b model with the conversational abilities of the calm2-7b-chat model.
Supported Languages 
Japanese (native), English (native)
Training Details 
Context Length:
32768
Model Architecture:
Mixture of Experts (MoE)
LLM NameCalm2 7B Chat 7B MoE
Repository ๐Ÿค—https://huggingface.co/aixsatoshi/calm2-7b-chat-7b-moe 
Model Size7b
Required VRAM22.7 GB
Updated2025-09-23
Maintaineraixsatoshi
Model Typemixtral
Model Files  10.0 GB: 1-of-3   10.0 GB: 2-of-3   2.7 GB: 3-of-3
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.2
Tokenizer ClassGPTNeoXTokenizer
Padding Token<|endoftext|>
Vocabulary Size65024
Torch Data Typebfloat16

Best Alternatives to Calm2 7B Chat 7B MoE

Best Alternatives
Context / RAM
Downloads
Likes
Multimaster 7B V632K / 142.5 GB97151
Multilingual Mistral32K / 93.5 GB17352
MultiverseBuddy 15B MoE32K / 25.8 GB90
Mini Mixtral V0.232K / 25.8 GB64
Lumina 232K / 37.1 GB50
OpenMistral MoE32K / 48.3 GB12180
Laserxtral32K / 48.3 GB81078
Mixtral 7B 8expert32K / 93.6 GB1059264
Merged Model MoE32K / 53.3 GB41
RogerWizard 12B MoE32K / 25.8 GB41
Note: green Score (e.g. "73.2") means that the model is better than aixsatoshi/calm2-7b-chat-7b-moe.

Rank the Calm2 7B Chat 7B MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51544 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124