Tiny Llama Llama Dolphin Laser 1B MoE by jtatman

 ยป  All LLMs  ยป  jtatman  ยป  Tiny Llama Llama Dolphin Laser 1B MoE   URL Share it on

  Autotrain compatible Base model:cognitivecomputatio... Base model:cognitivecomputatio... Base model:merge:cognitivecomp... Base model:merge:tinyllama/tin... Base model:merge:tinyllama/tin... Base model:tinyllama/tinyllama... Base model:tinyllama/tinyllama... Cognitivecomputations/tinydolp... Cognitivecomputations/tinydolp...   Endpoints compatible   Frankenmoe   Lazymergekit   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow Tinyllama/tinyllama-1.1b-chat-... Tinyllama/tinyllama-1.1b-inter...

Tiny Llama Llama Dolphin Laser 1B MoE Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Tiny Llama Llama Dolphin Laser 1B MoE (jtatman/Tiny-Llama-Llama-Dolphin-laser-1b-moe)
๐ŸŒŸ Advertise your project ๐Ÿš€

Tiny Llama Llama Dolphin Laser 1B MoE Parameters and Internals

Model Type 
text generation
LLM NameTiny Llama Llama Dolphin Laser 1B MoE
Repository ๐Ÿค—https://huggingface.co/jtatman/Tiny-Llama-Llama-Dolphin-laser-1b-moe 
Base Model(s)  TinyLlama/TinyLlama-1.1B-intermediate-step-715k-1.5T   cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser   cognitivecomputations/TinyDolphin-2.8.1-1.1b   TinyLlama/TinyLlama-1.1B-Chat-v1.0   TinyLlama/TinyLlama-1.1B-intermediate-step-715k-1.5T   cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser   cognitivecomputations/TinyDolphin-2.8.1-1.1b   TinyLlama/TinyLlama-1.1B-Chat-v1.0
Model Size1b
Required VRAM6.8 GB
Updated2025-09-23
Maintainerjtatman
Model Typemixtral
Model Files  2.0 GB: 1-of-4   2.0 GB: 2-of-4   2.0 GB: 3-of-4   0.8 GB: 4-of-4
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.38.1
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32002
Torch Data Typebfloat16

Best Alternatives to Tiny Llama Llama Dolphin Laser 1B MoE

Best Alternatives
Context / RAM
Downloads
Likes
TinyMistral 6x248M Instruct32K / 4 GB64610
TinyMistral 6x248M32K / 4 GB64514
...l DPO 8x7b V0.2 3.5bpw H6 EXL232K / 20.7 GB42
...l DPO 8x7b V0.2 6.0bpw H6 EXL232K / 35.3 GB51
Bagel 8x7b V0.2 3.5bpw H6 EXL232K / 20.7 GB42
Note: green Score (e.g. "73.2") means that the model is better than jtatman/Tiny-Llama-Llama-Dolphin-laser-1b-moe.

Rank the Tiny Llama Llama Dolphin Laser 1B MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51534 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124