Jambatypus V0.1 by mlabonne

 ยป  All LLMs  ยป  mlabonne  ยป  Jambatypus V0.1   URL Share it on

  4-bit   Axolotl   Base model:ai21labs/jamba-v0.1 Base model:quantized:ai21labs/...   Bitsandbytes   Conversational   Custom code   En   Endpoints compatible   Jamba   Lora   Region:us   Safetensors

Jambatypus V0.1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Jambatypus V0.1 (mlabonne/Jambatypus-v0.1)
๐ŸŒŸ Advertise your project ๐Ÿš€

Jambatypus V0.1 Parameters and Internals

Model Type 
QLoRA fine-tuned model
Supported Languages 
en (High)
Training Details 
Data Sources:
chargoddard/Open-Platypus-Chat
Context Length:
4096
Hardware Used:
2xA100 80 GB
Input Output 
Accepted Modalities:
text
Performance Tips:
Use ChatML template for optimal performance.
Release Notes 
Version:
0.1
Notes:
The model is a fine-tuned version of ai21labs/Jamba-v0.1 on Open-Platypus-Chat dataset using QLoRA.
LLM NameJambatypus V0.1
Repository ๐Ÿค—https://huggingface.co/mlabonne/Jambatypus-v0.1 
Base Model(s)  ai21labs/Jamba-v0.1   ai21labs/Jamba-v0.1
Model Size51.6b
Required VRAM0.5 GB
Updated2026-03-05
Maintainermlabonne
Model Files  0.5 GB   0.0 GB
Supported Languagesen
Model ArchitectureAutoModelForCausalLM
Licenseapache-2.0
Is Biasednone
Tokenizer ClassLlamaTokenizer
Padding Token<|pad|>
PEFT TypeLORA
LoRA ModelYes
PEFT Target Modulesq_proj|v_proj|dt_proj|out_proj|in_proj|k_proj|x_proj|o_proj|gate_proj|down_proj|up_proj|router
LoRA Alpha32
LoRA Dropout0.05
R Param16

Best Alternatives to Jambatypus V0.1

Best Alternatives
Context / RAM
Downloads
Likes
Asdf0K / 0.1 GB140
MS32 30K / 1.5 GB60
MS32 20K / 0.7 GB50
Jp Testing0K / 14.4 GB190
Tinyllama Cpt0K / 0.5 GB60
Fine Tune Sentimental Llama0K / 0 GB50
VLM2Vec LoRA0K / 0 GB3111
QuietStar Project0K /  GB82
Finetuned Llava Lora0K / 0.1 GB50
Qwen7B Haiguitang0K / 15.3 GB100
Note: green Score (e.g. "73.2") means that the model is better than mlabonne/Jambatypus-v0.1.

Rank the Jambatypus V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51631 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124