C4ai Command R V01 4bit by prince-canuma

 ยป  All LLMs  ยป  prince-canuma  ยป  C4ai Command R V01 4bit   URL Share it on

  4-bit   4bit   Ar   Autotrain compatible   Bitsandbytes   Cohere   Conversational   Custom code   De   En   Endpoints compatible   Es   Fr   It   Ja   Ko   Pt   Quantized   Region:us   Safetensors   Sharded   Tensorflow   Zh

C4ai Command R V01 4bit Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
C4ai Command R V01 4bit (prince-canuma/c4ai-command-r-v01-4bit)
๐ŸŒŸ Advertise your project ๐Ÿš€

C4ai Command R V01 4bit Parameters and Internals

Model Type 
generative
Additional Notes 
Optimized for multilingual generation evaluated in 10 languages and highly performant RAG capabilities.
Supported Languages 
languages_covered (English, French, Spanish, Italian, German, Brazilian Portuguese, Japanese, Korean, Simplified Chinese, and Arabic.), additional_languages (Russian, Polish, Turkish, Vietnamese, Dutch, Czech, Indonesian, Ukrainian, Romanian, Greek, Hindi, Hebrew, Persian.)
Training Details 
Methodology:
This model uses supervised fine-tuning (SFT) and preference training to align model behavior to human preferences for helpfulness and safety.
Context Length:
128000
Model Architecture:
This is an auto-regressive language model that uses an optimized transformer architecture.
Input Output 
Input Format:
Text input for generative capabilities.
Accepted Modalities:
text
Output Format:
Generated text output.
Performance Tips:
For better performance in code generation tasks, use a low temperature and even greedy decoding.
LLM NameC4ai Command R V01 4bit
Repository ๐Ÿค—https://huggingface.co/prince-canuma/c4ai-command-r-v01-4bit 
Model Size19.1b
Required VRAM22.8 GB
Updated2025-09-23
Maintainerprince-canuma
Model Typecohere
Model Files  4.9 GB: 1-of-5   5.0 GB: 2-of-5   5.0 GB: 3-of-5   5.0 GB: 4-of-5   2.9 GB: 5-of-5
Supported Languagesen fr de es it pt ja ko zh ar
Quantization Type4bit
Model ArchitectureCohereForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.39.0.dev0
Tokenizer ClassCohereTokenizer
Padding Token<PAD>
Vocabulary Size256000
Torch Data Typefloat16

Rank the C4ai Command R V01 4bit Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51543 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124