Mambarim 110M by dominguesm

 ยป  All LLMs  ยป  dominguesm  ยป  Mambarim 110M   URL Share it on

  Autotrain compatible Dataset:nicholaskluge/pt-corpu...   Endpoints compatible   Instruct   Mamba   Portuguese   Pt   Pytorch   Region:us   Safetensors

Mambarim 110M Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mambarim 110M (dominguesm/mambarim-110m)
๐ŸŒŸ Advertise your project ๐Ÿš€

Mambarim 110M Parameters and Internals

Model Type 
text-generation
Additional Notes 
Mambarim-110M is not a transformer. It is based on a state-space model architecture, Mamba.
Supported Languages 
Portuguese (native)
Training Details 
Data Sources:
Pt-Corpus Instruct
Data Volume:
6.2B tokens
Methodology:
pre-trained via causal language modeling
Context Length:
2048
Model Architecture:
Mamba model
LLM NameMambarim 110M
Repository ๐Ÿค—https://huggingface.co/dominguesm/mambarim-110m 
Model Size110m
Required VRAM0.3 GB
Updated2025-08-22
Maintainerdominguesm
Model Typemamba
Instruction-BasedYes
Model Files  0.3 GB
Supported Languagespt
Model ArchitectureMambaForCausalLM
Licensecc-by-4.0
Transformers Version4.39.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<pad>
Vocabulary Size32000
Torch Data Typefloat32

Rank the Mambarim 110M Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50804 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124