Phibode 3 Mini 4K Ultraalpaca by recogna-nlp

 ยป  All LLMs  ยป  recogna-nlp  ยป  Phibode 3 Mini 4K Ultraalpaca   URL Share it on

  Autotrain compatible   Conversational   Custom code   Endpoints compatible   Instruct   Phi3   Region:us   Safetensors   Sharded   Tensorflow

Phibode 3 Mini 4K Ultraalpaca Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Phibode 3 Mini 4K Ultraalpaca (recogna-nlp/phibode-3-mini-4k-ultraalpaca)
๐ŸŒŸ Advertise your project ๐Ÿš€

Phibode 3 Mini 4K Ultraalpaca Parameters and Internals

Model Type 
text generation
Use Cases 
Limitations:
The model is a work in progress and still presents issues in generating text in Portuguese.
Additional Notes 
Designed for users with limited computational resources.
Supported Languages 
Portuguese (refined for Portuguese language)
Training Details 
Data Sources:
UltraAlpaca dataset translated for Portuguese
Methodology:
LoRa fine-tuning
Input Output 
Input Format:
Expected input roles: system, user
Accepted Modalities:
text
Output Format:
Generated text
LLM NamePhibode 3 Mini 4K Ultraalpaca
Repository ๐Ÿค—https://huggingface.co/recogna-nlp/phibode-3-mini-4k-ultraalpaca 
Model Size3.8b
Required VRAM15.4 GB
Updated2025-09-10
Maintainerrecogna-nlp
Model Typephi3
Instruction-BasedYes
Model Files  5.0 GB: 1-of-4   5.0 GB: 2-of-4   5.0 GB: 3-of-4   0.4 GB: 4-of-4
Model ArchitecturePhi3ForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.40.1
Tokenizer ClassLlamaTokenizer
Padding Token<|endoftext|>
Vocabulary Size32064
Torch Data Typefloat32

Best Alternatives to Phibode 3 Mini 4K Ultraalpaca

Best Alternatives
Context / RAM
Downloads
Likes
Phi 4 Mini Instruct128K / 7.7 GB246233592
Phi 3 Mini 128K Instruct128K / 7.7 GB10598821669
Phi 3.5 Mini Instruct128K / 7.7 GB208573907
MediPhi Instruct128K / 7.7 GB454544
NuExtract 1.5128K / 7.7 GB70228237
NuExtract V1.5128K / 7.7 GB10851189
MediPhi Clinical128K / 7.7 GB8999
Phi 4 Mini Instruct128K / 7.7 GB549120
Phi 3.5 Mini TitanFusion 0.1128K / 7.7 GB50
MediPhi MedCode128K / 7.7 GB6003
Note: green Score (e.g. "73.2") means that the model is better than recogna-nlp/phibode-3-mini-4k-ultraalpaca.

Rank the Phibode 3 Mini 4K Ultraalpaca Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51262 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124