Phi3 Mini 4K Sft DPO Quant by matteosz

 ยป  All LLMs  ยป  matteosz  ยป  Phi3 Mini 4K Sft DPO Quant   URL Share it on

Phi3 Mini 4K Sft DPO Quant is an open-source language model by matteosz. Features: 2.1b LLM, VRAM: 2.3GB, Context: 4K, License: apache-2.0, LLM Explorer Score: 0.14.

  4-bit   Autotrain compatible   Bitsandbytes   Conversational   Custom code   En   Endpoints compatible   Phi-3   Phi3   Quantization   Region:us   Safetensors

Phi3 Mini 4K Sft DPO Quant Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Phi3 Mini 4K Sft DPO Quant (matteosz/phi3-mini-4k-sft-dpo-quant)
๐ŸŒŸ Advertise your project ๐Ÿš€

Phi3 Mini 4K Sft DPO Quant Parameters and Internals

Model Type 
fine-tuned model, quantized model, STEM-focused model
Use Cases 
Areas:
education
Applications:
AI university tutor
Primary Use Cases:
Being an AI university tutor
Additional Notes 
This model has been quantized to 4 bits for efficiency using a specific BitsAndBytesConfig.
Supported Languages 
en (proficient)
Training Details 
Data Sources:
STEM domains
Methodology:
Fine-tuned with SFT and DPO
LLM NamePhi3 Mini 4K Sft DPO Quant
Repository ๐Ÿค—https://huggingface.co/matteosz/phi3-mini-4k-sft-dpo-quant 
Model Size2.1b
Required VRAM2.3 GB
Updated2025-09-13
Maintainermatteosz
Model Typephi3
Model Files  2.3 GB
Supported Languagesen
Model ArchitecturePhi3ForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.41.1
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32011
Torch Data Typefloat16

Best Alternatives to Phi3 Mini 4K Sft DPO Quant

Best Alternatives
Context / RAM
Downloads
Likes
...xZSsgOijsAiwZBXCZaCeJGxXyjzVTt128K / 2.3 GB60
...guGNWodemktssFvRBlcJZUwVLDzmSa128K / 2.3 GB60
...uDtCRwSjaCHvWLsRNvYmzIBDUAuGnq128K / 2.3 GB60
Tiny Random Phi3ForCausalLM4K / 0 GB51670
NuExtract Finetuned4K / 2.8 GB60
Orpo Phi4K / 7.7 GB60
Quantized Phi 3 Mini 4K4K / 2.3 GB71
... 3 Mini 128K Instruct Bnb 4bit128K / 2.3 GB161
...3 Mini 4K Instruct 4bit 64rank4K / 2.4 GB60
...hi 3 Mini 4K Instruct Bnb 4bit4K / 2.3 GB80
Note: green Score (e.g. "73.2") means that the model is better than matteosz/phi3-mini-4k-sft-dpo-quant.

Rank the Phi3 Mini 4K Sft DPO Quant Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52758 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a