Additional Notes |
| |||||||||
Training Details |
|
LLM Name | Phi 3 Mini Code Finetune 128K Instruct V1 |
Repository ๐ค | https://huggingface.co/RDson/Phi-3-mini-code-finetune-128k-instruct-v1 |
Model Size | 3.8b |
Required VRAM | 15.4 GB |
Updated | 2025-08-18 |
Maintainer | RDson |
Model Type | phi3 |
Instruction-Based | Yes |
Model Files | |
Model Architecture | Phi3ForCausalLM |
License | other |
Context Length | 131072 |
Model Max Length | 131072 |
Transformers Version | 4.41.0.dev0 |
Tokenizer Class | LlamaTokenizer |
Padding Token | <|endoftext|> |
Vocabulary Size | 32040 |
Torch Data Type | float32 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Phi 4 Mini Instruct | 128K / 7.7 GB | 199709 | 579 |
Phi 3 Mini 128K Instruct | 128K / 7.7 GB | 1083572 | 1661 |
Phi 3.5 Mini Instruct | 128K / 7.7 GB | 229679 | 902 |
MediPhi Instruct | 128K / 7.7 GB | 5252 | 42 |
NuExtract 1.5 | 128K / 7.7 GB | 123197 | 236 |
NuExtract V1.5 | 128K / 7.7 GB | 108511 | 89 |
Phi 4 Mini Instruct | 128K / 7.7 GB | 7049 | 20 |
MediPhi Clinical | 128K / 7.7 GB | 662 | 9 |
Phi 3.5 Mini TitanFusion 0.1 | 128K / 7.7 GB | 5 | 0 |
MediPhi PubMed | 128K / 7.7 GB | 504 | 6 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐