| Model Type |
| ||||||||||||
| Additional Notes |
| ||||||||||||
| Supported Languages |
| ||||||||||||
| Training Details |
|
| LLM Name | Phi 1 5 Dolly Instruction Polish |
| Repository ๐ค | https://huggingface.co/s3nh/phi-1_5_dolly_instruction_polish |
| Model Size | 1.4b |
| Required VRAM | 2.8 GB |
| Updated | 2025-10-23 |
| Maintainer | s3nh |
| Model Type | phi-msft |
| Instruction-Based | Yes |
| Model Files | |
| Supported Languages | pl en |
| Model Architecture | PhiForCausalLM |
| License | openrail |
| Model Max Length | 2048 |
| Transformers Version | 4.37.0.dev0 |
| Tokenizer Class | CodeGenTokenizer |
| Vocabulary Size | 51200 |
| Torch Data Type | float16 |
| Activation Function | gelu_new |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Phi 1 5 Instruct V0.1 | 2K / 2.8 GB | 691 | 1 |
| ...hi Science Generalist Instruct | 0K / 2.8 GB | 16 | 1 |
| ...hi Science Generalist Instruct | 0K / 2.8 GB | 4 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐