TinyLlama 1.1B 1T OpenOrca by jeff31415

 ยป  All LLMs  ยป  jeff31415  ยป  TinyLlama 1.1B 1T OpenOrca   URL Share it on

  Autotrain compatible   Dataset:bigcode/starcoderdata Dataset:cerebras/slimpajama-62...   Dataset:open-orca/openorca   En   Endpoints compatible   Llama   Pytorch   Region:us   Safetensors

TinyLlama 1.1B 1T OpenOrca Benchmarks

๐ŸŒŸ Advertise your project ๐Ÿš€

TinyLlama 1.1B 1T OpenOrca Parameters and Internals

Additional Notes 
Training cost approximately for this fine-tuning.
Supported Languages 
en (proficient)
Training Details 
Data Sources:
Open-Orca/OpenOrca, bigcode/starcoderdata, cerebras/SlimPajama-627B
Methodology:
Fine-tuned on OpenOrca GPT4 subset for 1 epoch, using CHATML format.
Training Time:
~16 hours for 1 epoch.
Hardware Used:
1*RTX A5000 from autodl.com
LLM NameTinyLlama 1.1B 1T OpenOrca
Repository ๐Ÿค—https://huggingface.co/jeff31415/TinyLlama-1.1B-1T-OpenOrca 
Model Size627b
Required VRAM2.2 GB
Updated2025-06-09
Maintainerjeff31415
Model Typellama
Model Files  2.2 GB   2.2 GB
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.31.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat32
TinyLlama 1.1B 1T OpenOrca (jeff31415/TinyLlama-1.1B-1T-OpenOrca)

Quantized Models of the TinyLlama 1.1B 1T OpenOrca

Model
Likes
Downloads
VRAM
...inyLlama 1.1B 1T OpenOrca GGUF1712630 GB
...inyLlama 1.1B 1T OpenOrca GPTQ2410 GB
TinyLlama 1.1B 1T OpenOrca AWQ2130 GB

Best Alternatives to TinyLlama 1.1B 1T OpenOrca

Best Alternatives
Context / RAM
Downloads
Likes
TinyLlama V1.12K / 4.4 GB13800997
CroissantLLMBase2K / 5.4 GB111032
MicroLlama2K / 1.2 GB173249
TinyLlama V1.1 Math Code2K / 4.4 GB374611
TinyLlama V1.1 Chinese2K / 4.4 GB151010
...Llama 1.1B 1.5T OpenOrca Alpha2K / 2.2 GB264
...inyLlama 1.1B 1T OpenOrca GPTQ2K / 0.8 GB412
TinyLlama 1.1B 1T OpenOrca AWQ2K / 0.8 GB132
Note: green Score (e.g. "73.2") means that the model is better than jeff31415/TinyLlama-1.1B-1T-OpenOrca.

Rank the TinyLlama 1.1B 1T OpenOrca Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48046 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124