Cpt St Vicuna V1.3 1.5B Ppl by nota-ai

 ยป  All LLMs  ยป  nota-ai  ยป  Cpt St Vicuna V1.3 1.5B Ppl   URL Share it on

  Arxiv:2402.02834   Autotrain compatible   Endpoints compatible   Llama   Pytorch   Region:us   Safetensors

Cpt St Vicuna V1.3 1.5B Ppl Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

Cpt St Vicuna V1.3 1.5B Ppl Parameters and Internals

Model Type 
text generation
Additional Notes 
Pruning method involves identifying unimportant Transformer blocks and performing one-shot pruning.
Training Details 
Data Sources:
SlimPajama-627B
Methodology:
Continued Pretraining (CPT) vs. LoRA-based tuning
Training Time:
varied by model size (6 to 12 days)
Hardware Used:
8 NVIDIA H100 GPUs
Model Architecture:
Transformer depth-pruned
LLM NameCpt St Vicuna V1.3 1.5B Ppl
Repository ๐Ÿค—https://huggingface.co/nota-ai/cpt_st-vicuna-v1.3-1.5b-ppl 
Model Size1.5b
Required VRAM3 GB
Updated2025-06-09
Maintainernota-ai
Model Typellama
Model Files  3.0 GB   3.0 GB
Model ArchitectureLlamaForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.31.0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16
Cpt St Vicuna V1.3 1.5B Ppl (nota-ai/cpt_st-vicuna-v1.3-1.5b-ppl)

Best Alternatives to Cpt St Vicuna V1.3 1.5B Ppl

Best Alternatives
Context / RAM
Downloads
Likes
Yi Coder 1.5B Chat128K / 3 GB52736
Yi Coder 1.5B128K / 3 GB47718
CursorCore Yi 1.5B SR128K / 3 GB160
CursorCore Yi 1.5B128K / 3 GB181
CursorCore Yi 1.5B LC128K / 3 GB200
TinyLlama3 EmptyModel32K / 6 GB210
TinyLlaMa3 DifferentTest32K / 6 GB70
OpenCoder 1.5B Instruct4K / 3.8 GB161839
OpenCoder 1.5B Base4K / 3.8 GB92022
MobileLLM 1.5B4K / 3.1 GB6811
Note: green Score (e.g. "73.2") means that the model is better than nota-ai/cpt_st-vicuna-v1.3-1.5b-ppl.

Rank the Cpt St Vicuna V1.3 1.5B Ppl Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48046 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124