Norocetacean 20B 10K by ddh0

 ยป  All LLMs  ยป  ddh0  ยป  Norocetacean 20B 10K   URL Share it on

  Autotrain compatible   Custom code   Endpoints compatible   Llama   Region:us   Safetensors   Sharded   Tensorflow

Norocetacean 20B 10K Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Norocetacean 20B 10K (ddh0/Norocetacean-20b-10k)
๐ŸŒŸ Advertise your project ๐Ÿš€

Norocetacean 20B 10K Parameters and Internals

Model Type 
text generation
Additional Notes 
The model has a context length of 10240 tokens via YaRN, extended from an original length of 4096 tokens. It combines Psyonic-Cetacean-20B and no_robots-alpaca LoRA.
Input Output 
Input Format:
Alpaca prompt format
Performance Tips:
For best results, customize the system prompt.
LLM NameNorocetacean 20B 10K
Repository ๐Ÿค—https://huggingface.co/ddh0/Norocetacean-20b-10k 
Model Size20b
Required VRAM39.9 GB
Updated2025-08-22
Maintainerddh0
Model Typellama
Model Files  9.9 GB: 1-of-5   9.9 GB: 2-of-5   9.9 GB: 3-of-5   9.9 GB: 4-of-5   0.3 GB: 5-of-5
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length10240
Model Max Length10240
Transformers Version4.36.1
Tokenizer ClassLlamaTokenizer
Padding Token[PAD]
Vocabulary Size32000
Torch Data Typefloat16

Quantized Models of the Norocetacean 20B 10K

Model
Likes
Downloads
VRAM
Norocetacean 20B 10K GGUF61368 GB
Norocetacean 20B 10K GPTQ2610 GB
Norocetacean 20B 10K AWQ2610 GB

Best Alternatives to Norocetacean 20B 10K

Best Alternatives
Context / RAM
Downloads
Likes
Internlm2 5 20B Llamafied256K / 39.9 GB10095
Internlm2 20B Llama32K / 39.6 GB165920
Stellaris Internlm2 20B R51232K / 39.8 GB53
Internlm2 Chat 20B Llama Old32K / 39.6 GB93
Internlm2 Base 20B Llama32K / 39.6 GB53
Internlm2 Base 20B Llama32K / 39.6 GB70
Deita 20B32K / 39.8 GB51
Bagel 20B V04 Llama32K / 39.6 GB197
Bagel DPO 20B V04 Llama32K / 39.6 GB143
Internlm2 Limarp Chat 20B32K / 39.6 GB63
Note: green Score (e.g. "73.2") means that the model is better than ddh0/Norocetacean-20b-10k.

Rank the Norocetacean 20B 10K Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50835 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124