Norocetacean 20B 10K GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Norocetacean 20B 10K GPTQ   URL Share it on

  4-bit   Autotrain compatible Base model:ddh0/norocetacean-2... Base model:quantized:ddh0/noro...   Custom code   Gptq   Llama   Quantized   Region:us   Safetensors

Norocetacean 20B 10K GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Norocetacean 20B 10K GPTQ (TheBloke/Norocetacean-20B-10k-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Norocetacean 20B 10K GPTQ Parameters and Internals

Model Type 
llama
Additional Notes 
This model is a merge of Psyonic-Cetacean-20B with a no_robots-alpaca LoRA and has been extended to 10240 context length using YaRN. The overall goal was to blend unique aspects of the two underlying datasets while enhancing capability for long context interactions.
Input Output 
Input Format:
Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: {prompt} ### Response:
LLM NameNorocetacean 20B 10K GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Norocetacean-20B-10k-GPTQ 
Model NameNorocetacean 20B 10K
Model Creatorddh0
Base Model(s)  Norocetacean 20B 10K   ddh0/Norocetacean-20b-10k
Model Size20b
Required VRAM10.5 GB
Updated2025-08-22
MaintainerTheBloke
Model Typellama
Model Files  10.5 GB
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length10240
Model Max Length10240
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token[PAD]
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Norocetacean 20B 10K GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
DaringMaid 20B GPTQ4K / 10.5 GB245
Rose Kimiko 20B GPTQ4K / 10.5 GB01
Nethena 20B Glued GPTQ4K / 10.5 GB61
Iambe Storyteller 20B GPTQ4K / 10.5 GB152
Rose 20B GPTQ4K / 10.5 GB753
Iambe 20B DARE GPTQ4K / 10.5 GB61
Noromaid 20B V0.1.1 GPTQ4K / 10.5 GB88
Nethena 20B GPTQ4K / 10.5 GB187
MXLewd L2 20B GPTQ4K / 10.5 GB119
InternLM2 Chat 20B ToxicRP32K / 39.9 GB50
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Norocetacean-20B-10k-GPTQ.

Rank the Norocetacean 20B 10K GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50835 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124