WhiteRabbitNeo 33B V1 by WhiteRabbitNeo

 »  All LLMs  »  WhiteRabbitNeo  »  WhiteRabbitNeo 33B V1   URL Share it on

WhiteRabbitNeo 33B V1 is an open-source language model by WhiteRabbitNeo. Features: 33b LLM, VRAM: 67GB, Context: 16K, License: other, Code Generating, HF Score: 46.9, LLM Explorer Score: 0.16, Arc: 44.4, HellaSwag: 60.2, MMLU: 40.6, TruthfulQA: 41.7, WinoGrande: 61, GSM8K: 33.7.

  Codegen   Deploy:azure   Endpoints compatible   Llama   Pytorch   Region:us   Sharded

WhiteRabbitNeo 33B V1 Benchmarks

WhiteRabbitNeo 33B V1 Parameters and Internals

Model Type 
cybersecurity, text generation
Use Cases 
Areas:
cybersecurity
Applications:
offensive cybersecurity, defensive cybersecurity
Additional Notes 
The model includes a "Prompt Enhancement" feature for improved performance. It is used both in offensive and defensive cybersecurity contexts and pilots a public preview to assess its societal impact. The model uses a cognitive reasoning process it terms "Tree of Thoughts" to provide comprehensive answers.
LLM NameWhiteRabbitNeo 33B V1
Repository 🤗https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-33B-v1 
Model Size33b
Required VRAM67 GB
Updated2026-04-01
MaintainerWhiteRabbitNeo
Model Typellama
Model Files  4.9 GB: 1-of-14   4.8 GB: 2-of-14   4.8 GB: 3-of-14   4.8 GB: 4-of-14   4.8 GB: 5-of-14   4.8 GB: 6-of-14   4.8 GB: 7-of-14   4.8 GB: 8-of-14   4.8 GB: 9-of-14   4.8 GB: 10-of-14   4.8 GB: 11-of-14   4.8 GB: 12-of-14   4.8 GB: 13-of-14   4.5 GB: 14-of-14
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<|end▁of▁sentence|>
Vocabulary Size32256
Torch Data Typebfloat16

Best Alternatives to WhiteRabbitNeo 33B V1

Best Alternatives
Context / RAM
Downloads
Likes
ReflectionCoder DS 33B16K / 67 GB97424
Deepseek Coder 33B Instruct16K / 66.5 GB10247566
Deepseek Wizard 33B Slerp16K / 35.3 GB100
ValidateAI 3 33B Ties16K / 66.5 GB80
ValidateAI 2 33B AT16K / 66.5 GB50
Everyone Coder 33B Base16K / 66.5 GB11721
Fortran2Cpp16K / 67.3 GB44
Deepseek Coder 33B Base16K / 66.5 GB322675
F2C Translator16K / 67.3 GB01
Llm4decompile 33B16K / 66.5 GB18
Note: green Score (e.g. "73.2") means that the model is better than WhiteRabbitNeo/WhiteRabbitNeo-33B-v1.

Rank the WhiteRabbitNeo 33B V1 Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a