Everyone Coder 33B Base by rombodawg

 »  All LLMs  »  rombodawg  »  Everyone Coder 33B Base   URL Share it on

  Autotrain compatible   Codegen   Conversational   Endpoints compatible   Llama   Merge   Model-index   Region:us   Safetensors   Sharded   Tensorflow

Everyone Coder 33B Base Benchmarks

Everyone Coder 33B Base Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
coding
Additional Notes 
Model faces challenges with end tokens, prompting a custom template design.
Input Output 
Input Format:
Custom prompt template with end token '<|EOT|>'.
Accepted Modalities:
text
Output Format:
Text output with '<|EOT|>' end token.
Performance Tips:
Utilize '<|EOT|>' as the stop string in the text generating interface.
LLM NameEveryone Coder 33B Base
Repository 🤗https://huggingface.co/rombodawg/Everyone-Coder-33b-Base 
Model Size33b
Required VRAM66.5 GB
Updated2025-06-09
Maintainerrombodawg
Model Typellama
Model Files  9.7 GB: 1-of-7   9.8 GB: 2-of-7   9.8 GB: 3-of-7   9.8 GB: 4-of-7   9.9 GB: 5-of-7   9.9 GB: 6-of-7   7.6 GB: 7-of-7
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|EOT|>
Vocabulary Size32256
Torch Data Typefloat16
Everyone Coder 33B Base (rombodawg/Everyone-Coder-33b-Base)

Quantized Models of the Everyone Coder 33B Base

Model
Likes
Downloads
VRAM
Everyone Coder 33B Base AWQ241518 GB
Everyone Coder 33B Base GGUF1231812 GB
Everyone Coder 33B Base GPTQ36217 GB

Best Alternatives to Everyone Coder 33B Base

Best Alternatives
Context / RAM
Downloads
Likes
ReflectionCoder DS 33B16K / 67 GB62704
Deepseek Wizard 33B Slerp16K / 35.3 GB120
Deepseek Coder 33B Instruct16K / 66.5 GB5920521
ValidateAI 3 33B Ties16K / 66.5 GB130
ValidateAI 2 33B AT16K / 66.5 GB150
WhiteRabbitNeo 33B V116K / 67 GB53987
Fortran2Cpp16K / 67.3 GB244
F2C Translator16K / 67.3 GB111
Llm4decompile 33B16K / 66.5 GB188
Deepseek Coder 33B Base16K / 66.5 GB163371
Note: green Score (e.g. "73.2") means that the model is better than rombodawg/Everyone-Coder-33b-Base.

Rank the Everyone Coder 33B Base Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48046 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124