Phind CodeLlama 34B V1 by Phind

 ยป  All LLMs  ยป  Phind  ยป  Phind CodeLlama 34B V1   URL Share it on

  Autotrain compatible   Code llama   Codegen   Endpoints compatible   Llama   Model-index   Pytorch   Region:us   Sharded

Phind CodeLlama 34B V1 Benchmarks

Phind CodeLlama 34B V1 (Phind/Phind-CodeLlama-34B-v1)
๐ŸŒŸ Advertise your project ๐Ÿš€

Phind CodeLlama 34B V1 Parameters and Internals

Model Type 
text-generation
Additional Notes 
Phind-CodeLlama-34B-v2 is recommended for better performance with 73.8% pass@1.
Training Details 
Data Sources:
Phind's proprietary dataset
Data Volume:
~80k high quality programming problems and solutions
Methodology:
Fine-tuning without LoRA. Utilized instruction-answer pairs structurally different from HumanEval.
Context Length:
4096
Training Time:
90 GPU-hours
Hardware Used:
32x A100-80GB GPUs
Input Output 
Input Format:
Instruction prompts with '\n' separator
Accepted Modalities:
text
Output Format:
Generated text response
Performance Tips:
Do not use Llama chat markup, add '\n: ' at end of task.
LLM NamePhind CodeLlama 34B V1
Repository ๐Ÿค—https://huggingface.co/Phind/Phind-CodeLlama-34B-v1 
Model Size34b
Required VRAM67.5 GB
Updated2025-09-10
MaintainerPhind
Model Typellama
Model Files  9.8 GB: 1-of-7   9.7 GB: 2-of-7   9.7 GB: 3-of-7   9.7 GB: 4-of-7   9.7 GB: 5-of-7   9.7 GB: 6-of-7   9.2 GB: 7-of-7
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length16384
Model Max Length16384
Transformers Version4.33.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typebfloat16

Quantized Models of the Phind CodeLlama 34B V1

Model
Likes
Downloads
VRAM
Phind CodeLlama 34B V1 GGUF1538514 GB
Phind CodeLlama 34B V1 AWQ11218 GB
Phind CodeLlama 34B V1 GPTQ11817 GB

Best Alternatives to Phind CodeLlama 34B V1

Best Alternatives
Context / RAM
Downloads
Likes
...gpt 32K Codellama 34B Instruct32K / 67.5 GB1072
CodeLlama 34B Instruct Hf16K / 67.5 GB9466297
ReflectionCoder CL 34B16K / 67.6 GB97360
CodeLlama 34B Hf16K / 67.5 GB162800173
Phind CodeLlama 34B V216K / 67.5 GB3337832
Speechless Codellama 34B V2.016K / 67.5 GB170617
Phind CodeLlama 34B Python V116K / 67.5 GB1831252
CodeLlama 34B Python Hf16K / 67.5 GB188698
CodeLlama 34B Hf16K / 67.5 GB24622
Tora Code 34B V1.016K / 67.5 GB171714
Note: green Score (e.g. "73.2") means that the model is better than Phind/Phind-CodeLlama-34B-v1.

Rank the Phind CodeLlama 34B V1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51262 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124