Phi 2 GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Phi 2 GPTQ   URL Share it on

Phi 2 GPTQ is an open-source language model by TheBloke. Features: 2.8b LLM, VRAM: 1.8GB, License: other, Quantized, LLM Explorer Score: 0.12.

  4-bit   Base model:microsoft/phi-2 Base model:quantized:microsoft...   Code   Custom code   En   Gptq   Phi-msft   Quantized   Region:us   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/TheBloke/phi-2-GPTQ 

Phi 2 GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Phi 2 GPTQ (TheBloke/phi-2-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Phi 2 GPTQ Parameters and Internals

Model Type 
text-generation, nlp, code
Use Cases 
Areas:
research purposes
Applications:
QA format, chat format, code format
Primary Use Cases:
research, exploration of safety challenges
Limitations:
potential for generating inaccurate code and facts, limited code scope to common packages, unreliable instructions adherence, language understanding mostly standard English, potential societal biases
Additional Notes 
Intended for research purposes. Not tested for production-level applications.
Supported Languages 
en (standard English)
Training Details 
Data Sources:
AOAI GPT-3.5, Falcon RefinedWeb, SlimPajama
Data Volume:
1.4T tokens
Methodology:
next-word prediction
Context Length:
2048
Training Time:
14 days
Hardware Used:
96xA100-80G
Model Architecture:
Transformer-based model
Input Output 
Accepted Modalities:
text
LLM NamePhi 2 GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/phi-2-GPTQ 
Model NamePhi 2
Model CreatorMicrosoft
Base Model(s)  Phi 2   microsoft/phi-2
Model Size2.8b
Required VRAM1.8 GB
Updated2026-03-31
MaintainerTheBloke
Model Typephi-msft
Model Files  1.8 GB
Supported Languagesen
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitecturePhiForCausalLM
Licenseother
Model Max Length2048
Transformers Version4.37.0.dev0
Tokenizer ClassCodeGenTokenizer
Vocabulary Size51200
Torch Data Typefloat16
Activation Functiongelu_new

Best Alternatives to Phi 2 GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Phi 2 Quantize Gptq2K / 1.8 GB72
... 2 Electrical Engineering GPTQ2K / 1.8 GB77
Phi2 OSST GPTQ2K / 1.8 GB31
Phi 2 DPO GPTQ0K / 1.8 GB145
Bnb DPO 8bit2K / 3 GB70
Bnb DPO 8bit2K / 3 GB70
Phi 2 4bit 64rank2K / 5.6 GB230
Phi 2 Nf4 Fp16 Upscaled2K / 5.6 GB90
MFANN3bv0.24128K / 11.1 GB50
MFANN3b128K / 11.1 GB240
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/phi-2-GPTQ.

Rank the Phi 2 GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52473 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a