SynthIA 70B V1.5 GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  SynthIA 70B V1.5 GPTQ   URL Share it on

SynthIA 70B V1.5 GPTQ is an open-source language model by TheBloke. Features: 70b LLM, VRAM: 35.3GB, Context: 2K, License: llama2, Quantized, LLM Explorer Score: 0.1.

  4-bit Base model:migtissera/synthia-... Base model:quantized:migtisser...   Gptq   Llama   Quantized   Region:us   Safetensors

SynthIA 70B V1.5 GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
SynthIA 70B V1.5 GPTQ (TheBloke/SynthIA-70B-v1.5-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

SynthIA 70B V1.5 GPTQ Parameters and Internals

Model Type 
llama
Input Output 
Input Format:
{prompt}
LLM NameSynthIA 70B V1.5 GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/SynthIA-70B-v1.5-GPTQ 
Model NameSynthia 70B v1.5
Model CreatorMigel Tissera
Base Model(s)  migtissera/SynthIA-70B-v1.5   migtissera/SynthIA-70B-v1.5
Model Size70b
Required VRAM35.3 GB
Updated2026-03-08
MaintainerTheBloke
Model Typellama
Model Files  35.3 GB
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length2048
Model Max Length2048
Transformers Version4.34.0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to SynthIA 70B V1.5 GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
...B Instruct AutoRound GPTQ 4bit128K / 39.9 GB3246
...B Instruct AutoRound GPTQ 4bit128K / 39.9 GB250
...ama 3.1 70B Instruct Gptq 4bit128K / 39.9 GB244
Opus V1.2 70B Marlin32K / 36.4 GB50
MoMo 70B Lora 1.8.6 DPO GPTQ32K / 41.3 GB111
MoMo 70B Lora 1.8.4 DPO GPTQ32K / 41.3 GB21
Tess 70B V1.6 Marlin31K / 36.3 GB71
Miqu 1 70B Sf GPTQ31K / 36.7 GB46510
Midnight Miqu 70B V1.5 GPTQ32G31K / 40.7 GB274
...Midnight Miqu 70B V1.0 GPTQ32G31K / 40.7 GB92
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/SynthIA-70B-v1.5-GPTQ.

Rank the SynthIA 70B V1.5 GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52394 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a