DiscoLM German 7B V1 by DiscoResearch

 ยป  All LLMs  ยป  DiscoResearch  ยป  DiscoLM German 7B V1   URL Share it on

  Autotrain compatible Base model:finetune:leolm/leo-... Base model:leolm/leo-mistral-h...   Chatml   Conversational   De   Deutsch   Dpo   En   Endpoints compatible   Finetuned   German   Mistral   Region:us   Safetensors   Sharded   Synthetic data   Tensorflow

DiscoLM German 7b V1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
DiscoLM German 7B V1 (DiscoResearch/DiscoLM_German_7b_v1)
๐ŸŒŸ Advertise your project ๐Ÿš€

DiscoLM German 7B V1 Parameters and Internals

Model Type 
ChatML, DPO, German-focused
Use Cases 
Areas:
Research, Commercial applications
Applications:
Chatbot, Translation, Text generation
Primary Use Cases:
German-focused applications
Limitations:
Complex reasoning, Math, Coding tasks
Considerations:
Model can produce factually incorrect and offensive outputs.
Additional Notes 
Optimized for German text with proficiency in English.
Supported Languages 
de (Proficient), en (Fluent)
Training Details 
Data Sources:
Public datasets, multi-turn chats, retrieval instructions, and synthetically generated instructions
Methodology:
SFT finetuning and DPO reinforcement learning
Model Architecture:
Mistral-based
Responsible Ai Considerations 
Mitigation Strategies:
Ensure to implement a safety/moderation layer before deployment.
Input Output 
Input Format:
ChatML
Accepted Modalities:
Text
Output Format:
Varied, includes structured outputs and retrieval format.
Performance Tips:
Use special retrieval format to improve steerability and reduce hallucinations.
LLM NameDiscoLM German 7b V1
Repository ๐Ÿค—https://huggingface.co/DiscoResearch/DiscoLM_German_7b_v1 
Base Model(s)  Leo Mistral Hessianai 7B   LeoLM/leo-mistral-hessianai-7b
Model Size7b
Required VRAM14.4 GB
Updated2025-09-22
MaintainerDiscoResearch
Model Typemistral
Model Files  4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.5 GB: 3-of-3
Supported Languagesde en
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32002
Torch Data Typebfloat16

Quantized Models of the DiscoLM German 7B V1

Model
Likes
Downloads
VRAM
DiscoLM German 7b V1 GGUF3110642 GB
DiscoLM German 7b V1 AWQ4434 GB
DiscoLM German 7b V1 GPTQ2184 GB

Best Alternatives to DiscoLM German 7B V1

Best Alternatives
Context / RAM
Downloads
Likes
...Nemo Instruct 2407 Abliterated1000K / 24.5 GB14518
MegaBeam Mistral 7B 512K512K / 14.4 GB890450
SpydazWeb AI HumanAI RP512K / 14.4 GB161
SpydazWeb AI HumanAI 002512K / 14.4 GB181
...daz Web AI ChatML 512K Project512K / 14.5 GB120
MegaBeam Mistral 7B 300K282K / 14.4 GB377916
MegaBeam Mistral 7B 300K282K / 14.4 GB809416
Hebrew Mistral 7B 200K256K / 30 GB131415
Astral 256K 7B V2250K / 14.4 GB50
Astral 256K 7B250K / 14.4 GB50
Note: green Score (e.g. "73.2") means that the model is better than DiscoResearch/DiscoLM_German_7b_v1.

Rank the DiscoLM German 7B V1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51534 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124