Normistral 7B Warm is an open-source language model by norallm. Features: 7b LLM, VRAM: 14.5GB, Context: 2K, License: apache-2.0, Quantized, LLM Explorer Score: 0.15.
The model is primarily intended for research purposes, and it is pretrained on open Norwegian data. It can generate harmful completions if prompted inappropriately.
Supported Languages
no (high), nb (high), nn (high)
Training Details
Data Sources:
NCC corpus, HPLT corpus, CulturaX, Starcoder
Data Volume:
260 billion tokens
Methodology:
Warm-started training from Mistral-7b-v0.1 architecture.
Context Length:
2000
Training Time:
December 2023 to January 2024
Hardware Used:
LUMI cluster in Finland
Model Architecture:
optimized transformer architecture based on the Mistral/Llama language models
Input Output
Input Format:
Textual input
Accepted Modalities:
text
Output Format:
Generated text
Release Notes
Version:
Initial
Date:
January 2024
Notes:
Pretrained language models released; instruction-finetuned models to follow.
Note: green Score (e.g. "73.2") means that the model is better than norallm/normistral-7b-warm.
Rank the Normistral 7B Warm Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.