MiniCPM 2B Sft Fp32 Safetensors is an open-source language model by Isaak-Carter. Features: 2b LLM, VRAM: 10.9GB, Context: 4K, LLM Explorer Score: 0.13.
MiniCPM 2B Sft Fp32 Safetensors Parameters and Internals
Model Type
End-Size Large Language Model, Multimodal Model
Use Cases
Areas:
Research, Commercial applications
Applications:
Multimodal Models
Limitations:
Hallucination issues due to model size, Identity information similar to GPT due to ShareGPT data, Prompt sensitivity affecting consistency, Knowledge memory inaccuracies
Considerations:
No identity-specific training conducted
Additional Notes
Model stream outputs are faster than human verbal speed, can deploy on smartphones.
Supported Languages
Chinese (High proficiency), English (Moderate proficiency)
Training Details
Data Sources:
Open source corpus including ShareGPT
Methodology:
SFT (Supervised Fine Tuning) and DPO (Decentralized Proof of Output)
Hardware Used:
GPU 1080/2080 for efficient tuning, GPU 3090/4090 for full param tuning
Note: green Score (e.g. "73.2") means that the model is better than Goekdeniz-Guelmez/MiniCPM-2B-sft-fp32-safetensors.
Rank the MiniCPM 2B Sft Fp32 Safetensors Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.