Metharme 1.3B by PygmalionAI

 ยป  All LLMs  ยป  PygmalionAI  ยป  Metharme 1.3B   URL Share it on

  Autotrain compatible   En   Endpoints compatible   Gpt neox   Pytorch   Region:us   Safetensors

Metharme 1.3B Benchmarks

Metharme 1.3B (PygmalionAI/metharme-1.3b)
๐ŸŒŸ Advertise your project ๐Ÿš€

Metharme 1.3B Parameters and Internals

LLM NameMetharme 1.3B
Repository ๐Ÿค—https://huggingface.co/PygmalionAI/metharme-1.3b 
Model Size1.3b
Required VRAM2.9 GB
Updated2025-08-23
MaintainerPygmalionAI
Model Typegpt_neox
Model Files  2.9 GB   2.9 GB
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.30.0.dev0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50304
Torch Data Typebfloat16

Best Alternatives to Metharme 1.3B

Best Alternatives
Context / RAM
Downloads
Likes
SGPT 1.3B Insurance Epoch102K / 5.4 GB18361
...Ko Empathy Message Friend 1.3B2K / 5.4 GB50
Pgl Mtm1b 32K / 1 GB50
Pgl Mtm1b2K / 1.1 GB50
...olyglot Ko 1.3B Pretrained Asd2K / 5.4 GB250
KIT 1.3B2K / 5.4 GB32
...glot Ko 1.3B Ao Instruct V0.912K / 5.4 GB120
Pygmalion Free2K / 2.9 GB70
My Consulting Ai Model2K / 5.4 GB90
...Ko 1.3B Slim Orca 10000 Epoch22K / 2.7 GB50
Note: green Score (e.g. "73.2") means that the model is better than PygmalionAI/metharme-1.3b.

Rank the Metharme 1.3B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50836 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124