Granite 4.0 H Tiny 3bit MLX by mlx-community

 ยป  All LLMs  ยป  mlx-community  ยป  Granite 4.0 H Tiny 3bit MLX   URL Share it on

Granite 4.0 H Tiny 3bit MLX is an open-source language model by mlx-community. Features: 868.7m LLM, VRAM: 3GB, Context: 128K, License: apache-2.0, Quantized.

  3-bit   3bit   Apple-silicon Base model:ibm-granite/granite... Base model:quantized:ibm-grani...   Conversational   En   Granite   Granitemoehybrid   Hybrid   Ibm   Instruct   Long-context   Mamba2   Mlx   Moe   Quantized   Region:us   Safetensors

Granite 4.0 H Tiny 3bit MLX Parameters and Internals

LLM NameGranite 4.0 H Tiny 3bit MLX
Repository ๐Ÿค—https://huggingface.co/mlx-community/granite-4.0-h-tiny-3bit-MLX 
Base Model(s)  ibm-granite/granite-4.0-h-tiny   ibm-granite/granite-4.0-h-tiny
Model Size868.7m
Required VRAM3 GB
Updated2026-03-27
Maintainermlx-community
Model Typegranitemoehybrid
Model Files  3.0 GB
Supported Languagesen
Quantization Type3bit
Model ArchitectureGraniteMoeHybridForCausalLM
Licenseapache-2.0
Context Length131072
Model Max Length131072
Transformers Version4.56.0
Tokenizer ClassGPT2Tokenizer
Padding Token<|pad|>
Vocabulary Size100352
Torch Data Typebfloat16

Rank the Granite 4.0 H Tiny 3bit MLX Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51648 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260327b