PULI GPT 3SX by NYTK

 ยป  All LLMs  ยป  NYTK  ยป  PULI GPT 3SX   URL Share it on

  Autotrain compatible   Endpoints compatible   Gpt neox   Hu   Puli   Pytorch   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/NYTK/PULI-GPT-3SX 

PULI GPT 3SX Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
PULI GPT 3SX (NYTK/PULI-GPT-3SX)
๐ŸŒŸ Advertise your project ๐Ÿš€

PULI GPT 3SX Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
research, commercial applications
Limitations:
maximum sequence length of 2048
Additional Notes 
Model is trained specifically for the Hungarian language.
Supported Languages 
Hungarian (native)
Training Details 
Data Sources:
Hungarian language sources
Data Volume:
36.3 billion words
Methodology:
Trained with EleutherAI's GPT-NeoX
Context Length:
2048
Model Architecture:
GPT-NeoX
Input Output 
Input Format:
Token IDs
Accepted Modalities:
text
Output Format:
Generated text
LLM NamePULI GPT 3SX
Repository ๐Ÿค—https://huggingface.co/NYTK/PULI-GPT-3SX 
Model Size6.9b
Required VRAM13.8 GB
Updated2025-08-23
MaintainerNYTK
Model Typegpt_neox
Model Files  5.0 GB: 1-of-3   5.0 GB: 2-of-3   3.8 GB: 3-of-3   9.9 GB: 1-of-2   3.9 GB: 2-of-2
Supported Languageshu
Model ArchitectureGPTNeoXForCausalLM
Licensecc-by-nc-4.0
Context Length2048
Model Max Length2048
Transformers Version4.37.0
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token<|endoftext|>
End of Sentence Token<|endoftext|>
Unk Token<|endoftext|>
Vocabulary Size50048
Torch Data Typefloat16
Errorsreplace

Quantized Models of the PULI GPT 3SX

Model
Likes
Downloads
VRAM
PULI GPT 3SX GPTQ4174 GB
PULI GPT 3SX GGML533 GB

Best Alternatives to PULI GPT 3SX

Best Alternatives
Context / RAM
Downloads
Likes
Pythia 6.9B Deduped 8K8K / 13.8 GB261
Ppo Tldr 6.9B2K / 13.8 GB50
...I Pythia 6.9B Deduped Sft Tldr2K / 27.4 GB80
Pythia 6.9B Deduped2K / 13.8 GB138258
...I Pythia 6.9B Deduped Sft Tldr2K / 13.8 GB2620
Pythia 6.9B2K / 13.8 GB2257556
Oasst Pythia 6.9B 4000 Steps2K / 13.8 GB19220
Pythia 6.9B HC32K / 27.7 GB192
Open Instruct Pythia 6.9B Tulu2K / 27.6 GB9995
...an Large Pythia 6.9B Dev Phase2K / 27.6 GB133
Note: green Score (e.g. "73.2") means that the model is better than NYTK/PULI-GPT-3SX.

Rank the PULI GPT 3SX Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50836 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124