Lm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0 by princeton-nlp

 ยป  All LLMs  ยป  princeton-nlp  ยป  Lm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0   URL Share it on

Lm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0 is an open-source language model by princeton-nlp. Features: 30b LLM, VRAM: 5.4GB, Context: 4K, LLM Explorer Score: 0.12.

  Autotrain compatible   Endpoints compatible   Llama   Region:us   Safetensors   Sharded   Tensorflow

Lm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Lm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0 (princeton-nlp/lm-1.3B-select_30B_tokens_by-facts_and_trivia-sample_with_temperature2.0)
๐ŸŒŸ Advertise your project ๐Ÿš€

Lm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0 Parameters and Internals

LLM NameLm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0
Repository ๐Ÿค—https://huggingface.co/princeton-nlp/lm-1.3B-select_30B_tokens_by-facts_and_trivia-sample_with_temperature2.0 
Model Size30b
Required VRAM5.4 GB
Updated2025-09-12
Maintainerprinceton-nlp
Model Typellama
Model Files  5.0 GB: 1-of-2   0.4 GB: 2-of-2
Model ArchitectureLlamaForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.35.0
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Torch Data Typefloat32

Best Alternatives to Lm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0

Best Alternatives
Context / RAM
Downloads
Likes
TildeOpen 30B64K / 60.9 GB3168158
Flash Llama 30M 2000132K / 0.1 GB3330
Smaug Slerp 30B V0.132K / 60.4 GB50
Tenebra 30B Alpha0116K / 65 GB15214
Llama33b 16K16K / 65.2 GB61
Yayi2 30B Llama4K / 121.2 GB92222
... Tokens By Perplexity Bottom K4K / 5.4 GB50
...lue Sample With Temperature2.04K / 5.4 GB80
... Tokens By Writing Style Top K4K / 5.4 GB50
Yayi2 30B Llama4K / 121.2 GB2522
Note: green Score (e.g. "73.2") means that the model is better than princeton-nlp/lm-1.3B-select_30B_tokens_by-facts_and_trivia-sample_with_temperature2.0.

Rank the Lm 1.3B Select 30B Tokens By Facts And Trivia Sample With Temperature2.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a