Vicuna 7B 1.1 by eachadea

 ยป  All LLMs  ยป  eachadea  ยป  Vicuna 7B 1.1   URL Share it on

  Autotrain compatible   Endpoints compatible   Llama   Pytorch   Region:us   Sharded
Model Card on HF ๐Ÿค—: https://huggingface.co/eachadea/vicuna-7b-1.1 

Vicuna 7B 1.1 Benchmarks

Vicuna 7B 1.1 (eachadea/vicuna-7b-1.1)
๐ŸŒŸ Advertise your project ๐Ÿš€

Vicuna 7B 1.1 Parameters and Internals

Model Type 
auto-regressive language model, transformer architecture
Use Cases 
Areas:
research on large language models and chatbots
Training Details 
Data Sources:
ShareGPT.com
Data Volume:
70K conversations
Training Time:
between March 2023 and April 2023
Model Architecture:
transformer
Release Notes 
Version:
v1.1
Notes:
Refactor the tokenization and separator. In Vicuna v1.1, the separator has been changed from "###" to the EOS token "~~". This change makes it easier to determine the generation stop criteria and enables better compatibility with other libraries. Fix the supervised fine-tuning loss computation for better model quality.
LLM NameVicuna 7B 1.1
Repository ๐Ÿค—https://huggingface.co/eachadea/vicuna-7b-1.1 
Model Size7b
Required VRAM13.5 GB
Updated2025-09-23
Maintainereachadea
Model Typellama
Model Files  10.0 GB: 1-of-2   3.5 GB: 2-of-2
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.29.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Vicuna 7B 1.1

Best Alternatives
Context / RAM
Downloads
Likes
A6 L1024K / 16.1 GB2010
A3.41024K / 16.1 GB130
A5.41024K / 16.1 GB120
A2.41024K / 16.1 GB120
M1024K / 16.1 GB1270
1571024K / 16.1 GB1010
1241024K / 16.1 GB930
1621024K / 16.1 GB600
2 Very Sci Fi1024K / 16.1 GB3170
1181024K / 16.1 GB150
Note: green Score (e.g. "73.2") means that the model is better than eachadea/vicuna-7b-1.1.

Rank the Vicuna 7B 1.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51539 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124