Jackalope 7B by openaccess-ai-collective

 ยป  All LLMs  ยป  openaccess-ai-collective  ยป  Jackalope 7B   URL Share it on

Jackalope 7B is an open-source language model by openaccess-ai-collective. Features: 7b LLM, VRAM: 14.4GB, Context: 32K, License: apache-2.0, HF Score: 61.2, LLM Explorer Score: 0.13, Arc: 63.4, HellaSwag: 83.3, MMLU: 63.5, TruthfulQA: 50.1, WinoGrande: 78.1, GSM8K: 28.7.

  Arxiv:2301.13688   Arxiv:2306.02707   Conversational Dataset:ldjnr/lesswrong-amplif...   Dataset:ldjnr/pure-dove   Dataset:ldjnr/verified-camel   Dataset:meta-math/metamathqa   Dataset:open-orca/openorca   Dataset:pygmalionai/pippa   Dataset:riddle sense   En   Endpoints compatible   Mistral   Pytorch   Region:us   Sharded

Jackalope 7B Benchmarks

Jackalope 7B (openaccess-ai-collective/jackalope-7b)
๐ŸŒŸ Advertise your project ๐Ÿš€

Jackalope 7B Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
research, multimodal applications
Applications:
chatbots, automated reasoning
Primary Use Cases:
multi-turn chat, text generation
Supported Languages 
en (English)
Training Details 
Data Sources:
Open-Orca/OpenOrca, LDJnr/LessWrong-Amplify-Instruct, LDJnr/Pure-Dove, LDJnr/Verified-Camel, PygmalionAI/PIPPA, meta-math/MetaMathQA, riddle_sense
Methodology:
Fine-tuning on SlimOrca dataset, multi-turn chat improved with OpenChat packing, trained with Axolotl
Training Time:
96 hours with 8x A6000 GPUs
Hardware Used:
8x A6000 GPUs
Input Output 
Input Format:
OpenAI's ChatML format with special tokens
Accepted Modalities:
text
Performance Tips:
Use 'tokenize=True' and 'return_tensors="pt"' for optimized performance
Release Notes 
Notes:
Release highlights the efficiency and improved multi-turn chat capabilities.
LLM NameJackalope 7B
Repository ๐Ÿค—https://huggingface.co/openaccess-ai-collective/jackalope-7b 
Model Size7b
Required VRAM14.4 GB
Updated2026-03-29
Maintaineropenaccess-ai-collective
Model Typemistral
Model Files  9.9 GB: 1-of-2   4.5 GB: 2-of-2
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.34.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32002
Torch Data Typebfloat16

Quantized Models of the Jackalope 7B

Model
Likes
Downloads
VRAM
Jackalope 7B GGUF72383 GB
Jackalope 7B GPTQ0174 GB
Jackalope 7B AWQ164 GB

Best Alternatives to Jackalope 7B

Best Alternatives
Context / RAM
Downloads
Likes
...Nemo Instruct 2407 Abliterated1000K / 24.5 GB25420
MegaBeam Mistral 7B 512K512K / 14.4 GB844453
SpydazWeb AI HumanAI RP512K / 14.4 GB141
SpydazWeb AI HumanAI 002512K / 14.4 GB181
...daz Web AI ChatML 512K Project512K / 14.5 GB120
MegaBeam Mistral 7B 300K282K / 14.4 GB377916
MegaBeam Mistral 7B 300K282K / 14.4 GB808216
Hebrew Mistral 7B 200K256K / 30 GB125115
Astral 256K 7B V2250K / 14.4 GB50
Astral 256K 7B250K / 14.4 GB50
Note: green Score (e.g. "73.2") means that the model is better than openaccess-ai-collective/jackalope-7b.

Rank the Jackalope 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52473 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a