Law Chat is an open-source language model by AdaptLLM. Features: 6.7b LLM, VRAM: 27GB, Context: 4K, License: llama2, Instruction-Based, HF Score: 52.9, LLM Explorer Score: 0.16, Arc: 53.4, HellaSwag: 76.2, MMLU: 50.2, TruthfulQA: 43.5, WinoGrande: 75.5, GSM8K: 18.5.
| Model Type |
| |||||||
| Use Cases |
| |||||||
| Additional Notes |
| |||||||
| Training Details |
| |||||||
| Release Notes |
|
| LLM Name | Law Chat |
| Repository ๐ค | https://huggingface.co/AdaptLLM/law-chat |
| Model Size | 6.7b |
| Required VRAM | 27 GB |
| Updated | 2026-04-10 |
| Maintainer | AdaptLLM |
| Model Type | llama |
| Instruction-Based | Yes |
| Model Files | |
| Supported Languages | en |
| Model Architecture | LlamaForCausalLM |
| License | llama2 |
| Context Length | 4096 |
| Model Max Length | 4096 |
| Transformers Version | 4.31.0.dev0 |
| Tokenizer Class | LlamaTokenizer |
| Beginning of Sentence Token | <s> |
| End of Sentence Token | </s> |
| Padding Token | <pad> |
| Unk Token | <unk> |
| Vocabulary Size | 32001 |
| Torch Data Type | float16 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| Law Chat GGUF | 25 | 571 | 2 GB |
| Law Chat GPTQ | 4 | 5 | 3 GB |
| Law Chat AWQ | 4 | 5 | 3 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Speechless Coder Ds 6.7B | 16K / 13.5 GB | 1110 | 7 |
| Magicoder S DS 6.7B | 16K / 27.1 GB | 4743 | 205 |
| ...s Coder6.7b Reflct Adamw Iter1 | 16K / 13.5 GB | 475 | 0 |
| ...Coder6.7b Reflct Rmsprop Iter1 | 16K / 13.5 GB | 95 | 0 |
| ...Coder6.7b Reflct Rmsprop Iter1 | 16K / 13.5 GB | 110 | 0 |
| ...r6.7b Pos Reflct Rmsprop Iter1 | 16K / 13.5 GB | 87 | 0 |
| ...ir4 Ds Coder6.7b Rmsprop Iter1 | 16K / 13.5 GB | 43 | 0 |
| ...r6.7b Pos Reflct Rmsprop Iter1 | 16K / 13.5 GB | 90 | 0 |
| ...Coder6.7b Reflct Rmsprop Iter1 | 16K / 13.5 GB | 62 | 0 |
| Ds Coder6.7b Rmsprop Iter1 | 16K / 13.5 GB | 67 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐