Clinical Understanding Model V2.1 is an open-source language model by CodCodingCode. Features: 8.2b LLM, VRAM: 16.4GB, Context: 128K, LLM Explorer Score: 0.2.
| LLM Name | Clinical Understanding Model V2.1 |
| Repository 🤗 | https://huggingface.co/CodCodingCode/clinical_understanding_model-V2.1 |
| Model Size | 8.2b |
| Required VRAM | 16.4 GB |
| Updated | 2025-09-19 |
| Maintainer | CodCodingCode |
| Model Type | qwen3 |
| Model Files | |
| Model Architecture | Qwen3ForCausalLM |
| Context Length | 131072 |
| Model Max Length | 131072 |
| Transformers Version | 4.53.1 |
| Tokenizer Class | LlamaTokenizerFast |
| Padding Token | <|end▁of▁sentence|> |
| Vocabulary Size | 151936 |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| ...en 3 Panda Agi 3.3 DPO 8epochs | 80K / 16.4 GB | 71 | 0 |
| Diallm Qwen Sft Aus | 40K / 16.4 GB | 71 | 0 |
| Diallm Qwen Sft Brit | 40K / 16.4 GB | 63 | 0 |
| S V3 1ep | 40K / 16.4 GB | 51 | 0 |
| Diallm Qwen Sft All | 40K / 16.4 GB | 32 | 0 |
| Diallm Qwen Sft Ind | 40K / 16.4 GB | 23 | 0 |
| ... Hook Layer 9 Step 1000 Merged | 40K / 16.4 GB | 3758 | 0 |
| ...wen Hook Layer 9 Posneg Merged | 40K / 16.4 GB | 42 | 0 |
| ...b5 48d5 412d 9d91 A04b992fc59f | 32K / 32.7 GB | 15 | 0 |
| ...5d D08d 4249 868c A5c8617cf874 | 32K / 32.7 GB | 6 | 0 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟