Medguanaco Lora 65B GPTQ is an open-source language model by nmitchko. Features: 65b LLM, VRAM: 0.3GB, License: cc, Quantized, LLM Explorer Score: 0.08.
May not perform effectively outside the medical domain, Primarily targets the knowledge level of medical students, Not tested in real-world applications, Not a substitute for a doctor's opinion
Considerations:
Must be treated as a research tool only
Additional Notes
The training data is still under development, with approximately 70% of question-answer pairs believed to be factually correct.
Supported Languages
en (English)
Training Details
Data Sources:
Anki flashcards, Wikidoc, StackExchange, ChatDoctor
Data Volume:
Multiple datasets with varying numbers of question-answer pairs
Methodology:
Used LoRA and quantization techniques
Model Architecture:
65B parameters, fine-tuned for medical domain tasks
Note: green Score (e.g. "73.2") means that the model is better than nmitchko/medguanaco-lora-65b-GPTQ.
Rank the Medguanaco Lora 65B GPTQ Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52758 in total.