| Model Type | | multilingual, medical domain |
|
| Additional Notes | | Foundation model without instruction fine-tuning. |
|
| Supported Languages | | en (English), zh (Chinese), ja (Japanese), fr (French), ru (Russian), es (Spanish) |
|
| Training Details |
| Data Sources: | |
| Data Volume: | |
| Methodology: | | Further pretraining on MMedC |
|
| Context Length: | |
| Model Architecture: | |
|
| Input Output |
| Input Format: | |
| Accepted Modalities: | |
| Output Format: | |
| Performance Tips: | | Use transformers version 4.28.1 to avoid errors. |
|
|
| Release Notes |
| Version: | |
| Date: | |
| Notes: | | Initial release of MMedLM. |
|
| Version: | |
| Date: | |
| Notes: | | Release of MMedBench, a multilingual medical multi-choice question-answering benchmark. |
|
|
|