Skip to content

Commit 83e602e

Browse files
KshitizGITPrikshit7766rads-b
authored
MKT-361: Update MedLLM Docs (#1818)
* add new Medical LLM models in documentation * Update Medical LLM documentation to correct the max sequence length for Medical-LLM-7B * Update Release_notes.md added link to blog for the VLM model * Update Release_notes.md typo --------- Co-authored-by: Prikshit7766 <prikshitsharma8024@gmail.com> Co-authored-by: rads-b <52103218+rads-b@users.noreply.github.com>
1 parent 3836cdc commit 83e602e

File tree

3 files changed

+28
-7
lines changed

3 files changed

+28
-7
lines changed

docs/en/LLMs/medical_llm.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ seotitle: Medical LLMs| John Snow Labs
55
title: Medical LLMs
66
permalink: /docs/en/LLMs/medical_llm
77
key: docs-medical-llm
8-
modify_date: "2025-03-31"
8+
modify_date: "2025-05-19"
99
show_nav: true
1010
sidebar:
1111
nav: medical-llm
@@ -20,14 +20,16 @@ Our models are designed to deliver best-in-class performance across a wide range
2020

2121
| **Model Name** | **Parameters** | **Recommended GPU Memory** | **Max Sequence Length** | **Model Size** | **Max KV-Cache** | **Tensor Parallel Sizes** |
2222
|----------------------------|------------|--------------|---------------------|------------|--------------|----------------------|
23-
| Medical-LLM-7B | 7B | ~25 GB | 16K | 14 GB | 11 GB | 1, 2, 4 |
23+
| Medical-LLM-7B | 7B | ~25 GB | 32K | 14 GB | 11 GB | 1, 2, 4 |
2424
| Medical-LLM-10B | 10B | ~35 GB | 32K | 19 GB | 15 GB | 1, 2, 4 |
2525
| Medical-LLM-14B | 14B | ~40 GB | 16K | 27 GB | 13 GB | 1, 2 |
2626
| Medical-LLM-24B | 24B | ~69 GB | 32K | 44 GB | 25 GB | 1, 2, 4, 8 |
2727
| Medical-LLM-Small | 14B | ~58 GB | 32K | 28 GB | 30 GB | 1, 2, 4, 8 |
2828
| Medical-LLM-Medium | 70B | ~452 GB | 128K | 131 GB | 320 GB | 4, 8 |
2929
| Medical-Reasoning-LLM-14B | 14B | ~58 GB | 32K | 28 GB | 30 GB | 1, 2, 4, 8 |
3030
| Medical-Reasoning-LLM-32B | 32B | ~222 GB | 128K | 61 GB | 160 GB | 2, 4, 8 |
31+
| Medical-VLM-24B | 24B | ~145 GB | 128K | 45 GB | 100 GB | 2, 4, 8 |
32+
| Spanish-Medical-LLM-24B | 24B | ~145 GB | 128K | 45 GB | 100 GB | 2, 4, 8 |
3133

3234

3335

docs/en/LLMs/on_prem_deployment.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ seotitle: Medical LLMs | John Snow Labs
55
title: On-premise Deployment
66
permalink: /docs/en/LLMs/on_prem_deploy
77
key: docs-medical-llm
8-
modify_date: "2024-03-31"
8+
modify_date: "2025-05-19"
99
show_nav: true
1010
sidebar:
1111
nav: medical-llm
@@ -52,14 +52,16 @@ The following models are currently available for on-premise deployments:
5252

5353
| **Model Name** | **Parameters** | **Recommended GPU Memory** | **Max Sequence Length** | **Model Size** | **Max KV-Cache** | **Tensor Parallel Sizes** |
5454
|----------------------------|------------|--------------|---------------------|------------|--------------|----------------------|
55-
| Medical-LLM-7B | 7B | ~25 GB | 16K | 14 GB | 11 GB | 1, 2, 4 |
55+
| Medical-LLM-7B | 7B | ~25 GB | 32K | 14 GB | 11 GB | 1, 2, 4 |
5656
| Medical-LLM-10B | 10B | ~35 GB | 32K | 19 GB | 15 GB | 1, 2, 4 |
5757
| Medical-LLM-14B | 14B | ~40 GB | 16K | 27 GB | 13 GB | 1, 2 |
5858
| Medical-LLM-24B | 24B | ~69 GB | 32K | 44 GB | 25 GB | 1, 2, 4, 8 |
5959
| Medical-LLM-Small | 14B | ~58 GB | 32K | 28 GB | 30 GB | 1, 2, 4, 8 |
6060
| Medical-LLM-Medium | 70B | ~452 GB | 128K | 131 GB | 320 GB | 4, 8 |
6161
| Medical-Reasoning-LLM-14B | 14B | ~58 GB | 32K | 28 GB | 30 GB | 1, 2, 4, 8 |
6262
| Medical-Reasoning-LLM-32B | 32B | ~222 GB | 128K | 61 GB | 160 GB | 2, 4, 8 |
63+
| Medical-VLM-24B | 24B | ~145 GB | 128K | 45 GB | 100 GB | 2, 4, 8 |
64+
| Spanish-Medical-LLM-24B | 24B | ~145 GB | 128K | 45 GB | 100 GB | 2, 4, 8 |
6365

6466

6567
*Note: All memory calculations are based on half-precision (fp16/bf16) weights. Recommended GPU Memory considers the model size and the maximum key-value cache at the model's maximum sequence length. These calculations follow the guidelines from [DJL's LMI Deployment Guide.](https://docs.djl.ai/master/docs/serving/serving/docs/lmi/deployment_guide/instance-type-selection.html)*

docs/en/LLMs/releases/Release_notes.md

Lines changed: 20 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,25 +5,42 @@ seotitle: Medical LLMs | John Snow Labs
55
title: Release Notes
66
permalink: /docs/en/LLMs/releases/release_notes
77
key: docs-medical-llm
8-
modify_date: "2025-03-31"
8+
modify_date: "2025-05-19"
99
show_nav: true
1010
sidebar:
1111
nav: medical-llm
1212
---
1313

1414
<div class="h3-box" markdown="1">
1515

16+
## 05-19-2025
17+
18+
We are excited to announce the addition of two new powerful models to our Medical LLM lineup.
19+
20+
- **Medical-VLM-24B**: A 24B parameter vision-language model that combines medical expertise with visual comprehension capabilities. This model excels at processing both medical images (X-rays, MRIs, pathology slides) and text, enabling comprehensive analysis of visual and textual medical data.
21+
22+
Get more information about the **Medical-VLM-24B** model in this [blog](https://www.johnsnowlabs.com/introducing-medical-vlm-24b-our-first-medical-vision-language-model/).
23+
24+
- **Spanish-Medical-LLM-24B**: A specialized 24B parameter model designed for Spanish-speaking healthcare environments, offering native processing of Spanish medical terminology and clinical documentation. The model maintains high precision in Spanish medical language understanding without requiring translation.
25+
26+
#### Specifications
27+
28+
| **Model Name** | **Parameters** | **Recommended GPU Memory** | **Max Sequence Length** | **Model Size** | **Max KV-Cache** | **Tensor Parallel Sizes** |
29+
|---------------------------|--------------|------------------|-------------------|-------------|----------------|------------------------|
30+
| Medical-VLM-24B | 24B | ~145 GB | 128K | 45 GB | 100 GB | 2, 4, 8 |
31+
| Spanish-Medical-LLM-24B | 24B | ~145 GB | 128K | 45 GB | 100 GB | 2, 4, 8 |
32+
1633
## 04-06-2025
1734

1835
**Welcome to John Snow Labs Medicall LLMs on premise deployments Documentation and Updates Hub!**
1936

20-
We are excited to announce the launch of the on premise deplyment of our Medical LLM models page, a centralized repository for all the latest features, enhancements, and resolutions of known issues within the. This dedicated space is designed to keep users informed of the most recent developments, enabling seamless testing and facilitating the provision of valuable feedback. Our commitment is to ensure that users have immediate access to the latest information, empowering them to leverage the full capabilities of out Medical LLM models effectively. Stay updated with us as we continue to improve and expand the functionalities of our Medical LLMsto meet and exceed your expectations.
37+
We are excited to announce the launch of the on premise deplyment of our Medical LLM models page, a centralized repository for all the latest features, enhancements, and resolutions of known issues within the. This dedicated space is designed to keep users informed of the most recent developments, enabling seamless testing and facilitating the provision of valuable feedback. Our commitment is to ensure that users have immediate access to the latest information, empowering them to leverage the full capabilities of out Medical LLM models effectively. Stay updated with us as we continue to improve and expand the functionalities of our Medical LLMs to meet and exceed your expectations.
2138

2239
### Supported Medical LLM Models
2340

2441
| **Model Name** | **Parameters** | **Recommended GPU Memory** | **Max Sequence Length** | **Model Size** | **Max KV-Cache** | **Tensor Parallel Sizes** |
2542
|----------------------------|------------|--------------|---------------------|------------|--------------|----------------------|
26-
| Medical-LLM-7B | 7B | ~25 GB | 16K | 14 GB | 11 GB | 1, 2, 4 |
43+
| Medical-LLM-7B | 7B | ~25 GB | 32K | 14 GB | 11 GB | 1, 2, 4 |
2744
| Medical-LLM-10B | 10B | ~35 GB | 32K | 19 GB | 15 GB | 1, 2, 4 |
2845
| Medical-LLM-14B | 14B | ~40 GB | 16K | 27 GB | 13 GB | 1, 2 |
2946
| Medical-LLM-24B | 24B | ~69 GB | 32K | 44 GB | 25 GB | 1, 2, 4, 8 |

0 commit comments

Comments
 (0)