Skip to content

Commit d96f13b

Browse files
javierdlrmSirOibaf
authored andcommitted
[HWORKS-1113] Add docs for attaching model evaluation images to a model (#362)
1 parent 93129fa commit d96f13b

File tree

4 files changed

+88
-0
lines changed

4 files changed

+88
-0
lines changed

docs/user_guides/mlops/registry/input_example.md

+4
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
---
2+
description: Documentation on how to attach an input example to a model.
3+
---
4+
15
# How To Attach An Input Example
26

37
## Introduction
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
---
2+
description: Documentation on how to attach model evaluation images to a model.
3+
---
4+
5+
# How To Save Model Evaluation Images
6+
7+
## Introduction
8+
9+
In this guide, you will learn how to attach ==model evaluation images== to a model. Model evaluation images are images that visually describe model performance metrics. For example, **confusion matrices**, **ROC curves**, **model bias tests**, and **training loss curves** are examples of common model evaluation images. By attaching model evaluation images to your versioned model, other users can better understand the model performance and evaluation metrics.
10+
11+
## Code
12+
13+
### Step 1: Connect to Hopsworks
14+
15+
```python
16+
import hopsworks
17+
18+
project = hopsworks.login()
19+
20+
# get Hopsworks Model Registry handle
21+
mr = project.get_model_registry()
22+
```
23+
24+
### Step 2: Generate model evaluation images
25+
26+
Generate an image that visualizes model performance and evaluation metrics
27+
28+
```python
29+
import seaborn
30+
from sklearn.metrics import confusion_matrix
31+
32+
# Predict the training data using the trained model
33+
y_pred_train = model.predict(X_train)
34+
35+
# Predict the test data using the trained model
36+
y_pred_test = model.predict(X_test)
37+
38+
# Calculate and print the confusion matrix for the test predictions
39+
results = confusion_matrix(y_test, y_pred_test)
40+
41+
# Create a DataFrame for the confusion matrix results
42+
df_confusion_matrix = pd.DataFrame(
43+
results,
44+
['True Normal', 'True Fraud'],
45+
['Pred Normal', 'Pred Fraud'],
46+
)
47+
48+
# Create a heatmap using seaborn with annotations
49+
heatmap = seaborn.heatmap(df_confusion_matrix, annot=True)
50+
51+
# Get the figure and display it
52+
fig = heatmap.get_figure()
53+
fig.show()
54+
```
55+
56+
### Step 3: Save the figure to a file inside the model directory
57+
58+
Save the figure to a file with a common filename extension (for example, .png or .jpeg), and place it in a directory called `images` - a subdirectory of the model directory that is registered to Hopsworks.
59+
60+
```python
61+
# Specify the directory name for saving the model and related artifacts
62+
model_dir = "./model"
63+
64+
# Create a subdirectory of model_dir called 'images' for saving the model evaluation images
65+
model_images_dir = model_dir + "/images"
66+
if not os.path.exists(model_images_dir):
67+
os.mkdir(model_images_dir)
68+
69+
# Save the figure to an image file in the images directory
70+
fig.savefig(model_images_dir + "/confusion_matrix.png")
71+
72+
# Register the model
73+
py_model = mr.python.create_model(name="py_model")
74+
py_model.save("./model")
75+
```
76+
77+
## Conclusion
78+
79+
In this guide you learned how to attach model evaluation images to a model, visually communicating the model performance and evaluation metrics in the model registry.

docs/user_guides/mlops/registry/model_schema.md

+4
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
---
2+
description: Documentation on how to attach a model schema to a model.
3+
---
4+
15
# How To Attach A Model Schema
26

37
## Introduction

mkdocs.yml

+1
Original file line numberDiff line numberDiff line change
@@ -182,6 +182,7 @@ nav:
182182
- Python: user_guides/mlops/registry/frameworks/python.md
183183
- Model Schema: user_guides/mlops/registry/model_schema.md
184184
- Input Example: user_guides/mlops/registry/input_example.md
185+
- Model Evaluation Images: user_guides/mlops/registry/model_evaluation_images.md
185186
- Model Serving:
186187
- user_guides/mlops/serving/index.md
187188
- Deployment:

0 commit comments

Comments
 (0)