File size: 5,165 Bytes
76d9c4f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
---
title: RunnerBase
version: EN
---

### RunnerBase

```python
RunnerBase()
```
Base class for model registering.

This base class introduces 5 static methods as followings:
- `predict`: Make prediction with given data and model. This method must be overridden. The
  data is given from the result of `preprocess_data`, and the return value of this method
  will be passed to `postprocess_data` before serving.
- `save_model`: Save the model into a file. Return value of this method will be given to the
  `load_model` method on model loading. If this method is overriden, `load_model` must be
  overriden as well.
- `load_model`: Load the model from a file.
- `preprocess_data`: Preprocess the data before prediction. It converts the API input data to
  the model input data.
- `postprocess_data`: Postprocess the data after prediction. It converts the model output data
  to the API output data.

Check each method's docstring for more information.


**Methods:**

## .load_model
```python
vessl.load_model(
   props: Union[Dict[str, str], None], artifacts: Dict[str, str]
)
```
Load the model instance from file.

`props` is given from the return value of `save_model`, and `artifacts` is
given from the `register_model` method.

If the `save_model` is not overriden, `props` will be None

**Args**
* `props` (dict | None) : Data that was returned by `save_model`. If `save_model` is
    not overriden, this will be None.
* `artifacts` (dict) : Data that is given by `register_model` function.

**Returns**
Model instance.
## .preprocess_data
```python
vessl.preprocess_data(
   data: InputDataType
)
```
Preprocess the given data.

The data processed by this method will be given to the model.

**Args**
* `data`  : Data to be preprocessed.

**Returns**
Preprocessed data that will be given to the model.
## .predict
```python
vessl.predict(
   model: ModelType, data: ModelInputDataType
)
```
Make prediction with given data and model.

**Args**
* `model` (model_instance) : Model instance.
* `data`  : Data to be predicted.

**Returns**
Prediction result.
## .postprocess_data
```python
vessl.postprocess_data(
   data: ModelOutputDataType
)
```
Postprocess the given data.

The data processed by this method will be given to the user.

**Args**
* `data`  : Data to be postprocessed.

**Returns**
Postprocessed data that will be given to the user.
## .save_model
```python
vessl.save_model(
   model: ModelType
)
```
Save the given model instance into file.

Return value of this method will be given to first argument of `load_model` on model loading.

**Args**
* `model` (model_instance) : Model instance to save.

**Returns**
(dict) Data that will be passed to `load_model` on model loading.
    Must be a dictionary with key and value both string.

----

## register_model
```python
vessl.register_model(
   repository_name: str, model_number: Union[int, None], runner_cls: RunnerBase,
   model_instance: Union[ModelType, None] = None, requirements: List[str] = None,
   artifacts: Dict[str, str] = None, **kwargs
)
```
Register the given model for serving. If you want to override the
default organization, then pass `organization_name` as `**kwargs`.

**Args**
* `repository_name` (str) : Model repository name.
* `model_number` (int | None) : Model number. If None, new model will be
    created. In such case, `model_instance` must be given.
* `runner_cls` (RunnerBase) : Runner class that includes code for serving.
* `model_instance` (ModelType | None) : Model instance. If None, `runner_cls`
    must override `load_model` method. Defaults to None.
* `requirements` (List[str]) : Python requirements for the model. Defaults to
    [].
* `artifacts` (Dict[str, str]) : Artifacts to be uploaded. Key is the path to
    artifact in local filesystem, and value is the path in the model
    volume. Only trailing asterisk(*) is allowed for glob pattern.
    Defaults to {}.

**Example**
* "model.pt", "checkpoints/*": "checkpoints/*"},
```python
register_model(
    repository_name="my-model",
    model_number=1,
    runner_cls=MyRunner,
    model_instance=model_instance,
    requirements=["torch", "torchvision"],
)
```

----

## register_torch_model
```python
vessl.register_torch_model(
   repository_name: str, model_number: Union[int, None], model_instance: ModelType,
   preprocess_data = None, postprocess_data = None, requirements: List[str] = None,
   **kwargs
)
```
Register the given torch model instance for model serving. If you want to
override the default organization, then pass `organization_name` as
`**kwargs`.

**Args**
* `repository_name` (str) : Model repository name.
* `model_number` (int | None) : Model number. If None, new model will be
    created.
* `model_instance` (model_instance) : Torch model instance.
* `preprocess_data` (callable) : Function that will preprocess data.
    Defaults to identity function.
* `postprocess_data` (callable) : Function that will postprocess data.
    Defaults to identity function.
* `requirements` (list) : List of requirements. Defaults to [].

**Example**
```python
vessl.register_model(
    repository_name="my-model",
    model_number=1,
    model_instance=model_instance,
)
```