You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you need to handle many large files in your model registry, consider using a [Custom Transfer Agent](docs/docs_custom_transfer_agent.md) to replace the storage backend with AWS S3 or a similar service.
107
108
108
-
### 4. Start project
109
+
### 4. Construct a rack to start ML cellar
109
110
110
-
A project in `ml-cellar` is like creating shelves in a wine cellar.
111
-
If ML models are like wine, then project boxes are the shelves where you store the wine.
111
+
A rack in `ml-cellar` is like creating racks in a wine cellar.
112
+
If ML models are like wine, then racks are the shelves where you store the wine.
112
113
Just as wine is labeled with its vintage year, we attach versions to AI models.
113
114
114
-
- Decide directory pattern for
115
+
The rack consists the directory like belows:
115
116
116
-
Choose a directory pattern for `ml-cellar` from [the document](docs/docs_directory_pattern.md).
117
-
Just as every winery has its optimal shelf arrangement, consider the directory structure that fits your team structure and what you're developing.
117
+
```yaml
118
+
- model_registry_repository/
119
+
- vit-l/
120
+
- 0.1/
121
+
- config.yaml
122
+
- checkpoint.pth
123
+
- result.json
124
+
- log.log
125
+
- 0.2/
126
+
- ...
127
+
- 1.0/
128
+
- ...
129
+
- 1.1/
130
+
- ...
131
+
- vit-m/
132
+
- ...
133
+
```
134
+
135
+
In this directory, we call the ML model like `vit-l 1.1` for "Large size of Vision Transformer, version 1.1".
136
+
It is like vintage wine called by "Dom Pérignon Vintage 2010".
118
137
119
-
-Start project
138
+
- Make rack
120
139
121
140
```sh
122
-
ml-cellar project {project_path}
141
+
ml-cellar rack {path}
123
142
```
124
143
125
-
- If you run `ml-cellar project my_algorithm`, then you can see as below
144
+
- If you run `ml-cellar rack my_algorithm`, then you can see as below
- If you allow to save any kind of file, you set `"*"` for optional files.
136
157
137
158
```toml
138
-
[files]
139
-
required_files = [
140
-
"config.yml",
141
-
"log.log",
142
-
"result.json",
143
-
"*.pth",
144
-
]
145
-
optional_files = [
146
-
"*.png",
147
-
"*.mp4",
148
-
"*.onnx",
149
-
]
150
-
151
-
[document]
152
-
template_file = "template.md"
153
-
#result_file = "result.json"
159
+
[artifact]
160
+
required_files = ["config.yaml", "logs/*"]
161
+
optional_files = ["*"]
154
162
```
155
163
156
-
- Edit template.md
157
-
- If you want to setup template.md and result.json for MLOps, then please see [the document](docs/docs_template.md).
164
+
### 5. (Option) Set up projects for MLOps
165
+
166
+
- Start fine-tuning for projects from base model
167
+
168
+
If you use fixed dataset and focus on algorithm development like Kaggle competition or research usage, the simple ML cellar is fine to share various experimental results using single rack.
158
169
159
-
### 5. Commit ML model
170
+
However, if you are engineer and you have many projects you want to fine-tuning for, I recommend to use "project-based model registry".
171
+
Please see [the document for project-based model registry](docs/project_based_model_registry.md)
160
172
161
-
Let's actually stock AI models on the shelves
173
+
- Setup template.md and result.json for MLOps
174
+
175
+
If you want to setup template.md and result.json for MLOps, then please see [the document for template.md](docs/docs_template.md).
176
+
177
+
### 6. Commit ML model
178
+
179
+
Let's actually stock AI models on the racks
162
180
163
181
- Make branch and switch the branch
164
182
165
183
```sh
166
184
git sw -c feat/my_algorithm/0.1
167
185
```
168
186
169
-
- Make version directory and put files you want to commit.
187
+
- Make version directory and put files you want to commit for `{your_ml_registry}/{my_algorithm}/0.1/*`.
0 commit comments