Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

get basic onnx export working #30

Open
wants to merge 24 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
c2b424d
Create notebooks
ChristopherGS Nov 2, 2019
52c727e
Unit Testing Production ML Code - Code Overview
ChristopherGS Nov 16, 2019
4babbec
Unit Testing Production ML Code - Test Preprocessing
ChristopherGS Nov 23, 2019
c2bca1e
Unit Testing Production ML Code - Test Config
ChristopherGS Nov 24, 2019
dc9bd0e
Unit Testing Production ML Code - Test Input Validation
ChristopherGS Nov 24, 2019
dcf9fe8
Unit Testing Production ML Code - Test Model Quality
ChristopherGS Nov 24, 2019
96f6372
Unit Testing Production ML Code - Add Tooling
ChristopherGS Nov 24, 2019
2238c3e
Integration Testing Production ML Code - Initial Setup
ChristopherGS Dec 1, 2019
be41c76
Integration Testing Production ML Code - Create Integration Tests
ChristopherGS Dec 1, 2019
4869936
Advanced Testing Production ML Code - Create Differential Tests
ChristopherGS Dec 7, 2019
17ab34e
Advanced Testing Production ML Code - Create Differential Tests In Do…
ChristopherGS Dec 8, 2019
ac8c3ab
Shadow Mode ML Code - Initial Setup
ChristopherGS Dec 15, 2019
29e4201
Shadow Mode ML Code - Implementation and Tests
ChristopherGS Dec 21, 2019
5e4a506
Shadow Mode ML Code - Asynchronous Implementation
ChristopherGS Dec 21, 2019
850ba34
Shadow Mode ML Code - Populate DB Script
ChristopherGS Dec 28, 2019
85a4884
Shadow Mode ML Code - Analyse Results (#18)
ChristopherGS Jan 4, 2020
a61cbb6
Monitoring with Prometheus - Basic Setup
ChristopherGS Jan 4, 2020
02e47f5
Monitoring with Prometheus - Add Simple Metrics
ChristopherGS Jan 4, 2020
cf92bf6
Monitoring with Prometheus - Setup Grafana
ChristopherGS Jan 4, 2020
b696eb9
Monitoring with Prometheus - Basic Infrastructure Metrics
ChristopherGS Jan 5, 2020
39d64d5
Monitoring with Prometheus - Instrument Project API
ChristopherGS Jan 5, 2020
21639fe
Monitoring with Prometheus - Build Grafana Dashboards for Model
ChristopherGS Jan 12, 2020
312340f
Monitoring Logs with the Elastic Stack - Basic ELK Setup
ChristopherGS Jan 12, 2020
0f38bd1
get basic onnx export working
ChristopherGS Jan 30, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
exercise_notebooks/*
*/env*
*/venv*
.circleci*
packages/gradient_boosting_model
*.env
*.log
.git
.gitignore
.dockerignore
*.mypy_cache
*.pytest_cache

### Python ###

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
23 changes: 23 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,7 @@ venv/
ENV/
env.bak/
venv.bak/
.tox/

# Spyder project settings
.spyderproject
Expand All @@ -102,3 +103,25 @@ venv.bak/

# mypy
.mypy_cache/

# pycharm
.idea/

# OSX
.DS_Store

# all logs
logs/

# training data
packages/gradient_boosting_model/gradient_boosting_model/datasets/*.csv
packages/gradient_boosting_model/gradient_boosting_model/datasets/*.txt
packages/gradient_boosting_model/gradient_boosting_model/datasets/*.zip

# trained models
packages/gradient_boosting_model/gradient_boosting_model/trained_models/*.pkl
*.h5

# differential test artifacts
packages/ml_api/differential_tests/expected_results/
packages/ml_api/differential_tests/actual_results/
Empty file added exercise_notebooks/.gitkeep
Empty file.
262 changes: 262 additions & 0 deletions exercise_notebooks/assessing_model_results.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,262 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Setup\n",
"\n",
"Make sure your virtualenv (use the same one as ml_api) is active, and all the imports below are installed\n",
"\n",
"Make sure your database docker container is running."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"import sys\n",
"from sqlalchemy import create_engine"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [],
"source": [
"# Add the ptdraft folder path to the sys.path list\n",
"# WINDOWS USERS: You will need to change the backslashes (TODO: check if this is true)\n",
"sys.path.append('../../')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"A reminder that SQLAlchemy DB URIs look like this:\n",
"`postgres+psycop2://myuser:[email protected]:5432/mydatabase`"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"['/Users/christophersamiullah/repos/testing-and-monitoring-ml-deployments/exercise_notebooks', '/Users/christophersamiullah/repos/advanced_ml_deployment_draft/packages/ml_api/differential_tests', '/Users/christophersamiullah/repos/advanced_ml_deployment_draft/packages/ml_api', '/usr/local/bin/python3.7', '/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python37.zip', '/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python3.7', '/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python3.7/lib-dynload', '', '/Users/christophersamiullah/repos/testing-and-monitoring-ml-deployments/packages/ml_api/env/lib/python3.7/site-packages', '/Users/christophersamiullah/repos/testing-and-monitoring-ml-deployments/packages/ml_api/env/lib/python3.7/site-packages/IPython/extensions', '/Users/christophersamiullah/.ipython', '../', '..', '../../']\n"
]
}
],
"source": [
"print(sys.path)"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [],
"source": [
"from packages.ml_api.api.config import DevelopmentConfig"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"postgresql+psycopg2://user:[email protected]:6609/ml_api_dev\n"
]
}
],
"source": [
"db_uri = DevelopmentConfig.SQLALCHEMY_DATABASE_URI\n",
"print(db_uri)"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {},
"outputs": [],
"source": [
"engine = create_engine(db_uri)"
]
},
{
"cell_type": "code",
"execution_count": 20,
"metadata": {},
"outputs": [],
"source": [
"lasso_model_df = pd.read_sql_table(\"regression_model_predictions\", con=engine)"
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [],
"source": [
"gradient_model_df = pd.read_sql_table(\"gradient_boosting_model_predictions\", con=engine)"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {},
"outputs": [],
"source": [
"combined_df = lasso_model_df.merge(gradient_model_df, how='outer')"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>id</th>\n",
" <th>user_id</th>\n",
" <th>datetime_captured</th>\n",
" <th>model_version</th>\n",
" <th>inputs</th>\n",
" <th>outputs</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>1</td>\n",
" <td>007</td>\n",
" <td>2019-12-21 10:15:22.064246+00:00</td>\n",
" <td>0.0.4444</td>\n",
" <td>[{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":...</td>\n",
" <td>[105437.16948684008]</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>34</td>\n",
" <td>007</td>\n",
" <td>2019-12-21 10:23:00.036615+00:00</td>\n",
" <td>0.0.4444</td>\n",
" <td>[{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":...</td>\n",
" <td>[105437.16948684008]</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>35</td>\n",
" <td>007</td>\n",
" <td>2019-12-21 11:07:31.731324+00:00</td>\n",
" <td>0.0.4444</td>\n",
" <td>[{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":...</td>\n",
" <td>[105437.16948684008]</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>36</td>\n",
" <td>007</td>\n",
" <td>2019-12-21 11:10:33.730662+00:00</td>\n",
" <td>0.0.4444</td>\n",
" <td>[{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":...</td>\n",
" <td>[105437.16948684008]</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>37</td>\n",
" <td>007</td>\n",
" <td>2019-12-21 11:10:34.408394+00:00</td>\n",
" <td>0.0.4444</td>\n",
" <td>[{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":...</td>\n",
" <td>[105437.16948684008]</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" id user_id datetime_captured model_version \\\n",
"0 1 007 2019-12-21 10:15:22.064246+00:00 0.0.4444 \n",
"1 34 007 2019-12-21 10:23:00.036615+00:00 0.0.4444 \n",
"2 35 007 2019-12-21 11:07:31.731324+00:00 0.0.4444 \n",
"3 36 007 2019-12-21 11:10:33.730662+00:00 0.0.4444 \n",
"4 37 007 2019-12-21 11:10:34.408394+00:00 0.0.4444 \n",
"\n",
" inputs outputs \n",
"0 [{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":... [105437.16948684008] \n",
"1 [{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":... [105437.16948684008] \n",
"2 [{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":... [105437.16948684008] \n",
"3 [{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":... [105437.16948684008] \n",
"4 [{\"1stFlrSF\": 896, \"2ndFlrSF\": 0, \"3SsnPorch\":... [105437.16948684008] "
]
},
"execution_count": 27,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"combined_df.head()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
26 changes: 26 additions & 0 deletions exercise_notebooks/docker_exercise/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
FROM python:3.7-alpine
WORKDIR /code

# Set env vars required by Flask
ENV FLASK_APP app.py
ENV FLASK_RUN_HOST 0.0.0.0

# Install gcc so Python packages such as MarkupSafe
# and SQLAlchemy can compile speedups.
RUN apk add --no-cache gcc musl-dev linux-headers

# copy local requirements.txt into container
# doing this separately from the main copy
# operation makes more efficient use of docker
# layer caching.
COPY requirements.txt requirements.txt

# install requirements inside the container
RUN pip install -r requirements.txt

# Copy the current directory . in the project
# to the workdir . in the image
COPY . .

# Set the default command for the container to flask run
CMD ["flask", "run"]
25 changes: 25 additions & 0 deletions exercise_notebooks/docker_exercise/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
import time

import redis
from flask import Flask

app = Flask(__name__)
cache = redis.Redis(host='redis', port=6379)


def get_hit_count():
retries = 5
while True:
try:
return cache.incr('hits')
except redis.exceptions.ConnectionError as exc:
if retries == 0:
raise exc
retries -= 1
time.sleep(0.5)


@app.route('/')
def hello():
count = get_hit_count()
return f'Hello World! I have been seen {count} times.\n'
8 changes: 8 additions & 0 deletions exercise_notebooks/docker_exercise/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
version: '3'
services:
web:
build: .
ports:
- "5000:5000"
redis:
image: "redis:alpine"
2 changes: 2 additions & 0 deletions exercise_notebooks/docker_exercise/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
flask>=1.1.1,<1.2.0
redis>=3.3.11,<3.4
Loading