-
Notifications
You must be signed in to change notification settings - Fork 0
feat: custom job add feature #187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
feat: changed endpoint parameter name from experiment_run_id -> model…
feat: developed function for adding custom jobs with institution and model validation
src/webapp/routers/data.py
Outdated
| id=job_run_id, | ||
| triggered_at=triggered_timestamp, | ||
| created_by=str_to_uuid(current_user.user_id), | ||
| batch_name="No batch name (manual custom school job)", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The batch_name will show up on the frontend right? I think having no batch name like this will create a null point or will it just show "No batch name"? Do you know?
| batch_name="No batch name (manual custom school job)", | ||
| output_filename=f"{job_run_id}/inference_output.csv", | ||
| model_id=query_result[0][0].id, | ||
| output_valid=False, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When is this set back to True?
vishpillai123
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The diff is a bit hard to read here because of the migration of all of the functions so just a heads up to put two PRs together next time!
Changes
NEW ENDPOINT for adding custom school jobs to the JobTable in Cloud SQL. It leverages existing endpoint functions (read_model and read_inst) to validate that the institution and model exist before writing. This same function is used on Databricks, but it can also be called directly from the FastAPI interface.
Minor fixes: I also moved the front end table endpoints to a new file, to ensure it's clearer for debugging and testing