Description
Is your feature request related to a problem?
As described in the excellent documentation, executing inferences requires a destination table.
Not requiring a destination table would help execute queries more interactively and be simpler to the end users.
I could not find the explanation in the design documentation why it was not possible to provide it as an option.
FROM table_references
[WHERE where_condition]
[LIMIT row_count]
TO PREDICT result_table_reference
[WITH
attr_expr [, attr_expr ...]]
USING model_table_reference;
Describe the solution you'd like
TO PREDICT result_table_reference
--> [TO PREDICT result_table_reference]
optional
FROM table_references
[WHERE where_condition]
[LIMIT row_count]
[WITH
attr_expr [, attr_expr ...]]
USING model_table_reference;
and returns:
predicted_label column1 column2...
where predicted_label
comes from the model used.
If a user wants to save the predictions, it would add the TO PREDICT
class like today.
Describe alternatives
e.g. BigQuery ML does not require a destination table and a regular SELECT * works: https://cloud.google.com/bigquery-ml/docs/reference/standard-sql/bigqueryml-syntax-predict