Skip to content

Commit 4f4a1bc

Browse files
kalib-brayerLukaFontanillawaziersflaviojdzdbarrbc
authored
(perf) Filter construction improvements - Make It Smarter (#131)
* Feat make smarter (#91) * fix - pass in the parameters to the big query generate model * make it smater * merge * prevent duplicate triggering of generation * move logic into helper utils * fix state * fix the callbacks * use params instead of a string * add back sorting * fix link generation * remove some consoles * check the filters twice, make sure that we're returning a simpler data structure * fix race condition for page load * separate call for filter and visualization generation * fix typescript errors * add jest tests * update for edge cases when custom urls, and custom dates are in examples need to be handled. * Fix package json * more fixes post merge * fix the store * fix sidepanel * fix the link to explore * improve speed by parallelizing calls to gemini * fix summarization * scroll into view * add filter helper and tests * add tests * run on every commit * only use node 16 and 18 * use a filter validator * remove un-used console logs * post-merge fixes * fix tests * improve scrolling --------- Co-authored-by: Jawad Laraqui <[email protected]> Co-authored-by: Flavio Di Berardino <[email protected]> * Update useSendVertexMessage.ts * Fixed issue with parsing cleaned url Added case statement to handle both cleaned an uncleaned urls * Origin/feature generate examples (#1) * Added generate examples script and trusted dashboard table * Error handling * terraform bug fix * Handle url queryPrompt parameter * generate_exmples bug fix * Added number filter documentation * work in progress - append examples * working and tested new concat function. * tested * Update looker_filter_doc.md adding more context to filter doc on dates * Add files via upload Adding context file on timeframes and intervals * Update useSendVertexMessage.ts adding import and reference for context * Update ExploreFilterHelper.ts updating file * mostly working with new configs * working settings! * refactoring to use lookml queries * working with cloud function and new lookml model * Update useSendVertexMessage.ts reverted/removed inline date and number filter documentation in useSendVertexMessage * made settings admin-only and hide them for regular users. * committing timeframe filter logic * secure fetchProxy added * working with Bigquery! * Fixed problem with variability * remove indeterminacy, fix filter bug * more context with restored filters call * bug fix * bug fix * add back in filter mods * handle NOT NULL better * merge fixes * Update looker_filter_doc.md to help last x timeframe filters Adding more context to distinguish between last x days and x days ago type of filters. * rm trusted dashboards * work in progress on be installer * testing cloud shell script * testing cloud console run * testing * readme updated * test * testing * test * test * test * test * test * test * test * test * readme edits * updated readme * readme update * readme * adding changes * Fixed READMEs * restore 'query object format' text. * testing * testing * security setup * updates for security testing * error handling fixes and example script updates * readme updates * Revert "Merge branch 'marketplace_deploy' into make-it-smarter-bytecode" This reverts commit 954b73b, reversing changes made to cd7ee7e. * Remove 'check the response'. --------- Co-authored-by: Luka Fontanilla <[email protected]> Co-authored-by: Jawad Laraqui <[email protected]> Co-authored-by: Flavio Di Berardino <[email protected]> Co-authored-by: dbarrbc <[email protected]> Co-authored-by: Colin Roy-Ehri <[email protected]> Co-authored-by: colin-roy-ehri <[email protected]> Co-authored-by: carolbethperkins <[email protected]>
1 parent 27539e2 commit 4f4a1bc

39 files changed

+11799
-3837
lines changed

.DS_Store

8 KB
Binary file not shown.

.github/workflows/extension-tests.yml

+29
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
name: Explore Assistant Extension CI
2+
3+
on:
4+
push:
5+
branches:
6+
- '**' # This will run on pushes to any branch
7+
8+
jobs:
9+
test:
10+
runs-on: ubuntu-latest
11+
12+
strategy:
13+
matrix:
14+
node-version: [16.x, 18.x]
15+
16+
steps:
17+
- uses: actions/checkout@v3
18+
- name: Use Node.js ${{ matrix.node-version }}
19+
uses: actions/setup-node@v3
20+
with:
21+
node-version: ${{ matrix.node-version }}
22+
- name: Change to project directory
23+
run: cd explore-assistant-extension
24+
- name: Install dependencies
25+
working-directory: ./explore-assistant-extension
26+
run: npm ci
27+
- name: Run Unit tests
28+
working-directory: ./explore-assistant-extension
29+
run: npm test tests/unit_tests

explore-assistant-examples/.env

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
##Update the variables in this environment file to automate the bash scripts for loading & updating the examples
22

3-
PROJECT_ID="PROJECT_ID" ##Required. The Google Cloud project ID where your BigQuery dataset resides.
4-
DATASET_ID="DATASET_ID" ##The ID of the BigQuery dataset. Defaults to explore_assistant.
5-
EXPLORE_ID="MODEL:EXPLORE_ID" ##Required. A unique identifier for the dataset rows related to a specific use case or query (used in deletion and insertion).
3+
PROJECT_ID=seraphic-ripsaw-360618
4+
DATASET_ID=explore_assistant
5+
EXPLORE_ID=nabc:spins_nlp
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "code",
5+
"execution_count": null,
6+
"metadata": {
7+
"vscode": {
8+
"languageId": "plaintext"
9+
}
10+
},
11+
"outputs": [],
12+
"source": [
13+
"#Convert CSV to JSON\n",
14+
"import csv\n",
15+
"import json\n",
16+
"\n",
17+
"\n",
18+
"def csv_to_json(csv_file, json_file):\n",
19+
" \"\"\"Converts a CSV file to a JSON file.\n",
20+
"\n",
21+
"\n",
22+
" Args:\n",
23+
" csv_file: The path to the CSV file.\n",
24+
" json_file: The path to the output JSON file.\n",
25+
" \"\"\"\n",
26+
"\n",
27+
"\n",
28+
" data = []\n",
29+
" with open(csv_file, 'r') as csvfile:\n",
30+
" csvreader = csv.DictReader(csvfile)\n",
31+
" for row in csvreader:\n",
32+
" data.append(dict(row))\n",
33+
"\n",
34+
"\n",
35+
" with open(json_file, 'w') as jsonfile:\n",
36+
" json.dump(data, jsonfile, indent=4)\n",
37+
"\n",
38+
"\n",
39+
"\n",
40+
"\n",
41+
"# Example usage\n",
42+
"csv_file = 'DMi EA Prompts - Explore Assistant Order Details - Cleansed.csv'\n",
43+
"json_file = 'dmi_examples.json'\n",
44+
"csv_to_json(csv_file, json_file)\n",
45+
"print(f\"CSV converted to JSON: {json_file}\")"
46+
]
47+
}
48+
],
49+
"metadata": {
50+
"language_info": {
51+
"name": "python"
52+
}
53+
},
54+
"nbformat": 4,
55+
"nbformat_minor": 2
56+
}
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "code",
5+
"execution_count": null,
6+
"metadata": {
7+
"vscode": {
8+
"languageId": "plaintext"
9+
}
10+
},
11+
"outputs": [],
12+
"source": [
13+
"#Convert CSV to JSON\n",
14+
"import csv\n",
15+
"import json\n",
16+
"\n",
17+
"\n",
18+
"def csv_to_json(csv_file, json_file):\n",
19+
" \"\"\"Converts a CSV file to a JSON file.\n",
20+
"\n",
21+
"\n",
22+
" Args:\n",
23+
" csv_file: The path to the CSV file.\n",
24+
" json_file: The path to the output JSON file.\n",
25+
" \"\"\"\n",
26+
"\n",
27+
"\n",
28+
" data = []\n",
29+
" with open(csv_file, 'r') as csvfile:\n",
30+
" csvreader = csv.DictReader(csvfile)\n",
31+
" for row in csvreader:\n",
32+
" data.append(dict(row))\n",
33+
"\n",
34+
"\n",
35+
" with open(json_file, 'w') as jsonfile:\n",
36+
" json.dump(data, jsonfile, indent=4)\n",
37+
"\n",
38+
"\n",
39+
"\n",
40+
"\n",
41+
"# Example usage\n",
42+
"csv_file = '/Users/kalib/Downloads/NABC Examples - examples_cleansed.csv'\n",
43+
"json_file = 'nabc_examples.json'\n",
44+
"csv_to_json(csv_file, json_file)\n",
45+
"print(f\"CSV converted to JSON: {json_file}\")"
46+
]
47+
}
48+
],
49+
"metadata": {
50+
"language_info": {
51+
"name": "python"
52+
}
53+
},
54+
"nbformat": 4,
55+
"nbformat_minor": 2
56+
}

explore-assistant-examples/load_examples.sh

100644100755
+1-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
source .env
44
TABLE_ID="explore_assistant_examples" ##The ID of the BigQuery table where the data will be inserted. Set to explore_assistant_examples.
5-
JSON_FILE="examples.json" ##The path to the JSON file containing the data to be loaded. Set to examples.json.
5+
JSON_FILE="nabc_examples.json" ##The path to the JSON file containing the data to be loaded. Set to examples.json.
66

77
python load_examples.py \
88
--project_id $PROJECT_ID \

0 commit comments

Comments
 (0)