Skip to content
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 23 additions & 0 deletions scripts/policy_tag_extractor/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# BigQuery Policy Tag Extractor

## Introduction
This directory contains the [policy_tag_export.sh](policy_tag_extractor) bash script which extracts BigQuery policy tag information from a given dataset. The script will iterate through at most 10K tables in a dataset and then for every column with a policy tag, it will output the table name, column name, and policy tag ID in CSV format.

## Instructions for use
The simplest way to execute this script is to run it directly in Cloud Shell, but if needed it can be executed as part of a larger CI/CD pipeline or process.

Before using, make sure to update the bash script with the dataset that needs to be reviewed.

To exceute in Cloud Shell:
1. Start a new session in the GCP project where your BigQuery data resides
2. Open Cloud Shell
3. Upload policy_tag_export.sh to the Cloud Shell environment
4. Execute the script by running "bash policy_tag_export.sh"
5. List the resources in Cloud Shell (ls) and verify that a file called "policy_tags.csv" was created
6. Download the file

## Considerations
* Ensure either you or the service account executing the bash script has the bigquery.metadataViewer role to access the required level of information.
* The extractor can identify specific policy tags on columns, but is limited to the information available to the bq command line tool. In it's current state, this is the full policy tag identifier:

projects/<PROJECT_ID>/locations/<LOCATION>/taxonomies/<TAXONOMY_ID>/policyTags/<TAG_ID>
38 changes: 38 additions & 0 deletions scripts/policy_tag_extractor/policy_tag_export.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
#!/bin/bash

# Copyright 2024 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Prompt user for DATASET value if not set
if [ -z "$DATASET" ]; then
read -p "Enter the BigQuery dataset name: " DATASET
fi

#write all tables in a dataset to a reference TXT file
bq ls --max_results=10000 ${DATASET} | awk '{ print $1 }' | sed '1,2d' > table_list.txt

#loop through each table and export policy tags (if any) to a CSV
echo "Writing to CSV..."
while IFS= read -r TABLE; do
TAG_COUNT="`bq show --schema ${DATASET}.${TABLE} | grep "policyTags" | wc -l`"

if [ "${TAG_COUNT}" -ge 1 ]
then
COLUMN_AND_TAG=`bq show --format=prettyjson ${DATASET}.${TABLE} | jq '.schema.fields[] | select(.policyTags | length>=1)'`
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't handle RECORD type columns with nested policy tags. Can you either handle it in code or make an explicit callout in README that this script only handles simple column types.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@danieldeleo added a line to the Considerations section of the README calling this out. Will work on updating the code to handle nested tags in the future.

COLUMN=`echo $COLUMN_AND_TAG | jq '.name'`
TAG_ID=`echo $COLUMN_AND_TAG | jq '.policyTags.names[]'`
echo ${TABLE},${COLUMN},${TAG_ID} | tr -d '"'
fi
done < table_list.txt >> policy_tags.csv
echo "Done."