Welcome! This document is intended for developers wishing to contribute to the Dynatrace Snowflake Observability Agent (DSOA).
Table of Contents:
- Setting up Development Environment
- Development Workflow
- Testing
- Writing Plugins
- Semantic Conventions
- Source Code Overview
- Pull Request Checklist
This guide helps developers who want to contribute to DSOA. If you only want to install and use it, see the installation guide.
Required tools:
- Python (3.9 or newer)
- Git
- Windows Users: WSL2 is required.
- Recommended IDE: VS Code with the Snowflake extension.
-
Clone the repository:
git clone https://github.com/dynatrace-oss/dynatrace-snowflake-observability-agent.git cd dynatrace-snowflake-observability-agent -
Run the setup script: This helper script installs system dependencies and sets up the Python virtual environment.
./scripts/deploy/setup.sh source .venv/bin/activate -
Install dependencies manually (if needed):
pip install -r requirements.txt
For Ubuntu/Debian:
sudo apt-get update
sudo apt-get install -y pango cairo gdk-pixbuf libffi pandoc
npm install -g prettierFor macOS (using Homebrew):
brew install pango cairo gdk-pixbuf libffi pandoc prettierThe source code is split into Python, SQL templates, and configuration files. You must build the agent to combine these into deployable artifacts.
Before deploying changes or running integration tests, run the build script. This compiles the Python code into Snowpark-compatible formats and assembles the SQL templates.
./scripts/dev/build.shWhat this does:
-
Compilation (
compile.sh):- Creates
_version.py - Pre-compiles
dtagent.otel.semantics.Semanticsto include metric semantics dictionary - Creates single files for both main stored procedures (
_dtagent.pyand_send_telemetry.py) - The
##INSERTdirective controls assembly order
- Creates
-
Building and embedding (
build.sh):- Creates default configuration file (
build/config-default.yml) - Copies SQL files from all
*.sqlfolders - Embeds compiled Python files into procedure templates
- Creates default configuration file (
-
Prepares all files for deployment in the
build/directory
Note: When Snowflake reports issues in stored procedures, line numbers correspond to _dtagent.py and _send_telemetry.py files.
If you change info.md, configuration files, instruments-def.yml, or add a new plugin, you must rebuild the documentation (PDFs and READMEs).
./scripts/dev/build_docs.shThis command will:
- Rebuild the agent
- Refresh
README.md - Generate
Dynatrace-Snowflake-Observability-Agent-$VERSION.pdf
Note: This requires pango, cairo, libffi, and prettier installed on your system. On macOS, you may need to set:
export WEASYPRINT_DLL_DIRECTORIES=/opt/homebrew/libTo create a distributable zip file (containing SQL scripts and docs) for sharing with other users:
./scripts/dev/package.shThis creates dynatrace_snowflake_observability_agent-$VERSION.$BUILD.zip containing everything necessary for distribution.
We use pytest for Python tests and bats for Bash script tests.
- Core Tests (
test/core/): Configuration, utilities, and view structure - OTel Tests (
test/otel/): OpenTelemetry integration - Plugin Tests (
test/plugins/): Individual plugin logic - Bash Tests (
test/bash/): Deployment and build scripts, including:- Custom object name replacement (
test_custom_object_names.bats) - Optional object filtering (
test_optional_objects.bats) - Configuration conversion (
test_convert_config_to_yaml.bats) - Deployment script utilities (
test_list_options_to_exclude.bats)
- Custom object name replacement (
For detailed information about each test suite, see:
-
Local Mode (Mocked):
- Runs without
test/credentials.yml - Does not connect to Snowflake/Dynatrace
- Useful for rapid logic testing
- Uses mocked APIs and example test data from
test/test_data/
- Runs without
-
Live Mode:
- Runs if
test/credentials.ymlexists - Connects to real Snowflake/Dynatrace endpoints
- Sends data to actual Dynatrace APIs
- Runs if
Run all Python tests:
pytestRun all bash tests:
./test/bash/run_tests.shRun specific test suites:
# Core tests
pytest test/core/
# OTel tests
pytest test/otel/
# Plugin tests
pytest test/plugins/
# Individual bash test
./test/bash/run_tests.sh test_custom_object_names.batsRun a specific plugin test: (e.g., for the 'budgets' plugin)
./scripts/dev/test.sh test_budgetsRegenerate test data (Pickles): If you modify a plugin's SQL logic, you may need to update the test data.
./scripts/dev/test.sh test_budgets -pTests use example test data from the test/test_data folder:
- Pickle (
*.pkl) files are used for test execution - ndJSON files are provided for reference only
- Test results are validated against expected data in
test_results
To run tests in live mode:
-
Create a test deployment with configuration in
conf/config-test.yml:core: dynatrace_tenant_address: abc12345.live.dynatrace.com deployment_environment: TEST log_level: DEBUG tag: "" procedure_timeout: 3600 snowflake: account_name: 'your_snowflake_account.us-east-1' host_name: 'your_snowflake_account.us-east-1.snowflakecomputing.com' resource_monitor: credit_quota: 1 otel: {} plugins: disabled_by_default: true
-
Create
test/credentials.yamlfrom thetest/credentials.template.ymltemplate. -
Generate
test/conf/config-download.ymlby running:PYTHONPATH="./src" pytest -s -v "test/core/test_config.py::TestConfig::test_init" --pickle_conf y
For local mode testing (mocked APIs), ensure test/conf/config-download.yml does NOT exist. A good practice is to temporarily disable these files by prefixing them with an underscore (e.g., _config-download.yml and _credentials.yml). The gitignore ensures files prefixed with underscore are not tracked.
- Open Test Explorer view (
Ctrl+Shift+P→ "Test: Focus on Test Explorer") - Click "Run All Tests" to execute all Python and bash tests
- Individual test suites can be run by expanding the test tree
For a comprehensive guide on how to write a plugin, including step-by-step instructions, complete examples, configuration details, and debugging tips, please refer to the Plugin Development Guide.
The Plugin Development Guide covers:
- Complete plugin structure and file organization
- Step-by-step plugin creation with working examples
- Python class implementation patterns
- SQL views, procedures, and task definitions
- Configuration and semantic dictionary setup
- Bill of Materials (BOM) documentation
- Comprehensive testing strategies
- Common patterns and advanced topics
- Troubleshooting and debugging tips
IMPORTANT: Before contributing code, ensure you understand these naming rules.
- Case: ALWAYS use
snake_case - Prefix: Custom fields SHOULD start with
snowflake. - Units: AVOID measurement units in names (e.g., use
duration, notduration_ms) - Boolean: Must use
is_orhas_prefix - No Suffix: DO NOT use
.countsuffix (it is implied for counters) - Structure: Use dots
.to denote object hierarchy (e.g.,snowflake.table.name) - No Objects: Avoid reporting raw OBJECT fields; expand them into specific metrics/attributes
- Existing Semantics: Use existing OpenTelemetry or Dynatrace semantics when they express the same field
- Dimensionality: DO NOT provide dimensionality information in field names
- Consistency: DO NOT produce different names of the same metric depending on the dimension set
- Extension Name: DO NOT add extension name or technology as part of field name
- Singular/Plural: Use singular and plural properly to reflect the field content
- Names: All Snowflake objects in
DTAGENT_DBmust be UPPERCASE (e.g.,DTAGENT_DB.APP.MY_VIEW). Lowercase names will not be caught during custom tag deployment, causing objects to initialize in the defaultDTAGENT_DB. - Permissions:
- Tables/Views: Grant
SELECTtoDTAGENT_VIEWER - Procedures: Grant
USAGEtoDTAGENT_VIEWER - Tasks: Grant
OWNERSHIPtoDTAGENT_VIEWER
- Tables/Views: Grant
- Safety: Procedures should include
EXCEPTIONhandling blocks - Boolean Returns: Avoid returning boolean values from stored procedures. Make them more descriptive (e.g., by specifying affected tables)
- Table Initialization: Avoid using
create or replace tablein stored procedures. Initialize tables before procedures and truncate them inside - Log Messages: For views reported as logs, include
_MESSAGEcolumn (automatically mapped tocontentfield)
OpenTelemetry defines 3 metric types:
- Counters: Values that accumulate (e.g., total requests)
- Gauges: Point-in-time measurements (e.g., current temperature)
- Histograms: Distributions for aggregation (e.g., request duration)
Important: Since Dynatrace API only recognizes counters and gauges, and there are no examples of counter metrics in current implementation, all metrics are currently sent as gauge until further improvements.
src/dtagent: Python source codesrc/dtagent.sql: Core SQL init scripts (Roles, DBs, Warehouses)src/dtagent.conf: Default configuration and core semanticssrc/dtagent/plugins: Source code for all plugins (Python + SQL + Config)src/dtagent/otel: Telemetry API Python codescripts/dev: Tools for building, compiling, and testingscripts/deploy: Tools for deploying the agent to a Snowflake accountscripts/tools: Utility scripts for config and dashboard format conversion
SQL files use three-digit prefixes to enforce execution order:
0xx- Core initialization + plugin-specific procedures, views70x- Core procedures80x- Task definitions90x- Plugin-specific update procedures
SQL scripts support annotation blocks for conditional inclusion:
--%PLUGIN:plugin_name:
-- Code included only when plugin is enabled
--%:PLUGIN:plugin_name
--%OPTION:option_name:
-- Code included only when optional component is enabled
--%:OPTION:option_nameSupported options:
dtagent_admin- Admin role code (excluded whencore.snowflake.roles.adminis"-")resource_monitor- Resource monitor code (excluded whencore.snowflake.resource_monitor.nameis"-")
Before submitting a PR, please ensure:
- You have run
./scripts/dev/build.shsuccessfully - You have added tests for any new functionality
- All tests pass locally (
pytestand./test/bash/run_tests.sh) - Documentation (
README.md,PLUGIN_DEVELOPMENT.md, etc.) is updated if needed - If adding a plugin,
instruments-def.ymlis defined and valid - Code follows the Semantic Conventions
- If changing SQL objects, all names are UPPERCASE
- If adding new semantic fields, they follow naming rules
- Documentation has been rebuilt (
./scripts/dev/build_docs.sh) if needed
