Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ snowflake.local.yml
output/**
log.*
*.log
credentials.json
*credentials*.json
*.pyc
build/*
package/*
Expand Down
3 changes: 3 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
"doesn't",
"dsoa",
"DTAGENT",
"Dynatrace",
"ECAC",
"ensurepath",
"externalbrowser",
Expand All @@ -62,6 +63,7 @@
"jsonstrip",
"judgements",
"Kamatchi",
"kvlist",
"LDATA",
"libcairo",
"libgdk",
Expand All @@ -78,6 +80,7 @@
"multitenancy",
"mysnowflake",
"Neev",
"openpipeline",
"pandoc",
"pango",
"parseable",
Expand Down
14 changes: 9 additions & 5 deletions INSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,11 +80,14 @@ The `$file_prefix` parameter is optional and can take multiple values:

You should store the Access Token for your Dynatrace tenant (to which you want to send telemetry from your environment) as the environment variable `DTAGENT_TOKEN`. The token should have the following scopes enabled:

* `logs.ingest`
* `metrics.ingest`
* `events.ingest`
* `bizevents.ingest`
* `openTelemetryTrace.ingest`
| Scope ID | Scope Name | Comment |
|-----------------------------|------------------------------|------------------------------|
| `logs.ingest` | Ingest Logs | |
| `metrics.ingest` | Ingest Metrics | |
| `bizevents.ingest` | Ingest BizEvents | |
| `openpipeline.events` | OpenPipeline - Ingest Events | |
| `openTelemetryTrace.ingest` | Ingest OpenTelemetry Traces | |
| `events.ingest` | Ingest Events | Not required version>=0.9.1 |

We **strongly** recommend to ensure your token is not recorded in shell script history; please find an example how to define `DTAGENT_TOKEN` environment variable on Linux or WSL below:

Expand All @@ -98,6 +101,7 @@ If you do not set the `DTAGENT_TOKEN` environment variable, or if it does not co

* The Dynatrace Snowflake Observability Agent deployment process **WILL NOT** send self-monitoring BizEvents to your Dynatrace tenant to mark the start and finish of the deployment process.
* The deployment process *will not be able* to set `DTAGENT_API_KEY` when deploying the complete configuration (`./deploy.sh $config_name`) or when updating just the API key (`./deploy.sh $config_name apikey`). In these cases, **YOU WILL** be prompted to provide the correct `DTAGENT_TOKEN` value during deployment.
* The deployment process *will not be able* to send BizEvents to your Dynatrace tenant to mark the start and finish of the deployment process.

No additional objects need to be provided for the deployment process on the Snowflake side. Dynatrace Snowflake Observability Agent will build a database to store his information - `DTAGENT_DB` by default or `DTAGENT_{TAG}_DB` if tag is provided (see [Multitenancy](#multitenancy)).

Expand Down
4 changes: 3 additions & 1 deletion compile.sh
Original file line number Diff line number Diff line change
Expand Up @@ -44,9 +44,11 @@ process_files() {
local src_file=$1
local dest_file=$2

echo "# pylint: disable=W0404, W0105, C0302", C0412, C0413 > "$dest_file"

gawk 'match($0, /[#]{2}INSERT (.+)/, a) {system("sed -e \"1,/##endregion COMPILE_REMOVE/d\" "a[1]); next } 1' "$src_file" |
sed -e '/##region.* IMPORTS/,/##endregion COMPILE_REMOVE/d' |
grep -v '# COMPILE_REMOVE' >"$dest_file"
grep -v '# COMPILE_REMOVE' >> "$dest_file"

if [[ "$OSTYPE" == "darwin"* ]]; then
sed -i '' -e '/dtagent/!b' -e '/import/d' "$dest_file"
Expand Down
4 changes: 3 additions & 1 deletion pytest.ini
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,6 @@
pythonpath = src
addopts = --ignore-glob=**/otel_*_test.py
log_cli = false
log_cli_level = DEBUG
log_cli_level = DEBUG
markers =
xdist_group: mark test to be executed in named group
9 changes: 7 additions & 2 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -27,10 +27,12 @@ toml
tzlocal
#%DEV:
streamlit
snowflake-snowpark-python
snowflake-snowpark-python==1.40.0
snowflake-connector-python[secure-local-storage]
opentelemetry-api==1.26.0
opentelemetry-sdk==1.26.0
opentelemetry-exporter-otlp-proto-http==1.26.0
opentelemetry-proto==1.26.0
snowflake-core==1.5.1
uuid
jsonstrip
Expand All @@ -40,7 +42,10 @@ inflect
markdown2
weasyprint
pyyaml
# github tests
pylint
flake8
pylint
black
sqlfluff
yamllint
#%:DEV
10 changes: 9 additions & 1 deletion src/dtagent.conf/otel-config.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,10 @@
{
"OTEL": {
"MAX_CONSECUTIVE_API_FAILS": 10,
"LOGS": {
"EXPORT_TIMEOUT_MILLIS": 10000,
"MAX_EXPORT_BATCH_SIZE": 100
},
"SPANS": {
"EXPORT_TIMEOUT_MILLIS": 10000,
"MAX_EXPORT_BATCH_SIZE": 50,
Expand All @@ -14,7 +18,11 @@
},
"EVENTS": {
"MAX_RETRIES": 5,
"RETRY_DELAY": 10000
"RETRY_DELAY_MS": 10000
},
"DAVIS_EVENTS": {
"MAX_RETRIES": 5,
"RETRY_DELAY_MS": 10000
},
"BIZ_EVENTS": {
"MAX_RETRIES": 5,
Expand Down
28 changes: 18 additions & 10 deletions src/dtagent/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,9 @@
from dtagent.otel.otel_manager import OtelManager
from dtagent.otel.spans import Spans
from dtagent.otel.metrics import Metrics
from dtagent.otel.events import Events
from dtagent.otel.bizevents import BizEvents
from dtagent.otel.events.generic import GenericEvents
from dtagent.otel.events.davis import DavisEvents
from dtagent.otel.events.bizevents import BizEvents
from dtagent.context import get_context_by_name
from dtagent.util import get_now_timestamp_formatted, is_regular_mode

Expand Down Expand Up @@ -103,7 +104,8 @@ def __init__(self, session: snowpark.Session) -> None:
self._spans = self._get_spans(resource)
self._metrics = self._get_metrics()
self._events = self._get_events()
self._bizevents = self._get_bizevents()
self._davis_events = self._get_davis_events()
self._biz_events = self._get_biz_events()
self._set_max_consecutive_fails()

def _get_spans(self, resource: Resource) -> Spans:
Expand All @@ -118,11 +120,15 @@ def _get_metrics(self) -> Metrics:
"""Returns new Metrics instance"""
return Metrics(self._instruments, self._configuration)

def _get_events(self) -> Events:
def _get_events(self) -> GenericEvents:
"""Returns new Events instance"""
return Events(self._configuration)
return GenericEvents(self._configuration)

def _get_bizevents(self) -> BizEvents:
def _get_davis_events(self) -> DavisEvents:
"""Returns new Events instance"""
return DavisEvents(self._configuration)

def _get_biz_events(self) -> BizEvents:
"""Returns new BizEvents instance"""
return BizEvents(self._configuration)

Expand All @@ -144,13 +150,15 @@ def report_execution_status(self, status: str, task_name: str, exec_id: str, det
"dsoa.task.exec.status": str(status),
}

self._bizevents.report_via_api(
context=get_context_by_name("self-monitoring"),
bizevents_sent = self._biz_events.report_via_api(
query_data=[data_dict | (details_dict or {})],
event_type="dsoa.task",
query_data=[data_dict if details_dict is None else data_dict | details_dict],
context=get_context_by_name("self-monitoring"),
is_data_structured=False,
)
self._bizevents.flush_events()
bizevents_sent += self._biz_events.flush_events()
if bizevents_sent == 0:
LOG.warning("Unable to report task execution status via BizEvents: %s", str(data_dict))

def _set_max_consecutive_fails(self):
OtelManager.set_max_fail_count(self._configuration.get("max_consecutive_api_fails", context="otel", default_value=10))
Expand Down
10 changes: 6 additions & 4 deletions src/dtagent/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,8 +78,10 @@
##INSERT src/dtagent/otel/spans.py
##INSERT src/dtagent/otel/metrics.py
##INSERT src/dtagent/otel/logs.py
##INSERT src/dtagent/otel/events.py
##INSERT src/dtagent/otel/bizevents.py
##INSERT src/dtagent/otel/events/__init__.py
##INSERT src/dtagent/otel/events/generic.py
##INSERT src/dtagent/otel/events/davis.py
##INSERT src/dtagent/otel/events/bizevents.py
##INSERT src/dtagent/plugins/*.py
##INSERT src/dtagent/__init__.py

Expand Down Expand Up @@ -109,7 +111,7 @@ def process(self, sources: List, run_proc: bool = True) -> Dict:
self.report_execution_status(status="STARTED", task_name=source, exec_id=exec_id)

if is_regular_mode(self._session):
self._session.query_tag = f"dsoa.version:{ str(VERSION) }.plugin:{ c_source.__name__ }.{ exec_id }"
self._session.query_tag = f"dsoa.version:{str(VERSION)}.plugin:{c_source.__name__}.{exec_id}"

if inspect.isclass(c_source):
#
Expand All @@ -123,7 +125,7 @@ def process(self, sources: List, run_proc: bool = True) -> Dict:
metrics=self._metrics,
configuration=self._configuration,
events=self._events,
bizevents=self._bizevents,
bizevents=self._biz_events,
).process(run_proc)
#
self.report_execution_status(status="FINISHED", task_name=source, exec_id=exec_id)
Expand Down
20 changes: 11 additions & 9 deletions src/dtagent/config.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
"""File contatning Configuration class and methods"""
"""File with Configuration class and methods"""

##region ------------------------------ IMPORTS -----------------------------------------
#
Expand Down Expand Up @@ -77,28 +77,29 @@ def __init__(self, session: snowpark.Session) -> Dict:
},
...
},
'dimesion_sets': {
'dimension_sets': {
'set1': [], ...
}
}
}
"""
from dtagent.otel.metrics import Metrics # COMPILE_REMOVE
from dtagent.otel.events import Events # COMPILE_REMOVE
from dtagent.otel.bizevents import BizEvents # COMPILE_REMOVE
from dtagent.otel.events.generic import GenericEvents # COMPILE_REMOVE
from dtagent.otel.events.davis import DavisEvents # COMPILE_REMOVE
from dtagent.otel.events.bizevents import BizEvents # COMPILE_REMOVE
from dtagent.otel.logs import Logs # COMPILE_REMOVE
from dtagent.otel.spans import Spans # COMPILE_REMOVE

def __rewrite_with_types(config_df: dict) -> dict:
"""
This function rewrites the pandas dataframe with config to a dict type and assigns desired types to fields.
List format in configuration table should be as follows to work properly:
List values must start with `[` and end with `]`, all fields must be seperated with `, `. Values within the list should be enclosed in double quotes (").
List values must start with `[` and end with `]`, all fields must be separated with `, `. Values within the list should be enclosed in double quotes (").
All items within the list must be the same type.
Args:
config_df (dict) - pandas dataframe with configuration table contents
Returns:
processed_dict (dict) - dictionary with refromatted field types
processed_dict (dict) - dictionary with reformatted field types
"""
import builtins

Expand Down Expand Up @@ -168,8 +169,9 @@ def __unpack_prefixed_keys(config: dict, prefix: Optional[str] = None) -> dict:
"logs.http": f"https://{config_dict['core.dynatrace_tenant_address']}{Logs.ENDPOINT_PATH}",
"spans.http": f"https://{config_dict['core.dynatrace_tenant_address']}{Spans.ENDPOINT_PATH}",
"metrics.http": f"https://{config_dict['core.dynatrace_tenant_address']}{Metrics.ENDPOINT_PATH}",
"events.http": f"https://{config_dict['core.dynatrace_tenant_address']}{Events.ENDPOINT_PATH}",
"bizevents.http": f"https://{config_dict['core.dynatrace_tenant_address']}{BizEvents.ENDPOINT_PATH}",
"events.http": f"https://{config_dict['core.dynatrace_tenant_address']}{GenericEvents.ENDPOINT_PATH}",
"davis_events.http": f"https://{config_dict['core.dynatrace_tenant_address']}{DavisEvents.ENDPOINT_PATH}",
"biz_events.http": f"https://{config_dict['core.dynatrace_tenant_address']}{BizEvents.ENDPOINT_PATH}",
"resource.attributes": Configuration.RESOURCE_ATTRIBUTES
| {
"service.name": _get_service_name(config_dict),
Expand All @@ -193,7 +195,7 @@ def get(
otel_module: Optional[str] = None,
default_value: Optional[Any] = None,
) -> any:
"""Returns configuraiton value for the given key in either given context or for the given plugin name
"""Returns configuration value for the given key in either given context or for the given plugin name

Args:
key (str): Configuration key for which to return value
Expand Down
Loading
Loading