You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG.md
+4Lines changed: 4 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,6 +9,10 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
9
9
10
10
[Please read through the Keep a Changelog (~5min)](https://keepachangelog.com/en/1.0.0/).
11
11
12
+
## [v0.0.4] - 2023-10-09
13
+
### Added
14
+
- Functionality to introduce an new option for event hub configuration. Namely a source_details option 'eventhub.accessKeySecretName' to properly construct the eh_shared_key_value properly. Without this option, there were errors while connecting to the event hub service (linked to [issue-13 - java.lang.RuntimeException: non-nullable field authBytes was serialized as null #13](https://github.com/databrickslabs/dlt-meta/issues/13))
15
+
12
16
## [v0.0.3] - 2023-06-07
13
17
### Fixed
14
18
- infer datatypes from sequence_by to __START_AT, __END_AT for apply changes API
5. Run integration test against cloudfile or eventhub or kafka using below options:
24
24
5a. Run the command for cloudfiles ```python integration-tests/run-integration-test.py --cloud_provider_name=aws --dbr_version=11.3.x-scala2.12 --source=cloudfiles --dbfs_path=dbfs:/tmp/DLT-META/```
25
25
26
-
5b. Run the command for eventhub ```python integration-tests/run-integration-test.py --cloud_provider_name=azure --dbr_version=11.3.x-scala2.12 --source=eventhub --dbfs_path=dbfs:/tmp/DLT-META/ --eventhub_name=iot --eventhub_secrets_scope_name=eventhubs_creds --eventhub_namespace=int_test-standard --eventhub_port=9093 --eventhub_producer_accesskey_name=producer --eventhub_consumer_accesskey_name=consumer```
26
+
5b. Run the command for eventhub ```python integration-tests/run-integration-test.py --cloud_provider_name=azure --dbr_version=11.3.x-scala2.12 --source=eventhub --dbfs_path=dbfs:/tmp/DLT-META/ --eventhub_name=iot --eventhub_secrets_scope_name=eventhubs_creds --eventhub_namespace=int_test-standard --eventhub_port=9093 --eventhub_producer_accesskey_name=producer --eventhub_producer_accesskey_secret_name=producer --eventhub_consumer_accesskey_name=consumer --eventhub_consumer_accesskey_secret_name=consumer```
27
27
28
28
For eventhub integration tests, the following are the prerequisites:
4. Provide databricks secret scope name : --eventhub_secrets_scope_name
38
38
5. Provide eventhub producer access key name : --eventhub_producer_accesskey_name
39
-
6. Provide eventhub access key name : --eventhub_consumer_accesskey_name
39
+
6. Provide eventhub consumer access key name : --eventhub_consumer_accesskey_name
40
+
7. Provide eventhub producer access key secret name : --eventhub_producer_accesskey_secret_name
41
+
8. Provide eventhub consumer access key secret name : --eventhub_consumer_accesskey_secret_name
40
42
41
43
42
44
5c. Run the command for kafka ```python3 integration-tests/run-integration-test.py --cloud_provider_name=aws --dbr_version=11.3.x-scala2.12 --source=kafka --dbfs_path=dbfs:/tmp/DLT-META/ --kafka_topic_name=dlt-meta-integration-test --kafka_broker=host:9092```
Copy file name to clipboardExpand all lines: docs/content/getting_started/metadatapreperation.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ draft: false
17
17
| data_flow_id | This is unique identifer for pipeline |
18
18
| data_flow_group | This is group identifer for launching multiple pipelines under single DLT |
19
19
| source_format | Source format e.g `cloudFiles`, `eventhub`, `kafka`, `delta`|
20
-
| source_details | This map Type captures all source details for cloudfiles = `source_schema_path`, `source_path_{env}`, `source_database` and for eventhub= `source_schema_path` , `eventhub.accessKeyName`, `eventhub.name` , `eventhub.secretsScopeName` , `kafka.sasl.mechanism`, `kafka.security.protocol`, `eventhub.namespace`, `eventhub.port`. For Source schema file spark DDL schema format parsing is supported <br> In case of custom schema format then write schema parsing function `bronze_schema_mapper(schema_file_path, spark):Schema` and provide to `OnboardDataflowspec` initialization <br> .e.g `onboardDataFlowSpecs = OnboardDataflowspec(spark, dict_obj,bronze_schema_mapper).onboardDataFlowSpecs()`|
20
+
| source_details | This map Type captures all source details for cloudfiles = `source_schema_path`, `source_path_{env}`, `source_database` and for eventhub= `source_schema_path` , `eventhub.accessKeyName`, `eventhub.accessKeySecretName`, `eventhub.name` , `eventhub.secretsScopeName` , `kafka.sasl.mechanism`, `kafka.security.protocol`, `eventhub.namespace`, `eventhub.port`. For Source schema file spark DDL schema format parsing is supported <br> In case of custom schema format then write schema parsing function `bronze_schema_mapper(schema_file_path, spark):Schema` and provide to `OnboardDataflowspec` initialization <br> .e.g `onboardDataFlowSpecs = OnboardDataflowspec(spark, dict_obj,bronze_schema_mapper).onboardDataFlowSpecs()`|
21
21
| bronze_database_{env} | Delta lake bronze database name. |
22
22
| bronze_table | Delta lake bronze table name |
23
23
| bronze_reader_options | Reader options which can be provided to spark reader <br> e.g multiline=true,header=true in json format |
parser.add_argument("--eventhub_producer_accesskey_name", help="Provide access key that has write permission on the eventhub e.g --eventhub_producer_accesskey_name=iotProducerAccessKey")
664
668
parser.add_argument("--eventhub_consumer_accesskey_name", help="Provide access key that has read permission on the eventhub e.g --eventhub_consumer_accesskey_name=iotConsumerAccessKey")
669
+
parser.add_argument("--eventhub_producer_accesskey_secret_name", help="Provide name of the secret that stores access key with write permission on the eventhub. Optional if same as `eventhub_producer_accesskey_name` e.g --eventhub_producer_accesskey_secret_name=iotProducerAccessKey")
670
+
parser.add_argument("--eventhub_consumer_accesskey_secret_name", help="Provide name of the secret that stores access key with read permission on the eventhub. Optional if same as `eventhub_consumer_accesskey_name` e.g --eventhub_consumer_accesskey_secret_name=iotConsumerAccessKey")
0 commit comments