You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: documentation/pages/integrations/siem.mdx
-57Lines changed: 0 additions & 57 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -80,63 +80,6 @@ Some example of configuration is provided in `example/`.
80
80
81
81
## SIEM configuration
82
82
83
-
### Azure Log analytics workspace
84
-
85
-
To send your Dashlane audit logs on Azure in a Log Analytics Workspace, you can use the template provided in the dashlane-audit-logs repository. The template will create a container instance that will automatically pull and run the Dashlane Docker image and send the logs in a **ContainerInstanceLog_CL** table in the Log Analytics Workspace of your choice. Before deploying the template you will have to provide:
86
-
87
-
- The location where you want your container to run (ex: "West Europe")
88
-
- Your Dashlane credentials
89
-
- Your Log Analytics Workspace ID and Shared Key
90
-
91
-
> **Click on the button to start the deployment**
92
-
>
93
-
> [](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FDashlane%2Fdashlane-audit-logs%2Fmain%2FAzureTemplates%2FLog%20Analytics%20Workspace%2Fazuredeploy.json)
94
-
95
-
### Azure blob storage
96
-
97
-
If you want to send your logs to an Azure storage account, you can use the deployment template we provide in the dashlane-audit-logs repository, which will:
98
-
99
-
- Create a storage account and a file share to upload a custom FluentBit configuration file
100
-
- Create a container instance running the Docker image with your custom file
101
-
102
-
You will need:
103
-
104
-
- Your Dashlane credentials
105
-
- A custom FluentBit configuration file
106
-
107
-
> **Click on the button to start the deployment**
108
-
>
109
-
> [](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FDashlane%2Fdashlane-audit-logs%2Fmain%2FAzureTemplates%2FBlob%20storage%2Fazuredeploy.json)
110
-
111
-
Once your container is deployed, copy the following configuration into a file called "fluent-bit.conf".
112
-
113
-
```
114
-
[INPUT]
115
-
Name stdin
116
-
Tag dashlane
117
-
118
-
[OUTPUT]
119
-
Name stdout
120
-
Match *
121
-
Format json_lines
122
-
123
-
[OUTPUT]
124
-
name azure_blob
125
-
match *
126
-
account_name ${STORAGE_ACCOUNT_NAME}
127
-
shared_key ${ACCESS_KEY}
128
-
container_name audit-logs
129
-
auto_create_container on
130
-
tls on
131
-
blob_type blockblob
132
-
```
133
-
134
-
Then upload in the storage account you just created. In the Azure Portal, go to **Storage accounts**, select the one you just created, go to **File shares**, select **fluentbit-configuration** and upload your configuration file.
135
-
136
-
> The "blob_type" configuration specifies to create a blob for every log entry on the storage account, which facilitates the logs manipulation for eventual post-processing treatment.
137
-
138
-
> The configuration provided above is meant to be working out of the box, but can be customized to suit your needs. You can refer to FluentBit's documentation to see all available options: https://docs.fluentbit.io/manual/pipeline/outputs/azure_blob
139
-
140
83
## Splunk
141
84
142
85
If you want to send your logs to Splunk, you need to create a HEC (HTTP Event Collector) on your Splunk instance. As an example, we will show here how to create one on a Splunk Cloud instance.
0 commit comments