Skip to content

Commit c710bfd

Browse files
authored
Merge pull request #176 from dbt-labs/release-0.2.5
New Release 0.2.5 with a few fixes
2 parents 6d72a6c + 6e7f8e1 commit c710bfd

File tree

6 files changed

+98
-88
lines changed

6 files changed

+98
-88
lines changed

CHANGELOG.md

+22-10
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,19 @@
22

33
All notable changes to this project will be documented in this file.
44

5-
## [Unreleased](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.2.4...HEAD)
5+
## [Unreleased](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.2.5...HEAD)
6+
7+
## [0.2.5](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.2.4...v0.2.5)
8+
9+
## Fixes
10+
11+
- [#172](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/172): Fix issue when changing the schedule of jobs from a list of hours to an interval in a [dbtcloud_job](https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest/docs/resources/job)
12+
- [#175](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/175): Fix issue when modifying the `environment_id` of an existing [dbtcloud_job](https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest/docs/resources/job)
13+
- [#154](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/154): Allow the creation of [Databricks connections](https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest/docs/resources/connection) using Service Tokens when it was only possible with User Tokens before
14+
15+
## Changes
16+
17+
- Use the `v2/users/<id>` endpoint to get the groups of a user
618

719
## [0.2.4](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.2.3...v0.2.4)
820

@@ -12,9 +24,9 @@ All notable changes to this project will be documented in this file.
1224

1325
## Changes
1426

15-
- [171](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/171) Add the ability to define which environment is the production one (to be used with cross project references in dbt Cloud)
16-
- Add guide on how to use the Hashicorp HTTP provider
17-
- [174](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/174) Add the ability to assign User groups to dbt Cloud users.
27+
- [#171](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/171) Add the ability to define which [environment](https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest/docs/resources/environment) is the production one (to be used with cross project references in dbt Cloud)
28+
- Add [guide](https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest/docs/guides/2_leveraging_http_provider) on how to use the Hashicorp HTTP provider
29+
- [#174](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/174) Add the ability to assign User groups to dbt Cloud users.
1830

1931
## [0.2.3](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.2.2...v0.2.3)
2032

@@ -25,27 +37,27 @@ All notable changes to this project will be documented in this file.
2537

2638
## Changes
2739

28-
- [164](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/164) Add the ability to define `priority` and `execution_project` for BigQuery connections
29-
- [168](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/168) Add the ability to set up email notifications (to internal users and external email addresses) based on jobs results
40+
- [164](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/164) Add the ability to define `priority` and `execution_project` for [BigQuery connections](https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest/docs/resources/bigquery_connection)
41+
- [168](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/168) Add the ability to set up [email notifications](https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest/docs/resources/notification) (to internal users and external email addresses) based on jobs results
3042

3143
## [0.2.2](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.2.1...v0.2.2)
3244

3345
## Fixes
3446

35-
- [156](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/156) Fix the `dbtcloud_connection` for Databricks when updating the `http_path` or `catalog` + add integration test
36-
- [157](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/157) Fix updating an environment with credentials already set + add integration test
47+
- [#156](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/156) Fix the `dbtcloud_connection` for Databricks when updating the `http_path` or `catalog` + add integration test
48+
- [#157](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/157) Fix updating an environment with credentials already set + add integration test
3749

3850
## Changes
3951

40-
- Add guide to get started with the provider
52+
- Add [guide](https://registry.terraform.io/providers/dbt-labs/dbtcloud/latest/docs/guides/1_getting_started) to get started with the provider
4153
- Add missing import and fix more docs
4254
- Update docs template to allow using Subcategories later
4355

4456
## [0.2.1](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.2.0...v0.2.1)
4557

4658
## Changes
4759

48-
- Resources deleted from dbt Cloud won't crash the provider and we now consider the resource as deleted, removing it from the state. This is the expected behaviour of a provider.
60+
- Resources deleted from dbt Cloud won't crash the provider and we now consider the resource as deleted, removing it from the state. This is the expected behavior of a provider.
4961
- Add examples in the docs to resources that didn't have any so far
5062

5163
## [0.2.0](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.1.12...v0.2.0)

pkg/dbt_cloud/connection.go

+26-18
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@ package dbt_cloud
22

33
import (
44
"encoding/json"
5-
"errors"
65
"fmt"
76
"net/http"
87
"strconv"
@@ -52,15 +51,16 @@ type ConnectionResponse struct {
5251
}
5352

5453
type Adapter struct {
55-
ID *int `json:"id,omitempty"`
56-
AccountID int `json:"account_id"`
57-
ProjectID int `json:"project_id"`
58-
CreatedByID int `json:"created_by_id"`
59-
Metadata AdapterMetadata `json:"metadata_json"`
60-
State int `json:"state"`
61-
AdapterVersion string `json:"adapter_version"`
62-
CreatedAt *string `json:"created_at,omitempty"`
63-
UpdatedAt *string `json:"updated_at,omitempty"`
54+
ID *int `json:"id,omitempty"`
55+
AccountID int `json:"account_id"`
56+
ProjectID int `json:"project_id"`
57+
CreatedByID *int `json:"created_by_id,omitempty"`
58+
CreatedByServiceTokenID *int `json:"created_by_service_token_id,omitempty"`
59+
Metadata AdapterMetadata `json:"metadata_json"`
60+
State int `json:"state"`
61+
AdapterVersion string `json:"adapter_version"`
62+
CreatedAt *string `json:"created_at,omitempty"`
63+
UpdatedAt *string `json:"updated_at,omitempty"`
6464
}
6565

6666
type AdapterMetadata struct {
@@ -104,9 +104,6 @@ func (c *Client) CreateConnection(projectID int, name string, connectionType str
104104
if connectionType == "adapter" {
105105
adapterId, err := c.createDatabricksAdapter(projectID, state)
106106
if err != nil {
107-
if strings.Contains(err.Error(), "This endpoint cannot be accessed with a service token") {
108-
return nil, errors.New("to create an adapter typed connection, you need to use a user token. A service token can not be used to create adapters")
109-
}
110107
return nil, err
111108
}
112109

@@ -211,17 +208,12 @@ func (c *Client) DeleteConnection(connectionID, projectID string) (string, error
211208
}
212209

213210
func (c *Client) createDatabricksAdapter(projectID int, state int) (*int, error) {
214-
currentUser, err := c.GetConnectedUser()
215-
if err != nil {
216-
return nil, err
217-
}
218211

219212
newAdapter := Adapter{
220213
ID: nil,
221214
AdapterVersion: "databricks_v0",
222215
ProjectID: projectID,
223216
AccountID: c.AccountID,
224-
CreatedByID: currentUser.ID,
225217
State: state,
226218
Metadata: AdapterMetadata{
227219
Title: "Databricks",
@@ -230,6 +222,22 @@ func (c *Client) createDatabricksAdapter(projectID int, state int) (*int, error)
230222
},
231223
}
232224

225+
currentUser, err := c.GetConnectedUser()
226+
if err != nil {
227+
// if GetConnectedUser is the following specific error, it means that the user is using a service token
228+
// as there is no way to get the current token ID, we always use 1
229+
if strings.Contains(err.Error(), "This endpoint cannot be accessed with a service token") {
230+
serviceTokenID := 1
231+
newAdapter.CreatedByServiceTokenID = &serviceTokenID
232+
} else {
233+
// if the error is different, return it
234+
return nil, err
235+
}
236+
} else {
237+
// if there is no error, the user is using a user token
238+
newAdapter.CreatedByID = &currentUser.ID
239+
}
240+
233241
newAdapterData, err := json.Marshal(newAdapter)
234242
if err != nil {
235243
return nil, err

pkg/dbt_cloud/user.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ func (c *Client) GetUser(email string) (*User, error) {
5353
}
5454

5555
func (c *Client) GetConnectedUser() (*User, error) {
56-
req, err := http.NewRequest("GET", fmt.Sprintf("%s/v2/whoami", c.HostURL), nil)
56+
req, err := http.NewRequest("GET", fmt.Sprintf("%s/v2/whoami/", c.HostURL), nil)
5757
if err != nil {
5858
return nil, err
5959
}

pkg/dbt_cloud/user_groups.go

+4-44
Original file line numberDiff line numberDiff line change
@@ -38,41 +38,8 @@ type AssignUserGroupsResponse struct {
3838
Status ResponseStatus `json:"status"`
3939
}
4040

41-
// the 2 following types are used to parse groups from /users/
42-
// they could be removed once we can call /users/{user_id}/
43-
type UserWithGroups struct {
44-
ID int `json:"id"`
45-
Email string `json:"email"`
46-
Groups []Permission `json:"permissions"`
47-
}
48-
49-
type UserListResponseWithGroups struct {
50-
Data []UserWithGroups `json:"data"`
51-
Status ResponseStatus `json:"status"`
52-
}
53-
5441
func (c *Client) GetUserGroups(userId int) (*UserGroupsCurrentAccount, error) {
55-
56-
// TODO: the current endpoint is not working for service tokens, we could use it when it is updated
57-
// in the meantime we use the /users/ endpoint and parse the groups from there
58-
59-
// req, err := http.NewRequest("GET", fmt.Sprintf("%s/v2/accounts/%s/users/%s/", c.HostURL, strconv.Itoa(c.AccountID), strconv.Itoa(userId)), nil)
60-
// if err != nil {
61-
// return nil, err
62-
// }
63-
64-
// body, err := c.doRequest(req)
65-
// if err != nil {
66-
// return nil, err
67-
// }
68-
69-
// userGroupsResponse := UserGroupsResponse{}
70-
// err = json.Unmarshal(body, &userGroupsResponse)
71-
// if err != nil {
72-
// return nil, err
73-
// }
74-
75-
req, err := http.NewRequest("GET", fmt.Sprintf("%s/v3/accounts/%s/users/?limit=1000", c.HostURL, strconv.Itoa(c.AccountID)), nil)
42+
req, err := http.NewRequest("GET", fmt.Sprintf("%s/v2/accounts/%s/users/%s/", c.HostURL, strconv.Itoa(c.AccountID), strconv.Itoa(userId)), nil)
7643
if err != nil {
7744
return nil, err
7845
}
@@ -82,23 +49,16 @@ func (c *Client) GetUserGroups(userId int) (*UserGroupsCurrentAccount, error) {
8249
return nil, err
8350
}
8451

85-
userListResponse := UserListResponseWithGroups{}
86-
err = json.Unmarshal(body, &userListResponse)
52+
userGroupsResponse := UserGroupsResponse{}
53+
err = json.Unmarshal(body, &userGroupsResponse)
8754
if err != nil {
8855
return nil, err
8956
}
9057

91-
userGroupsPermissions := []Permission{}
92-
for i, user := range userListResponse.Data {
93-
if user.ID == userId {
94-
userGroupsPermissions = userListResponse.Data[i].Groups
95-
}
96-
}
97-
9858
// the API returns the permissions for all accounts
9959
// we just want to get the ones of the current account
10060
userGroupsCurrentAccount := UserGroupsCurrentAccount{}
101-
for _, permission := range userGroupsPermissions {
61+
for _, permission := range userGroupsResponse.Data.Permissions {
10262
if permission.AccountID == c.AccountID {
10363
userGroupsCurrentAccount.Groups = append(userGroupsCurrentAccount.Groups, permission.Groups...)
10464
}

pkg/resources/job.go

+30-7
Original file line numberDiff line numberDiff line change
@@ -310,17 +310,37 @@ func resourceJobUpdate(ctx context.Context, d *schema.ResourceData, m interface{
310310
c := m.(*dbt_cloud.Client)
311311
jobId := d.Id()
312312

313-
if d.HasChange("name") || d.HasChange("dbt_version") || d.HasChange("num_threads") ||
314-
d.HasChange("target_name") || d.HasChange("execute_steps") || d.HasChange("run_generate_sources") ||
315-
d.HasChange("generate_docs") || d.HasChange("triggers") || d.HasChange("schedule_type") ||
316-
d.HasChange("schedule_interval") || d.HasChange("schedule_hours") || d.HasChange("schedule_days") ||
317-
d.HasChange("schedule_cron") || d.HasChange("deferring_job_id") || d.HasChange("self_deferring") ||
313+
if d.HasChange("project_id") ||
314+
d.HasChange("environment_id") ||
315+
d.HasChange("name") ||
316+
d.HasChange("dbt_version") ||
317+
d.HasChange("num_threads") ||
318+
d.HasChange("target_name") ||
319+
d.HasChange("execute_steps") ||
320+
d.HasChange("run_generate_sources") ||
321+
d.HasChange("generate_docs") ||
322+
d.HasChange("triggers") ||
323+
d.HasChange("schedule_type") ||
324+
d.HasChange("schedule_interval") ||
325+
d.HasChange("schedule_hours") ||
326+
d.HasChange("schedule_days") ||
327+
d.HasChange("schedule_cron") ||
328+
d.HasChange("deferring_job_id") ||
329+
d.HasChange("self_deferring") ||
318330
d.HasChange("timeout_seconds") {
319331
job, err := c.GetJob(jobId)
320332
if err != nil {
321333
return diag.FromErr(err)
322334
}
323335

336+
if d.HasChange("project_id") {
337+
projectID := d.Get("project_id").(int)
338+
job.Project_Id = projectID
339+
}
340+
if d.HasChange("environment_id") {
341+
envID := d.Get("environment_id").(int)
342+
job.Environment_Id = envID
343+
}
324344
if d.HasChange("name") {
325345
name := d.Get("name").(string)
326346
job.Name = name
@@ -375,9 +395,12 @@ func resourceJobUpdate(ctx context.Context, d *schema.ResourceData, m interface{
375395
}
376396
if len(d.Get("schedule_hours").([]interface{})) > 0 {
377397
job.Schedule.Time.Hours = &scheduleHours
398+
job.Schedule.Time.Type = "at_exact_hours"
399+
job.Schedule.Time.Interval = 0
400+
} else {
401+
job.Schedule.Time.Hours = nil
402+
job.Schedule.Time.Type = "every_hour"
378403
}
379-
job.Schedule.Time.Type = "at_exact_hours"
380-
job.Schedule.Time.Interval = 0
381404
}
382405
if d.HasChange("schedule_days") {
383406
scheduleDays := make([]int, len(d.Get("schedule_days").([]interface{})))

pkg/resources/job_acceptance_test.go

+15-8
Original file line numberDiff line numberDiff line change
@@ -122,10 +122,17 @@ resource "dbtcloud_environment" "test_job_environment" {
122122
type = "development"
123123
}
124124
125+
resource "dbtcloud_environment" "test_job_environment_new" {
126+
project_id = dbtcloud_project.test_job_project.id
127+
name = "DEPL %s"
128+
dbt_version = "%s"
129+
type = "deployment"
130+
}
131+
125132
resource "dbtcloud_job" "test_job" {
126133
name = "%s"
127134
project_id = dbtcloud_project.test_job_project.id
128-
environment_id = dbtcloud_environment.test_job_environment.environment_id
135+
environment_id = dbtcloud_environment.test_job_environment_new.environment_id
129136
dbt_version = "%s"
130137
execute_steps = [
131138
"dbt test"
@@ -145,7 +152,7 @@ resource "dbtcloud_job" "test_job" {
145152
schedule_hours = [9, 17]
146153
timeout_seconds = 180
147154
}
148-
`, projectName, environmentName, DBT_CLOUD_VERSION, jobName, DBT_CLOUD_VERSION)
155+
`, projectName, environmentName, DBT_CLOUD_VERSION, environmentName, DBT_CLOUD_VERSION, jobName, DBT_CLOUD_VERSION)
149156
}
150157

151158
func testAccDbtCloudJobResourceDeferringJobConfig(jobName, jobName2, jobName3, projectName, environmentName string, selfDeferring bool) string {
@@ -158,17 +165,17 @@ resource "dbtcloud_project" "test_job_project" {
158165
name = "%s"
159166
}
160167
161-
resource "dbtcloud_environment" "test_job_environment" {
168+
resource "dbtcloud_environment" "test_job_environment_new" {
162169
project_id = dbtcloud_project.test_job_project.id
163-
name = "%s"
170+
name = "DEPL %s"
164171
dbt_version = "%s"
165-
type = "development"
172+
type = "deployment"
166173
}
167174
168175
resource "dbtcloud_job" "test_job" {
169176
name = "%s"
170177
project_id = dbtcloud_project.test_job_project.id
171-
environment_id = dbtcloud_environment.test_job_environment.environment_id
178+
environment_id = dbtcloud_environment.test_job_environment_new.environment_id
172179
dbt_version = "%s"
173180
execute_steps = [
174181
"dbt test"
@@ -191,7 +198,7 @@ resource "dbtcloud_job" "test_job" {
191198
resource "dbtcloud_job" "test_job_2" {
192199
name = "%s"
193200
project_id = dbtcloud_project.test_job_project.id
194-
environment_id = dbtcloud_environment.test_job_environment.environment_id
201+
environment_id = dbtcloud_environment.test_job_environment_new.environment_id
195202
execute_steps = [
196203
"dbt test"
197204
]
@@ -207,7 +214,7 @@ resource "dbtcloud_job" "test_job_2" {
207214
resource "dbtcloud_job" "test_job_3" {
208215
name = "%s"
209216
project_id = dbtcloud_project.test_job_project.id
210-
environment_id = dbtcloud_environment.test_job_environment.environment_id
217+
environment_id = dbtcloud_environment.test_job_environment_new.environment_id
211218
execute_steps = [
212219
"dbt test"
213220
]

0 commit comments

Comments
 (0)