Replies: 2 comments 1 reply
-
Line 109 is when it calls PutObjectAsync: public static async Task WriteS3Async(DataSet ds, string destination, string fileName, CollectionConfig cfg)
{
if (ds.Tables.Contains("DBADash") && ds.Tables["DBADash"]!.Columns.Contains("S3Path"))
{
ds.Tables["DBADash"].Rows[0]["S3Path"] = destination;
}
DataSetSerialization.SetDateTimeKind(ds); // Required to prevent timezone conversion
var uri = new Amazon.S3.Util.AmazonS3Uri(destination);
using var s3Cli = await AWSTools.GetAWSClientAsync(cfg.AWSProfile, cfg.AccessKey, cfg.GetSecretKey(), uri);
var r = new Amazon.S3.Model.PutObjectRequest()
{
BucketName = uri.Bucket,
Key = (uri.Key + "/" + fileName).Replace("//", "/")
};
using var ms = new MemoryStream();
ds.WriteXml(ms, XmlWriteMode.WriteSchema);
r.InputStream = ms;
await s3Cli.PutObjectAsync(r);
} I'm not sure why you are getting the error. Is the powershell script using the same access/secret? DBA Dash would use the access/secret key in the service config if specified. If you have a credentials file instead, this would need to be created for the service account rather than for your own user. Using permissions assigned to the instance profile avoids the need to store any credentials. Double check the credentials. Also, the bucket should be specified in this format:
It can also be a good idea to include the region. e.g.
We use the S3 buckets feature within Trimble so I know it works. I suspect it's not that widely used outside of Trimble, but it can be incredibly useful to collect data from instances that have no direct connectivity to your repository database. Hope this helps. If you get it working let me know what the issue was. PS |
Beta Was this translation helpful? Give feedback.
-
Hi David,
I am new to AWS, it was the bucket format, I had it wrong:
https://bucket_name.s3.eu-west-2.amazonaws.com/DBADash
It is now uploading to the bucket.
I am setting it up for the reason you are describing, we need to collect
data from remote sites and have some insight on how it is
performing\configuration.
Now I want to import it, but this should be easy now.
Thanks for your help and quick response,
Wesley
… Message ID: <trimble-oss/dba-dash/repo-discussions/1361/comments/13154905@
github.com>
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi David,
I am testing/exploring your tool, thank you for your effort.
I am trying to setup a secondary destination in S3.
I created a general purpose bucket, created a user (added key/secret), added permission.
I tested it with powershell, i can connect and upload files.
then I added to the configuration: bucket + aws credentials.

I started the service, data is being collected and inserted into the database, but uploading to S3 is producing error:
any suggestions?
Wesley
Beta Was this translation helpful? Give feedback.
All reactions