How to access file in S3 bucket from lambda function - c#

I have a file a my S3 bucket and I want to access this file from a Lambda function.
When I pass the path of this file to one of the methods, I get the error:
Could not find a part of the path '/var/task/https:/s3.amazonaws.com/TestBucket/testuser/AWS_sFTP_Key.pem".
For example:
TestMethod("https://s3.amazonaws.com/TestBucket/testuser/AWS_sFTP_Key.pem")
code:
public void FunctionHandler(S3Event s3Event, ILambdaContext lambdaContext)
{
ConnectionInfo connectionInfo = new ConnectionInfo("xxx.xxx.xx.xxx", "testuser",
new AuthenticationMethod[]{
new PrivateKeyAuthenticationMethod("testuser", new PrivateKeyFile[] {
new PrivateKeyFile("https://s3.amazonaws.com/TestBucket/testuser/AWS_sFTP_Key.pem")})
});
SftpClient sftpClient = new SftpClient(connectionInfo);
sftpClient.Connect();
lambdaContext.Logger.Log(sftpClient.WorkingDirectory);
sftpClient.Disconnect();
}

You can use AWS SDK for reading the file from S3 as shown below, however I would suggest to use AWS Certificate Manager or IAM for storing and managing your certificates and keys:
PS: Make sure you assign the proper role for your lambda function or bucket policy for your bucket to be able to GetObject from S3:
RegionEndpoint bucketRegion = RegionEndpoint.USWest2;//region where you store your file
client = new AmazonS3Client(bucketRegion);
GetObjectRequest request = new GetObjectRequest();
request.WithBucketName(BUCKET_NAME);//TestBucket
request.WithKey(S3_KEY);//testuser/AWS_sFTP_Key.pem
GetObjectResponse response = client.GetObject(request);
StreamReader reader = new StreamReader(response.ResponseStream);
String content = reader.ReadToEnd();
More Help:
https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_server-certs.html
https://docs.aws.amazon.com/acm/latest/userguide/import-certificate.html
https://docs.aws.amazon.com/AmazonS3/latest/dev/RetrievingObjectUsingNetSDK.html

Related

AccountName property of BlobClient in .net 6 returns empty string

I'm using the following code to create an Azure blob client in C# using credentials.
When I try to retrieve the account name from the blobclient object it returns empty string, while if I use the connection string overload it returns the account name.
Any advise?
It is because of dfs. If you use blob instead it will work.
var uri = new Uri("https://mystorageaccountname.blob.core.windows.net/");
var builder = new BlobUriBuilder(uri, true);
Console.WriteLine(builder.AccountName); // mystorageaccountname
var uri2 = new Uri("https://mystorageaccountname.dfs.core.windows.net/");
var builder2 = new BlobUriBuilder(uri2, true);
Console.WriteLine(builder2.AccountName); // null
If you think that is a bug, you can create a bug report on GitHub.

Read Parquet file from Azure blob with out downloading it locally c# .net

We have a parquet formatfile (500 mb) which is located in Azure blob.How to read the file directly from blob and save in memory of c# ,say eg:Datatable.
I am able to read parquet file which is physically located in folder using the below code.
public void ReadParqueFile()
{
using (Stream fileStream = System.IO.File.OpenRead("D:/../userdata1.parquet"))
{
using (var parquetReader = new ParquetReader(fileStream))
{
DataField[] dataFields = parquetReader.Schema.GetDataFields();
for (int i = 0; i < parquetReader.RowGroupCount; i++)
{
using (ParquetRowGroupReader groupReader = parquetReader.OpenRowGroupReader(i))
{
DataColumn[] columns = dataFields.Select(groupReader.ReadColumn).ToArray();
DataColumn firstColumn = columns[0];
Array data = firstColumn.Data;
//int[] ids = (int[])data;
}
}
}
}
}
}
(I am able to read csv file directly from blob using sourcestream).Please kindly suggest a fastest method to read the parquet file directly from blob
Per my experience, the solution to directly read the parquet file from blob is first to generate the blob url with sas token and then to get the stream of HttpClient from the url with sas and finally to read the http response stream via ParquetReader.
First, please refer to the sample code below of the section Create a service SAS for a blob of the offical document Create a service SAS for a container or blob with .NET using Azure Blob Storage SDK for .NET Core.
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
// Get a reference to a blob within the container.
// Note that the blob may not exist yet, but a SAS can still be created for it.
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
if (policyName == null)
{
// Create a new access policy and define its constraints.
// Note that the SharedAccessBlobPolicy class is used both to define the parameters of an ad hoc SAS, and
// to construct a shared access policy that is saved to the container's shared access policies.
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
// When the start time for the SAS is omitted, the start time is assumed to be the time when the storage service receives the request.
// Omitting the start time for a SAS that is effective immediately helps to avoid clock skew.
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create
};
// Generate the shared access signature on the blob, setting the constraints directly on the signature.
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
Console.WriteLine("SAS for blob (ad hoc): {0}", sasBlobToken);
Console.WriteLine();
}
else
{
// Generate the shared access signature on the blob. In this case, all of the constraints for the
// shared access signature are specified on the container's stored access policy.
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
Console.WriteLine("SAS for blob (stored access policy): {0}", sasBlobToken);
Console.WriteLine();
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
Then to get the http response stream of HttpClient from the url with sas token .
var blobUrlWithSAS = GetBlobSasUri(container, blobName);
var client = new HttpClient();
var stream = await client.GetStreamAsync(blobUrlWithSAS);
Finally to read it via ParquetReader, the code comes from Reading Data of GitHub repo aloneguid/parquet-dotnet.
var options = new ParquetOptions { TreatByteArrayAsString = true };
var reader = new ParquetReader(stream, options);

AWS SDK for .NET. The authorization mechanism you have provided is not supported

I received the next error while sending the ListObjectRequest:
The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256
According to this answer, AmazonS3Config was updated in the following way:
var amazonS3Config = new AmazonS3Config
{
SignatureVersion = "4",
ServiceURL = bucketName,
RegionEndpoint = RegionEndpoint.USEast1,
SignatureMethod = SigningAlgorithm.HmacSHA256
};
var s3Client = new AmazonS3Client(accessKeyID, secretKey, amazonS3Config);
But I still receive this error. What have I missed here?
Thanks.
Try to use the last version of amazonS3 sdk.I think ServiceUrl is not necessary when you know regionEndpoint, I used it with private cloud amazonS3 and When I do not know the Region Endpoint. I can retrieve the information from Amazon using the following code.
var amazonS3Config = new AmazonS3Config();
// region of FrankPurt is : RegionEndpoint.EUCentral1
// according to amazonS3 Doc http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
amazonS3Config.RegionEndpoint = RegionEndpoint.USEast1;
var s3Client = new AmazonS3Client("your access key", "your secret key", amazonS3Config);
S3DirectoryInfo dir = new S3DirectoryInfo(s3Client, "your bucket name", "your folder path without bucket name");
Console.WriteLine(dir.GetFiles().Count());
By Using this I am able to work in EU west 2 region
AmazonS3Config config = new AmazonS3Config();
config.SignatureVersion = "4";
config.RegionEndpoint = Amazon.RegionEndpoint.GetBySystemName("eu-west-2");
config.SignatureMethod = Amazon.Runtime.SigningAlgorithm.HmacSHA256;
region should

AWS S3 .Net copy object that key contains dots at end

I use AWSSDK for .Net and my code for copy file is:
CopyObjectRequest request = new CopyObjectRequest()
{
SourceBucket = _bucketName,
SourceKey = sourceObjectKey,
DestinationBucket = _bucketName,
DestinationKey = targetObjectKey
};
CopyObjectResponse response = amazonS3Client.CopyObject(request);
The code work perfect for normal files but when i tried to copy file with file name like 'mage...' it getting the following error message:
The request signature we calculated does not match the signature you provided. Check your key and signing method.
Is there any way to copy object for that type of files?
I used the following C# code to copy files between S3 folders .
AmazonS3Config cfg = new AmazonS3Config();
cfg.RegionEndpoint = Amazon.RegionEndpoint.EUCentral1;//my bucket has this Region
string bucketName = "your bucket";
AmazonS3Client s3Client = new AmazonS3Client("your access key", "your secret key", cfg);
S3FileInfo sourceFile = new S3FileInfo(s3Client, bucketName, "FolderNameUniTest179/Test.test.test.pdf");
S3DirectoryInfo targetDir = new S3DirectoryInfo(s3Client, bucketName, "Test");
sourceFile.CopyTo(targetDir);
S3FileInfo sourceFile2 = new S3FileInfo(s3Client, bucketName, "FolderNameUniTest179/Test...pdf");
sourceFile2.CopyTo(targetDir);
I am using amazon AWSSDK.Core and AWSSDK.S3 version 3.1.0.0 for .net 3.5. I hope it can help you.

AWS .NET PutObjectRequest using wrong host

I am using the .NET library for Amazon Web Services for an application that uploads images to an Amazon S3 bucket. It is used in an internal service of an ASP.NET 4.5 application. The NuGet package name is AWSSDK and its version is the latest (as of writing) stable: 2.3.54.2
When I attempt to use the PutObject method on the PutObjectRequest object (to upload the image blob), it throws an exception and complains that the hostname is wrong.
var accessKey = Config.GetValue("AWSAccessKey");
var secretKey = Config.GetValue("AWSSecretKey");
using (var client = new AmazonS3Client(accessKey, secretKey, config))
{
var request = new PutObjectRequest();
request.BucketName = Config.GetValue("PublicBucket");
request.Key = newFileName;
request.InputStream = resizedImage;
request.AutoCloseStream = false;
using (var uploadTaskResult = client.PutObject(request))
{
using (var uploadStream = uploadTaskResult.ResponseStream)
{
uploadStream.Seek(0, SeekOrigin.Begin);
var resultStr = new StreamReader(uploadStream).ReadToEnd();
}
}
}
The exception details are as follows:
Fatal unhandled exception in Web API component: System.Net.WebException: The remote name could not be resolved: 'images.ourcompany.com.http'
at System.Net.HttpWebRequest.GetRequestStream(TransportContext& context)
at System.Net.HttpWebRequest.GetRequestStream()
at Amazon.S3.AmazonS3Client.getRequestStreamCallback[T](IAsyncResult result)
at Amazon.S3.AmazonS3Client.endOperation[T](IAsyncResult result)
at Amazon.S3.AmazonS3Client.EndPutObject(IAsyncResult asyncResult)
at Tracks.Application.Services.Bp.BpTemplateService.UploadImage(Byte[] image, String fileName) in ...
I have tried to debug this in VS by stepping through the code but AWSSDK doesn't come with debug symbols. It should be noted that the remote host name (or bucket name as I think Amazon calls them) is images.ourcompany.com (not our real company's name!). I have checked the value of Config.GetValue("PublicBucket") and it is indeed images.ourcompany.com. At this stage I have exhausted my limited knowledge about Amazon S3 and have no theories about what causes the exception to be thrown.
I think you have to add region endpoint or/and set ServiceUrl to establish connection to AmazonS3 check the similar question below:
Coping folder inside AmazonS3 Bucket (c#)
Upload images on Amazon S3. source code
AmazonS3Config cfg = new AmazonS3Config();
cfg.RegionEndpoint = Amazon.RegionEndpoint.SAEast1;//your region Endpoint
string butcketName = "yourBucketName";
AmazonS3Client s3Client = new AmazonS3Client("your access key",
"your secret key", cfg);
PutObjectRequest request = new PutObjectRequest()
{
BucketName = _bucket,
InputStream = stream,
Key = fullName
};
s3Client.PutObject(request);
or
AmazonS3Config asConfig = new AmazonS3Config()
{
ServiceURL = "http://irisdb.s3-ap-southeast2.amazonaws.com/",
RegionEndpoint = Amazon.RegionEndpoint.APSoutheast2
};

Categories

Resources