I have a bucket on Amazon S3 and I have created IAM user Now I want to download private bucket file using temporary credential.
This is my bucket policy
{
"Id": "Policy1509026195925",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1509026179419",
"Action": [
"s3:GetObject"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::test-folder/*",
"Principal": {
"AWS": [
"arn:aws:iam::461567291450:user/john"
]
}
}
]
}
this is my c# .Net code
ServicePointManager.Expect100Continue = false;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
try
{
// In real applications, the following code is part of your trusted code. It has
// your security credentials you use to obtain temporary security credentials.
AmazonSecurityTokenServiceConfig config = new AmazonSecurityTokenServiceConfig();
AmazonSecurityTokenServiceClient stsClient =
new AmazonSecurityTokenServiceClient(config);
GetFederationTokenRequest federationTokenRequest =
new GetFederationTokenRequest();
federationTokenRequest.Name = "testuser";
// federationTokenRequest.Policy = "Policy1509026195925";
federationTokenRequest.DurationSeconds = 7200;
GetFederationTokenResponse federationTokenResponse = stsClient.GetFederationToken(federationTokenRequest);
//FederatedUser federationTokenResult = federationTokenResponse.;
Credentials credentials = federationTokenResponse.Credentials;
SessionAWSCredentials sessionCredentials =
new SessionAWSCredentials(credentials.AccessKeyId,
credentials.SecretAccessKey,
credentials.SessionToken);
// The following will be part of your less trusted code. You provide temporary security
// credentials so it can send authenticated requests to Amazon S3.
// Create Amazon S3 client by passing in the basicSessionCredentials object.
AmazonS3Client s3Client = new AmazonS3Client(sessionCredentials, Amazon.RegionEndpoint.USEast1);
// Test. For example, send list object keys in a bucket.
ListObjectsRequest listObjectRequest = new ListObjectsRequest();
listObjectRequest.BucketName = bucketName;
ListObjectsResponse response = s3Client.ListObjects(listObjectRequest);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
Every time when I run the code I got Access denied message. Why? How to download the bucket file using Temporary credential?
You can try something like :
assumeRoleResult = AssumeRole(role-arn);
tempCredentials = new SessionAWSCredentials(
assumeRoleResult.AccessKeyId,
assumeRoleResult.SecretAccessKey,
assumeRoleResult.SessionToken);
s3Request = CreateAmazonS3Client(tempCredentials);
You need to to call AssumeRole to get temporary security credentials, and then use those credentials to make a call to Amazon S3, see Switching to an IAM Role (API).
Refer : Using Temporary Security Credentials with the AWS SDKs
Related
I followed this article but I modified the sample code a bit to use GetContextAsync instead of GetAzureADAppOnlyAuthenticatedContext, I get the error "The remote server returned an error: (401) Unauthorized." every single time.
AuthenticationManager authManager = new AuthenticationManager(clientId, certPath,certPassword,tenantId);
using (ClientContext cc = await authManager.GetContextAsync(_siteUrl))
{
cc.Load(cc.Web, p => p.Title);
await cc.ExecuteQueryAsync();
Console.WriteLine(cc.Web.Title);
}
The error is thrown at await cc.ExecuteQueryAsync();
I have uploaded the self-signed certificate onto Azure portal
and granted the permission
My app is a Winforms app using .NET framework 4.7
PnP.Framework 1.11
P/s: What do I enter for the tenantId param? At the moment I'm using the Directory (tenant) ID from the Overview page
I have tried to reproduce the same in my environment.
Created self signed certificate
It is then being uploaded to my app registration
Checked the manifest to confirm
"keyCredentials": [
{
"customKeyIdentifier": "xxx",
"endDate": "2023-01-31T00:00:00Z",
"keyId": "xxx",
"startDate": "2023-01-23T00:00:00Z",
"type": "AsymmetricX509Cert",
"usage": "Verify",
"value": "xxx",
"displayName": "my new ssc"
}
using postman
code given:
using System;
using PnP.Framework;
//using OfficeDevPnP.Core;
//using Microsoft.SharePoint;
//using Microsoft.SharePoint.Client;
using System.Threading.Tasks;
namespace spoapprepo
{
class Program
{
static async void Main(string[] args)
{
var clientId = "xxx";
var certPath = "C:\\xxx\\selfsigned.pfx";
var certPassword = "xxx";
var tenantId = "xxx";
var siteUrl= "https://contoso.sharepoint.com";
AuthenticationManager authManager = new AuthenticationManager(clientId, certPath, certPassword, tenantId);
Try{
using (var cc = await authManager.GetContextAsync(siteUrl))
{
cc.Load(cc.Web, p => p.Title);
await cc.ExecuteQueryAsync();
Console.WriteLine(cc.Web.Title);
}
Console.WriteLine("Hello World!");
}
}
}
}
catch(ex)
{
Ex.message();
}
Try using if-else and try-catch block to catch the exact error.
As getcontext uses current users credentials , but here we are intended to use app only context.
If users login is not having privileges to access the sharepoint or when entered wrong Login details , the 401 unauthorized usually occurs.
If the user profile needs to be read , it needs user.read permission.
But note the limitation Accessing SharePoint using an application context, also known as app-only | Microsoft Learn here.
User Profile CSOM write operations do not work with Azure AD application Only read operations work.
For writing you need to user to login, use SharePoint App-Only principal
Reference : azure active directory - SharePoint PnP AuthenticationManager login with current user - Stack Overflow
Uploading some txt files from a local folder to a specific FTP address (I'm using this, ftp://ftpint/sales/to_system/) is one of my daily routines. I'm using ZappySys for automate this routine, but my company doesn't want to use it anymore, so i think WinSCP could be a good option.
I've installed WinSCP 5.19 & .NET assembly and followed the instructions from this link, https://winscp.net/eng/docs/library_ssis. But I think WinSCP can't recognize my FTP link. Here's my C# code, any suggestions? Thank you.
using System;
using WinSCP;
class Example
{
public static int Main()
{
try
{
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Sftp,
HostName = "xxx",
UserName = "xxx",
Password = "xxx",
SshHostKeyFingerprint = "SHA-256 xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx"
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Upload files
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
TransferOperationResult transferResult =
session.PutFiles(#"C:\Users\Diomedas\test\*", "ftp://ftpint/sales/to_system/", false, transferOptions);
// Throw on any error
transferResult.Check();
// Print results
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
Console.WriteLine("Upload of {0} succeeded", transfer.FileName);
}
}
return 0;
}
catch (Exception e)
{
Console.WriteLine("Error: {0}", e);
return 1;
}
}
}
The remotePath argument of Session.PutFiles is a remote path. Not any URL. So it should be like:
session.PutFiles(#"C:\Users\Diomedas\test\*", "/sales/to_system/", false, transferOptions);
You have already specified the hostname in SessionOptions.HostName. No point trying to repeat that information.
Your protocols do not match. You have specified Protocol.Sftp in SessionOptions.Protocol, while your URL has ftp://. Make sure you know what is the actual protocol of your server.
WinSCP GUI can generate full working code (including the upload part) for you.
I am trying to write a lambda to run FFMPEG in an AWS lambda. I have done this before at another workplace so I know that it is possible.
The log shows that I make a Temporary URL to read the file in, it processes with FFMPEG, I pipe the output to a byte[] which is showing data in it, and then I try to do an S3 PutObjectRequest, which always fails with this message:
The request signature we calculated does not match the signature you provided. Check your key and signing method.
The S3 client is the same credentials that I use automated all the time to upload files to S3 from different servers at different locations of our company. I have tried a couple different IAMs to no effect.
I am not trying to do any sort of signature whatsoever. I am simply doing this:
var putRequest = new PutObjectRequest
{
BucketName = m.Groups["bucket"].Value,
Key = m.Groups["key"].Value,
InputStream = new MemoryStream(Encoding.UTF8.GetBytes(data ?? "")),
CannedACL = S3CannedACL.PublicRead,
ContentType = MimeTypes.GetMimeType(Path.GetExtension(m.Groups["key"].Value)),
DisablePayloadSigning = true,
};
putRequest.Headers.ContentLength = data.Length;
_context.Logger.LogLine($"Saving file to bucket '{putRequest.BucketName}' and key '{putRequest.Key}' and content type '{putRequest.ContentType}' and content length {putRequest.Headers.ContentLength}");
try
{
await _s3Client.PutObjectAsync(putRequest);
}
catch (AmazonS3Exception s3x)
{
_context.Logger.LogLine($"S3 Exception: {s3x.Message}");
}
I have checked bucket and key and they are correct. Data.length is greater than 0. Content type is "audio/mpeg", which is correct for .mp3. The data is there to be written.
My lambda is running under AWSLambda_Full_Access with the following additional rights granted:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:ListStorageLensConfigurations",
"s3:ListAccessPointsForObjectLambda",
"s3:GetAccessPoint",
"s3:PutAccountPublicAccessBlock",
"s3:GetAccountPublicAccessBlock",
"s3:ListAllMyBuckets",
"s3:*",
"s3:ListAccessPoints",
"s3:ListJobs",
"s3:PutStorageLensConfiguration",
"s3:ListMultiRegionAccessPoints",
"s3:CreateJob"
],
"Resource": "*"
}
]
}
Does anyone have any ideas what else I could be missing? I have been stuck on this one problem for over 3 days now and have tried everything I can think of, so it must be something I'm not thinking of.
Thanks.
Error: Unexpected character encountered while parsing value: e. Path
'', line 0, position 0.
I am using the Google .Net Client library to access the Google drive API v3 specifically the Google.Apis.Drive.v3 package. I am authorizing using "Service Account" with C#.
Authorization with the p12 key is no problem. However, JSON is recommended and p12 format is maintained for backward compatibility.
I downloaded the JSON file from the Google Developers Console and tried to make the authorization with the following code:
public static Google.Apis.Drive.v3.DriveService AuthenticateServiceAccountJSON(string keyFilePath) {
// check the file exists
if (!File.Exists(keyFilePath)) {
Console.WriteLine("An Error occurred - Key file does not exist");
return null;
}
string[] scopes = new string[] { DriveService.Scope.Drive, // view and manage your files and documents
DriveService.Scope.DriveAppdata, // view and manage its own configuration data
DriveService.Scope.DriveFile, // view and manage files created by this app
DriveService.Scope.DriveMetadataReadonly, // view metadata for files
DriveService.Scope.DriveReadonly, // view files and documents on your drive
DriveService.Scope.DriveScripts }; // modify your app scripts
try {
using (var stream = new FileStream(keyFilePath, FileMode.Open, FileAccess.Read)) {
var credential = GoogleCredential.FromStream(stream);
if (credential.IsCreateScopedRequired) {
credential.CreateScoped(scopes);
}
// Create the service.
Google.Apis.Drive.v3.DriveService service = new Google.Apis.Drive.v3.DriveService(new BaseClientService.Initializer() {
HttpClientInitializer = credential,
ApplicationName = "MyDrive",
});
return service;
}
} catch (Exception ex) {
Console.WriteLine(ex.InnerException);
return null;
}
}
I have looked at the JSON file in notepad and it seems encrypted.
"ewogICJ0eXBlIjogInNlcnZpY2VfYWNjb3VudCIsCiAgInByb2plY3RfaWQiOiAicmFkaWFudC1tZXJjdXJ5LTEyMjkwNyIsCiAgIn.........."
Is it ok to continue using the P12 ?
This works for me using the JSON credentials file from the Google Developers Console. I am using the Analytics Service, but just swap out the appropriate names for the Drive service:
private AnalyticsReportingService service;
public async Task GetAuthorizationByServiceAccount()
{
string[] scopes = new string[] { AnalyticsReportingService.Scope.AnalyticsReadonly }; // Put your scopes here
var keyFilePath = AppContext.BaseDirectory + #"KeyFile.json";
//Console.WriteLine("Key File: " + keyFilePath);
var stream = new FileStream(keyFilePath, FileMode.Open, FileAccess.Read);
var credential = GoogleCredential.FromStream(stream);
credential = credential.CreateScoped(scopes);
service = new AnalyticsReportingService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "<Your App Name here>",
});
}
Make sure that you are downloading proper file...
GoogleCredential.FromStream(stream)
works with JSON file. Its should look something like this:
{
"type": "service_account",
"project_id": "",
"private_key_id": "",
"private_key": "-----BEGIN PRIVATE KEY-----
---END PRIVATE KEY-----\n",
"client_email": "",
"client_id": "",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": ""
}
You can get this file at https://console.developers.google.com/apis/credentials by clicking Download JSON button on the right side of the grid showing client IDs. Just make sure that Type for selected ID is "Service account client".
Short version
Logged in as a Facebook user, I use my oAuth token to assume an IAM role on AWS. It returns what looks to be valid credentials, e.g. there is an AccessKeyId, SecretAccessKey that are similar length to our master keys.
When I try to use these credentials to access a DynamoDB table, I get one of two exceptions:
"The remote server returned an error: (400) Bad Request."; or
"The security token included in the request is invalid.".
I'm using the AWS C# SDK version 1.5.25.0
Long version
As I said above, I'm trying to access a DynamoDB table on AWS using credentials supplied by AmazonSecurityTokenServiceClient authorized by Facebook Identity as described in this AWS guide.
The policy for the IAM role that I've created is:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"dynamodb:BatchGetItem",
"dynamodb:PutItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:UpdateItem"
],
"Sid": "Stmt1372350145000",
"Resource": [
"*"
],
"Effect": "Allow"
}
]
}
How I get the credentials:
The user logs in with Facebook using oAuth.
Using the access token, I assume the IAM role using a AmazonSecurityTokenServiceClient.AssumeRoleWithWebIdentity with a request.
This returns what looks like to be valid credentials, e.g. a AccessKeyId, SecretAccessKey that are similar length to our master keys.
using (var tokenServiceClient = new AmazonSecurityTokenServiceClient(RegionEndpoint.USEast1))
{
var request = new AssumeRoleWithWebIdentityRequest
{
DurationSeconds = (int)TimeSpan.FromHours(1).TotalSeconds,
ProviderId = "graph.facebook.com",
RoleArn = "arn:aws:iam::193557284355:role/Facebook-OAuth",
RoleSessionName = result.id,
WebIdentityToken = FBClient.AccessToken
};
var response = tokenServiceClient.AssumeRoleWithWebIdentity(request);
AWSAssumedRoleUser = response.AssumeRoleWithWebIdentityResult.AssumedRoleUser;
AWSCredentials = response.AssumeRoleWithWebIdentityResult.Credentials;
}
How I use these credentials:
Using the returned credentials, I then try to access a AWS DynamoDB resource.
using (var client = new AmazonDynamoDBClient(AWSCredentials.AccessKeyId, AWSCredentials.SecretAccessKey, AWSCredentials.SessionToken, RegionEndpoint.USEast1))
{
var context = new DynamoDBContext(client);
var data = context.Scan<SomeData>();
}
This returns "The remote server returned an error: (400) Bad Request." when trying to Scan the table.
This is where the variation in the exception message is; if I omit the AWSCredentials.SessionToken from the above AmazonDynamoDBClient
using (var client = new AmazonDynamoDBClient(AWSCredentials.AccessKeyId, AWSCredentials.SecretAccessKey, RegionEndpoint.USEast1))
{
var context = new DynamoDBContext(client);
var data = context.Scan<SomeData>();
}
This returns "The security token included in the request is invalid." when trying to Scan the table.
Question
At this point I cannot tell what is wrong, are the credentials invalid or that I'm not passing everything through that is needed to AWS.
Can anyone offer any insight to what is wrong or how I could debug this further?
I cross-posted my question to the AWS forums and received an answer from an Amazon engineer.
https://forums.aws.amazon.com/message.jspa?messageID=465057
DynamoDBContext object invokes DescribeTable on the target table (and caches this data, so for optimal performance you would want to keep the context object around for as long as possible, so this call is only done once per target table). Modify your policy as follows:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"dynamodb:BatchGetItem",
"dynamodb:PutItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:UpdateItem",
"dynamodb:DescribeTable"
],
"Sid": "Stmt1372350145000",
"Resource": [
"*"
],
"Effect": "Allow"
}
]
}