Uploading zip file to S3 from c# - c#

So Im trying to upload a zip file to s3 for storage. But I keep getting 403 forbidden back.
My code works when i upload an image file but not when i upload a zip file
My code:
internal static void UploadFiletoS3fromZip(Byte[] fileByteArray, string fileName, string bucketName, string filepath)
{
try
{
CognitoAWSCredentials credentials = new CognitoAWSCredentials("###PVTCredentials###", Amazon.RegionEndpoint.EUWest1);
client = new AmazonS3Client(credentials, Amazon.RegionEndpoint.EUWest1);
using (MemoryStream fileToUpload = new MemoryStream(fileByteArray))
{
PutObjectRequest request = new PutObjectRequest()
{
BucketName = bucketName,
Key = fileName,
InputStream = fileToUpload,
ContentType = "application/zip"
};
request.Timeout = TimeSpan.FromSeconds(60);
PutObjectResponse response2 = client.PutObject(request);
}
}
catch (AmazonS3Exception s3Exception)
{
s3Exception.ToExceptionless().Submit();
}
catch (Exception ex)
{
ex.ToExceptionless().Submit();
}
}
Can anyone see what the problem here is? i get a 403 forbidden in the s3Exception. the credentials im using does have write permission and works perfectly when i use a base64 image and change the contentType to "image/jpeg"
OK SO I FOUND THE FIX....
instead of using
CognitoAWSCredentials credentials = new CognitoAWSCredentials("###PVTCredentials###", Amazon.RegionEndpoint.EUWest1);
client = new AmazonS3Client(credentials, Amazon.RegionEndpoint.EUWest1);
i replaced it with
var client = new AmazonS3Client(AwsAccessKeyId,AwsSecretAccessKey, Amazon.RegionEndpoint.EUWest1);
For if anyone else is having this issue, replace CognitoAWSCredentials with id and secret credentials

using (var client = new AmazonS3Client(LlaveAcceso, LlaveAccesoSecreta, RegionEndpoint.USEast2))
{
using (var newMemoryStream = new MemoryStream())
{
var putArchivo = new PutObjectRequest
{
BucketName = Buquet,
Key = file.FileName,
FilePath = ruta,
};
PutObjectResponse response = client.PutObjectAsync(putArchivo).Result;
MessageBox.Show("Archivo " + file.FileName + " Cargado Correctamente.", "AWS Loader", MessageBoxButtons.OK, MessageBoxIcon.Information);
label2.Text = "";
}
}

Related

How to upload files to Digital Ocean Space with AWS SDK for .NET?

I am wanting to create a WinForm application to upload selected files to digital ocean space. You can imagine it works similar to Cyberduck and written in C#. Thanks a lot
This worked for me to upload files to Digital ocean space
public static string UploadFile(HttpPostedFileBase file, string filepath)
{
try
{
string accessKey = "xxxxxxxxxxxxxxxxx";
string secretKey = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
AmazonS3Config config = new AmazonS3Config();
config.ServiceURL = "https://abc1.digitaloceanspaces.com";
AmazonS3Client s3Client = new AmazonS3Client(
accessKey,
secretKey,
config
);
// Create a client
AmazonS3Client client = new AmazonS3Client(RegionEndpoint.USEast1); //according to your prefered Region
try
{
var fileTransferUtility = new TransferUtility(s3Client);
var fileTransferUtilityRequest = new TransferUtilityUploadRequest
{
BucketName = bucketName + #"/" + filepath, // filepath is your folder name in digital ocean space leave empty if not any.
InputStream = file.InputStream,
StorageClass = S3StorageClass.StandardInfrequentAccess,
Key = file.FileName,
CannedACL = S3CannedACL.PublicRead
};
fileTransferUtility.Upload(fileTransferUtilityRequest);
}
catch (AmazonS3Exception e)
{
var a = e.Message;
}
}
catch (Exception ex) { }
return file.FileName;
}

SharePointOnlineCredentials can't find account and

I'm trying to connect to my Company's sharepoint but the credentials I use (my email and password) are not recognized and an error is thrown on the ExecuteQuery line. Also when I try and get the files using relative paths or url paths, every single property throws a Microsoft.SharePoint.Client.PropertyOrFieldNotInitializedException.
private static readonly HttpClient client = new HttpClient();
[FunctionName("SendFiles")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)] HttpRequestMessage req,
ILogger log)
{
ClientContext context = new ClientContext("https://abc.sharepoint.com/sites/a-b");
SecureString secureString = new SecureString();
foreach(char c in "dummypass")
{
secureString.AppendChar(c);
}
context.Credentials = new SharePointOnlineCredentials("a.b#faktion.com", secureString);
Web site = context.Web;
string jsonInput = req.Content.ReadAsStringAsync().Result;
SendFilesInput input = JsonConvert.DeserializeObject<SendFilesInput>(jsonInput);
string url = "https://a.b.com/gql/api/organisations/" + input.OrganisationId + "/projects/" + input.ProjectId + "/process";
string response = null;
bool succesfullRequest = false;
MultipartFormDataContent formdata = new MultipartFormDataContent();
try
{
foreach (var filePath in input.Files)
{
// create filestream content
var fileurl = "https://abc.sharepoint.com/sites/a-b" + "/" + filePath;
Microsoft.SharePoint.Client.File temp = site.GetFileByServerRelativeUrl(filePath);
Microsoft.SharePoint.Client.File temp2 = site.GetFileByUrl(fileurl);
ClientResult<Stream> crstream = temp.OpenBinaryStream();
ClientResult<Stream> crstream2 = temp2.OpenBinaryStream();
context.Load(temp);
context.Load(temp2);
context.ExecuteQuery();
var tempfile = Path.GetTempFileName();
FileStream fs = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read);
if (crstream.Value != null)
{
crstream.Value.CopyTo(fs);
}
HttpContent content = new StreamContent(fs);
string name = GetFileName(filePath);
content.Headers.Add("Content-Type", GetFileType(name));
formdata.Add(content, "files", name);
System.IO.File.Decrypt(tempfile);
}
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", input.BearerToken);
// send content to the backend and parse result
var resultPost = client.PostAsync(url, formdata).Result;
response = resultPost.Content.ReadAsStringAsync().Result;
succesfullRequest = resultPost.IsSuccessStatusCode;
}
// I absolutely want to catch every exception and pass these along to the workflow
catch (Exception ex)
{
req.CreateErrorResponse(HttpStatusCode.BadRequest, ex);
}
// if something went wrong in the backend, throw an error
if (!succesfullRequest)
{
req.CreateErrorResponse(HttpStatusCode.BadRequest, "Something went wrong during the upload process");
}
[...] // rest is not important
Error:
The sign-in name or password does not match one in the Microsoft account system.

Upload file Amazon S3 return The stream does not support concurrent IO read or write operations

I try upload file in amazon s3, but always return this message.
My code:
AmazonS3Config S3Config = new AmazonS3Config()
{
ServiceURL = "s3.amazonaws.com",
CommunicationProtocol = Protocol.HTTP,
RegionEndpoint = RegionEndpoint.SAEast1
};
using (AmazonS3Client client = new AmazonS3Client(KEY_S3, PASSWORD, S3Config))
{
string pathInS3 = folder + "/" + fileName;
PutObjectRequest request = new PutObjectRequest();
request.WithBucketName(BUCKET_NAME);
request.WithKey(pathInS3);
request.WithInputStream(memoryStreamFile);
request.CannedACL = S3CannedACL.PublicReadWrite;
client.PutObject(request);
}
I had use lock in request and client but do not resolve.
I think the problem is the memoryStreamFile, please do double check trying to read the content of your memorystreamFile.So another way to upload files to AmazonS3 with C# is the following:
AmazonS3Config cfg = new AmazonS3Config();
cfg.RegionEndpoint = Amazon.RegionEndpoint.EUCentral1;// region endpoint
string bucketName = "your bucket";
AmazonS3Client s3Client = new AmazonS3Client("your access key", "your secret key", cfg);
string dataString ="your data ";
MemoryStream data = new System.IO.MemoryStream(UTF8Encoding.ASCII.GetBytes(dataString));
TransferUtility t = new TransferUtility(s3Client);
t.Upload(data, bucketName, "testUploadFromTransferUtility.txt");

Getting file url after upload amazon s3

I need to get file url after upload the file to amazons3 server.
Here is my upload code.
How to return amazons3 path ?
public static bool UploadToS3(string bucketName, string bucketFilePath, Byte[] localPath)
{
var client = Amazon.AWSClientFactory.CreateAmazonS3Client(Config.EmailServer.AwsAccessKey, Config.EmailServer.AwsSecretKey, Amazon.RegionEndpoint.EUWest1);
PutObjectRequest request = new PutObjectRequest()
{
BucketName = bucketName,
Key = bucketFilePath,
InputStream = new MemoryStream(localPath),
AutoCloseStream = true,
CannedACL = S3CannedACL.PublicRead,
StorageClass = S3StorageClass.ReducedRedundancy
};
PutObjectResponse response = client.PutObject(request);
return true;
}
Simply you can generate download expiry link after upload completed.
example:
var expiryUrlRequest = new GetPreSignedUrlRequest()
.WithBucketName(BucketName)
.WithKey(Key)
.WithExpires(DateTime.Now.AddDays(10));
string url = _amazonS3Client.GetPreSignedURL(expiryUrlRequest);
Try this method
GetPreSignedUrlRequest request = new GetPreSignedUrlRequest();
request.BucketName = "my-bucket-name";
request.Key = "secret_plans.txt";
request.Expires = DateTime.Now.AddHours(1);
request.Protocol = Protocol.HTTP;
string url = client.GetPreSignedURL(request);
Console.WriteLine(url);

can't replace file in amazon s3 bucket

can't replace file in amazon s3 bucket
when i am going to upload an image to amazon s3 bucket it shows error like below
An item with the same key has already been added.
i have uploaded an image file and i wanted replace that image when i need it. but it does not allow.
how can I fix it?
i am using C#
using (s3Client = Amazon.AWSClientFactory.CreateAmazonS3Client("key", "secret key", Amazon.RegionEndpoint.USWest2))
{
var stream2 = new System.IO.MemoryStream();
bitmap.Save(stream2, ImageFormat.Jpeg);
stream2.Position = 0;
PutObjectRequest request2 = new PutObjectRequest();
request2.InputStream = stream2;
request2.BucketName = "ezcimassets";
request2.CannedACL = S3CannedACL.PublicRead;
fileName = webpage + ".jpeg";
//fileName = Guid.NewGuid() + webpage + ".jpeg";)
request2.Key = "WebThumbnails/" + fileName;
Amazon.S3.Model.PutObjectResponse response = s3Client.PutObject(request2);
}
Thanks in advance
this line must be changed as
request2.CannedACL = S3CannedACL.PublicReadWrite
You can check if an object with that key already exists, and if so delete it:
public bool Exists(string fileKey, string bucketName)
{
try
{
response = _s3Client.GetObjectMetadata(new GetObjectMetadataRequest()
.WithBucketName(bucketName)
.WithKey(key));
return true;
}
catch (Amazon.S3.AmazonS3Exception ex)
{
if (ex.StatusCode == System.Net.HttpStatusCode.NotFound)
return false;
//status wasn't not found, so throw the exception
throw;
}
}
public void Delete(string fileKey, string bucketName)
{
DeleteObjectRequest request = new DeleteObjectRequest();
request.BucketName = bucketName;
request.Key = fileKey;
client.DeleteObject(request);
}

Categories

Resources