I have created hierarchical structure for managing file in bucket. Planning to create folder for each month i.e. dec-2017. There will be more than 10k pdf files in each folder.
Written C# code for getting objects from bucket. This code is working fine for accessing files that are on root of bucket. I am having issues with accessing files in folder i.e. my-bucket/dec-2017/test.pdf is not accessed using code.
Refer my bucket structure here
I am using following code, Can anyone don this before?
if (_storageService == null)
{
_storageService = CreateStorageClient();
}
ObjectsResource.GetRequest downloadRequest = null;
//string objectName = "dec-2017%2Ftest.pdf";
string objectName = "dec-2017/test.pdf";
downloadRequest = new ObjectsResource.GetRequest(_storageService, bucketName, objectName);
MemoryStream stream = new MemoryStream();
downloadRequest.Download(stream);
bytes = stream.ToArray();
Please check below sample code -
using Google.Apis.Auth.OAuth2;
using Google.Cloud.Storage.V1;
string file = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Test.csv");
File.WriteAllText(file, "test");
GoogleCredential credential = null;
BucketConnector bucketConnector = new BucketConnector();
credential = bucketConnector.ConnectStream();
var storageClient = StorageClient.Create(credential);
string folderPath = ConfigurationParameters.FOLDER_NAME_IN_BUCKET;
using (FileStream file = File.OpenRead(localPath))
{
objectName = folderPath + Path.GetFileName(localPath);
storage.UploadObject(bucketName, objectName, null, file);
}
Related
I am able to move an object in an S3 bucket from one directory to another directory using C# but unable to copy all current permissions with that object.
For example, my current object has public access permissions but after moving it to another directory it lost the public read permissions.
Here is the code I'm using to move objects:
public void MoveFile(string sourceBucket, string destinationFolder, string file) {
AmazonS3Client s3Client = new AmazonS3Client(ConfigurationHelper.AmazonS3AccessKey, ConfigurationHelper.AmazonS3SecretAccessKey, Amazon.RegionEndpoint.USEast1);
S3FileInfo currentObject = new S3FileInfo(s3Client, sourceBucket, file);
currentObject.MoveTo(sourceBucket + "/" + destinationFolder, file);
}
Here is the output after moving file to another directory:
It lost public "Read" permission.
I've figure out issue myself, by using CopyObject() and DeleteObject() instead of using moveTo inbuild method and that solves my issue,
here is the code which really helped me:
CopyObjectRequest copyObjectRequest = new CopyObjectRequest
{
SourceBucket = sourceBucket,
DestinationBucket = sourceBucket + "/" + destinationFolder,
SourceKey = file,
DestinationKey = file,
CannedACL = S3CannedACL.PublicRead,
StorageClass = S3StorageClass.StandardInfrequentAccess,
};
CopyObjectResponse response1 = s3Client.CopyObject(copyObjectRequest);
var deleteObjectRequest = new DeleteObjectRequest
{
BucketName = sourceBucket,
Key = file
};
s3Client.DeleteObject(deleteObjectRequest);
I'm posting answer so it can be helpful for someone!!!
I have a xamarin app where I need to read from a json file I have added in the Asset folder.
This is how the folder tree is in the solution explorer.
The code being used to read this json file is called from the User class in the Model folder.
This is how I have tried to access this file.
string path = Path.Combine(Directory.GetCurrentDirectory(), "..", "Assets", "data", "User.json");
var assembly = typeof(User).GetType().Assembly;
var stream = assembly.GetManifestResourceStream($"{assembly.GetName().Name}.{"User.json"}");
using (var reader = new StreamReader(stream))
{
string jsonString = await reader.ReadToEndAsync();
users = JsonConvert.DeserializeObject<List<User>>(jsonString);
}
The stream variable is null.
Please help out.
First check that you have set User.json file to Buid Action:EmbeddedResource.
then do like below:
var assembly = IntrospectionExtensions.GetTypeInfo(typeof(User)).Assembly;
Stream stream = assembly.GetManifestResourceStream("TravulRecd.Assets.data.User.json");
string jsonString = "";
using (var reader = new System.IO.StreamReader(stream))
{
jsonString = reader.ReadToEnd();
}
I am using the following code to upload an XML file to an Azure Blob Storage account using the DotNetZip nuget package.
XmlDocument doc = new XmlDocument();
doc.Load(path);
string xmlContent = doc.InnerXml;
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
var cloudBlobClient = storageAccount.CreateCloudBlobClient();
var cloudBlobContainer = cloudBlobClient.GetContainerReference(container);
cloudBlobContainer.CreateIfNotExists();
using (var fs = File.Create("test.zip"))
{
using (var s = new ZipOutputStream(fs))
{
s.PutNextEntry("entry1.xml");
byte[] buffer = Encoding.ASCII.GetBytes(xmlContent);
s.Write(buffer, 0, buffer.Length);
fs.Position = 0;
//Get the blob ref
var blob = cloudBlobContainer.GetBlockBlobReference("test.zip");
blob.Properties.ContentEncoding = "zip"
blob.Properties.ContentType = "text/plain";
blob.Metadata["filename"] = "test.zip";
blob.UploadFromStream(fs);
}
}
This code creates a zip file in my container. But when I download it and try to open it, I get the following error:
"Windows cannot open the folder. The compressed (zipped) folder is invalid". But the saved zipped file in my application directory can be unzipped fine and contains my xml file.
What am I doing wrong?
I am able to reproduce the problem you're having. Essentially the issue is that the content is not completely written in the zip file when you initiated the upload command. In my test, the zip file size on the local disk was 902 bytes however at the time of uploading the size of file stream was just 40 bytes and that's causing the problem.
What I did was split the two functionality where the first one just creates the file and other reads from file and uploads in storage. Here's the code I used:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
var cloudBlobClient = storageAccount.CreateCloudBlobClient();
var cloudBlobContainer = cloudBlobClient.GetContainerReference("test");
cloudBlobContainer.CreateIfNotExists();
using (var fs = File.Create("test.zip"))
{
using (var s = new ZipOutputStream(fs))
{
s.PutNextEntry("entry1.xml");
byte[] buffer = File.ReadAllBytes(#"Path\To\MyFile.txt");
s.Write(buffer, 0, buffer.Length);
//Get the blob ref
}
}
using (var fs = File.OpenRead("test.zip"))
{
var blob = cloudBlobContainer.GetBlockBlobReference("test.zip");
blob.Properties.ContentEncoding = "zip";
blob.Properties.ContentType = "text/plain";
blob.Metadata["filename"] = "test.zip";
fs.Position = 0;
blob.UploadFromStream(fs);
}
I created Bucket on google Storage and I programmatically uploaded some files on it. When I try to download them, I get this exception:
The specified data could not be decrypted
I wrote code such that
GoogleCredential credential = null;
var jsonFileBytes = Properties.Resources.stitcherautoupdate_55bd51f48cf0;
var jsonFileString = Encoding.UTF8.GetString(jsonFileBytes, 0, jsonFileBytes.Length);
var json = Newtonsoft.Json.JsonConvert.DeserializeObject<System.Object>(jsonFileString);
var jsonString = json.ToString();
credential = GoogleCredential.FromJson(jsonString);
StorageClient = StorageClient.Create(credential);
StorageClient.DownloadObject(bucketName, fileName, fileStream);
My recommendation regarding your issue is to try following the methods for upload and download mentioned in the documentation. Once you get it working, you can start slowly changing the code so you know which part is the one causing the issue.
This document describes how you should first configure your Cloud Storage client library and setting up authentication.
A sample code for uploading an object:
private void UploadFile(string bucketName, string localPath,
string objectName = null)
{
var storage = StorageClient.Create();
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
storage.UploadObject(bucketName, objectName, null, f);
Console.WriteLine($"Uploaded {objectName}.");
}
}
For downloading an object:
private void DownloadObject(string bucketName, string objectName,
string localPath = null)
{
var storage = StorageClient.Create();
localPath = localPath ?? Path.GetFileName(objectName);
using (var outputFile = File.OpenWrite(localPath))
{
storage.DownloadObject(bucketName, objectName, outputFile);
}
Console.WriteLine($"downloaded {objectName} to {localPath}.");
}
I am trying to upload my files to the s3 bucket, but i don't want that file to be uploaded from my local machine instead when some one uses the application and uploads a file the same should be directly uploaded to my s3 bucket.!! Is there a way to do this?(code should be in .net)
string filekey = filePath.Substring(filePath.LastIndexOf('\\') + 1);
using (MemoryStream filebuffer = new MemoryStream(File.ReadAllBytes(filePath)))
{
PutObjectRequest putRequest = new PutObjectRequest
{
BucketName = this.awsBucketName,
Key = "GUARD1" + "/" + filekey,
InputStream = filebuffer,
ContentType = "application/pkcs8",
};
This is what i am doing..which in turn creates a folder in the bucket and takes the file path from the local machine and the same is uploaded to the bucket.
What i need is that the file shouldn't be saved in the local machine and instead be taken directly from the application to the s3 bucket.
This is WriteIntoS3 method:
string filekey = filePath.Substring(filePath.LastIndexOf('\\') + 1);
using (MemoryStream filebuffer = new MemoryStream(File.ReadAllBytes(filePath)))
{
PutObjectRequest putRequest = new PutObjectRequest
{
BucketName = this.awsBucketName,
Key = "GUARD1" + "/" + filekey,
InputStream = filebuffer,
ContentType = "application/pkcs8",
};
client.PutObject(putRequest);
GetPreSignedUrlRequest expiryUrlRequest = new GetPreSignedUrlRequest();
expiryUrlRequest.BucketName = this.awsBucketName;
expiryUrlRequest.Key = filekey;
expiryUrlRequest.Expires = DateTime.Now.AddDays(ExpiryDays);
string url = client.GetPreSignedURL(expiryUrlRequest);
return url;
}
If you don't want to use local files then you can use TransferUtility class to upload a stream directly to S3.
For example:
using Amazon.S3.Transfer;
using System.IO;
class Program
{
static void Main(string[] args)
{
var client = new Amazon.S3.AmazonS3Client();
using (var ms = new MemoryStream()) // Load the data into memorystream from a data source other than a file
{
using (var transferUtility = new TransferUtility(client))
{
transferUtility.Upload(ms, "bucket", "key");
}
}
}
}