i want to download all files in container on azure as zip file and make the path to download be dynamic
this the right now code
string userName = System.Security.Principal.WindowsIdentity.GetCurrent().Name;
string[] arr = userName.Split('\\');
string path = $#"C:\Users\{arr[1]}\Downloads\";
CloudBlobContainer contianner = BlobClient.GetContainerReference(contianerName);
var list = contianner.ListBlobs();
/// Console.WriteLine(list.Count());
string[] FilesName = new string[list.Count()];
int i = 0;
foreach (var blob in list)
{
string[] Name = blob.Uri.AbsolutePath.Split('/');
FilesName[i++] = Name[2];
// Console.WriteLine(Name[2]);
CloudBlockBlob blockBlob = contianner.GetBlockBlobReference(Name[2]);
System.IO.Directory.CreateDirectory($#"{path}ImagesPath");
using (var fileStream = System.IO.File.OpenWrite($#"{path}\ImagesPath\{Name[2]}"))
{
blockBlob.DownloadToStream(fileStream);
}
}
You need to finish your job using 3 steps.
Step 1, Download all the files to a folder. I suggest you create a folder under your content folder of your web application.
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("storage connection string");
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
string containerName = "mycontainer";
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
var blobs = container.ListBlobs(string.Empty, true);
string currentDateTime = DateTime.Now.ToString("yyyyMMddhhmmss");
string directoryPath = Server.MapPath("~/Content/" + containerName + currentDateTime);
System.IO.Directory.CreateDirectory(directoryPath);
foreach (CloudBlockBlob blockBlob in blobs)
{
string[] segements = blockBlob.Name.Split('/');
string subFolderPath = directoryPath;
for (int i = 0; i < segements.Length - 1; i++)
{
subFolderPath = subFolderPath + "\\" + segements[i];
if (!System.IO.Directory.Exists(subFolderPath))
{
System.IO.Directory.CreateDirectory(subFolderPath);
}
}
string filePath = directoryPath + "\\" + blockBlob.Name;
blockBlob.DownloadToFile(filePath, System.IO.FileMode.CreateNew);
}
Console.WriteLine("Download files successful.");
Step 2, After download the files, we could compress the folder using System.IO.Compression.ZipFile class. To use it, we need to add references to 2 assemblies. System.IO.Compression and System.IO.Compression.FileSystem.
System.IO.Compression.ZipFile.CreateFromDirectory(directoryPath, directoryPath + ".zip");
Console.WriteLine("Compress the folder successfully");
Step 3, Since the zip file is generated in the content folder, you could generate the URL for downloading operation.
string url = "http://hostname:port/content/" + containerName + currentDateTime + ".zip";
Related
I am trying to update a JSON file which is located in azure blob storage. when the program does the put call the saved file looks like this:
Zona de especial protecci\u00F3n
the accents and other characters are the matter, but that only happens when I download the file from the azure UI interface if I do a get from postman, that does not happen. this is my code:
SemanticDictionaryContent semanticDictionaryContent = new SemanticDictionaryContent()
{
Name = entity.Id + JSON_EXTENSION,
Content = BinaryData.FromObjectAsJson(entity)
};
Create a storage account in azure.
Create a container in azure.
Uploaded a Json file to a container using the below code.
I have used the below approach in Uploading / downloading / editing the Json file.
public static bool Upload()
{
try
{
var containerName = "mycontainer";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionSting);
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = client.GetContainerReference(containerName);
var isCreated = container.CreateIfNotExists();
container.SetPermissionsAsync(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
using (FileStream fileStream = File.Open(#"C:\Tools\local.settings.json", FileMode.Open))
{
using (MemoryStream memoryStream = new MemoryStream())
{
memoryStream.Position = 0;
fileStream.CopyTo(memoryStream);
var fileName = "local.settings.json";
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
string mimeType = "application/unknown";
string ext = (fileName.Contains(".")) ? System.IO.Path.GetExtension(fileName).ToLower() : "." + fileName;
Microsoft.Win32.RegistryKey regKey = Microsoft.Win32.Registry.ClassesRoot.OpenSubKey(ext);
if (regKey != null && regKey.GetValue("Content Type") != null) mimeType = regKey.GetValue("Content Type").ToString();
memoryStream.ToArray();
memoryStream.Seek(0, SeekOrigin.Begin);
blob.Properties.ContentType = mimeType;
blob.UploadFromStream(memoryStream);
}
}
return true;
}
catch (Exception ex)
{
throw;
}
}
Uploaded Json file
Updated the Json file in azure and uploaded it using the below code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionSting);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob jsonBlob = container.GetBlockBlobReference("local.settings.json");
string jsonText = jsonBlob.DownloadText();
dynamic jsonData = JsonConvert.DeserializeObject(jsonText);
jsonData.property1 = "Property1";
jsonData.property2 = "Property2";
jsonBlob.UploadText(JsonConvert.SerializeObject(jsonData));
And downloaded the Json file from azure manually and do not find any special characters.
I am able to move an object in an S3 bucket from one directory to another directory using C# but unable to copy all current permissions with that object.
For example, my current object has public access permissions but after moving it to another directory it lost the public read permissions.
Here is the code I'm using to move objects:
public void MoveFile(string sourceBucket, string destinationFolder, string file) {
AmazonS3Client s3Client = new AmazonS3Client(ConfigurationHelper.AmazonS3AccessKey, ConfigurationHelper.AmazonS3SecretAccessKey, Amazon.RegionEndpoint.USEast1);
S3FileInfo currentObject = new S3FileInfo(s3Client, sourceBucket, file);
currentObject.MoveTo(sourceBucket + "/" + destinationFolder, file);
}
Here is the output after moving file to another directory:
It lost public "Read" permission.
I've figure out issue myself, by using CopyObject() and DeleteObject() instead of using moveTo inbuild method and that solves my issue,
here is the code which really helped me:
CopyObjectRequest copyObjectRequest = new CopyObjectRequest
{
SourceBucket = sourceBucket,
DestinationBucket = sourceBucket + "/" + destinationFolder,
SourceKey = file,
DestinationKey = file,
CannedACL = S3CannedACL.PublicRead,
StorageClass = S3StorageClass.StandardInfrequentAccess,
};
CopyObjectResponse response1 = s3Client.CopyObject(copyObjectRequest);
var deleteObjectRequest = new DeleteObjectRequest
{
BucketName = sourceBucket,
Key = file
};
s3Client.DeleteObject(deleteObjectRequest);
I'm posting answer so it can be helpful for someone!!!
How do I unzip a file of size 40 GB in Azure blob store using C#? I tried using Sharpziplib and ionic.zip. But I run into errors
Bad state (invalid block type)
. Can anyone please help me out?
Below is my code
var storageAccount1 = CloudStorageAccount.Parse(connectionString1);
var blobClient = storageAccount1.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("testwwpoc");
if (share.Exists())
{
Console.WriteLine("Yes");
Console.WriteLine("Yes");
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
var file = container.GetBlobReference("WAV_WW_5988.zip");
// Ensure that the file exists.
if (file.Exists())
{
using (ZipFile zipFile = ZipFile.Read(file.OpenRead()))
{
zipFile.CompressionLevel = Ionic.Zlib.CompressionLevel.Default;
//zipFile.UseZip64WhenSaving = Zip64Option.Always;
zipFile.Encryption = EncryptionAlgorithm.None;
//zipFile.BufferSize = 65536 * 19000;
zipFile.Password = "xyz";
Console.WriteLine(zipFile.Entries.Count);
//var entry = zipFile.Entries.First();
//CloudFileDirectory sampleDir = rootDir.GetDirectoryReference("WAV_WW_5988");
//foreach (var entry in zipFile.Entries)
for (var i = 1; i < zipFile.Count; i++)
{
var blob = container.GetBlockBlobReference("test/" + zipFile[i].FileName);//+ entry.FileName);
Console.WriteLine(zipFile[i].FileName);
Console.WriteLine(zipFile[i].UncompressedSize);
try
{
blob.UploadFromStream(zipFile[i].OpenReader());
}
catch(Exception ex)
{
Console.WriteLine(ex);
}
}
}
When i click the download link it send me to a error page trying to debug it its telling me that my given paths format is not supported
In my controller class:
public async Task<ActionResult> DownloadBlob(string file, string extension)
{
string downloadPath = await repo.DownloadBlobAsync(file, extension);
return Json(downloadPath);
}
In My Blob Storage class:
public async Task<string> DownloadBlobAsync (string file, string fileExtension)
{
_cloudBlobContainerx = _cloudBlobClientx.GetContainerReference(containerNamex);
CloudBlockBlob blockBlob = _cloudBlobContainerx.GetBlockBlobReference(file + "." + fileExtension);
var path = downloadPath + file + "." + fileExtension;
using (var fileStream = System.IO.File.OpenWrite(path))
{
fileStream.Position = 1;
//fileStream.Seek(0, SeekOrigin.Begin);
await blockBlob.DownloadToStreamAsync(fileStream);
return path;
}
}
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.NotSupportedException: The given path's format is not supported
The source of the error :
using (var fileStream = System.IO.File.OpenWrite(path))
According to your description, I suggest you could set a break point to check the value of downloadPath or 'path' field.
I have created a simple demo to download a blob file to a local folder path. It works fine. You could refer to.
string containerNamex = "container002";
string file = "MyTest"; //a blob file named MyTest.txt
string fileExtension = "txt";
string downloadPath = #"D:\MyBlob\";
CloudBlobClient _cloudBlobClientx = storageAccount.CreateCloudBlobClient();
_cloudBlobContainerx = _cloudBlobClientx.GetContainerReference(containerNamex);
CloudBlockBlob blockBlob = _cloudBlobContainerx.GetBlockBlobReference(file + "." + fileExtension);
var path = downloadPath + file + "." + fileExtension;
using (var fileStream = System.IO.File.OpenWrite(path))
{
fileStream.Position = 1;
blockBlob.DownloadToStreamAsync(fileStream);
Console.WriteLine("success");
}
The result like this:
I have created hierarchical structure for managing file in bucket. Planning to create folder for each month i.e. dec-2017. There will be more than 10k pdf files in each folder.
Written C# code for getting objects from bucket. This code is working fine for accessing files that are on root of bucket. I am having issues with accessing files in folder i.e. my-bucket/dec-2017/test.pdf is not accessed using code.
Refer my bucket structure here
I am using following code, Can anyone don this before?
if (_storageService == null)
{
_storageService = CreateStorageClient();
}
ObjectsResource.GetRequest downloadRequest = null;
//string objectName = "dec-2017%2Ftest.pdf";
string objectName = "dec-2017/test.pdf";
downloadRequest = new ObjectsResource.GetRequest(_storageService, bucketName, objectName);
MemoryStream stream = new MemoryStream();
downloadRequest.Download(stream);
bytes = stream.ToArray();
Please check below sample code -
using Google.Apis.Auth.OAuth2;
using Google.Cloud.Storage.V1;
string file = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Test.csv");
File.WriteAllText(file, "test");
GoogleCredential credential = null;
BucketConnector bucketConnector = new BucketConnector();
credential = bucketConnector.ConnectStream();
var storageClient = StorageClient.Create(credential);
string folderPath = ConfigurationParameters.FOLDER_NAME_IN_BUCKET;
using (FileStream file = File.OpenRead(localPath))
{
objectName = folderPath + Path.GetFileName(localPath);
storage.UploadObject(bucketName, objectName, null, file);
}