I am using the following code to upload an XML file to an Azure Blob Storage account using the DotNetZip nuget package.
XmlDocument doc = new XmlDocument();
doc.Load(path);
string xmlContent = doc.InnerXml;
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
var cloudBlobClient = storageAccount.CreateCloudBlobClient();
var cloudBlobContainer = cloudBlobClient.GetContainerReference(container);
cloudBlobContainer.CreateIfNotExists();
using (var fs = File.Create("test.zip"))
{
using (var s = new ZipOutputStream(fs))
{
s.PutNextEntry("entry1.xml");
byte[] buffer = Encoding.ASCII.GetBytes(xmlContent);
s.Write(buffer, 0, buffer.Length);
fs.Position = 0;
//Get the blob ref
var blob = cloudBlobContainer.GetBlockBlobReference("test.zip");
blob.Properties.ContentEncoding = "zip"
blob.Properties.ContentType = "text/plain";
blob.Metadata["filename"] = "test.zip";
blob.UploadFromStream(fs);
}
}
This code creates a zip file in my container. But when I download it and try to open it, I get the following error:
"Windows cannot open the folder. The compressed (zipped) folder is invalid". But the saved zipped file in my application directory can be unzipped fine and contains my xml file.
What am I doing wrong?
I am able to reproduce the problem you're having. Essentially the issue is that the content is not completely written in the zip file when you initiated the upload command. In my test, the zip file size on the local disk was 902 bytes however at the time of uploading the size of file stream was just 40 bytes and that's causing the problem.
What I did was split the two functionality where the first one just creates the file and other reads from file and uploads in storage. Here's the code I used:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
var cloudBlobClient = storageAccount.CreateCloudBlobClient();
var cloudBlobContainer = cloudBlobClient.GetContainerReference("test");
cloudBlobContainer.CreateIfNotExists();
using (var fs = File.Create("test.zip"))
{
using (var s = new ZipOutputStream(fs))
{
s.PutNextEntry("entry1.xml");
byte[] buffer = File.ReadAllBytes(#"Path\To\MyFile.txt");
s.Write(buffer, 0, buffer.Length);
//Get the blob ref
}
}
using (var fs = File.OpenRead("test.zip"))
{
var blob = cloudBlobContainer.GetBlockBlobReference("test.zip");
blob.Properties.ContentEncoding = "zip";
blob.Properties.ContentType = "text/plain";
blob.Metadata["filename"] = "test.zip";
fs.Position = 0;
blob.UploadFromStream(fs);
}
Related
I am trying to update a JSON file which is located in azure blob storage. when the program does the put call the saved file looks like this:
Zona de especial protecci\u00F3n
the accents and other characters are the matter, but that only happens when I download the file from the azure UI interface if I do a get from postman, that does not happen. this is my code:
SemanticDictionaryContent semanticDictionaryContent = new SemanticDictionaryContent()
{
Name = entity.Id + JSON_EXTENSION,
Content = BinaryData.FromObjectAsJson(entity)
};
Create a storage account in azure.
Create a container in azure.
Uploaded a Json file to a container using the below code.
I have used the below approach in Uploading / downloading / editing the Json file.
public static bool Upload()
{
try
{
var containerName = "mycontainer";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionSting);
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = client.GetContainerReference(containerName);
var isCreated = container.CreateIfNotExists();
container.SetPermissionsAsync(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
using (FileStream fileStream = File.Open(#"C:\Tools\local.settings.json", FileMode.Open))
{
using (MemoryStream memoryStream = new MemoryStream())
{
memoryStream.Position = 0;
fileStream.CopyTo(memoryStream);
var fileName = "local.settings.json";
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
string mimeType = "application/unknown";
string ext = (fileName.Contains(".")) ? System.IO.Path.GetExtension(fileName).ToLower() : "." + fileName;
Microsoft.Win32.RegistryKey regKey = Microsoft.Win32.Registry.ClassesRoot.OpenSubKey(ext);
if (regKey != null && regKey.GetValue("Content Type") != null) mimeType = regKey.GetValue("Content Type").ToString();
memoryStream.ToArray();
memoryStream.Seek(0, SeekOrigin.Begin);
blob.Properties.ContentType = mimeType;
blob.UploadFromStream(memoryStream);
}
}
return true;
}
catch (Exception ex)
{
throw;
}
}
Uploaded Json file
Updated the Json file in azure and uploaded it using the below code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionSting);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob jsonBlob = container.GetBlockBlobReference("local.settings.json");
string jsonText = jsonBlob.DownloadText();
dynamic jsonData = JsonConvert.DeserializeObject(jsonText);
jsonData.property1 = "Property1";
jsonData.property2 = "Property2";
jsonBlob.UploadText(JsonConvert.SerializeObject(jsonData));
And downloaded the Json file from azure manually and do not find any special characters.
I am uploading an excel file to azure storage container. When the file gets uploaded, and I try to download it back from the portal and open it, the open fails because the format of the file and extension do not match. Also, there is no size in the size column corresponding to the file. I cannot spot the error. The code is in asp.net core 3.1 with c#.
Here is my code
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(connectionString); // to the azure account
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(containerName); // container in which the file will be uploaded
blob = cloudBlobContainer.GetBlockBlobReference(f); // f is file name
await blob.UploadFromStreamAsync(s); // this is a memory stream
blob.Properties.ContentType = fileType;
According to my test, we can use the following code to upload excel file to Azure blob storage
Install SDK
Install-Package Microsoft.Azure.Storage.Blob -Version 11.1.1
Install-Package Microsoft.AspNetCore.StaticFiles -Version 2.2.0
My excel file(.xlsx)
Code
static async Task Main(string[] args)
{
var filepath = #"D:\test.xlsx";
var storageAccount = CloudStorageAccount.Parse("<connection string>");
var cloudBlobClient = storageAccount.CreateCloudBlobClient();
var cloudBlobContainer = cloudBlobClient.GetContainerReference("test1");
var blob = cloudBlobContainer.GetBlockBlobReference(Path.GetFileName(filepath));
blob.Properties.ContentType = Get(Path.GetFileName(filepath));
using (var stream = File.OpenRead(filepath))
{
await blob.UploadFromStreamAsync(stream);
}
//download file
filepath = #"D:\test\" + blob.Name;
using (var stream = File.OpenWrite(filepath))
{
await blob.DownloadToStreamAsync(stream);
}
}
// get the file content type
static string Get(string fileName)
{
var provider = new FileExtensionContentTypeProvider();
string contentType;
if (!provider.TryGetContentType(fileName, out contentType))
{
contentType = "application/octet-stream";
}
return contentType;
}
I'm using Azure Blob Storage to save some files. I'm having issues downloading this stream & am not sure why. I don't get any errors, just an empty stream. I've verified that the file exists in the container, and even ran code to list all files in the container. Any help would be greatly appreciated.
private async Task<MemoryStream> GetMemoryStreamAsync(string fileName)
{
var storageAccountName = Environment.GetEnvironmentVariable("storage_account_name");
var storageAccountKey = Environment.GetEnvironmentVariable("storage_access_key");
var storageContainerName = Environment.GetEnvironmentVariable("storage_container_name");
CloudStorageAccount storageAccount = new CloudStorageAccount(new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(storageAccountName, storageAccountKey), true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(storageContainerName);
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
MemoryStream stream = new MemoryStream();
await blockBlob.DownloadToStreamAsync(stream);
return stream;
}
You'll need to set the position to zero before returning the stream so that the consumer of the stream reads it from the beginning.
MemoryStream stream = new MemoryStream();
await blockBlob.DownloadToStreamAsync(stream);
stream.Position = 0;
return stream;
I have created hierarchical structure for managing file in bucket. Planning to create folder for each month i.e. dec-2017. There will be more than 10k pdf files in each folder.
Written C# code for getting objects from bucket. This code is working fine for accessing files that are on root of bucket. I am having issues with accessing files in folder i.e. my-bucket/dec-2017/test.pdf is not accessed using code.
Refer my bucket structure here
I am using following code, Can anyone don this before?
if (_storageService == null)
{
_storageService = CreateStorageClient();
}
ObjectsResource.GetRequest downloadRequest = null;
//string objectName = "dec-2017%2Ftest.pdf";
string objectName = "dec-2017/test.pdf";
downloadRequest = new ObjectsResource.GetRequest(_storageService, bucketName, objectName);
MemoryStream stream = new MemoryStream();
downloadRequest.Download(stream);
bytes = stream.ToArray();
Please check below sample code -
using Google.Apis.Auth.OAuth2;
using Google.Cloud.Storage.V1;
string file = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Test.csv");
File.WriteAllText(file, "test");
GoogleCredential credential = null;
BucketConnector bucketConnector = new BucketConnector();
credential = bucketConnector.ConnectStream();
var storageClient = StorageClient.Create(credential);
string folderPath = ConfigurationParameters.FOLDER_NAME_IN_BUCKET;
using (FileStream file = File.OpenRead(localPath))
{
objectName = folderPath + Path.GetFileName(localPath);
storage.UploadObject(bucketName, objectName, null, file);
}
I have written a C# application to archive data from a SQL Server table into an Azure blob. The archiving is configured by a JSON file and the values retrieved from the JSON file dictate what data to retrieve and archive.
The data needs to be stored in a blob name in this format
year/month/day/hour/older-than-[query-date]
Where query-date is the current date minus a number of days specified in the JSON file.
The issue I am having is how to incorporate compression to the process.
We would like to compress the data being archived to save space.
Currently the JSON settings mean that any data only than 30 days should be archived, but this results in about 3.7 million rows of data, so sometimes I get out of memory exceptions.
Regardless, how can I compress using GZip each row of data to the Azure blob? Here is existing code.
using (SqlDataAdapter adr = new SqlDataAdapter(comm))
{
adr.Fill(data);
data.TableName = config.TargetTableName;
}
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("blank");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
blobClient.DefaultRequestOptions.ParallelOperationThreadCount = 20;
blobClient.DefaultRequestOptions.MaximumExecutionTime = TimeSpan.FromMinutes(20);
blobClient.DefaultRequestOptions.ServerTimeout = TimeSpan.FromMinutes(20);
CloudBlobContainer container = blobClient.GetContainerReference(config.AzureContainerName);
StringBuilder jsonData = new StringBuilder();
CloudBlockBlob blob = container.GetBlockBlobReference($"{config.TargetTableName}/{DateTime.Now.Year}/{DateTime.Now.Month}/{DateTime.Now.Day}/{DateTime.Now.Hour}/Older-Than-{queryParameter.Value}.log");
using (var writeStream = blob.OpenWrite())
{
using (var writer = new StreamWriter(writeStream))
{
data.WriteXml(writer, XmlWriteMode.WriteSchema);
}
}
I suggest you write your data to a MemoryStream. Then we can compress the memory stream and write it to Azure Blob Service. Code below is for your reference.
CloudBlockBlob blob = container.GetBlockBlobReference($"{config.TargetTableName}/{DateTime.Now.Year}/{DateTime.Now.Month}/{DateTime.Now.Day}/{DateTime.Now.Hour}/Older-Than-{queryParameter.Value}.log");
using (var writeStream = blob.OpenWrite())
{
MemoryStream memoryStream = new MemoryStream();
using (var writer = new StreamWriter(memoryStream))
{
data.WriteXml(writer, XmlWriteMode.WriteSchema);
}
using (GZipStream compressionStream = new GZipStream(writeStream,
CompressionMode.Compress))
{
memoryStream.Position = 0;
memoryStream.CopyTo(compressionStream);
}
}