I am trying to upload files to Azure Data Box with C# and I am running into problems. Here is my code:
private void uploadBlob()
{
String storageconn = ConfigurationManager.AppSettings.Get("StorageConnectionString");
BlobServiceClient blobServiceClient = new BlobServiceClient(storageconn);
BlobContainerClient container = blobServiceClient.GetBlobContainerClient("democontainer");
Guid guid = Guid.NewGuid();
String filename = guid.ToString();
BlobClient blobClient = container.GetBlobClient(filename);
byte[] memoryArray = new byte[3252224];
var strm = new MemoryStream(memoryArray);
blobClient.Upload(strm);
strm.Close();
}
When the blobClient.Upload method executes, I get the following error:
The value for one of the httpheaders is not in the correct format
Does anyone have any ideas on what the problem is?
Related
Context: Encrypt and Decrypt an audio file (.wav) in Azure Storage.
Issue: inputStream must be seek-able (when encrypting) "await pgp.EncryptStreamAsync(sourceStream, outputStream);"
I'm not a C# Developer :)
Thank you for your help,
Here is the code i'm using:
static async Task Main()
{
string connectionString = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
////Create a unique name for the container
string containerName = "audioinput";
string filename = "abc.wav";
//// Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient sourcecontainer = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = sourcecontainer.GetBlobClient(filename);
if (sourcecontainer.Exists())
{
var sourceStream = new MemoryStream();
//Download blob to MemoryStream
await blobClient.DownloadToAsync(sourceStream);
sourceStream.Position = 0;
//OutputStream
await using var outputStream = new MemoryStream();
//Get encryptionkeys
EncryptionKeys encryptionKeys;
using (Stream publicKeyStream = new FileStream(#"...\public.asc", FileMode.Open))
encryptionKeys = new EncryptionKeys(publicKeyStream);
PGP pgp = new PGP(encryptionKeys);
await pgp.EncryptStreamAsync(sourceStream, outputStream);
}
else
{
Console.WriteLine("container doesn't exist");
}
}
I am trying to upload the file that I have stored in MemoryStream using the following code.
private static void SaveStream(MemoryStream stream, string fileName)
{
var blobStorageService = new BlobStorageService();
UploadBlob(stream, fileName);
}
public void UploadBlob(MemoryStream fileStream,string fileName)
{
var blobContainer = _blobServiceClient.GetBlobContainerClient(Environment
.GetEnvironmentVariable("ContainerName"));
var blobClient = blobContainer.GetBlobClient(fileName);
blobClient.Upload(fileStream); <--- Error Message
}
Error Message: System.ArgumentException: 'content.Position must be less than content.Length.Please set content.Position to the start of the data to upload.'
This happened because the current position is at the end of the stream. You can set the position to the start of the stream before uploading
var blobClient = blobContainer.GetBlobClient(fileName);
fileStream.Position =0;
blobClient.Upload(fileStream)
I am using the following code to upload an XML file to an Azure Blob Storage account using the DotNetZip nuget package.
XmlDocument doc = new XmlDocument();
doc.Load(path);
string xmlContent = doc.InnerXml;
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
var cloudBlobClient = storageAccount.CreateCloudBlobClient();
var cloudBlobContainer = cloudBlobClient.GetContainerReference(container);
cloudBlobContainer.CreateIfNotExists();
using (var fs = File.Create("test.zip"))
{
using (var s = new ZipOutputStream(fs))
{
s.PutNextEntry("entry1.xml");
byte[] buffer = Encoding.ASCII.GetBytes(xmlContent);
s.Write(buffer, 0, buffer.Length);
fs.Position = 0;
//Get the blob ref
var blob = cloudBlobContainer.GetBlockBlobReference("test.zip");
blob.Properties.ContentEncoding = "zip"
blob.Properties.ContentType = "text/plain";
blob.Metadata["filename"] = "test.zip";
blob.UploadFromStream(fs);
}
}
This code creates a zip file in my container. But when I download it and try to open it, I get the following error:
"Windows cannot open the folder. The compressed (zipped) folder is invalid". But the saved zipped file in my application directory can be unzipped fine and contains my xml file.
What am I doing wrong?
I am able to reproduce the problem you're having. Essentially the issue is that the content is not completely written in the zip file when you initiated the upload command. In my test, the zip file size on the local disk was 902 bytes however at the time of uploading the size of file stream was just 40 bytes and that's causing the problem.
What I did was split the two functionality where the first one just creates the file and other reads from file and uploads in storage. Here's the code I used:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
var cloudBlobClient = storageAccount.CreateCloudBlobClient();
var cloudBlobContainer = cloudBlobClient.GetContainerReference("test");
cloudBlobContainer.CreateIfNotExists();
using (var fs = File.Create("test.zip"))
{
using (var s = new ZipOutputStream(fs))
{
s.PutNextEntry("entry1.xml");
byte[] buffer = File.ReadAllBytes(#"Path\To\MyFile.txt");
s.Write(buffer, 0, buffer.Length);
//Get the blob ref
}
}
using (var fs = File.OpenRead("test.zip"))
{
var blob = cloudBlobContainer.GetBlockBlobReference("test.zip");
blob.Properties.ContentEncoding = "zip";
blob.Properties.ContentType = "text/plain";
blob.Metadata["filename"] = "test.zip";
fs.Position = 0;
blob.UploadFromStream(fs);
}
I created Bucket on google Storage and I programmatically uploaded some files on it. When I try to download them, I get this exception:
The specified data could not be decrypted
I wrote code such that
GoogleCredential credential = null;
var jsonFileBytes = Properties.Resources.stitcherautoupdate_55bd51f48cf0;
var jsonFileString = Encoding.UTF8.GetString(jsonFileBytes, 0, jsonFileBytes.Length);
var json = Newtonsoft.Json.JsonConvert.DeserializeObject<System.Object>(jsonFileString);
var jsonString = json.ToString();
credential = GoogleCredential.FromJson(jsonString);
StorageClient = StorageClient.Create(credential);
StorageClient.DownloadObject(bucketName, fileName, fileStream);
My recommendation regarding your issue is to try following the methods for upload and download mentioned in the documentation. Once you get it working, you can start slowly changing the code so you know which part is the one causing the issue.
This document describes how you should first configure your Cloud Storage client library and setting up authentication.
A sample code for uploading an object:
private void UploadFile(string bucketName, string localPath,
string objectName = null)
{
var storage = StorageClient.Create();
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
storage.UploadObject(bucketName, objectName, null, f);
Console.WriteLine($"Uploaded {objectName}.");
}
}
For downloading an object:
private void DownloadObject(string bucketName, string objectName,
string localPath = null)
{
var storage = StorageClient.Create();
localPath = localPath ?? Path.GetFileName(objectName);
using (var outputFile = File.OpenWrite(localPath))
{
storage.DownloadObject(bucketName, objectName, outputFile);
}
Console.WriteLine($"downloaded {objectName} to {localPath}.");
}
I am uploading an excel file to azure storage container. When the file gets uploaded, and I try to download it back from the portal and open it, the open fails because the format of the file and extension do not match. Also, there is no size in the size column corresponding to the file. I cannot spot the error. The code is in asp.net core 3.1 with c#.
Here is my code
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(connectionString); // to the azure account
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(containerName); // container in which the file will be uploaded
blob = cloudBlobContainer.GetBlockBlobReference(f); // f is file name
await blob.UploadFromStreamAsync(s); // this is a memory stream
blob.Properties.ContentType = fileType;
According to my test, we can use the following code to upload excel file to Azure blob storage
Install SDK
Install-Package Microsoft.Azure.Storage.Blob -Version 11.1.1
Install-Package Microsoft.AspNetCore.StaticFiles -Version 2.2.0
My excel file(.xlsx)
Code
static async Task Main(string[] args)
{
var filepath = #"D:\test.xlsx";
var storageAccount = CloudStorageAccount.Parse("<connection string>");
var cloudBlobClient = storageAccount.CreateCloudBlobClient();
var cloudBlobContainer = cloudBlobClient.GetContainerReference("test1");
var blob = cloudBlobContainer.GetBlockBlobReference(Path.GetFileName(filepath));
blob.Properties.ContentType = Get(Path.GetFileName(filepath));
using (var stream = File.OpenRead(filepath))
{
await blob.UploadFromStreamAsync(stream);
}
//download file
filepath = #"D:\test\" + blob.Name;
using (var stream = File.OpenWrite(filepath))
{
await blob.DownloadToStreamAsync(stream);
}
}
// get the file content type
static string Get(string fileName)
{
var provider = new FileExtensionContentTypeProvider();
string contentType;
if (!provider.TryGetContentType(fileName, out contentType))
{
contentType = "application/octet-stream";
}
return contentType;
}