Save App service log into Azure Blob Storage, using Block blobs - c#

How to save our application logs informarion to Azure Storage blob container as .csv (Block blobs type).
My application developed in Asp.Net core 6.0

Please check by using the BlobStream if it helps to resolve the issue:
_container.CreateIfNotExist();
CloudBlob inputBlob = _container.GetBlobReference(outputBlobUri);
CloudBlob outputBlob = _container.GetBlobReference(inputBlobUri);
using (BlobStream input = inputBlob.OpenRead())
using (BlobStream output = outputBlob.OpenWrite())
{
ProcessImage(input, output);
output.Commit();
outputBlob.Properties.ContentType = "csv";
outputBlob.SetProperties();
AppRepository<Post> postRepository = new AppRepository<Post>();
Post post = postRepository.Find(partitionKey, rowkey);
post.PostImage = outputBlobUri;
post.State = true;
postRepository.Update(post);
postRepository.SubmitChange();
_queue.DeleteMessage(message);
}

Related

MS Visual Studio 2022 .Net 6 Framework in Razor Pages Project, is this possible?

I have a question, is it possible to change the target framework for a C# project using Razor Pages from .Net 4.8 to 6.0?
In my project I want to copy files from the local disk to Azure Cloud BlobStorage and currently I can only do it with framework 6.0 or higher. I can't do the switch to the framework from 4.8 to 6.0 because it is not offered to me in this project. What can I do?
I hope that I am not the only one with this problem. Does anyone have any ideas to solve my problem?
Many greetings and thanks in advance
Code sample, this will work only with .Net 6.0 or higher
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
//Set <storage-account-name> to actual storage account name
var blobServiceClient = new BlobServiceClient("DefaultEndpointsProtocol=https;" +
"AccountName=.....;AccountKey=.....;EndpointSuffix=core.windows.net");
//Create a name for the container
string containerName = "myFiles";
//Create the container and return a container client object
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
//Create a local file in the directory for uploading and downloading
string localPath = "c:\\DATA";
Directory.CreateDirectory(localPath);
string fileName = "myFile" + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
//Write text to the file
await File.WriteAllTextAsync(localFilePath, "Hello, World!");
//Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
//Upload data from the local file
await blobClient.UploadAsync(localFilePath, true);
Console.WriteLine("Listing blobs...");
//List all blobs in the container
await foreach (BlobItem blobItem in containerClient.GetBlobsAsync())
{
Console.WriteLine("\t" + blobItem.Name);
}
//Download the blob to a local file and append the string "DOWNLOADED" for compare
string downloadFilePath = localFilePath.Replace(".txt", "DOWNLOADED.txt");
Console.WriteLine("\nDownloading blob to\n\t{0}\n", downloadFilePath);
//Download the blob's contents and save it to a file
await blobClient.DownloadToAsync(downloadFilePath);
Console.WriteLine("Done");

How to download a file to local from Aspose.Cloud?

I am trying to convert doc to pdf using Aspose.Words-Cloud but the conversion API uploads the converted file to cloud and I can't find a clear way to download it to local via code.
This is the code provided by Aspose.Cloud,
private static void downloadToLocal(Configuration config, string path)
{
StorageApi storageApi = new StorageApi(config);
GetDownloadRequest request = new GetDownloadRequest();
request.Path = path;
request.Storage = null;
request.VersionId = null;
var response = storageApi.GetDownload(request);
}
but it is not clear which library StorageApi and GetDownloadRequest are a part of.
It seems you are using some old SDK version of Aspose.Words Cloud. Please note since 19.4 release, now Aspose.Words Cloud API has its own storage methods for storage operations. Please use the latest version of Aspose.Words Cloud SDK for .NET from NuGet to download a file to a local drive from cloud storage as follows.
P.S: I am a developer evangelist at aspose.cloud.
public static async Task DownloadStorageFile_Word()
{
Aspose.Words.Cloud.Sdk.Configuration config = new Aspose.Words.Cloud.Sdk.Configuration();
config.ClientId = ClientId;
config.ClientSecret = ClientSecret;
Aspose.Words.Cloud.Sdk.WordsApi wordsApi = new Aspose.Words.Cloud.Sdk.WordsApi(config);
// Download a File from Cloud Storage
var downloadRequest = new Aspose.Words.Cloud.Sdk.Model.Requests.DownloadFileRequest("Output.pdf",null,null);
var actual = await wordsApi.DownloadFile(downloadRequest);
var fileStream = System.IO.File.Create(#"C:\Temp\Output.pdf");
actual.CopyTo(fileStream);
fileStream.Close();
}

how to copy all files from a directory and move the copy of those file to different directory in same azure blob container c#

I am a newbie to c# and .net and Azure developement i wanna copy all files from a directory and move the copy of those file to different directory in same azure blob container so that i can work with the copy of file when the original file is safe. But had no idea how to do so..
I have tested in my environment
You can use the below code to copy blobs from one directory to another directory in a same container in Azure Blob Storage:
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
var connectionString = "StorageAccountConnectionString";
string containerName = "ContainerName";
string directory1 = "SourceDirectory";
string directory2 = "TargetDirectory";
var serviceClient = new BlobServiceClient(connectionString);
var containerClient = serviceClient.GetBlobContainerClient(containerName);
foreach (BlobItem blob in containerClient.GetBlobs())
{
if (blob.Name.Contains(directory1))
{
var blobUri = containerClient.GetBlobClient(blob.Name);
var newBlob = containerClient.GetBlobClient(blob.Name.Replace(directory1,directory2));
newBlob.StartCopyFromUri(blobUri.Uri);
}
}

Uploading File to Azure Blob Storage results in size 0

Able to successfully create a container and upload any file in a created container. Having an issue after downloading a file (any uploaded file) they are always zero. Here is my code using C# in a NET.Core application.
// Get a reference to a blob
BlobContainerClient container = new BlobContainerClient(connectionString: connectionString, blobContainerName: "test-container");
await container.CreateIfNotExistsAsync();
try
{
// Get a reference to a blob
BlobClient blob = container.GetBlobClient(uploadFiles.First().FileName);
string filePath = Path.GetFullPath(uploadFiles.First().FileName);
// Upload file data
await blob.UploadAsync(filePath, true);
}
catch (Exception e)
{
_ = e;
}
enter image description here
Image is attached to show what is inside the Auzure portal storage account, the size there is ZERO ?!
I am following documentation from Microsoft and they are also not setting any size explicitly when uploading a file.
The following code can work:
using FileStream uploadFileStream = File.OpenRead(filePath);
await blob.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();

Upload a zip file in small chunks to azure cloud blob storage

I want to upload zip file in small chunks (less than 5 MB) to blob containers in Microsoft Azure Storage. I already configured 4 MB chunk limits in BlobRequestOptions but when I run my code and check the memory usage in Azure Cloud, its not uploading in chunks. I am using C# .NET Core. Because I want to zip files that are already located in Azure Cloud, so first I am downloading the individual files to stream, adding stream to zip archive and then uploading the zip back to the cloud. The following is my code:
if (CloudStorageAccount.TryParse(_Appsettings.GetSection("StorConf").GetSection("StorageConnection").Value, out CloudStorageAccount storageAccount)) {
CloudBlobClient BlobClient = storageAccount.CreateCloudBlobClient();
TimeSpan backOffPeriod = TimeSpan.FromSeconds(2);
int retryCount = 1;
BlobRequestOptions bro = new BlobRequestOptions() {
SingleBlobUploadThresholdInBytes = 4096 * 1024, // 4MB
ParallelOperationThreadCount = 1,
RetryPolicy = new ExponentialRetry(backOffPeriod, retryCount),
// new
ServerTimeout = TimeSpan.MaxValue,
MaximumExecutionTime = TimeSpan.FromHours(3),
//EncryptionPolicy = policy
};
// set blob request option for created blob client
BlobClient.DefaultRequestOptions = bro;
// using specified container which comes via transaction id
CloudBlobContainer container = BlobClient.GetContainerReference(transaction id);
using(var zipArchiveMemoryStream = new MemoryStream()) {
using(var zipArchive = new ZipArchive(zipArchiveMemoryStream, ZipArchiveMode.Create, true)) // new
{
foreach(FilesListModel FileName in filesList) {
if (await container.ExistsAsync()) {
CloudBlob file = container.GetBlobReference(FileName.FileName);
if (await file.ExistsAsync()) {
// zip: get stream and add zip entry
var entry = zipArchive.CreateEntry(FileName.FileName, CompressionLevel.Fastest);
// approach 1
using(var entryStream = entry.Open()) {
await file.DownloadToStreamAsync(entryStream, null, bro, null);
await entryStream.FlushAsync();
entryStream.Close();
}
} else {
downlReady = "false";
}
} else {
// case: Container does not exist
//return BadRequest("Container does not exist");
}
}
}
if (downlReady == "true") {
string zipFileName = "sample.zip";
CloudBlockBlob zipBlockBlob = container.GetBlockBlobReference(zipFileName);
zipArchiveMemoryStream.Position = 0;
//zipArchiveMemoryStream.Seek(0, SeekOrigin.Begin);
// new
zipBlockBlob.Properties.ContentType = "application/x-zip-compressed";
await zipArchiveMemoryStream.FlushAsync();
await zipBlockBlob.UploadFromStreamAsync(zipArchiveMemoryStream, zipArchiveMemoryStream.Length, null, bro, null);
}
zipArchiveMemoryStream.Close();
}
}
The following is a snapshot of the memory usage (see private_Memory) in azure cloud kudu process explorer:
memory usage
Any suggestions would be really helpful. Thank you.
UPDATE 1:
To make it more clear. I have files which are already located in Azure blob storage. Now I want to read the files from the container, create a ZIP which contains all of my files. The major challenge here is that my code is obviously loading all files into memory to create the zip. If and how it is possible to read files from a container and write the ZIP file back into the same container in parallel/pieces, so that my Azure web app does NOT need to load the whole files into memory? Ideally I read the files in pieces and also start writing the zip already so that my Azure web app consumes less memory.
I have found the solution by referring to this stackoverflow article:
How can I dynamically add files to a zip archive stored in Azure blob storage?
The way to do is to simultaneously write to the zip memory stream while reading / downloading the input files.
Below is my code snippet:
using (var zipArchiveMemoryStream = await zipBlockBlob.OpenWriteAsync(null, bro, null))
using (var zipArchive = new ZipArchive(zipArchiveMemoryStream, ZipArchiveMode.Create))
{
foreach (FilesListModel FileName in filesList)
{
if (await container.ExistsAsync())
{
CloudBlob file = container.GetBlobReference(FileName.FileName);
if (await file.ExistsAsync())
{
// zip: get stream and add zip entry
var entry = zipArchive.CreateEntry(FileName.FileName, CompressionLevel.Fastest);
// approach 1
using (var entryStream = entry.Open())
{
await file.DownloadToStreamAsync(entryStream, null, bro, null);
entryStream.Close();
}
}
}
}
zipArchiveMemoryStream.Close();
}

Categories

Resources