PGP encrypt/decrypt in Azure Blob Storage - c#

I am in the process of moving applications to Azure using Azure Function Apps and Blob Storge. I’m trying to figure out a way to do Encrypting/Decrypting in Azure. When doing the same in our on-premise environment I used PGPCore package but that is when you can point to a local file. I’m struggling to figure out a way to do that in Azure Blob Storage. All examples seem to use local files.
Is see paid for apps like DidiSoft but I’m trying to stick with a free version if possible.
Currently I’m working on the decrypt side of things and the concept is that I will have a PGP file delivered to my Blob Storage. I have a Azure Function with a blob trigger looking for *.pgp files and that will call another Function App specifically for just PGP work. The idea is that the PGP Function App will, in this case, decrypt the file right back into the same blob storage but this time as a .txt file. I then have a blob trigger looking for .txt files and it will run code it needs to process that file. The .txt trigger is already done and working. I already have my PGP function app calling Key Vault to get any Public/Private keys needed but what I’m not finding is a good way to do the actual Encrypt/Decrypt in Azure.
Anyone have any suggestions or examples to go look at?

We follow a similar process: download a PGP file (SFTP in our case) and load it to Blob Storage. Then we call an Azure Function that uses PgpCore to Decrypt the PGP files in Blob Storage.
Here is the Decrypt code in the Azure Function:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using PgpCore;
using Azure.Storage.Blobs;
namespace AF_Services
{
public static class PGPDecryptBlob
{
[FunctionName("PGPDecryptBlob")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation($"C# HTTP trigger function {nameof(PGPDecryptBlob)} processed a request.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
try
{
// Create TMP directory
Directory.CreateDirectory(IOHelper.TempAppDirectoryName);
string sourceContainerName = data?.sourceContainerName;
log.LogInformation($"Source container name: {sourceContainerName}");
string sourceBlobName = data?.sourceBlobName;
log.LogInformation($"Source blob name: {sourceBlobName}");
string targetContainerName = data?.targetContainerName;
log.LogInformation($"Target container name: {targetContainerName}");
string targetBlobName = data?.targetBlobName;
log.LogInformation($"Target blob name: {targetBlobName}");
if (sourceContainerName == null)
throw new ArgumentNullException("Required parameter sourceContainerName not provided");
if (sourceBlobName == null || targetContainerName == null || targetBlobName == null)
throw new ArgumentNullException("Required parameter sourceBlobName not provided");
if (targetContainerName == null)
throw new ArgumentNullException("Required parameter targetContainerName not provided");
if (targetBlobName == null)
throw new ArgumentNullException("Required parameter targetBlobName not provided");
string keyBlobName = SettingsHelper.Get(Settings.PrivateKeyBlobName);
string passPhrase = SettingsHelper.Get(Settings.PassPhrase);
string blobAccountConnStr = SettingsHelper.Get(Settings.BlobConnectionString);
// TMP file names
var temp_sourceFileName = IOHelper.BuildTempFileName(BlobHelper.StripPath(sourceBlobName));
var temp_targetFileName = IOHelper.BuildTempFileName(BlobHelper.StripPath(targetBlobName));
var temp_keyFileName = IOHelper.BuildTempFileName(BlobHelper.StripPath(keyBlobName));
// download Blob to TMP
using (var sourceStream = new FileStream(temp_sourceFileName, FileMode.Create))
{
var sourceBlobClient = new BlobClient(blobAccountConnStr, sourceContainerName, sourceBlobName);
await sourceBlobClient.DownloadToAsync(sourceStream);
}
// download key to TMP
using (var keyStream = new FileStream(temp_keyFileName, FileMode.Create))
{
var keyBlobClient = new BlobClient(blobAccountConnStr, sourceContainerName, keyBlobName);
await keyBlobClient.DownloadToAsync(keyStream);
}
// Decrypt
using (var pgp = new PGP())
{
using (FileStream inputFileStream = new FileStream(temp_sourceFileName, FileMode.Open))
{
using (Stream outputFileStream = File.Create(temp_targetFileName))
{
using (Stream privateKeyStream = new FileStream(temp_keyFileName, FileMode.Open))
{
await pgp.DecryptStreamAsync(inputFileStream, outputFileStream, privateKeyStream, passPhrase);
}
}
}
}
// write to target blob
using (var decryptStream = new FileStream(temp_targetFileName, FileMode.Open))
{
var targetBlobClient = new BlobClient(blobAccountConnStr, targetContainerName, targetBlobName);
await targetBlobClient.UploadAsync(decryptStream, true);
return new OkObjectResult(new { Status = "Decrypted", BlobInfo = targetBlobClient });
}
}
catch (Exception ex)
{
return new BadRequestObjectResult(new { RequestBody = data, Exception = ex });
}
finally
{
Directory.Delete(IOHelper.TempAppDirectoryName, true);
}
}
}
}
The Encrypt code is similar, just in the opposite direction.
UPDATE:
The sourceContainer, sourceBlobName, targetContainer, and targetBlobName are passed in the JSON body of the Azure Function call.
The PrivateKeyBlobName, PassPhrase, and BlobConnectionString are stored in the Azure Function App's Settings.
The PGP work has to happen locally, so the IOHelper class (code below) is for working with the Azure Function's local file system.
public static class IOHelper
{
private static string _TempAppDirectoryName;
public static string TempAppDirectoryName
{
get
{
if (String.IsNullOrWhiteSpace(_TempAppDirectoryName))
{
var ts = DateTime.UtcNow.Ticks.ToString();
_TempAppDirectoryName = Path.Combine(Path.GetTempPath(), "{YourLocalDirectoryName}", ts);
}
return _TempAppDirectoryName;
}
}
public static string BuildTempFileName(string fileName)
{
return Path.Combine(TempAppDirectoryName, fileName);
}
}

Related

Can I get a specific Block from an Azure Block Blob?

I want to get a specific block from an azure block blob with the blockId, is this even possible?
something like
var blockBlob = new BlockBlobClient(connectionString, containerName, blobName);
var blocklist = await GetBlobBlockList(blobName, cancellationToken);
var firstBlock = blocklist.First();
var memStream = new MemoryStream();
await blockBlob.DownloadStreamingAsync(memStream, firstBlock.Name);
I want to get a specific block from an azure block blob with the
blockId, is this even possible?
It should be possible to do so however it won't be as simple as you mentioned in your sample code.
Here's what you would need to do:
Fetch list of blocks. Each element in the list will have a block id and the size of the block.
Assuming you want to get data for block "n", what you will do is iterate over the list from 0 to n - 1 block and add the size of each block.
Next you would need to call DownloadRangeToStreamAsync(Stream, Nullable<Int64>, Nullable<Int64>), where your offset value will be the sum of the size of each block calculated in step 2 and length value will be the size of the block you wish to download.
You need to create a block blob below you can find the procedure:
you need to create a container under storage account as follows:
inside the container you can find the block blob under properties as shown below
To Get the block Id you can follow the code below:
code:
using System;
using System.Threading.Tasks;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
class Program
{
static async Task Main(string[] args)
{
string connectionString = "DefaultEndpointsProtocol=https;AccountName=<account-name>;AccountKey=<account-key>;EndpointSuffix=core.windows.net";
string containerName = "***";
string blobName = "***.txt";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = containerClient.GetBlobClient(blobName);
BlockList blockList = await blobClient.GetBlockListAsync(BlockListType.All)
string firstBlockId = blockList.Value.First().Name;
Console.WriteLine("First block ID: " + firstBlockId);
}
}
After getting the Block Id now you can get the certain block using the code below:
code:
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
class Program
{
static async Task Main(string[] args)
{
string connectionString = "DefaultEndpointsProtocol=https;AccountName=<account-name>;AccountKey=<account-key>;EndpointSuffix=core.windows.net";
string containerName = "mycontainer---";
string blobName = "myblob**";
string blockId = "1234567**";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = containerClient.GetBlobClient(blobName);
MemoryStream blockData = new MemoryStream();
await blobClient.DownloadStreamingAsync(blockId, blockData, cancellationToken: CancellationToken.None);
Console.WriteLine("Block data:");
Console.WriteLine(Convert.ToBase64String(blockData.ToArray()));
}
}
In the above code you need to replace connection string, container name, block Id, blob name as below:
By following the above procedure, I got successfully.

Upload files to blob storage via Azure Front Door

The app is set up on multiple on-premise services and uploads regularly some files to Azure Blob Storage placed in East US. But now it's necessary to place an instance of the app in the Australian region. As a result, upload time to the cloud increased drastically.
I have tested if Azure Front Door can help to improve it and I found that download from blob storage works 5 times faster if I use the Azure Front Door link. Now I struggle to change C# code to upload files via Azure Front Door. I have tried to use the suffix "azurefd.net" instead of "core.windows.net" in the connection string but it does not help. Could somebody give me a hint on how to upload files to Azure blob storage via Azure Front Door in C#?
As the Storage connection string uses only storage endpoint (core.windows.net), we cannot use front door endpoint (azurefd.net) in the connection string.
I integrated Azure Storage Account with Front Door. I am able to access the Blob Files in the Azure Storage Account with Front Door URL.
We cannot upload files to Blob Storage via Azure Front Door using C#
This is because Azure Storage in C# accepts connection string only from storage endpoint
Unfortunately, for upload, Azure Front Door does not provide any benefit. I used
PUT requests for the test described here: https://learn.microsoft.com/en-us/rest/api/storageservices/put-blob
PUT https://<entityName>.azurefd.net/<containerName>/<blobName>?<sharedAccessSignature>
x-ms-version: 2020-10-02
x-ms-blob-type: BlockBlob
< C:\Downloads\t1.txt
and compared times for storage account and Azure Front account. There is no difference in speed for upload.
Code that I used for test:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
namespace SandboxV2
{
class Program
{
static async Task Main()
{
string frontDoorUrl = "https://<FRONT-DOOR>.azurefd.net";
string storageUrl = "https://{STORAGE-ACCOUNT}.blob.core.windows.net";
string sasString = "...";
Console.Write("File Path: ");
string filePath = Console.ReadLine();
await RunUploadTestAsync(filePath, frontDoorUrl, sasString, "-fd");
await RunUploadTestAsync(filePath, storageUrl, sasString, "-storage");
}
private static async Task RunUploadTestAsync(string filePath, string rootUrl, string sasString, string suffix)
{
string blobName = Path.GetFileNameWithoutExtension(filePath) + suffix + Path.GetExtension(filePath);
Console.WriteLine(rootUrl);
string containerName = "testaccess";
var speeds = new List<double>();
var times = new List<TimeSpan>();
for (int i = 0; i < 5; i++)
{
var t1 = DateTime.UtcNow;
var statusCode = await UploadAsSingleBlock(filePath, rootUrl, blobName, containerName, sasString);
var t2 = DateTime.UtcNow;
var time = t2 - t1;
var speed = new FileInfo(filePath).Length / time.TotalSeconds / 1024 / 1024 * 8;
speeds.Add(speed);
times.Add(time);
Console.WriteLine($"Code: {statusCode}. Time: {time}. Speed: {speed}");
}
Console.WriteLine($"Average time: {TimeSpan.FromTicks((long)times.Select(t => t.Ticks).Average())}. Average speed: {speeds.Average()}.");
}
private static async Task<HttpStatusCode> UploadAsSingleBlock(string filePath, string rootUrl, string blobName, string containerName, string sasString)
{
var request = new HttpRequestMessage(HttpMethod.Put, $"{rootUrl}/{containerName}/{blobName}?{sasString}");
request.Headers.Add("x-ms-version", "2020-10-02");
request.Headers.Add("x-ms-blob-type", "BlockBlob");
HttpResponseMessage response;
using (var fileStream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read))
{
request.Content = new StreamContent(fileStream);
using (var client = new HttpClient())
response = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead);
}
return response.StatusCode;
}
}
}

Azure blob file got corrupted post uploading file using UploadFromStreamAsync

I tried below code to upload file to azure blob container but uploaded file got corrupted.
public async void UploadFile(Stream memoryStream, string fileName, string containerName)
{
try
{
memoryStream.Position = 0;
CloudBlockBlob file = GetBlockBlobContainer(containerName).GetBlockBlobReference(fileName);
file.Metadata["FileType"] = Path.GetExtension(fileName);
file.Metadata["Name"] = fileName;
await file.UploadFromStreamAsync(memoryStream).ConfigureAwait(false);
}
catch (Exception ex)
{
throw ex;
}
}
How can I resolve it.
Unable to open excel file which was uploaded to blob using above code.
Error:
Stream streamData= ConvertDataSetToByteArray(sourceTable); // sourceTable is the DataTable
streamData.Position = 0;
UploadFile(streamData,'ABCD.xlsx','sampleBlobContainer'); //calling logic to upload stream to blob
private Stream ConvertDataSetToByteArray(DataTable dataTable)
{
StringBuilder sb = new StringBuilder();
IEnumerable<string> columnNames = dataTable.Columns.Cast<DataColumn>().
Select(column => column.ColumnName);
sb.AppendLine(string.Join(",", columnNames));
foreach (DataRow row in dataTable.Rows)
{
IEnumerable<string> fields = row.ItemArray.Select(field => (field.ToString()));
sb.AppendLine(string.Join(",", fields));
}
var myByteArray = System.Text.Encoding.UTF8.GetBytes(sb.ToString());
var streamData = new MemoryStream(myByteArray);
return streamData;
}
Your code above creates a .csv file, not an .xlsx file. You can easily test this out by creating something similar to what your code builds, e.g.:
Then if you rename it to .xlsx, to replicate what you do, you get:
You have two solutions:
You either need to build an actual .xlsx file, you can do this with the https://github.com/JanKallman/EPPlus package for example
or
You need to save your file as a .csv, because that's what it really is.
The fact the you upload it to azure blob storage is completely irrelevant here - there's no issue with the upload.
Since the stream is instantiated outside this method I assume the file is handled there and added to the stream, however, here you are returning the position of the stream to 0, thus invalidating the file.
First of all, are you sure the file got corrupted? Save both the MemoryStream contents and the blog to local files and compare them. You could also save the MemoryStream contents to a file and use UploadFromFileAsync.
To check for actual corruption you should calculate the content's MD5 hash in advance and compare it with the blob's hash after upload.
To calculate the stream's MD5 hash use ComputeHash.
var hasher=MD5.Create();
memoryStream.Position = 0;
var originalHash=Convert.ToBase64String(hasher.ComputeHash(memoryStream));
To get the client to calculate an blob has you need to set the BlobRequestOptions.StoreBlobContentMD5 option while uploading :
memoryStream.Position = 0;
var options = new BlobRequestOptions()
{
StoreBlobContentMD5 = testMd5
};
await file.UploadFromStreamAsync(memoryStream,null,options,null).ConfigureAwait(false);
To retrieve and check the uploaded hash use FetchAttributes or FetchAttributesAsync and compare the BlobProperties.ContentMD5 value with the original :
file.FetchAttributes();
var blobHash=file.Properties.ContentMD5;
if (blobHash != originalHash)
{
//Ouch! Retry perhaps?
}
It seems that your method don't have fatal problems. I guess the part of your Stream conversion has gone wrong.
This is my code:
using System;
using System.IO;
using Microsoft.WindowsAzure.Storage;
namespace ConsoleApp7
{
class Program
{
public static class Util
{
public async static void UploadFile(Stream memoryStream, string fileName, string containerName)
{
memoryStream.Position = 0;
var storageAccount = CloudStorageAccount.Parse("xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
var blockBlob = storageAccount.CreateCloudBlobClient()
.GetContainerReference(containerName)
.GetBlockBlobReference(fileName);
blockBlob.UploadFromStreamAsync(memoryStream);
}
}
static void Main(string[] args)
{
//Open the file
FileStream fileStream = new FileStream("C:\\Users\\bowmanzh\\Desktop\\Book1.xlsx", FileMode.Open);
//Read the byte[] of File
byte[] bytes = new byte[fileStream.Length];
fileStream.Read(bytes,0,bytes.Length);
fileStream.Close();
//turn from byte[] to Stream
Stream stream = new MemoryStream(bytes);
Util.UploadFile(stream,"Book2.xlsx","test");
Console.WriteLine("Hello World!");
Console.ReadLine();
}
}
}

C# - Save object to JSON file

I'm writing a Windows Phone Silverlight app. I want to save an object to a JSON file. I've written the following piece of code.
string jsonFile = JsonConvert.SerializeObject(usr);
IsolatedStorageFile isoStore = IsolatedStorageFile.GetUserStoreForApplication();
IsolatedStorageFileStream isoStream = new IsolatedStorageFileStream("users.json", FileMode.Create, isoStore);
StreamWriter str = new StreamWriter(isoStream);
str.Write(jsonFile);
This is enough to create a JSON file but it is empty. Am I doing something wrong? Wasn't this supposed to write the object to the file?
The problem is that you're not closing the stream.
File I/O in Windows have buffers at the operating system level, and .NET might even implement buffers at the API level, which means that unless you tell the class "Now I'm done", it will never know when to ensure those buffers are propagated all the way down to the platter.
You should rewrite your code just slightly, like this:
using (StreamWriter str = new StreamWriter(isoStream))
{
str.Write(jsonFile);
}
using (...) { ... } will ensure that when the code leaves the block, the { ... } part, it will call IDisposable.Dispose on the object, which in this case will flush the buffers and close the underlying file.
I use these. Shoud work for you as well.
public async Task SaveFile(string fileName, string data)
{
System.IO.IsolatedStorage.IsolatedStorageFile local =
System.IO.IsolatedStorage.IsolatedStorageFile.GetUserStoreForApplication();
if (!local.DirectoryExists("MyDirectory"))
local.CreateDirectory("MyDirectory");
using (var isoFileStream =
new System.IO.IsolatedStorage.IsolatedStorageFileStream(
string.Format("MyDirectory\\{0}.txt", fileName),
System.IO.FileMode.Create, System.IO.FileAccess.ReadWrite, System.IO.FileShare.ReadWrite,
local))
{
using (var isoFileWriter = new System.IO.StreamWriter(isoFileStream))
{
await isoFileWriter.WriteAsync(data);
}
}
}
public async Task<string> LoadFile(string fileName)
{
string data;
System.IO.IsolatedStorage.IsolatedStorageFile local =
System.IO.IsolatedStorage.IsolatedStorageFile.GetUserStoreForApplication();
using (var isoFileStream =
new System.IO.IsolatedStorage.IsolatedStorageFileStream
(string.Format("MyDirectory\\{0}.txt", fileName),
System.IO.FileMode.Open, System.IO.FileAccess.Read, System.IO.FileShare.Read,
local))
{
using (var isoFileReader = new System.IO.StreamReader(isoFileStream))
{
data = await isoFileReader.ReadToEndAsync();
}
}
return data;
}

How to decompress zip with password in windows 8 metro program?

I have a zip password file and know this password.
I need open this zip file in Windows 8 metro app program code.
But 'System.IO.Compression.ZipArchive' is not supported decompress zip with password in Windows 8 metro app program code.
I'm try SharpZipLib and DotNetZip. BUT they are not support net 4.5. So i doesn't use them in my metro program code.
I'm try Ionic.Zip. It's ok in program code. I want to build packages to upload to the windows store. But not pass in microsoft code review.
Is there another way?
thanks a lot
The System.IO.Compression.FileSystem assembly is not available for Windows Store apps, so you cannot use the ExtractToDirectory extension method of the ZipFileExtensions class.
Instead of DirectoryInfo, FileInfo, etc. use StorageFile. See Accessing data and files and the File access sample for more information on how to read and write files in Metro style apps. Then you'll need to read the data from the file into a stream and then pass that to methods of one of the following class (your choice):
DeflateStream (which internally uses zlib as of .NET 4.5)
ZipArchive or GZipStream classes. Those are available to Metro style apps, even if the file specific extension methods are not.
Windows Runtime type Decompressor to decompress files.
you can using https://sharpcompress.codeplex.com/.
it support open file zip have password
code bellow
//if file zip have a file pdf , a file xml
async void Read(StorageFile file)
{
MemoryStream memoryFilePDf = new MemoryStream();
MemoryStream memoryFileXml = new MemoryStream();
FilePdf = null;
FileXml = null;
using (var zipStream = await file.OpenStreamForReadAsync())
{
using (MemoryStream zipMemoryStream = new MemoryStream((int)zipStream.Length))
{
await zipStream.CopyToAsync(zipMemoryStream);
try
{
using (var archive = ZipArchive.Open(zipMemoryStream, PassWord))
{
bool isFilePdf = false;
foreach (var entry in archive.Entries)
{
if (!entry.Key.ToLower().EndsWith(".pdf") && !entry.Key.ToLower().EndsWith(".xml"))
{
continue;
}
if (entry.Key.ToLower().EndsWith(".pdf"))
{
isFilePdf = true;
entry.WriteTo(memoryFilePDf);
}
else
{
isFilePdf = false;
entry.WriteTo(memoryFileXml);
}
var fileName = entry.Key.Split(new char[] { '/' }, StringSplitOptions.RemoveEmptyEntries).LastOrDefault();
var createFile = await ApplicationData.Current.TemporaryFolder.CreateFileAsync(fileName, Windows.Storage.CreationCollisionOption.GenerateUniqueName);
using (IRandomAccessStream stream = await createFile.OpenAsync(FileAccessMode.ReadWrite))
{
// Write compressed data from memory to file
using (Stream outstream = stream.AsStreamForWrite())
{
byte[] buffer = isFilePdf ? memoryFilePDf.ToArray() : memoryFileXml.ToArray();
outstream.Write(buffer, 0, buffer.Length);
outstream.Flush();
}
}
if (isFilePdf)
{
FilePdf = createFile;
}
else
{
FileXml = createFile;
}
}
}
}
catch (Exception ex)
{
throw new Exception(ex.Message);
}
}
}
}

Categories

Resources