Getting big data through SignalR - Blazor - c#

I have a component library that uses JS code to generate an image as a base64 string and the image needs to be transposed to C#. The image size is larger than MaximumReceiveMessageSize.
Can I get the value of the MaximumReceiveMessageSize property in C#? I need a way to correctly split the picture into chunks, or some other way to transfer it.
My component can be used in a Wasm or Server application. I can't change the value of the MaximumReceiveMessageSize property.
Thanks

Using a stream as described in the Stream from JavaScript to .NET solved my problem.
From Microsoft docs:
In JavaScript:
function streamToDotNet() {
return new Uint8Array(10000000);
}
In C# code:
var dataReference = await JS.InvokeAsync<IJSStreamReference>("streamToDotNet");
using var dataReferenceStream = await dataReference.OpenReadStreamAsync(maxAllowedSize: 10_000_000);
var outputPath = Path.Combine(Path.GetTempPath(), "file.txt");
using var outputFileStream = File.OpenWrite(outputPath);
await dataReferenceStream.CopyToAsync(outputFileStream);
In the preceding example: JS is an injected IJSRuntime instance. The dataReferenceStream is written to disk (file.txt) at the current user's temporary folder path (GetTempPath).

Related

C#.Net Download Image from URL, Crop, and Upload without Saving or Displaying

I have a large number of images on a Web server that need to be cropped. I would like to automate this process.
So my thought is to create a routine that, given the URL of the image, downloads the image, crops it, then uploads it back to the server (as a different file). I don't want to save the image locally, and I don't want to display the image to the screen.
I already have a project in C#.Net that I'd like to do this in, but I could do .Net Core if I have to.
I have looked around, but all the information I could find for downloading an image involves saving the file locally, and all the information I could find about cropping involves displaying the image to the screen.
Is there a way to do what I need?
It's perfectly possible to issue a GET request to a URL and have the response returned to you as a byte[] using HttpClient.GetByteArrayAsync. With that binary content, you can read it into an Image using Image.FromStream.
Once you have that Image object, you can use the answer from here to do your cropping.
//Note: You only want a single HttpClient in your application
//and re-use it where possible to avoid socket exhaustion issues
using (var httpClient = new HttpClient())
{
//Issue the GET request to a URL and read the response into a
//stream that can be used to load the image
var imageContent = await httpClient.GetByteArrayAsync("<your image url>");
using (var imageBuffer = new MemoryStream(imageContent))
{
var image = Image.FromStream(imageBuffer);
//Do something with image
}
}

How to set ContentMD5 in DataLakeFileClient?

When uploading to an Azure Data Lake using the Microsoft Azure Storage Explorer the file automatically generates and stores a value for the ContentMD5 property. It also automatically does it in a function app that uses a Blob binding.
However, this does not automatically generate when uploading from a C# DLL.
I want to use this value to compare files in the future.
My code for the upload is very simple.
DataLakeFileClient fileClient = await directoryClient.CreateFileAsync("testfile.txt");
await fileClient.UploadAsync(fileStream);
I also know I can generate an MD5 using the below code, but I'm not certain if this is the same way that Azure Storage Explorer does it.
using (var md5gen = MD5.Create())
{
md5hash = md5gen.ComputeHash(fileStream);
}
but I have no idea how to set this value to the ContentMD5 property of the file.
I have found the solution.
The UploadAsync method has an overload that accepts a parameter of type DataLakeFileUploadOptions. This class contains a HttpHeaders object which in turn has a ContentHash property which stores it as a property of the document.
var uploadOptions = new DataLakeFileUploadOptions();
uploadOptions.HttpHeaders = new PathHttpHeaders();
uploadOptions.HttpHeaders.ContentHash = md5hash;
await fileClient.UploadAsync(fileStream, uploadOptions);

Opening a pdf file in a Cordova App via web service

I have a Cordova app that uses a C# webService to communicate with a SQL database.
This works great.
My problem is that I have some pdf documents on the server with the local filePath held in the database and I need to open these in the app.
I have done a similar thing before where the documents had a URL where they could be reached so they just open, but in this case there is no external access to the file.
So my question is this....how do I best get the file from the server to the app to open it?
I don't need to store the file on the device, just open it so it can be read.
I would be really grateful if someone could steer me in the right direction as I have no clue as the best method for achieving what i'm after.
*****UPDATE******
Right, I don't think i'm a million miles away but have a feeling i'm doing something fundamentally wrong.
I'm creating a byte[] using:
byte[] bytes = System.IO.File.ReadAllBytes(filepath);
which produces a really long string.
In the app, I'm getting that string and using the following to reconstitute it as a file:
var bytes = new Uint8Array(data);
saveByteArray("mytest.txt", data);
function base64ToArrayBuffer(base64) {
var binaryString = window.atob(base64);
var binaryLen = binaryString.length;
var bytes = new Uint8Array(binaryLen);
for (var i = 0; i < binaryLen; i++) {
var ascii = binaryString.charCodeAt(i);
bytes[i] = ascii;
}
return bytes;
}
function saveByteArray(reportName, byte) {
var blob = new Blob([byte], {type: "application/txt"});
var link = document.createElement('a');
link.href = window.URL.createObjectURL(blob);
var fileName = reportName;
link.download = fileName;
link.click();
}
This will either create an empty file or a corrupt one.
Can anyone help with this please?
A fresh pair of eyes would be gratefully received.
Thanks
What you could do would be to make a new endpoint in your C# webservice backend, to download the file from this endpoint, to store it locally, and to display it from your app.
Behind the endpoint, it would use the file location from the database, and it would get the file content in a stream from where the pdf is stored. This stream of data would be placed in a json result object as an array of bytes. Finally, your app would have to get this json object, then to build the pdf file from the array of bytes and from the file name.
Hope it helps.

Azure Functions - Blob Stream Dynamic Input bindings

I'm running a C# function on azure which needs to take in files from a container. The only problem is that the paths to the input files are going to be (potentially) different each time, and the number of input files will vary from 1 to about 4 or 5. Accordingly I can't just use the default input blob bindings as far as I'm aware. My options are give the container anonymous access and just grab the files through the link or figure out how to get dynamic input bindings.
Does anyone know how to declare the path for the input blob stream at runtime (in the C# code)?
If it helps I've managed to find this for dynamic output bindings
using (var writer = await binder.BindAsync<TextWriter>(
new BlobAttribute(containerPath + fileName)))
{
writer.Write(OutputVariable);
}
Thanks in advance, Cuan
try the below code:
string filename = string.Format("{0}/{1}_{2}.json", blobname, DateTime.UtcNow.ToString("ddMMyyyy_hh.mm.ss.fff"), Guid.NewGuid().ToString("n"));
using (var writer = await binder.BindAsync<TextWriter>(
new BlobAttribute(filename, FileAccess.Write)))
{
writer.Write(JsonConvert.SerializeObject(a_object));
}
For dynamic output bindings, you could leverage the following code snippet:
var attributes = new Attribute[]
{
new BlobAttribute("{container-name}/{blob-name}"),
new StorageAccountAttribute("brucchStorage") //connection string name for storage connection
};
using (var writer = await binder.BindAsync<TextWriter>(attributes))
{
writer.Write(userBlobText);
}
Note: The above code would create the target blob if not exists and override the existing blob if it exists. Moreover, if you do not specify the StorageAccountAttribute, your target blob would be create into the storage account based on the app setting AzureWebJobsStorage.
Additionally, you could follow Azure Functions imperative bindings for more details.
UPDATE:
For dynamic input binding, you could just change the binding type as follows:
var blobString = await binder.BindAsync<string>(attributes);
Or you could set the binding type to CloudBlockBlob and add the following namespace for azure storage blob:
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage.Blob;
CloudBlockBlob blob = await binder.BindAsync<CloudBlockBlob>(attributes);
Moreover, more details about the operations for CloudBlockBlob, you could follow here.

Getting contents of Media Library Item in Sitecore

Using Sitecore 7.5, I am trying to store several html files inside of the Media Library. Then in my sublayout codebehind I am attempting to grab the inner content of those html files.
I had this working when I was storing the html file on the server. I would upload the file into the Media Library using 'upload as file', and then use the following code to read the content:
string filename = htmlMediaItem.Fields["File Path"].ToString();
string path = Server.MapPath(filename);
string content = System.IO.File.ReadAllText(path);
However I now would like to do this without storing the files on the server and instead only have them inside the media library. Is there anyway I can do this?
So far I have had a hard time trying to find information on the subject.
Thank you.
From what I understand you want to read content of an html file stored in Media Library.
Sitecore.Data.Items.Item sampleItem = Sitecore.Context.Database.GetItem("/sitecore/media library/Files/yourhtmlfile");
Sitecore.Data.Items.Item sampleMedia = new Sitecore.Data.Items.MediaItem(sampleItem);
using(var reader = new StreamReader(MediaManager.GetMedia(sampleMedia).GetStream().Stream))
{
string text = reader.ReadToEnd();
}

Categories

Resources