Accessing a POSTed JSON File in Web API - c#

I have created a Web API service that accepts a POSTed JSON file. With the service, I want to parse the JSON file using JSON.NET. I have seen multiple posts on the subject, however, I don't want to save the file to disk. I want to keep the file in memory, parse the file and dispose of the file in memory.
I'm also using .NET Framework 4.0.
EDIT: I should be more clear. When the file is POSTed, it is a file stream. The part I don't know is how to convert the stream to JSON.
public HttpResponseMessage Post()
{
HttpResponseMessage result = null;
int FileLen;
var httpRequest = HttpContext.Current.Request;
if (httpRequest.Files.Count > 0)
{
string MyString = string.Empty;
var postedFile = httpRequest.Files[0];
FileLen = postedFile.ContentLength;
byte[] input = new byte[FileLen];
System.IO.Stream testStream = postedFile.InputStream;
testStream.Read(input, 0, FileLen);
for (int Loop1 = 0; Loop1 < FileLen; Loop1++)
MyString = MyString + input[Loop1].ToString();
CurrentRate.JSONSerializer(MyString);
}

Since JSON.NET has the ability to deserialize a file (converted to a stream) with the following code:
using (StreamReader file = File.OpenText(#"<pathToFile>"))
{
JsonSerializer serializer = new JsonSerializer();
MyObject myObject = (MyObject)serializer.Deserialize(file, typeof(MyObject));
}
By swapping out the stream from File.OpenText with something similar to the following
using (var stream = new MemoryStream(<FileByteArrayFromMemory>);
using (StreamReader file = new StreamReader(stream))
{
JsonSerializer serializer = new JsonSerializer();
MyObject myObject = (MyObject)serializer.Deserialize(file, typeof(MyObject));
}
we never have to deal with an actual file. There may be a more efficient way to create the MemoryStream or you may already have aStream and could initialize the and could skip that step.
EDIT 2: more explicitly showed swapping the StreamReader

Related

System.Text.Json.* & deserialization of streams

I have this original code:
public async Task<ActionResult> Chunk_Upload_Save(IEnumerable<IFormFile> files, string metaData)
{
if (metaData == null)
{
return await Save(files);
}
MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(metaData));
JsonSerializer serializer = new JsonSerializer();
ChunkMetaData chunkData;
using (StreamReader streamReader = new StreamReader(ms))
{
chunkData = (ChunkMetaData)serializer.Deserialize(streamReader, typeof(ChunkMetaData));
}
string path = String.Empty;
// The Name of the Upload component is "files"
if (files != null)
{
foreach (var file in files)
{
path = Path.Combine(WebHostEnvironment.WebRootPath, "App_Data", chunkData.FileName);
//AppendToFile(path, file);
}
}
FileResult fileBlob = new FileResult();
fileBlob.uploaded = chunkData.TotalChunks - 1<= chunkData.ChunkIndex;
fileBlob.fileUid = chunkData.UploadUid;
return Json(fileBlob);
}
I converted it using only System.Text.Json.* to this:
public async Task<ActionResult> Chunk_Upload_Save(IEnumerable<IFormFile> files, string metaData)
{
if (metaData == null)
{
return await Save(files);
}
var ms = new MemoryStream(Encoding.UTF8.GetBytes(metaData));
ChunkMetaDataModel chunkData;
using (var streamReader = new StreamReader(ms))
{
// Here is the issues
chunkData = (ChunkMetaDataModel) await JsonSerializer.DeserializeAsync(streamReader, typeof(ChunkMetaDataModel));
}
// The Name of the Upload component is "files"
if (files != null)
{
foreach (var file in files)
{
Path.Combine(hostEnvironment.WebRootPath, "App_Data", chunkData!.FileName);
//AppendToFile(path, file);
}
}
var fileBlob = new FileResultModel
{
uploaded = chunkData!.TotalChunks - 1 <= chunkData.ChunkIndex,
fileUid = chunkData.UploadUid
};
return Json(fileBlob);
}
I get the error:
Argument 1: cannot convert from 'System.IO.StreamReader' to 'System.IO.Stream'.
By Argument 1, VS is pointing to the streamReader parameter and it's this line:
chunkData = (ChunkMetaData)serializer.Deserialize(streamReader, typeof(ChunkMetaData));
How do I convert this to the System.Text.Json API?
System.Text.Json is designed to deserialize most efficiently from UTF8 byte sequences rather than UTF16 strings, so there is no overload to deserialize from a StreamReader. Instead deserialize directly from the MemoryStream ms using the following:
chunkData = await JsonSerializer.DeserializeAsync<ChunkMetaDataModel>(ms);
Notes:
There is no reason to use async deserialization when deserializing from a MemoryStream. Instead use synchronous deserialization like so:
chunkData = JsonSerializer.Deserialize<ChunkMetaDataModel>(ms);
And since you already have a string metaData containing the JSON to be deserialized, you can deserialize directly from it using the Deserialize<TValue>(ReadOnlySpan<Char>, JsonSerializerOptions) overload:
chunkData = JsonSerializer.Deserialize<ChunkMetaDataModel>(metaData);
System.Text.Json will do the UTF16 to UTF8 conversion for you internally using memory pooling.
If you really must deserialize from a StreamReader for some reason (e.g. incremental integration of System.Text.Json with legacy code), see Reading string as a stream without copying for suggestions on how to do this.

System.ArgumentNullException for Xamarin.Forms using StreamReader

In public ScenarioPage() of ScenarioPage.cs I have the following code to read from a json file:
var assembly = typeof(ScenarioPage).GetTypeInfo().Assembly;
Stream stream = assembly.GetManifestResourceStream("firstSession.json");
using (StreamReader reader = new StreamReader(stream)) // System.ArgumentNullException
{
var json = reader.ReadToEnd();
List<SessionModel> data = JsonConvert.DeserializeObject<List<SessionModel>>(json);
foreach(SessionModel scenario in data)
{
label.Text = scenario.title;
break;
};
}
I am getting an ArgumentNullException for the stream input. firstSession.json is in the same folder as ScenarioPage.cs, and it is set as an embedded resource. It seems like Visual Studio is not recognizing that my json file is there. Is this is a bug? Or is there something wrong with my code?
Where did you put the Json File, I put it in the Json File in the root Of PCL like following screenshot.
Then use following code to read the Json File.
void GetJsonData()
{
string jsonFileName = "firstSession.json";
ContactList ObjContactList = new ContactList();
var assembly = typeof(MainPage).GetTypeInfo().Assembly;
Stream stream = assembly.GetManifestResourceStream($"{assembly.GetName().Name}.{jsonFileName}");
using (var reader = new System.IO.StreamReader(stream))
{
var jsonString = reader.ReadToEnd();
//Converting JSON Array Objects into generic list
ObjContactList = JsonConvert.DeserializeObject<ContactList>(jsonString);
}
EmployeeView.ItemsSource = ObjContactList.contacts;
}
And here is running GIF.
I update my demo to you. you can test it
https://github.com/851265601/Xamarin.Android_ListviewSelect/blob/master/PlayMusicInBack.zip

Using a generic stream to create a zipped file with SharpCompress

Since System.IO.Compression seems to be out of reach for now if I want to use both dotnet core + net461, I've tried with SharpCompress.
The "read zip" part was easy, but I am having trouble finding out how to write to a zip stream.
The wiki of the project is a bit outdated. This is the only example that I've found that applies to writing to streams. I've tried to follow it and adapt it to my needs, but I am stuck at the exception it throws:
using Microsoft.VisualStudio.TestTools.UnitTesting;
using SharpCompress.Common;
using SharpCompress.Compressors.Deflate;
using SharpCompress.Writers;
using System;
using System.IO;
namespace DbManager.DjdbCore.Tests
{
[TestClass]
public class ZipTests
{
public ZipTests()
{
Directory.SetCurrentDirectory(AppContext.BaseDirectory);
}
[TestMethod]
public void Test()
{
var zip = File.OpenWrite(#"..\..\..\..\..\test-resources\zip_file_test.zip");
var writerOptions = new WriterOptions(CompressionType.Deflate);
var zipWriter = WriterFactory.Open(zip, ArchiveType.Zip, writerOptions);
var memoryStream = new MemoryStream();
var binaryWriter = new BinaryWriter(memoryStream);
binaryWriter.Write("Test string inside binary file - text to fill it up: qoiwjqefñlawijfñlaskdjfioqwjefñalskvndñaskvnqo`wiefowainvñaslkfjnwpowiqjfeopwiqjnfjñlaskdjfñlasdfjiowiqjefñaslkdjfñalskjfpqwoiefjqw");
var deflateStream = new DeflateStream(memoryStream, SharpCompress.Compressors.CompressionMode.Compress);
deflateStream.Write(memoryStream.ToArray(), 0, Convert.ToInt32(memoryStream.Length));
// EXCEPTION: SharpCompress.Compressors.Deflate.ZlibException: 'Cannot Read after Writing.'
// Source code: if (_streamMode != StreamMode.Reader) { throw new ZlibException("Cannot Read after Writing."); }
zipWriter.Write("test_file_inside_zip.bin", deflateStream, DateTime.Now);
zip.Flush();
zipWriter.Dispose();
zip.Dispose();
}
}
}
In case it helps, this is what I used (and it worked, but only in dotnet core) using the library System.IO.Compression:
private void WriteAsZipBinary()
{
//Open the zip file if it exists, else create a new one
var zip = ZipPackage.Open(this.FileFullPath, FileMode.OpenOrCreate, FileAccess.ReadWrite);
var zipStream = ZipManager.GetZipWriteStream(zip, nameOfFileInsideZip);
var memoryStream = new MemoryStream();
var binaryWriter = new BinaryWriter(memoryStream);
// Here is where strings etc are written to the binary file:
WriteStuffInBinaryStream(ref binaryWriter);
//Read all of the bytes from the file to add to the zip file
byte[] bites = new byte[Convert.ToInt32(memoryStream.Length - 1) + 1];
memoryStream.Position = 0;
memoryStream.Read(bites, 0, Convert.ToInt32(memoryStream.Length));
binaryWriter.Dispose();
binaryWriter = null;
memoryStream.Dispose();
memoryStream = null;
zipStream.Position = 0;
zipStream.Write(bites, 0, bites.Length);
zip.Close();
}
public static Stream GetZipWriteStream(Package zip, string renamedFileName)
{
//Replace spaces with an underscore (_)
string uriFileName = renamedFileName.Replace(" ", "_");
//A Uri always starts with a forward slash "/"
string zipUri = string.Concat("/", Path.GetFileName(uriFileName));
Uri partUri = new Uri(zipUri, UriKind.Relative);
string contentType = "Zip"; // System.Net.Mime.MediaTypeNames.Application.Zip;
//The PackagePart contains the information:
// Where to extract the file when it's extracted (partUri)
// The type of content stream (MIME type): (contentType)
// The type of compression: (CompressionOption.Normal)
PackagePart pkgPart = zip.CreatePart(partUri, contentType, CompressionOption.Normal);
//Compress and write the bytes to the zip file
return pkgPart.GetStream();
}
I'll post here the answer on github from #adamhathcock (the owner of the project):
[TestMethod]
public void Test()
{
var writerOptions = new WriterOptions(CompressionType.Deflate);
using(var zip = File.OpenWrite(#"..\..\..\..\..\test-resources\zip_file_test.zip"))
using(var zipWriter = WriterFactory.Open(zip, ArchiveType.Zip, writerOptions))
{
var memoryStream = new MemoryStream();
var binaryWriter = new BinaryWriter(memoryStream);
binaryWriter.Write("Test string inside binary file - text to fill it up: qoiwjqefñlawijfñlaskdjfioqwjefñalskvndñaskvnqo`wiefowainvñaslkfjnwpowiqjfeopwiqjnfjñlaskdjfñlasdfjiowiqjefñaslkdjfñalskjfpqwoiefjqw");
memoryStream.Position = 0;
zipWriter.Write("test_file_inside_zip.bin", memoryStream, DateTime.Now);
}
}
2 things:
You forgot to reset the MemoryStream after writing to it so it can be read.
You don't need to manually use the DeflateStream. You've told the ZipWriter what compression to use. If it worked, you would have double compressed the bytes which would be garbage really.

Prevent JsonTextReader from consuming the stream during deserialization

I'm using Json.Net to consume some seekable streams.
// reset the input stream, in case it was previously read
inputStream.Position = 0;
using (var textReader = new StreamReader(inputStream))
{
using (var reader = new JsonTextReader(textReader))
{
deserialized = serializer.Deserialize(reader, expectedType);
}
}
However, this method 'consumes' the stream, meaning the first contained valid Json token is removed from the stream.
That it very annoying. And meaningless, stream Position is provided to emulate a consumption, and 'reading' generally implies 'not modifying'.
Of course, I can dump the stream into a MemoryStream to protect my precious source stream, but that's a huge overhead, especially when doing trial-and-error on a deserialization.
If there is a way to to just 'read' and not 'read-and-consume', thanks for your help, I could not find documentation about that (and I hope this post will help others to google the solution ^^).
JsonTextReader is a forward-only reader, meaning it cannot be set back to a position earlier in the JSON to re-read a portion of it, even if the underlying stream supports seeking. However, the reader does not actually "consume" the stream, as you said. If you set the CloseInput property on the reader to false to prevent it from closing the underlying reader and stream when it is disposed, you can position the stream back to the beginning and open a new reader on the same stream to re-read the JSON. Here is a short program to demonstrate reading the same stream twice:
class Program
{
static void Main(string[] args)
{
string json = #"{ ""name"": ""foo"", ""size"": ""10"" }";
MemoryStream inputStream = new MemoryStream(Encoding.UTF8.GetBytes(json));
JsonSerializer serializer = new JsonSerializer();
using (var textReader = new StreamReader(inputStream))
{
for (int i = 0; i < 2; i++)
{
inputStream.Position = 0;
using (var reader = new JsonTextReader(textReader))
{
reader.CloseInput = false;
Widget w = serializer.Deserialize<Widget>(reader);
Console.WriteLine("Name: " + w.Name);
Console.WriteLine("Size: " + w.Size);
Console.WriteLine();
}
}
}
}
}
class Widget
{
public string Name { get; set; }
public int Size { get; set; }
}
Output:
Name: foo
Size: 10
Name: foo
Size: 10
Fiddle: https://dotnetfiddle.net/fftZV7
A stream may be consumed once read. The solution could be to copy it to a memory or file stream as below:
MemoryStream ms = new MemoryStream();
inputStream.CopyTo(ms);
ms.Position = 0;
using (var textReader = new StreamReader(ms))
(...)
Please let me know if it works.

ResXResourceWriter not writing to stream?

I'm trying to create a resx file and write it to a stream so that I might return it as a string instead of immediately saving it to a file. However, when I try to read that stream, it is empty. What am I doing wrong here? i did verify that the entries are not null. I can actually use the ResXResourceWriter constructor that saves it to disk successfully, but I'm trying to avoid using temp files. Also, I can see the stream is 0k before the loop and about 8k in length after the loop.
using (var stream = new MemoryStream())
{
using (var resx = new ResXResourceWriter(stream))
{
// build the resx and write to memory
foreach (var entry in InputFile.Entries.Values)
{
resx.AddResource(new ResXDataNode(entry.Key, entry.Value) { Comment = entry.Comment });
}
var reader = new StreamReader(stream);
var text = reader.ReadToEnd(); // text is an empty string here!
return null;
}
}
You need to flush and reset the output/stream before trying to read it. This should work, using Generate and Position:
resx.Generate();
stream.Position = 0;
var reader = new StreamReader(stream);
var text = reader.ReadToEnd();
return text;

Categories

Resources