Having an issue with my small & simple serializer for xml files - c#

I was using a simple serializer to load an xml file to my game (I'm implementing mod support).
The way I had it first (it's not my own serializer) was like this:
public static EnemyList Load(string path)
{
TextAsset _xml = Resources.Load<TextAsset>(path);
XmlSerializer serializer = new XmlSerializer(typeof(EnemyList));
StringReader reader = new StringReader(_xml.text);
EnemyList enemies = serializer.Deserialize(reader) as EnemyList;
reader.Close();
return enemies;
}
The problem with that is that "Resources.Load" is being used. Since I want the players to write/install mods the Resources folder can't be used here (because as far as I know they can't access the Resources folder). Therefore I created a "Mods" folder in the build folder and in that Mods folder other peoples folders (for example if I would make a mod I would have a folder like "MyMod" and that folder would have other folders like "entities" with an "entities.xml" file) are located. To get all the files from a folder I used DirectoryInfo.
And here is my problem: I'm trying to change the serializer to work with DirectoryInfo. This is the code so far:
public static EnemyList LoadXml(string path)
{
DirectoryInfo direcInfo = new DirectoryInfo(path);
FileInfo[] fileInfo = direcInfo.GetFiles();
XmlSerializer serializer = new XmlSerializer(typeof(EnemyList));
StringReader reader = new StringReader(fileInfo[0].ToString());
EnemyList enemies = serializer.Deserialize(reader) as EnemyList;
reader.Close();
return enemies;
}
but when I start the game I'm getting the Error: XmlException: Data at the root level is invalid. Line 1, position 1.
I've also tried things like File.ReadAllText(path) but I'm getting an "unauthorizedaccessexception".
When I googled that problem I found out that I need to specify a file in the path and not the directory (not just Mods/entities but Mods/entities/entities.xml) but I don't want to just read one single file. I want to ready every xml file that is in there. And even if I change it to entities.xml I'm still getting an Error IOException: Error 267 (couldn't find any answers to fix that)
I hope someone can help me with that. I've already googled but the people on the forum did completely different things, I couldn't apply that to my case.
Thank you in advance!
In case the xml is needed:

First you need to get the path to the directory you want to search in and then loop through each XML File in that Directory. With SearchOption.AllDirectory you also search in all sub directories that are in the current directory.
Loop through your Directory:
string folderWithModFiles = "YourPath";
List<Enemy> enemylist = new List<Enemy>();
foreach (string xmlFile in Directory.EnumerateFiles(folderWithModFiles, "*.xml", SearchOption.AllDirectories)) {
try {
enemylist = SerializeXML(xmlFile);
}
catch(Exception e) {
Debug.LogException(e);
}
}
The try catch is to ensure that no faulty XML files are read.
De-serializing XML File:
private List<Enemy> SerializeXMLS(string filePath) {
using (FileStream fileStream = System.IO.File.Open(filePath, FileMode.Open)) {
var serializer = new XmlSerializer(typeof(EnemyList));
var enemyList = serializer.Deserialize(fileStream) as EnemyList;
// Read the data you need from the XML File
// If the user didn't specify certain things he need to specify,
// you can throw an ArgumentExpection and the file will be skipped
return enemyList.ListOfEnemies;
}
}
}
Example of the EnemyList XML Parse File:
[XmlRoot("Entities")]
[CreateAssetMenu(menuName = "Nations of Cubion/Item/List of Enemies")]
public class EnemyList {
[XmlArray]
[XmlArrayItem("Entity")]
public List<Entity> ListOfEnemies {
get; set;
}
}
public class Entity{
[XmlAttribute("name")]
public string name{
get; set;
}
[XmlElement("mesh")]
public string mesh {
get; set;
}
[XmlElement("material")]
public string material {
get; set;
}
[XmlElement("enemyType")]
public string enemyType {
get; set;
}
[XmlElement("elementType")]
public string elementType {
get; set;
}
[XmlElement("maxHealth")]
public string maxHealth {
get; set;
}
}

Related

Serializing and deserializing a collection of objects

I'm trying to create an Rssfeed reader which saves info about a podcast to a JSON file and I'm having trouble serializing and deserializing to that file.
I realize that there are other threads regarding this subject, but I cannot grasp or comprehend how to apply it to my code or the reasoning behind it.
So I have a bit of code that creates a file if it doesn't exist and writes JSON data to it which looks like:
public void SaveFile(Podcast podcast)
{
try
{
JsonSerializer serializer = new JsonSerializer();
if(!File.Exists(#"C: \Users\Kasper\Desktop\Projektuppgift\Projektuppgift - Delkurs2\Projektet\Projektet\bin\Debug\podcasts.json"))
{
string json = JsonConvert.SerializeObject( new { Podcast = podcast });
StreamWriter sw = File.CreateText(#"C:\Users\Kasper\Desktop\Projektuppgift\Projektuppgift-Delkurs2\Projektet\Projektet\bin\Debug\podcasts.json");
using (JsonWriter writer = new JsonTextWriter(sw))
{
serializer.Serialize(writer, json);
}
}
else
{
var filepath = #"C:\Users\Kasper\Desktop\Projektuppgift\Projektuppgift-Delkurs2\Projektet\Projektet\bin\Debug\podcasts.json";
var jsonData = File.ReadAllText(filepath);
var podcasts = JsonConvert.DeserializeObject<List<Podcast>>(jsonData) ?? new List<Podcast>();
podcasts.Add(podcast);
jsonData = JsonConvert.SerializeObject(new {PodcastList = podcasts });
File.WriteAllText(filepath, jsonData);
}
}
catch (Exception ex)
{
Console.WriteLine("IO Exception ", ex.Message);
}
}
What I can't get to work is to deserialize from this file and add an object to it. Is there an easier way to add more data to the JSON file or am I missing something?
The Podcast class looks like this:
public class Podcast
{
public string url { get; set; }
public string name { get; set; }
public int updateInterval { get; set; }
public string category { get; set; }
//public Category category = new Category();
public List<Episode> episodes { get; set; }
public Podcast(string url, string name, Category category, List<Episode> episodes, int updateInterval)
{
this.url = url;
this.name = name;
this.category = category.name;
this.episodes = episodes;
this.updateInterval = updateInterval;
}
public Podcast(Podcast p)
{
this.url = p.url;
this.name = p.name;
this.category = p.category;
this.episodes = p.episodes;
this.updateInterval = p.updateInterval;
}
}
There appear to be a couple of issues here:
You are checking for the existence of a different file than the one you are reading/writing. The former filename has extra spaces in it. The best way to avoid this problem is to use a variable to contain the filename rather than hardcoding it in three separate places.
You are inconsistent about the JSON format you are writing and reading:
When you first create the file (in the first branch), you are writing a JSON object that contains a property Podcast which then contains a single podcast.
When you attempt to read the JSON file, you are treating the entire JSON as a list of podcasts.
After tacking the new podcast onto the list, you are writing the JSON as a single object containing a PodcastList property, which then contains the list.
You need to use a consistent JSON format. I would recommend breaking your code into smaller methods to read and write the podcasts.json file like this so that it is easier to reason about:
public static List<Podcast> ReadPodcastsFromFile(string filepath)
{
if (!File.Exists(filepath)) return new List<Podcast>();
string json = File.ReadAllText(filepath);
return JsonConvert.DeserializeObject<List<Podcast>>(json);
}
public static void WritePodcastsToFile(List<Podcast> podcasts, string filepath)
{
string json = JsonConvert.SerializeObject(podcasts);
// This will overwrite the file if it exists, or create a new one if it doesn't
File.WriteAllText(filepath, json);
}
Then, you can simplify your SaveFile method down to this (I would be tempted to rename it to SavePodcast):
public void SaveFile(Podcast podcast)
{
var filepath = #"C:\Users\Kasper\Desktop\Projektuppgift\Projektuppgift-Delkurs2\Projektet\Projektet\bin\Debug\podcasts.json";
List<Podcast> podcasts = ReadPodcastsFromFile(filepath);
podcasts.Add(podcast);
WritePodcastsToFile(podcasts, filepath);
}
Notice I've also removed the exception handling from SaveFile. You should move that up to wherever SaveFile is called, so that you can take appropriate action at that point if an exception is thrown, e.g.:
try
{
SaveFile(podcast);
}
catch (Exception ex)
{
// Show a message to the user indicating that the file did not save
}
I'm just still learning c# but it might be that you deserialise into a list of podcasts and when you serialise you're serliasing into an object type.

Is there size limit for a property to be serialized?

I'm working against an interface that requires an XML document. So far I've been able to serialize most of the objects using XmlSerializer. However, there is one property that is proving problematic. It is supposed to be a collection of objects that wrap a document. The document itself is encoded as a base64 string.
The basic structure is like this:
//snipped out of a parent object
public List<Document> DocumentCollection { get; set; }
//end snip
public class Document
{
public string DocumentTitle { get; set; }
public Code DocumentCategory { get; set; }
/// <summary>
/// Base64 encoded file
/// </summary>
public string BinaryDocument { get; set; }
public string DocumentTypeText { get; set; }
}
The problem is that smaller values work fine, but if the document is too big the serializer just skips over that document item in the collection.
Is there some limitation that I'm bumping up against?
Update: I changed
public string BinaryDocument { get; set; }
to
public byte[] BinaryDocument { get; set; }
and I'm still getting the same result. The smaller document (~150kb) is serializing just fine, but the rest aren't. To be clear, it's not just the value of the property, it's the entire containing Document object that gets dropped.
UPDATE 2:
Here's the serialization code with a simple repro. It's out of a console project I put together. The problem is that this code works fine in the test project. I'm having difficulty getting the full object structure packed in here because it's near impossible to use the actual objects in a test case because of the complexity of filling the fields, so I tried to cut down the code in the main application. The populated object goes into the serialization code with the DocumentCollection filled with four Documents and comes out with one Document.
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
namespace ConsoleApplication2
{
class Program
{
static void Main(string[] args)
{
var container = new DocumentContainer();
var docs = new List<Document>();
foreach (var f in Directory.GetFiles(#"E:\Software Projects\DA\Test Documents"))
{
var fileStream = new MemoryStream(File.ReadAllBytes(f));
var doc = new Document
{
BinaryDocument = fileStream.ToArray(),
DocumentTitle = Path.GetFileName(f)
};
docs.Add(doc);
}
container.DocumentCollection = docs;
var serializer = new XmlSerializer(typeof(DocumentContainer));
var ms = new MemoryStream();
var writer = XmlWriter.Create(ms);
serializer.Serialize(writer, container);
writer.Flush();
ms.Seek(0, SeekOrigin.Begin);
var reader = new StreamReader(ms, Encoding.UTF8);
File.WriteAllText(#"C:\temp\testexport.xml", reader.ReadToEnd());
}
}
public class Document
{
public string DocumentTitle { get; set; }
public byte[] BinaryDocument { get; set; }
}
// test class
public class DocumentContainer
{
public List<Document> DocumentCollection { get; set; }
}
}
XmlSerializer has no limit on the length of a string it can serialize.
.Net, however, has a maximum string length of int.MaxValue. Furthermore, since internally a string is implemented as a contiguous memory buffer, on a 32 bit process you're likely to be unable to allocate a string anywhere near that large due to process space fragmentation. And since a c# base64 string requires roughly 2.67 times the memory of the byte [] array from which it was created (1.33 for the encoding times 2 since the .Net char type is actually two bytes) you might be getting an OutOfMemoryException encoding a large binary document as a complete base64 string, then swallowing and ignoring it, leaving the BinaryDocument property null.
That being said, there is no reason for you to manually encode your binary documents into base64, because XmlSerializer does this for you automatically. I.e. if I serialize the following class:
public class Document
{
public string DocumentTitle { get; set; }
public Code DocumentCategory { get; set; }
public byte [] BinaryDocument { get; set; }
public string DocumentTypeText { get; set; }
}
I get the following XML:
<Document>
<DocumentTitle>my title</DocumentTitle>
<DocumentCategory>Default</DocumentCategory>
<BinaryDocument>AAECAwQFBgcICQoLDA0ODxAREhM=</BinaryDocument>
<DocumentTypeText>document text type</DocumentTypeText>
</Document>
As you can see, BinaryDocument is base64 encoded. Thus you should be able to keep your binary documents in a more compact byte [] representation and still get the XML output you want.
Even better, under the covers, XmlWriter uses System.Xml.Base64Encoder to do this. This class encodes its inputs in chunks, thereby avoiding the excessive memory use and potential out-of-memory exceptions described above.
I can't reproduce the problem you are having. Even with individual files as large as 267 MB to 1.92 GB, I'm not seeing any elements being skipped. The only problem I am seeing is that the temporary var ms = new MemoryStream(); exceeds its 2 GB buffer limit eventually, whereupon an exception gets thrown. I replaced this with a direct stream, and that problem went away:
using (var stream = File.Open(outputPath, FileMode.Create, FileAccess.ReadWrite))
That being said, your design will eventually run up against memory limits for a sufficiently large number of sufficiently large files, since you load all of them into memory before serializing. If this is happening, somewhere in your production code you may be catching and swallowing the OutOfMemoryException without realizing it, leading to the problem you are seeing.
As an alternative, I would suggest a streaming solution where you incrementally copy each file's contents to the XML output from within XmlSerializer by making your Document class implement IXmlSerializable:
public class Document : IXmlSerializable
{
public string DocumentPath { get; set; }
public string DocumentTitle
{
get
{
if (DocumentPath == null)
return null;
return Path.GetFileName(DocumentPath);
}
}
const string DocumentTitleName = "DocumentTitle";
const string BinaryDocumentName = "BinaryDocument";
#region IXmlSerializable Members
System.Xml.Schema.XmlSchema IXmlSerializable.GetSchema()
{
return null;
}
void ReadXmlElement(XmlReader reader)
{
if (reader.Name == DocumentTitleName)
DocumentPath = reader.ReadElementContentAsString();
}
void IXmlSerializable.ReadXml(XmlReader reader)
{
reader.ReadXml(null, ReadXmlElement);
}
void IXmlSerializable.WriteXml(XmlWriter writer)
{
writer.WriteElementString(DocumentTitleName, DocumentTitle ?? "");
if (DocumentPath != null)
{
try
{
using (var stream = File.OpenRead(DocumentPath))
{
// Write the start element if the file was successfully opened
writer.WriteStartElement(BinaryDocumentName);
try
{
var buffer = new byte[6 * 1024];
int read;
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
writer.WriteBase64(buffer, 0, read);
}
finally
{
// Write the end element even if an error occurred while streaming the file.
writer.WriteEndElement();
}
}
}
catch (Exception ex)
{
// You could log the exception as an element or as a comment, as you prefer.
// Log as a comment
writer.WriteComment("Caught exception with message: " + ex.Message);
writer.WriteComment("Exception details:");
writer.WriteComment(ex.ToString());
// Log as an element.
writer.WriteElementString("ExceptionMessage", ex.Message);
writer.WriteElementString("ExceptionDetails", ex.ToString());
}
}
}
#endregion
}
// test class
public class DocumentContainer
{
public List<Document> DocumentCollection { get; set; }
}
public static class XmlSerializationExtensions
{
public static void ReadXml(this XmlReader reader, Action<IList<XAttribute>> readXmlAttributes, Action<XmlReader> readXmlElement)
{
if (reader.NodeType != XmlNodeType.Element)
throw new InvalidOperationException("reader.NodeType != XmlNodeType.Element");
if (readXmlAttributes != null)
{
var attributes = new List<XAttribute>(reader.AttributeCount);
while (reader.MoveToNextAttribute())
{
attributes.Add(new XAttribute(XName.Get(reader.Name, reader.NamespaceURI), reader.Value));
}
// Move the reader back to the element node.
reader.MoveToElement();
readXmlAttributes(attributes);
}
if (reader.IsEmptyElement)
{
reader.Read();
return;
}
reader.ReadStartElement(); // Advance to the first sub element of the wrapper element.
while (reader.NodeType != XmlNodeType.EndElement)
{
if (reader.NodeType != XmlNodeType.Element)
// Comment, whitespace
reader.Read();
else
{
using (var subReader = reader.ReadSubtree())
{
while (subReader.NodeType != XmlNodeType.Element) // Read past XmlNodeType.None
if (!subReader.Read())
break;
if (readXmlElement != null)
readXmlElement(subReader);
}
reader.Read();
}
}
// Move past the end of the wrapper element
reader.ReadEndElement();
}
}
Then use it as follows:
public static void SerializeFilesToXml(string directoryPath, string xmlPath)
{
var docs = from file in Directory.GetFiles(directoryPath)
select new Document { DocumentPath = file };
var container = new DocumentContainer { DocumentCollection = docs.ToList() };
using (var stream = File.Open(xmlPath, FileMode.Create, FileAccess.ReadWrite))
using (var writer = XmlWriter.Create(stream, new XmlWriterSettings { Indent = true, IndentChars = " " }))
{
new XmlSerializer(container.GetType()).Serialize(writer, container);
}
Debug.WriteLine("Wrote " + xmlPath);
}
Using the streaming solution, when serializing 4 files of around 250 MB each, my memory use went up by 0.8 MB. Using the original classes, my memory went up by 1022 MB.
Update
If you need to write your XML to a memory stream, be aware that the c# MemoryStream has a hard maximum stream length of int.MaxValue (i.e. 2 GB) because the underlying memory is simply a byte array. On a 32-bit process the effective max length will be much smaller, see OutOfMemoryException while populating MemoryStream: 256MB allocation on 16GB system.
To programmatically check to see if your process is actually 32 bit, see How to determine programmatically whether a particular process is 32-bit or 64-bit. To change to 64 bit, see What is the purpose of the “Prefer 32-bit” setting in Visual Studio 2012 and how does it actually work?.
If you are sure you are running in 64 bit mode and are still exceeding the hard size limits of a MemoryStream, perhaps see alternative to MemoryStream for large data volumes or MemoryStream replacement?.

Cross-Platform reading of xml files

I have some Config files as part of my solution on Windows Mobile. I am porting the code to MonoForAndroid and MonoTouch, so I want to keep my code unchanged as much as possible.
When loading these xml files, on Windows Mobile works fine, in my last prototype it also worked on iOS, but the code does not work on MonForAndroid
I have these files
/solution folder
/My Documents/
/Business
App.Config
Settings.Config
I have these files build action set to Content and I can see that they are being copied to the /bin/Debug/ but When I try to read these files, I get the following exception:
System.IO.DirectoryNotFoundException
I see that there is a similar question in here, but they advised to use AndroidResources, which I do not want to do, there are many placed where these files are needed, so I do not want to change it in many places.
AndrodiResources, is out of the question, and if possible I would like to avoid using EmbededResources
ah and the way I am reading it, very straightforward xmDoc.Load(filePath) I also tried File.ReadAllText() I made sure that the filePath is correct, and I got the path generated using Path.Combine() to avoid any issues with the filePath/slashes
Here is how I construct my file path
var filePath = Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().GetName().CodeBase).Replace(FileURIPrefix, ""), "My Documents", "Business");
filePath = Path.Combine(filePath, "App.Config");
And I can see in the debugger that the filePath is correct
Thanks for the help in advance
After searching all around, I could not get the MonoDroid to load (or include) my files when their build action is set to Content.
I had to create an entity called FileHelper which is implemented differently on Android, I then use that FileHelper.ReadAllText(string filename);
I will put my implementation here, hoping that it would benefit somebody else.
Windows Mobile and iOS
public class FileHelper
{
public static string ReadAllText(string filePath)
{
var path = filePath.GetFullPath();
if (!File.Exists(path))
{
Logging.LogHandler.LogError("File " + path + " does not exists");
return string.Empty;
}
using (var reader = new StreamReader(filePath))
{
return reader.ReadToEnd();
}
}
}
Android version
public class FileHelper : BaseFileHelper
{
public static string ReadAllText(string filePath)
{
var entryAssemblyPath = Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().GetName().CodeBase).Replace("file:", ""), "MyExecutableAssemblyName.dll");
// This is because Assembly.GetEntryAssembly() returns null on Android... Booohhh
var assembly = Assembly.LoadFrom(entryAssemblyPath);
using (var stream = assembly.GetManifestResourceStream(filePath.GetFullPath()))
{
using (var reader = new StreamReader(stream))
{
return reader.ReadToEnd();
}
}
}
}
I had a shared code for Constants and an extention method for paths as below
Constants.cs
public static Class Constants
{
private static string _RootPath;
private static string _iOSRootPath;
private static string _AndroidResourcePath;
public static string RootPath
{
get
{
if (string.IsNullOrEmpty(_RootPath))
{
_RootPath = Path.GetDirectoryName(Assembly.GetExecutingAssembly().GetName().CodeBase).Replace(FileURIPrefix, "") + "\\My Documents\\Business";
}
return _RootPath;
}
}
public static string iOSRootPath
{
get
{
if (!string.IsNullOrEmpty(_iOSRootPath))
{
_iOSRootPath = Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().GetName().CodeBase).Replace(FileURIPrefix, "").Replace("file:", ""), Path.Combine("My_Documents", "Business"));
}
return _iOSRootPath;
}
}
public static string AndroidResourcePath
{
get
{
if (string.IsNullOrEmpty(_AndroidResourcePath))
{
_AndroidResourcePath = "Leopard.Delivery.My_Documents.Business.";
}
return _AndroidResourcePath;
}
}
}
PathExtentions.cs
public static class PathExtensions
{
public static string GetFullPath(this string filePath)
{
if (Platform.IsAndroid) // platform is a class that I have to tell me which platfrom I am at :)
{
return Constants.AndroidResourcePath + filePath;
}
if (Platform.IsIOS)
{
return Path.Combine(Constants.iOSRootPath, filePath);
}
return Path.Combine(Constants.RootPath, filePath);
}
}
After setting this up, I am using my FileHelper just as easy as below
string configuratinContents = FileHelper.ReadAllText(configruationPath);
To whoever using this code, remember to set the build action to EmbededResources on Android, and to Content on iOS and Windows Mobile.

XmlSerializer Access Denied

public void SerializeObject(string filename, T data)
{
// Get the path of the save game
string fullpath = filename;
// Open the file, creating it if necessary
//if (container.FileExists(filename))
// container.DeleteFile(filename);
FileStream stream = (FileStream)File.Open(fullpath, FileMode.OpenOrCreate);
try
{
// Convert the object to XML data and put it in the stream
XmlSerializer serializer = new XmlSerializer(typeof(T));
serializer.Serialize(stream, data); //Thrown HERE
}
finally
{
// Close the file
stream.Close();
}
}
how do I make the above code stop throwing an InvalidOperationException?
The full error message is:
Unable to generate a temporary class (result=1).
error CS0016: Could not write to output file 'c:\Users[MYUSERNAME]\AppData\Local\Temp\czdgjjs0.dll' -- 'Access is denied.
I have no idea how to get around this error.
I am attempting to serialize my Level class which looks like this:
[Serializable]
public class Level : ISerializable
{
public string Name { get; set; }
public int BestTime { get; set; } //In seconds
public List<Block> levelBlocks { get; set; }
public int Width { get; set; }
public int Height { get; set; }
public Level()
{
}
public Level(SerializationInfo info, StreamingContext ctxt)
{
this.Name = (String)info.GetValue("Name", typeof(String));
this.BestTime = (int)info.GetValue("BestTime", typeof(int));
this.levelBlocks = (List<Block>)info.GetValue("Blocks", typeof(List<Block>));
this.Width = (int)info.GetValue("Width", typeof(int));
this.Height = (int)info.GetValue("Height", typeof(int));
}
public void GetObjectData(SerializationInfo info, StreamingContext ctxt)
{
info.AddValue("Name", this.Name);
info.AddValue("BestTime", this.BestTime);
info.AddValue("Blocks", this.levelBlocks);
info.AddValue("Width", this.Width);
info.AddValue("Height", this.Height);
}
}
My blocks class is implemented in a similar way and holds only a Position Vector that is saved.
Below, my save method:
public static void Save()
{
string filename = "saved.xml";
Level toSave = new Level();
toSave.levelBlocks = new List<Block>();
//TODO: build toSave
toSave.Name = "This is a level!";
toSave.BestTime = 0;
foreach (Entity e in EntityController.Entities)
{
if (e is Block)
{
toSave.levelBlocks.Add((Block)e);
if (e.Position.X > toSave.Width)
toSave.Width = (int)e.Position.X;
if (e.Position.Y > toSave.Height)
toSave.Height = (int)e.Position.Y;
}
}
serializer.SerializeObject(filename, toSave);
}
My program is an XNA game.
Use COMODO antivirus and get CS0016 error?
Open COMODO Command Window (Main Window), and check the SANDBOX. If your application is listed as an application that has been flagged 'limited', simply right click and select the option from the popup to add your application as a Trusted Application. Or just uninstall COMODO and reboot.That should resolve the problem with CS0016 error.
The accepted answer here System.InvalidOperationException: Unable to generate a temporary class (result=1) is likely to have a reasonable solution for you.
One possibility they didn't suggest: if you're using ASP.NET is changing the temp directory in the web.config. Check the tempDirectory attribute of the compilation element (info here http://msdn.microsoft.com/en-us/library/s10awwz0.aspx ) and change to somewhere that your ASP.NET process does have access to.
Ultimately, though, your problem is that the process doing the serialization needs to generate and write some code to disk and doesn't have permissions. You can give that process permissions, change the location to somewhere it does have permissions, or use sgen.exe, depending on what works best for your situation.

Use Protobuf-net to stream large data files as IEnumerable

I'm trying to use Protobuf-net to save and load data to disk but got stuck.
I have a portfolio of assets that I need to process, and I want to be able to do that as fast as possible. I can already read from a CSV but it would be faster to use a binary file, so I'm looking into Protobuf-Net.
I can't fit all assets into memory so I want to stream them, not load them all into memory.
So what I need to do is expose a large set of records as an IEnumerable. Is this possible with Protobuf-Net? I've tried a couple of things but haven't been able to get it running.
Serializing seems to work, but I haven't been able to read them back in again, I get 0 assets back. Could someone point me in the right direction please? Looked at the methods in the Serializer class but can't find any that covers this case. I this use-case supported by Protobuf-net? I'm using V2 by the way.
Thanks in advance,
Gert-Jan
Here's some sample code I tried:
public partial class MainWindow : Window {
// Generate x Assets
IEnumerable<Asset> GenerateAssets(int Count) {
var rnd = new Random();
for (int i = 1; i < Count; i++) {
yield return new Asset {
ID = i,
EAD = i * 12345,
LGD = (float)rnd.NextDouble(),
PD = (float)rnd.NextDouble()
};
}
}
// write assets to file
private void Write(string path, IEnumerable<Asset> assets){
using (var file = File.Create(path)) {
Serializer.Serialize<IEnumerable<Asset>>(file, assets);
}
}
// read assets from file
IEnumerable<Asset> Read(string path) {
using (var file = File.OpenRead(path)) {
return Serializer.DeserializeItems<Asset>(file, PrefixStyle.None, -1);
}
}
// try it
private void Test() {
Write("Data.bin", GenerateAssets(100)); // this creates a file with binary gibberish that I assume are the assets
var x = Read("Data.bin");
MessageBox.Show(x.Count().ToString()); // returns 0 instead of 100
}
public MainWindow() {
InitializeComponent();
}
private void button2_Click(object sender, RoutedEventArgs e) {
Test();
}
}
[ProtoContract]
class Asset {
[ProtoMember(1)]
public int ID { get; set; }
[ProtoMember(2)]
public double EAD { get; set; }
[ProtoMember(3)]
public float LGD { get; set; }
[ProtoMember(4)]
public float PD { get; set; }
}
figured it out. To deserialize use PrefixBase.Base128 wich apparently is the default.
Now it works like a charm!
GJ
using (var file = File.Create("Data.bin")) {
Serializer.Serialize<IEnumerable<Asset>>(file, Generate(10));
}
using (var file = File.OpenRead("Data.bin")) {
var ps = Serializer.DeserializeItems<Asset>(file, PrefixStyle.Base128, 1);
int i = ps.Count(); // got them all back :-)
}

Categories

Resources