silverlight, Save load, IsolatedStorageFile and IsolatedStorageFileStream. Exceptions - c#

Windows Phone 7 App
The Goal of the application is a simple To Do list.
I have a class 'toditem' i add those objects to the Items object.
it seems to me I'm doing something really complicated and most likely no clean or decent code
But i have some serious problems with "IsolatedStorageFile"
public class ToDoItem
{
public string ToDoName { get; set; } // Add controle's enz.
public string ToDoDescription { get; set; }
internal Priority PriortiySelection { get; set; }
...
}
Items class (basicly a wrapper clas so i can acces it)
public class Items
{
public static List<ToDoItem> Itemslist = new List<ToDoItem>();
public static List<ToDoItem> GetList()
static methods here..
}
The code Belows returns the following exceptions :
"Attempt to access the method failed:
System.Io.streamreader..ctor
(System.String)"
and afterwards i get
Operation not permitted on IsolatedStorageFileSTream
if (store.FileExists(#"items.std"))
{
ToDoItem item = new ToDoItem();
try
{
IsolatedStorageFileStream save = new IsolatedStorageFileStream(#"items.std", FileMode.Open, store);
BinaryReader reader = new BinaryReader(save);
}
catch (Exception exc)
{
MessageBox.Show(exc.Message);
}
in public partial class NewToDo : PhoneApplicationPage
i added the following method. wich returns the above exceptions again i only assume that its allowd for some reason or i make some huge mistakes.
private void saveItem(ToDoItem toDoItem)
{
try
{
using (StreamWriter sw = new StreamWriter(store.OpenFile(#"items.std", FileMode.Append)))
{
sw.WriteLine(toDoItem.ToDoName);
sw.WriteLine(toDoItem.ToDoDescription);
sw.WriteLine(toDoItem.PriortiySelection.ToString());
}
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
}
Should u need more information I'm always glad to provide it, I'm currently a student at a Belgium college second year and I'm playing around with windows phone7 apps.

The following will read the contents of a file from isolated storage
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
{
if (!store.FileExists(VIEW_MODEL_STORAGE_FILE))
{
return result;
}
using (var isfs = new IsolatedStorageFileStream(VIEW_MODEL_STORAGE_FILE, FileMode.Open, store))
{
using (var sr = new StreamReader(isfs))
{
string lineOfData;
while ((lineOfData = sr.ReadLine()) != null)
{
result += lineOfData;
}
}
}
}
The example builds a string of data (result). This is actually a serialized object which is actually a collection of other objects. This can then be deserialized back to the collection. This is probably preferable to what you were trying to do with writing out properties to a file one at a time.
Here's how to write the file:
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
{
using (var isfs = new IsolatedStorageFileStream(VIEW_MODEL_STORAGE_FILE, FileMode.Create, store))
{
using (var sw = new StreamWriter(isfs))
{
sw.Write(serializedCollectionObject);
sw.Close();
}
}
}

Is it possible you're not disposing all your disposable objects and encountering a problem when you try to access a resource for a second time because it's still in use?
The using statement is a good way to handle this easily, more on that here.
Dispose with Using
A bit more background on the topic here where Jm47 was getting the same error message for this reason.
Problem opening a stream to an isolatedstorage image already the source on an image?

Related

Lock JSON File for Reading and Writing

Hi I have an app that uses a json file to monitor run intervals of emails. I use it as a simple way to store the times I run an email notification. But I've run into an issue where the user can run concurrent instances of this app. What happens then is that two instances can read that json file and if they both read that an email is not sent, both instances will send an email leading to duplicate emails.
This is how the code is (sampled down for replicability)
public class SummaryEmailStatus
{
public DateTime DateSent { get; set; }
public string ToolType { get; set; }
public bool IsSent { get; set; }
}
public static class JsonUtil
{
public static Dictionary<string, SummaryEmailStatus> EmailStatusKVP = new Dictionary<string, SummaryEmailStatus>();
public static bool CreateEmailItemJasonFile(string jsonFileName)
{
var summCfgEmailStatus = new SummaryEmailStatus
{
DateSent = DateTime.Parse("2022-01-01"),
ToolType = "CIM",
IsSent = false
};
EmailStatusKVP.Add(SummaryEmailJsonObjs.ConfigError, summCfgEmailStatus);
string json = JsonConvert.SerializeObject(EmailStatusKVP, Formatting.Indented);
File.WriteAllText(jsonFileName, json);
}
public static Dictionary<string, SummaryEmailStatus> ReadEmailItemJsonFromFile(string jsonFileName)
{
string json = File.ReadAllText(jsonFileName);
EmailStatusKVP = JsonConvert.DeserializeObject<Dictionary<string, SummaryEmailStatus>>(json);
return EmailStatusKVP;
}
public static void WriteSummEmailStatusJsonToFile(string summaryEmailType, SummaryEmailStatus emailItem)
{
//string json = JsonConvert.SerializeObject(emailItem, Formatting.Indented);
EmailStatusKVP[summaryEmailType] = emailItem;
string json = JsonConvert.SerializeObject(EmailStatusKVP, Formatting.Indented);
File.WriteAllText(ParserConfigFilesConstants.SummaryEmailJsonFilePath, json);
}
}
The issue is I am using File.WritallText and ReadAllText. Is there a way to do what I am doing but in a way that locks the file each time the CreateEmailItemJasonFile or ReadEmailItemJsonFromFile or WriteSummEmailStatusJsonToFile is called?
I want only one instance for the console application to use this file. If the other instance tries to use it, it should get some "being used by another program" exception.
I saw this solution How to lock a file with C#? but with how new I am to C# I am not sure how to use it for my own needs.
I also thought about using a lock object around my File.Write and File.Read sections but I was under the impression that would only work if its another thread within the console application instance:
lock (fileReadLock)
{
string json = File.ReadAllText(jsonFileName);
}
I fixed it by using FileStream:
For reading I used:
FileStream fileStream = new FileStream(jsonFileName, FileMode.Open, FileAccess.Read, FileShare.None);
using (StreamReader reader = new StreamReader(fileStream))
{
json = reader.ReadToEnd();
}
And for Writing I used:
FileStream fs = new FileStream(ParserConfigFilesConstants.SummaryEmailJsonFilePath, FileMode.Create, FileAccess.Write, FileShare.None);
using (StreamWriter writer = new StreamWriter(fs))
{
writer.WriteLine(json);
writer.Flush();
}
This allows me to lock the file out whenever my app is using the file.

Is there size limit for a property to be serialized?

I'm working against an interface that requires an XML document. So far I've been able to serialize most of the objects using XmlSerializer. However, there is one property that is proving problematic. It is supposed to be a collection of objects that wrap a document. The document itself is encoded as a base64 string.
The basic structure is like this:
//snipped out of a parent object
public List<Document> DocumentCollection { get; set; }
//end snip
public class Document
{
public string DocumentTitle { get; set; }
public Code DocumentCategory { get; set; }
/// <summary>
/// Base64 encoded file
/// </summary>
public string BinaryDocument { get; set; }
public string DocumentTypeText { get; set; }
}
The problem is that smaller values work fine, but if the document is too big the serializer just skips over that document item in the collection.
Is there some limitation that I'm bumping up against?
Update: I changed
public string BinaryDocument { get; set; }
to
public byte[] BinaryDocument { get; set; }
and I'm still getting the same result. The smaller document (~150kb) is serializing just fine, but the rest aren't. To be clear, it's not just the value of the property, it's the entire containing Document object that gets dropped.
UPDATE 2:
Here's the serialization code with a simple repro. It's out of a console project I put together. The problem is that this code works fine in the test project. I'm having difficulty getting the full object structure packed in here because it's near impossible to use the actual objects in a test case because of the complexity of filling the fields, so I tried to cut down the code in the main application. The populated object goes into the serialization code with the DocumentCollection filled with four Documents and comes out with one Document.
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
namespace ConsoleApplication2
{
class Program
{
static void Main(string[] args)
{
var container = new DocumentContainer();
var docs = new List<Document>();
foreach (var f in Directory.GetFiles(#"E:\Software Projects\DA\Test Documents"))
{
var fileStream = new MemoryStream(File.ReadAllBytes(f));
var doc = new Document
{
BinaryDocument = fileStream.ToArray(),
DocumentTitle = Path.GetFileName(f)
};
docs.Add(doc);
}
container.DocumentCollection = docs;
var serializer = new XmlSerializer(typeof(DocumentContainer));
var ms = new MemoryStream();
var writer = XmlWriter.Create(ms);
serializer.Serialize(writer, container);
writer.Flush();
ms.Seek(0, SeekOrigin.Begin);
var reader = new StreamReader(ms, Encoding.UTF8);
File.WriteAllText(#"C:\temp\testexport.xml", reader.ReadToEnd());
}
}
public class Document
{
public string DocumentTitle { get; set; }
public byte[] BinaryDocument { get; set; }
}
// test class
public class DocumentContainer
{
public List<Document> DocumentCollection { get; set; }
}
}
XmlSerializer has no limit on the length of a string it can serialize.
.Net, however, has a maximum string length of int.MaxValue. Furthermore, since internally a string is implemented as a contiguous memory buffer, on a 32 bit process you're likely to be unable to allocate a string anywhere near that large due to process space fragmentation. And since a c# base64 string requires roughly 2.67 times the memory of the byte [] array from which it was created (1.33 for the encoding times 2 since the .Net char type is actually two bytes) you might be getting an OutOfMemoryException encoding a large binary document as a complete base64 string, then swallowing and ignoring it, leaving the BinaryDocument property null.
That being said, there is no reason for you to manually encode your binary documents into base64, because XmlSerializer does this for you automatically. I.e. if I serialize the following class:
public class Document
{
public string DocumentTitle { get; set; }
public Code DocumentCategory { get; set; }
public byte [] BinaryDocument { get; set; }
public string DocumentTypeText { get; set; }
}
I get the following XML:
<Document>
<DocumentTitle>my title</DocumentTitle>
<DocumentCategory>Default</DocumentCategory>
<BinaryDocument>AAECAwQFBgcICQoLDA0ODxAREhM=</BinaryDocument>
<DocumentTypeText>document text type</DocumentTypeText>
</Document>
As you can see, BinaryDocument is base64 encoded. Thus you should be able to keep your binary documents in a more compact byte [] representation and still get the XML output you want.
Even better, under the covers, XmlWriter uses System.Xml.Base64Encoder to do this. This class encodes its inputs in chunks, thereby avoiding the excessive memory use and potential out-of-memory exceptions described above.
I can't reproduce the problem you are having. Even with individual files as large as 267 MB to 1.92 GB, I'm not seeing any elements being skipped. The only problem I am seeing is that the temporary var ms = new MemoryStream(); exceeds its 2 GB buffer limit eventually, whereupon an exception gets thrown. I replaced this with a direct stream, and that problem went away:
using (var stream = File.Open(outputPath, FileMode.Create, FileAccess.ReadWrite))
That being said, your design will eventually run up against memory limits for a sufficiently large number of sufficiently large files, since you load all of them into memory before serializing. If this is happening, somewhere in your production code you may be catching and swallowing the OutOfMemoryException without realizing it, leading to the problem you are seeing.
As an alternative, I would suggest a streaming solution where you incrementally copy each file's contents to the XML output from within XmlSerializer by making your Document class implement IXmlSerializable:
public class Document : IXmlSerializable
{
public string DocumentPath { get; set; }
public string DocumentTitle
{
get
{
if (DocumentPath == null)
return null;
return Path.GetFileName(DocumentPath);
}
}
const string DocumentTitleName = "DocumentTitle";
const string BinaryDocumentName = "BinaryDocument";
#region IXmlSerializable Members
System.Xml.Schema.XmlSchema IXmlSerializable.GetSchema()
{
return null;
}
void ReadXmlElement(XmlReader reader)
{
if (reader.Name == DocumentTitleName)
DocumentPath = reader.ReadElementContentAsString();
}
void IXmlSerializable.ReadXml(XmlReader reader)
{
reader.ReadXml(null, ReadXmlElement);
}
void IXmlSerializable.WriteXml(XmlWriter writer)
{
writer.WriteElementString(DocumentTitleName, DocumentTitle ?? "");
if (DocumentPath != null)
{
try
{
using (var stream = File.OpenRead(DocumentPath))
{
// Write the start element if the file was successfully opened
writer.WriteStartElement(BinaryDocumentName);
try
{
var buffer = new byte[6 * 1024];
int read;
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
writer.WriteBase64(buffer, 0, read);
}
finally
{
// Write the end element even if an error occurred while streaming the file.
writer.WriteEndElement();
}
}
}
catch (Exception ex)
{
// You could log the exception as an element or as a comment, as you prefer.
// Log as a comment
writer.WriteComment("Caught exception with message: " + ex.Message);
writer.WriteComment("Exception details:");
writer.WriteComment(ex.ToString());
// Log as an element.
writer.WriteElementString("ExceptionMessage", ex.Message);
writer.WriteElementString("ExceptionDetails", ex.ToString());
}
}
}
#endregion
}
// test class
public class DocumentContainer
{
public List<Document> DocumentCollection { get; set; }
}
public static class XmlSerializationExtensions
{
public static void ReadXml(this XmlReader reader, Action<IList<XAttribute>> readXmlAttributes, Action<XmlReader> readXmlElement)
{
if (reader.NodeType != XmlNodeType.Element)
throw new InvalidOperationException("reader.NodeType != XmlNodeType.Element");
if (readXmlAttributes != null)
{
var attributes = new List<XAttribute>(reader.AttributeCount);
while (reader.MoveToNextAttribute())
{
attributes.Add(new XAttribute(XName.Get(reader.Name, reader.NamespaceURI), reader.Value));
}
// Move the reader back to the element node.
reader.MoveToElement();
readXmlAttributes(attributes);
}
if (reader.IsEmptyElement)
{
reader.Read();
return;
}
reader.ReadStartElement(); // Advance to the first sub element of the wrapper element.
while (reader.NodeType != XmlNodeType.EndElement)
{
if (reader.NodeType != XmlNodeType.Element)
// Comment, whitespace
reader.Read();
else
{
using (var subReader = reader.ReadSubtree())
{
while (subReader.NodeType != XmlNodeType.Element) // Read past XmlNodeType.None
if (!subReader.Read())
break;
if (readXmlElement != null)
readXmlElement(subReader);
}
reader.Read();
}
}
// Move past the end of the wrapper element
reader.ReadEndElement();
}
}
Then use it as follows:
public static void SerializeFilesToXml(string directoryPath, string xmlPath)
{
var docs = from file in Directory.GetFiles(directoryPath)
select new Document { DocumentPath = file };
var container = new DocumentContainer { DocumentCollection = docs.ToList() };
using (var stream = File.Open(xmlPath, FileMode.Create, FileAccess.ReadWrite))
using (var writer = XmlWriter.Create(stream, new XmlWriterSettings { Indent = true, IndentChars = " " }))
{
new XmlSerializer(container.GetType()).Serialize(writer, container);
}
Debug.WriteLine("Wrote " + xmlPath);
}
Using the streaming solution, when serializing 4 files of around 250 MB each, my memory use went up by 0.8 MB. Using the original classes, my memory went up by 1022 MB.
Update
If you need to write your XML to a memory stream, be aware that the c# MemoryStream has a hard maximum stream length of int.MaxValue (i.e. 2 GB) because the underlying memory is simply a byte array. On a 32-bit process the effective max length will be much smaller, see OutOfMemoryException while populating MemoryStream: 256MB allocation on 16GB system.
To programmatically check to see if your process is actually 32 bit, see How to determine programmatically whether a particular process is 32-bit or 64-bit. To change to 64 bit, see What is the purpose of the “Prefer 32-bit” setting in Visual Studio 2012 and how does it actually work?.
If you are sure you are running in 64 bit mode and are still exceeding the hard size limits of a MemoryStream, perhaps see alternative to MemoryStream for large data volumes or MemoryStream replacement?.

Windows 8 phone Saving and loading a list of Objects

I would like to save and load my History list filled with History Entry objects. I am trying to do this through Isolated Storage, so that when the user opens and closes the app none of their browsing history is lost. It is saved which can be loaded once the app is clicked. I have had a look around and saw this question on stackoverflow, and I have tried to follow it but came across so errors. Isolated Storage & Saving Multiple Objects.
Here is the code
The HistoryEntry class
public string URL { get; set; }
public string timestamp { get; set; }
public string date { get; set; }
The MainPage code:
using System.Xml.Serialization;
using System.Collections.ObjectModel;
using System.IO;
List<HistoryEntry> urls = new List<HistoryEntry>();
public HistoryEntry selectedHistory;
public MainPage()
{
InitializeComponent();
Deserialize<>(urls, ???);
}
void Browser_Navigated(object sender, System.Windows.Navigation.NavigationEventArgs e)
{
HistoryEntry urlObj = new HistoryEntry();
urlObj.URL = url;
urlObj.timestamp = DateTime.Now.ToString("HH:mm");
urlObj.date = url.Remove(url.LastIndexOf('.'));
urls.Add(urlObj);
textBox1.Text = url;
listBox.ItemsSource = null;
listBox.ItemsSource = urls;
Serialize(urlObj, urls);
}
private static void Serialize(string fileName, object source)
{
var userStore = IsolatedStorageFile.GetUserStoreForApplication();
using (var stream = new IsolatedStorageFileStream(fileName, FileMode.Create, userStore))
{
XmlSerializer serializer = new XmlSerializer(source.GetType());
serializer.Serialize(stream, source);
}
}
public static void Deserialize<T>(ObservableCollection<T> list, string filename)
{
list = new ObservableCollection<T>();
var userStore = IsolatedStorageFile.GetUserStoreForApplication();
if (userStore.FileExists(filename))
{
using (var stream = new IsolatedStorageFileStream(filename, FileMode.Open, userStore))
{
XmlSerializer serializer = new XmlSerializer(list.GetType());
var items = (ObservableCollection<T>)serializer.Deserialize(stream);
foreach (T item in items)
{
list.Add(item);
}
}
}
}
Serialize has some invalid arguments which is the same with when De serialize is called. What are the appropriate values to be sent to the method, and will this successfully save and load the history objects.
Thank you in advance :)
If you need any more details please comment and I will be happy to explain in further detail :)
Did you try this way using MemoryStream which it did work for Win 8 :
To Serialize:
MemoryStream sessionData = new MemoryStream();
DataContractSerializer serializer = new
DataContractSerializer(typeof(ObservableCollection<NewsByTag>));
serializer.WriteObject(sessionData, data);
StorageFile file = await ApplicationData.Current.LocalFolder
.CreateFileAsync(sFileName);
using (Stream fileStream = await file.OpenStreamForWriteAsync())
{
sessionData.Seek(0, SeekOrigin.Begin);
await sessionData.CopyToAsync(fileStream);
await fileStream.FlushAsync();
}
To Deserialize it back:
StorageFile file = await ApplicationData.Current.LocalFolder.
GetFileAsync(sFileName);
using (IInputStream inStream = await file.OpenSequentialReadAsync())
{
DataContractSerializer serializer =
new DataContractSerializer(typeof(ObservableCollection<NewsByTag>));
var data = (ObservableCollection<NewsByTag>)serializer
.ReadObject(inStream.AsStreamForRead());
}
Hope it helps!

How can I serialize a class to XML using Windows Phone 8 SDK?

I've got one attribute (_XMLPlaylist) that I want to serialize in a XML-file likeshown in the code:
private void btn_Save_Click(object sender, RoutedEventArgs e)
{
_Playlist.Pl_Name = tb_Name.Text.ToString();
_XMLPlaylist.Playlists.Add(_Playlist);
IsolatedStorageFile isoStore = IsolatedStorageFile.GetUserStoreForApplication();
using (IsolatedStorageFileStream isoStream = new IsolatedStorageFileStream(StudiCast.Resources.AppResources.Playlists, FileMode.CreateNew, isoStore))
{
using (StreamWriter writer = new StreamWriter(isoStream))
{
var ser = new XmlSerializer(typeof(XMLPlaylist));
ser.Serialize(writer, _XMLPlaylist);
writer.Close();
}
isoStream.Close();
}
}
The Type XMLPlaylist looks as follows:
class XMLPlaylist
{
public XMLPlaylist()
{
Playlists = new List<Playlist>();
}
public List<Playlist> Playlists;
}
And the class Playlist like that:
class Playlist
{
public Playlist()
{
Casts = new List<Cast>();
}
public string Pl_Name;
public List<Cast> Casts;
}
'Cast' owns two strings. In .NET 4 I used the keyword [Serializable] in front of the class's name but there isn't the [Serializable]-attribute anymore.
Need fast Help please!
Edit:
Error at 'var ser = new XmlSerializer(typeof(XMLPlaylist));':
In System.InvalidOperationException an unhandled error of type "System.Xml.Serialization.ni.dll" occurs.
XmlSerializer can serialize only public classes - make your class XMLPlaylist public (also all properties/classes you want to serialize - so Playlist should be public as well).

CsvHelper not writing anything to memory stream

I have the following method:
public byte[] WriteCsvWithHeaderToMemory<T>(IEnumerable<T> records) where T : class
{
using (var memoryStream = new MemoryStream())
using (var streamWriter = new StreamWriter(memoryStream))
using (var csvWriter = new CsvWriter(streamWriter))
{
csvWriter.WriteRecords<T>(records);
return memoryStream.ToArray();
}
}
Which is being called with a list of objects - eventually from a database, but since something is not working I'm just populating a static collection. The objects being passed are as follows:
using CsvHelper.Configuration;
namespace Application.Models.ViewModels
{
public class Model
{
[CsvField(Name = "Field 1", Ignore = false)]
public string Field1 { get; set; }
[CsvField(Name = "Statistic 1", Ignore = false)]
public int Stat1{ get; set; }
[CsvField(Name = "Statistic 2", Ignore = false)]
public int Stat2{ get; set; }
[CsvField(Name = "Statistic 3", Ignore = false)]
public int Stat3{ get; set; }
[CsvField(Name = "Statistic 4", Ignore = false)]
public int Stat4{ get; set; }
}
}
What I'm trying to do is write a collection to a csv for download in an MVC application. Every time I try to write to the method though, the MemoryStream is coming back with zero length and nothing being passed to it. I've used this before, but for some reason it's just not working - I'm somewhat confused. Can anyone point out to me what I've done wrong here?
Cheers
You already have a using block which is great. That will flush your writer for you. You can just change your code slightly for it to work.
using (var memoryStream = new MemoryStream())
{
using (var streamWriter = new StreamWriter(memoryStream))
using (var csvWriter = new CsvWriter(streamWriter))
{
csvWriter.WriteRecords<T>(records);
} // StreamWriter gets flushed here.
return memoryStream.ToArray();
}
If you turn AutoFlush on, you need to be careful. This will flush after every write. If your stream is a network stream and over the wire, it will be very slow.
Put csvWriter.Flush(); before you return to flush the writer/stream.
EDIT: Per Jack's response. It should be the stream that gets flushed, not the csvWriter. streamWriter.Flush();. Leaving original solution, but adding this correction.
EDIT 2: My preferred answer is: https://stackoverflow.com/a/22997765/1795053 Let the using statements do the heavy lifting for you
Putting all these together (and the comments for corrections), including resetting the memory stream position, the final solution for me was;
using (MemoryStream ms = new MemoryStream())
{
using (TextWriter tw = new StreamWriter(ms))
using (CsvWriter csv = new CsvWriter(tw, CultureInfo.InvariantCulture))
{
csv.WriteRecords(errors); // Converts error records to CSV
tw.Flush(); // flush the buffered text to stream
ms.Seek(0, SeekOrigin.Begin); // reset stream position
Attachment a = new Attachment(ms, "errors.csv"); // Create attachment from the stream
// I sent an email here with the csv attached.
}
}
In case the helps someone else!
There is no flush in csvWriter, the flush is in the streamWriter. When called
csvWriter.Dispose();
it will flush the stream.
Another approach is to set
streamWriter.AutoFlush = true;
which will automatically flush the stream every time.
Here is working example:
void Main()
{
var records = new List<dynamic>{
new { Id = 1, Name = "one" },
new { Id = 2, Name = "two" },
};
Console.WriteLine(records.ToCsv());
}
public static class Extensions {
public static string ToCsv<T>(this IEnumerable<T> collection)
{
using (var memoryStream = new MemoryStream())
{
using (var streamWriter = new StreamWriter(memoryStream))
using (var csvWriter = new CsvWriter(streamWriter))
{
csvWriter.WriteRecords(collection);
} // StreamWriter gets flushed here.
return Encoding.ASCII.GetString(memoryStream.ToArray());
}
}
}
Based on this answer.
using CsvHelper;
public class TwentyFoursStock
{
[Name("sellerSku")]
public string ProductSellerSku { get; set; }
[Name("shippingPoint")]
public string ProductShippingPoint { get; set; }
}
using (var writer = new StreamWriter("file.csv"))
using (var csv = new CsvWriter(writer, CultureInfo.InvariantCulture))
{
csv.WriteRecords(TwentyFoursStock);
}

Categories

Resources