Exception when writing data with CsvHelper - c#

I'm trying to write data to a CSV-file using CsvHelper. However, I always get the following exception:
CsvHelper.Configuration.ConfigurationException: "Types that inherit
IEnumerable cannot be auto mapped. Did you accidentally call GetRecord
or WriteRecord which acts on a single record instead of calling
GetRecords or WriteRecords which acts on a list of records?"
This is my code (C#):
TextWriter outfile = new StreamWriter("blatest.csv");
List<string> test = new List<string>
{
"hello",
"world"
};
CsvWriter csv = new CsvWriter(outfile);
csv.WriteRecords(test);
I would like to write a List<string> or (ideally) a List<Dictionary<string, string>> to CSV. What would be the correct code for this? And how can I set the header row?
Any help is appreciated. I really can't wrap my head around this.

For the error, it is because string implements IEnumerable (because it is char[]). Generally using WriteRecords you pass in an IEnumerable of custom objects.
You could try another way (Example)
using(var stream = new MemoryStream())
using(var reader = new StreamReader(stream))
using(var writer = new StreamWriter(stream))
using(var csvWriter = new CsvHelper.CsvWriter(writer))
{
//csvWriter.Configuration.HasHeaderRecord = false;
foreach( var s in test)
{
csvWriter.WriteField(s);
}
writer.Flush();
stream.Position = 0;
reader.ReadToEnd(); //dump it where you want
}

Related

Deserialize object one by one from file .Net

I'm trying to deserialize a list of heavy objects from a json file. I do not want to deserialize it the classic way, like directly to a list, because it will expose me to an OutOfMemory exception. So I'm looking for a way to handle object one by one to store them one by one in the database and be memory safe.
I already handle the serialization and it's working well, but I'm facing some difficulties for deserialization.
Any idea ?
Thanks in advance
// Serialization
using (var FileStream = new FileStream(DirPath + "/TPV.Json", FileMode.Create))
{
using (var sw = new StreamWriter(FileStream))
{
using (var jw = new JsonTextWriter(sw))
{
jw.WriteStartArray();
using (var _Database = new InspectionBatimentsDataContext(TheBrain.DBClient.ConnectionString))
{
foreach (var TPVId in TPVIds)
{
var pic = (from p in _Database.TPV
where Operators.ConditionalCompareObjectEqual(p.Release, TPVId.Release, false) & Operators.ConditionalCompareObjectEqual(p.InterventionId, TPVId.InterventionId, false)
select p).FirstOrDefault;
var ser = new JsonSerializer();
ser.Serialize(jw, pic);
jw.Flush();
}
}
jw.WriteEndArray();
}
}
}
I finnaly found a way to do it by using custom separator beetween each object during serialization. Then for deserialization, I simply read the json file as string until I find my custom separator and I deserialise readed string, all in a loop. It's not the perfect answer because I'm breaking json format in my files, but it's not a constraint in my case.

how to serialize a YamlDocument to yaml string

I use YamlDotnet and I have a YamlDocument. Now I want to convert it to his yaml text representation in memory but I don't see how to achieve that.
var yaml = new YamlDocument(new YamlMappingNode());
yaml.Add("one", "other")
var text = yaml.ToYamlText()
and I should get in text something like :
one: "other"
I tried zith Serializer class but with no success
ok so I found the solution in unit test of the source code :
var yaml = new YamlDocument(new YamlMappingNode());
yaml.Add("one", "other");
var yamlStream = new YamlStream(yaml);
var buffer = new StringBuilder();
using (var writer = new StringWriter(buffer))
{
yamlStream.Save(writer);
yamlText = writer.ToString();
}
Anyway, I have now another problem, I need all my values to be surrounded by double quotes. In another application I was using a QuoteSurroundingEventEmitter : ChainedEventEmitter with an object graph selrialization. But with yamlStream.Save() I don't see how to implement this mecanism

add a line to SPFile object

I simply want to add a line to an SPFile object, which is a simple txt file.
Is there a simple way to do this ? I was thinking
Thanks
EDIT : that's what i have for the moment :
public static void addLine(SPFile file, string line)
{
using(System.IO.StreamWriter strWriter = new System.IO.StreamWriter(file.OpenBinaryStream())){
strWriter.WriteLine(line);
}
}
I don't have any error here, but the file doesn't get saved. I've tried to do something like :
file.SaveBinary( args )
But i don't know what to put in args.
If you can help me.
Thanks
You need SPFile.OpenBinaryStream, one of SPFile.SaveBinary to read/write. Some string manipulation of TextReader created over resulting stream like TextReader.ReadToEnd and write resulting data to MemoryStream with TextWriter.
Warning: non-compiled code below:
using (var readStream = file.OpenBinaryStream())
{
using(var reader = new StreamReader(readStream)
{
var allText = reader.ReadToEnd();
var writeStream = new MemoryStream();
using(var writer = new TextWriter(writeStream))
{
writer.Write(allText);
writer.Write(extraText);
}
file.SaveBinary(writeStream.ToArray();
}
}

StreamWriter to Reference Type

Persisting a reference type with the StreamWriter is easy enough. And retrieving this data back into a string is easy as well, with the StreamReader.
But how can i convert the string the StreamReader returns into my custom reference type ?
Persisting reference types (objects of classes) is called serialization. Reverse process is called deserialization. Both can be done easily in .net using XmlSerializer:
XmlSerializer serializer = new XmlSerializer(typeof(OrderedItem));
OrderedItem item = new OrderedItem();
//do stuff
using (StreamWriter sw = new StreamWriter(filename))
serializer.Serialize(sw, item);
reverse
XmlSerializer serializer = new XmlSerializer(typeof(OrderedItem));
OrderedItem item = null;
using (StreamReader sr = new StreamReader(filename))
item = (OrderedItem)serializer.Deserialize(sr);

how to read the txt file from the database(line by line)

i have stored the txt file to sql server database .
i need to read the txt file line by line to get the content in it.
my code :
DataTable dtDeleteFolderFile = new DataTable();
dtDeleteFolderFile = objutility.GetData("GetTxtFileonFileName", new object[] { ddlSelectFile.SelectedItem.Text }).Tables[0];
foreach (DataRow dr in dtDeleteFolderFile.Rows)
{
name = dr["FileName"].ToString();
records = Convert.ToInt32(dr["NoOfRecords"].ToString());
bytes = (Byte[])dr["Data"];
}
FileStream readfile = new FileStream(Server.MapPath("txtfiles/" + name), FileMode.Open);
StreamReader streamreader = new StreamReader(readfile);
string line = "";
line = streamreader.ReadLine();
but here i have used the FileStream to read from the Particular path. but i have saved the txt file in byte format into my Database. how to read the txt file using the byte[] value to get the txt file content, instead of using the Path value.
Given th fact that you have the file in a byte array, you can make use of MemoryStream Class
Something like
using (MemoryStream m = new MemoryStream(buffer))
using (StreamReader sr = new StreamReader(m))
{
while (!sr.EndOfStream)
{
string s = sr.ReadLine();
}
}
Also make sure to use using Statement (C# Reference)
Defines a scope, outside of which an
object or objects will be disposed.
The using statement allows the
programmer to specify when objects
that use resources should release
them. The object provided to the using
statement must implement the
IDisposable interface. This interface
provides the Dispose method, which
should release the object's resources.
You could try something like this at the end of your foreach:
String txtFileContent = Encoding.Unicode.GetString((Byte[])dr["Data"]);

Categories

Resources