CsvHelper "HasHeaderRecord" get only? How to append without header? - c#

It seems in a recent (?) update the entire configuration class is read only and I cannot set HasHeaderRecord to true to allow proper appending.
How am I supposed to append without a header now? My last working implementation is below...
internal void Write(Record record)
{
using (var writer = new StreamWriter(_filepath, true))
using (var csv = new CsvWriter(writer, CultureInfo.InvariantCulture))
{
csv.Configuration.HasHeaderRecord = true;
csv.WriteRecord(record);
csv.NextRecord();
}
}

You now have to pass a CsvConfiguration and set it there. Also, CsvConfiguration is in the CsvHelper.Configuration namespace.
CsvConfiguration config = new CsvConfiguration(CultureInfo.InvariantCulture)
{
HasHeaderRecord = true
};
using (var writer = new StreamWriter(_filepath, true))
using (var csv = new CsvWriter(writer, config))
{
csv.WriteRecord(record);
csv.NextRecord();
}

Related

CSV appears to be corrupt on Double quotes in Headers - C#

I was trying to read CSV file in C#.
I have tried File.ReadAllLines(path).Select(a => a.Split(';')) way but the issue is when there is \n multiple line in a cell it is not working.
So I have tried below
using LumenWorks.Framework.IO.Csv;
var csvTable = new DataTable();
using (TextReader fileReader = File.OpenText(path))
using (var csvReader = new CsvReader(fileReader, false))
{
csvTable.Load(csvReader);
}
for (int i = 0; i < csvTable.Rows.Count; i++)
{
if (!(csvTable.Rows[i][0] is DBNull))
{
var row1= csvTable.Rows[i][0];
}
if (!(csvTable.Rows[i][1] is DBNull))
{
var row2= csvTable.Rows[i][1];
}
}
The issue is the above code throwing exception as
The CSV appears to be corrupt near record '0' field '5 at position '63'
This is because the header of CSV's having two double quote as below
"Header1",""Header2""
Is there a way that I can ignore double quotes and process the CSV's.
update
I have tried with TextFieldParser as below
public static void GetCSVData()
{
using (var parser = new TextFieldParser(path))
{
parser.HasFieldsEnclosedInQuotes = false;
parser.Delimiters = new[] { "," };
while (parser.PeekChars(1) != null)
{
string[] fields = parser.ReadFields();
foreach (var field in fields)
{
Console.Write(field + " ");
}
Console.WriteLine(Environment.NewLine);
}
}
}
The output:
Sample CSV data I have used:
Any help is appreciated.
Hope this works!
Please replace two double quotes as below from csv:
using (FileStream fs = new FileStream(Path, FileMode.Open, FileAccess.ReadWrite, FileShare.None))
{
StreamReader sr = new StreamReader(fs);
string contents = sr.ReadToEnd();
// replace "" with "
contents = contents.Replace("\"\"", "\"");
// go back to the beginning of the stream
fs.Seek(0, SeekOrigin.Begin);
// adjust the length to make sure all original
// contents is overritten
fs.SetLength(contents.Length);
StreamWriter sw = new StreamWriter(fs);
sw.Write(contents);
sw.Close();
}
Then use the same CSV helper
using LumenWorks.Framework.IO.Csv;
var csvTable = new DataTable();
using (TextReader fileReader = File.OpenText(path))
using (var csvReader = new CsvReader(fileReader, false))
{
csvTable.Load(csvReader);
}
Thanks.

Csv file being saved in software folder csvhelper c#

so I have a program that extracts a csvTable from a csvFile structured like:
string Output_Path = Output_Path_value(output_path);
string Conversion_Logic = Conversion_Logic_value(conversion_logic); //Conversion_Logic
var textwriter = Console.Out;
using (var csvWriter = new CsvWriter(textwriter, CultureInfo.InvariantCulture))
using (var writer = new StreamWriter(Output_Path + #"\" + Conversion_Logic + "_" + fileName))
using (var csv_write = new CsvWriter(writer, CultureInfo.InvariantCulture))
using (var reader = new StringReader(sb.ToString()))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
csv.Configuration.MissingFieldFound = null;
csv.Configuration.IgnoreBlankLines = true;
csv.Configuration.RegisterClassMap<CsvTable_Headers_Map>();
csv.Read();
csv.ReadHeader();
csv.Configuration.ShouldSkipRecord = row => row[0].StartsWith("__") || row[0].StartsWith("T");
var records = csv.GetRecords<CsvTable_Headers>();
csv_write.WriteRecords(records);
//csvWriter.WriteRecords(records);// this has to be before or not exist for file to save correctly
}
The purpose of this code is to save the CsvTable to a seperate file, and then save it to the that specified output path. Everything here works, however something happens is that it also saves that file to the Debug folder, when it shouldnt. How should I structure it so that it does not save it there, and only to the output path?
I am also using the open source library, csvHelper.

CsvHelper CsvWriter is empty when source DataTable contains less than 12 rows

When writing to stream (Maybe other destinations too) CsvHelper does not return anything if my DataTable contains less than 12 rows. I tested adding rows one by one until I get a result in the string myCsvAsString variable.
Anyone ran into this problem? Here is the code I am using to reproduce it:
var stream = new MemoryStream();
using (var writer = new StreamWriter(stream))
using (var csvWriter = new CsvWriter(writer, CultureInfo.InvariantCulture))
{
if (includeHeaders)
{
foreach (DataColumn column in dataTable.Columns)
{
csvWriter.WriteField(column.ColumnName);
}
csvWriter.NextRecord();
}
foreach (DataRow row in dataTable.Rows)
{
for (var i = 0; i < dataTable.Columns.Count; i++)
{
csvWriter.WriteField(row[i]);
}
csvWriter.NextRecord();
}
csvWriter.Flush();
stream.Position = 0;
StreamReader reader = new StreamReader(stream);
string myCsvAsString = reader.ReadToEnd();
}
Ok I found the issue, I was flushing the csvWriter but I did not flush the StreamWriter.
I added writer.Flush() just after csvWriter.Flush() and the stream is complete.

Read exel files dynamically, not depending on rows, and write json c#

I'm trying to generate a json file from exel files. I have different Excel files and I would like to read them and generate a json file. I imagine it must be quite easy, but I'm having some trouble.
Ok, so I read this link using Excel reader tool, as this is what my leader says we should use. I tried following this link https://www.hanselman.com/blog/ConvertingAnExcelWorksheetIntoAJSONDocumentWithCAndNETCoreAndExcelDataReader.aspx
I always get the readTimeout and writeTimeout error. Also it never reads my Excel. It always writes null on my json document.
public static IActionResult GetData(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
Encoding.RegisterProvider(CodePagesEncodingProvider.Instance);
var inFilePath = "C:\\Users\\a\\Desktop\\exelreader\\Wave.xlsx";
var outFilePath = "C:\\Users\\a\\Desktop\\exelreader\\text.json";
using (var inFile = File.Open(inFilePath, FileMode.Open, FileAccess.Read))
using (var outFile = File.CreateText(outFilePath))
{
using (var reader = ExcelReaderFactory.CreateReader(inFile, new ExcelReaderConfiguration()
{ FallbackEncoding = Encoding.GetEncoding(1252) }))
using (var writer = new JsonTextWriter(outFile))
{
writer.Formatting = Formatting.Indented; //I likes it tidy
writer.WriteStartArray();
reader.Read(); //SKIP FIRST ROW, it's TITLES.
do
{
while (reader.Read())
{
//peek ahead? Bail before we start anything so we don't get an empty object
var status = reader.GetString(1);
if (string.IsNullOrEmpty(status)) break;
writer.WriteStartObject();
writer.WritePropertyName("Source");
writer.WriteValue(reader.GetString(1));
writer.WritePropertyName("Event");
writer.WriteValue(reader.GetString(2));
writer.WritePropertyName("Campaign");
writer.WriteValue(reader.GetString(3));
writer.WritePropertyName("EventDate");
writer.WriteValue(reader.GetString(4));
//writer.WritePropertyName("FirstName");
//writer.WriteValue(reader.GetString(5).ToString());
//writer.WritePropertyName("LastName");
//writer.WriteValue(reader.GetString(6).ToString());
writer.WriteEndObject();
}
} while (reader.NextResult());
writer.WriteEndArray();
}
}
//never mind this return
return null;
}
Can anybody give some help on this matter. The idea is to read the first row of my Excel files as headers and then the other rows as values, so I can write the json.
For converting excel data to json, you could try read excel data as dataset and then serialize the dataset to json.
Try code below:
public async Task<IActionResult> ConvertExcelToJson()
{
var inFilePath = #"xx\Wave.xlsx";
var outFilePath = #"xx\text.json";
using (var inFile = System.IO.File.Open(inFilePath, FileMode.Open, FileAccess.Read))
using (var outFile = System.IO.File.CreateText(outFilePath))
{
using (var reader = ExcelReaderFactory.CreateReader(inFile, new ExcelReaderConfiguration()
{ FallbackEncoding = Encoding.GetEncoding(1252) }))
{
var ds = reader.AsDataSet(new ExcelDataSetConfiguration()
{
ConfigureDataTable = (_) => new ExcelDataTableConfiguration()
{
UseHeaderRow = true
}
});
var table = ds.Tables[0];
var json = JsonConvert.SerializeObject(table, Formatting.Indented);
outFile.Write(json);
}
}
return Ok();
}
For AsDataSet, install package ExcelDataReader.DataSet, if you got any error related with Encoding.GetEncoding(1252), configure the code below in Startup.cs
System.Text.Encoding.RegisterProvider(System.Text.CodePagesEncodingProvider.Instance);
Reference: ExcelDataReader

Create multiple ZipArchiveEntry in ZipArchive class with different extensions C#

I am trying to figure it out how to create a zip archive that contains files with different extensions e.g. .txt file, .html file
If I do following:
using (var zipArchive = new ZipArchive(memory, ZipArchiveMode.Create, true))
{
var file1= zipArchive .CreateEntry("file1.html");
var file2= zipArchive .CreateEntry("file2.txt");
using (var entryStream = file1.Open())
using (var sw = new StreamWriter(entryStream))
{
streamWriter.Write("testing testinsg steing");
}
using (var entryStream = file2.Open())
using (var sw = new StreamWriter(entryStream))
{
sw.Write("xyxyxyxyxyxy");
}
}
Entries in create mode may only be written to once, and only one entry
may be held open at a time.
How can I solve this?
You just need to move down the line:
var file2= zipArchive .CreateEntry("file2.txt");
... and place it after you've finished writing to the previous entry:
using (var zipArchive = new ZipArchive(memory, ZipArchiveMode.Create, true))
{
var file1= zipArchive .CreateEntry("file1.html");
using (var entryStream = file1.Open())
using (var sw = new StreamWriter(entryStream))
{
streamWriter.Write("testing testinsg steing");
}
var file2= zipArchive .CreateEntry("file2.txt"); // <-- move this down
using (var entryStream = file2.Open())
using (var sw = new StreamWriter(entryStream))
{
sw.Write("xyxyxyxyxyxy");
}
}

Categories

Resources