I am trying to use csv helper libary to parse my csv. But I am having an issue it says that the itemcode does not exist when its there in the file.
// Adding stock item code
Sage.Accounting.Stock.StockItem stockItem = new Sage.Accounting.Stock.StockItem();
string line = null;
public void ImportCsv(string filename)
{
TextReader reader = File.OpenText(filename);
var csv = new CsvReader(reader);
csv.Configuration.HasHeaderRecord = true;
csv.Read();
// Dynamic
// Using anonymous type for the class definition
var anonymousTypeDefinition = new
{
Itemcode = string.Empty,
Barcode = string.Empty
};
var records = csv.GetRecords(anonymousTypeDefinition);
}
This is the csv structure
"Itemcode","Barcode","description"
"P4S100001","303300054486","Test Product"
This is my first time using the csvhelper as showing here at https://joshclose.github.io/CsvHelper/
You are better off creating a strongly typed model to hold the data if one does not already exist
public class Item {
public string Itemcode { get; set; }
public string Barcode { get; set; }
public string description { get; set; }
}
and using GetRecords<T>() to read the records by type
TextReader reader = File.OpenText(filename);
var csv = new CsvReader(reader);
var records = csv.GetRecords<Item>();
Your GetRecords function needs a type specifier like so:
var records = csv.GetRecords<type>();
Also you may want to put csv.Read() in a while loop depending on your need.
Since all your values have quotes you need to specify it in the config. Working with quotes in csvHelper is frustrating. if not all if the values have quotes there are ways to handle that as well but not as nicely as this
var csv = new CsvReader(reader,new CsvHelper.Configuration.Configuration
{
HasHeaderRecord = true,
QuoteAllFields = true
});
var anonymousTypeDefinition = new
{
Itemcode = string.Empty,
Barcode = string.Empty
};
var records = csv.GetRecords(anonymousTypeDefinition);
Related
I'm writing a program to read in CSV files and validate the data. The csv file is comma delimited.
The csv file contains a sales order that is retrieved online so we can't actually edit the CSV file itself. I need to read in the file and split it into the cells. However, the product description will contain further commas which is affecting how I access the data.
My code for pulling the values out is below.
private void csvParse()
{
List<string> products = new List<string>();
List<string> quantities = new List<string>();
List<string> price = new List<string>();
using (var reader = new StreamReader(txt_filePath.Text.ToString()))
{
while (!reader.EndOfStream)
{
var line = reader.ReadLine();
var values = line.Split(',');
products.Add(values[0]);
quantities.Add(values[2]);
values[3] = values[3].Substring(4);
price.Add(values[3]);
}
}
if (validateData(products, quantities, price) != "")
{
MessageBox.Show(validateData(products, quantities, price));
}
}
Is there anyway to ignore the columns in a set cell or can the columns distinguished by another delimiter?
A snippet of a row in my csv file is below.
The raw CSV data is below:
TO12345,"E45 Dermatological Moisturising Lotion, 500 ml",765,GBP 1.75
You can use LinqToCSV from nuGet. ie:
void Main()
{
List<MyData> sample = new List<MyData> {
new MyData {Id=1, Name="Hammer", Description="Everything looks like a nail to a hammer, doesn't it?"},
new MyData {Id=2, Name="C#", Description="A computer language."},
new MyData {Id=3, Name="Go", Description="Yet another language, from Google, cross compiles natively."},
new MyData {Id=3, Name="BlahBlah"},
};
string fileName = #"c:\temp\MyCSV.csv";
File.WriteAllText(fileName,"Id,My Product Name,Ignore1,Ignore2,Description\n");
File.AppendAllLines(fileName, sample.Select(s => $#"{s.Id},""{s.Name}"",""ignore this"",""skip this too"",""{s.Description}"""));
CsvContext cc = new CsvContext();
CsvFileDescription inputFileDescription = new CsvFileDescription
{
SeparatorChar = ',',
FirstLineHasColumnNames = true,
IgnoreUnknownColumns=true
};
IEnumerable<MyData> fromCSV = cc.Read<MyData>(fileName, inputFileDescription);
foreach (var d in fromCSV)
{
Console.WriteLine($#"ID:{d.Id},Name:""{d.Name}"",Description:""{d.Description}""");
}
}
public class MyData
{
[CsvColumn(FieldIndex = 1, Name="Id", CanBeNull = false)]
public int Id { get; set; }
[CsvColumn(FieldIndex = 2, Name="My Product Name",CanBeNull = false, OutputFormat = "C")]
public string Name { get; set; }
[CsvColumn(FieldIndex = 5, Name="Description",CanBeNull = true, OutputFormat = "C")]
public string Description { get; set; }
}
It should work..:)
var csvSplit = new Regex("(?:^|,)(\"(?:[^\"]+|\"\")*\"|[^,]*)", RegexOptions.Compiled);
string[] csvlines = File.ReadAllLines(txt_filePath.Text.ToString());
var query = csvlines.Select(csvline => new
{
data = csvSplit.Matches(csvline)
}).Select(t => t.data);
var row = query.Select(matchCollection =>
(from Match m in matchCollection select (m.Value.Contains(',')) ? m.Value.Replace(",", "") : m.Value)
.ToList()).ToList();
You can also use the Microsoft.VisualBasic.FileIO.TextFieldParser class. More detailed answer here: TextFieldParser
I have a database called ebookstore.db as below:
and JSON as below:
I want when slug on JSON is not the same as a title in the database, it will display the amount of data with a slug on JSON which is not same as a title in the database in ukomikText.
Code:
string judulbuku;
try
{
string urlPath1 = "https://...";
var httpClient1 = new HttpClient(new HttpClientHandler());
httpClient1.DefaultRequestHeaders.TryAddWithoutValidation("KIAT-API-KEY", "....");
var values1 = new List<KeyValuePair<string, string>>
{
new KeyValuePair<string, string>("halaman", 1),
new KeyValuePair<string, string>("limit", 100),
};
var response1 = await httpClient1.PostAsync(urlPath1, new FormUrlEncodedContent(values1));
response1.EnsureSuccessStatusCode();
if (!response1.IsSuccessStatusCode)
{
MessageDialog messageDialog = new MessageDialog("Memeriksa update Komik gagal", "Gangguan Server");
await messageDialog.ShowAsync();
}
string jsonText1 = await response1.Content.ReadAsStringAsync();
JsonObject jsonObject1 = JsonObject.Parse(jsonText1);
JsonArray jsonData1 = jsonObject1["data"].GetArray();
foreach (JsonValue groupValue in jsonData1)
{
JsonObject groupObject = groupValue.GetObject();
string id = groupObject["id"].GetString();
string judul = groupObject["judul"].GetString();
string slug = groupObject["slug"].GetString();
BukuUpdate file1 = new BukuUpdate();
file1.ID = id;
file1.Judul = judul;
file1.Slug = slug;
List<String> title = sqlhelp.GetKomikData();
foreach (string juduldb in title)
{
judulbuku = juduldb.Substring(juduldb.IndexOf('.') + 1);
if (judulbuku != file1.Slug.Replace("-", "_") + ".pdf")
{
BukuData.Add(file1);
ListBuku.ItemsSource = BukuData;
}
else
{
ukomikText.Text = "belum tersedia komik yang baru";
ukomikText.Visibility = Visibility.Visible;
}
}
}
if (ListBuku.Items.Count > 0)
{
ukomikText.Text = BukuData.Count + " komik baru";
ukomikText.Visibility = Visibility.Visible;
jumlahbuku = BukuData.Count;
}
else
{
ukomikText.Text = "belum tersedia komik yang baru";
ukomikText.Visibility = Visibility.Visible;
}
public static List<String> GetKomikData()
{
List<String> entries = new List<string>();
using (SqliteConnection db =
new SqliteConnection("Filename=ebookstore.db"))
{
db.Open();
SqliteCommand selectCommand = new SqliteCommand
("SELECT title FROM books where folder_id = 67", db);
SqliteDataReader query = selectCommand.ExecuteReader();
while (query.Read())
{
entries.Add(query.GetString(0));
}
db.Close();
}
return entries;
}
BukuUpdate.cs:
public string ID { get; set; }
public string Judul { get; set; }
public string Slug { get; set; }
I have a problem, that is when checking slugs on JSON, then the slug that is displayed is the first slug is displayed repeatedly as much data in the database, after that show the second slug repeatedly as much data on the database, and so on, as below:
How to solve it so that slug on JSON is not displayed repeatedly (according to the amount of data on JSON)?
The problem is that you have two nested foreach loops. What the code does in simplified pseudocode:
For each item in JSON
Load all rows from DB
And for each loaded row
Check if the current JSON item matches the row from DB and if not, output
As you can see, if you have N items in the JSON and M rows in the database, this inevitably leads to N*M lines of output except for those rare ones where the JSON item matches a specific row in database.
If I understand it correctly, I assume that you instead want to check if there is a row that matches the JSON item and if not, output it. You could do this the following way:
List<String> title = sqlhelp.GetKomikData();
HashSet<string> dbItems = new HashSet<string>();
foreach (string juduldb in title)
{
judulbuku = juduldb.Substring(juduldb.IndexOf('.') + 1);
dbItems.Add( judulbuku );
}
...
foreach ( JsonValue groupValue in jsonData1 )
{
...
//instead of the second foreach
if ( !dbItems.Contains( file1.Slug.Replace("-", "_") + ".pdf" ) )
{
//item is not in database
}
else
{
//item is in database
}
}
Additional tips
Avoid calling GetKomikData inside the foreach. This method does not have any arguments and that means you are just accessing the database again and again without a reason, which takes time and slows down the execution significantly. Instead, call GetKomikData only once before the first foreach and then just use title variable.
Don't assign ItemsSource every time the collection changes. This will unnecessarily slow down the UI thread, as it will have to reload all the items with each loop. Instead, assign the property only once after the outer foreach
write your code in one language. When you start mixing variable names in English with Indonesian, the code becomes confusing and less readable and adds cognitive overhead.
avoid non-descriptive variable names like file1 or jsonObject1. The variable name should be clear and tell you what it contains. When there is a number at the end, it usually means it could be named more clearly.
use plurals for list variable names - instead of title use titles
I tried to write to CSV file using CsvHelper in C#.
This is the link to the library http://joshclose.github.io/CsvHelper/
Nothing is sent to the csv file. I tried doing "exportCsv.WriteField("Hello");" but still nothing happened.
List<string> ColumnOne = new List<string>();
List<string> ColumnTwo = new List<string>();
var csvTextWriter = new
StreamWriter(#"C:\Users\Public\Documents\ExportTest.csv");
var exportCsv = new CsvWriter(csvTextWriter);
//creating a list to store workflows then adding name and description to the myWorkflowsList list
if (myWorkflows.WorkFlowCollection.Any())
{
foreach (var Workflow in myWorkflows.WorkFlowCollection)
{
ColumnOne.Add(Workflow.WorkflowName);
ColumnTwo.Add(Workflow.WorkflowDescription);
}
exportCsv.WriteField(ColumnOne);
//exportCsv.WriteField(ColumnTwo);
exportCsv.NextRecord();
exportCsv.Flush();
Console.WriteLine("File is saved:
C:\\Users\\Public\\Documents\\ExportTest.csv");
Console.ReadLine();
}
Your code doesn't add any records. It doesn't have any calls to WriteRecords or WriteRecord. It looks like it's trying to write an entire list of strings into a single field instead.
To write two columns out to a file you can use `WriteRecords, eg :
var data = from flow in myWorkflows.WorkFlowCollection
select new { flow.WorkflowName,flow.WorkflowDescription};
using (var writer = new StreamWriter("test.csv"))
using (var csv = new CsvWriter(writer))
{
csv.WriteRecords(data);
}
This will write a file with field names WorkflowName and WorkflowDescription
You can change how the fields are written by creating a small class that accepts only the fields you want and sets names etc through attributes :
class Flow
{
[NameAttribute("Workflow Name")]
public string WorkflowName { get; set; }
[NameAttribute("Workflow Description")]
public string WorkflowDescription { get; set; }
public Flow(string workflowName, string workflowDescription)
{
WorkflowName = workflowName;
WorkflowDescription = workflowDescription;
}
}
//...
var data = from flow in myWorkflows.WorkFlowCollection
select new Flow(flow.WorkflowName,flow.WorkflowDescription);
using (var writer = new StreamWriter("test.csv"))
using (var csv = new CsvWriter(writer))
{
csv.WriteRecords(data);
}
I am looking to load repeated elements from an XML file I have called initialInspections.xml. The problem is that the system needs to be dynamic and to allow as many different inspectionNotes to be added. I need to input all of them even though they have the same name.
If someone could please give me a method of doing this I would be extremely appreciative, since I have been searching for almost 3 hours now and I haven't found anything that works.
I need all off the data from within each inspectionNote node, and it will be put into an array of a structure called initialInspectionNotes.
Here is what I have up to now:
public int propertyID;
public string initialInspectorUsername;
public DateTime intialDateTime;
public struct initialInspectionNotes
{
public string locationWithinProperty;
public string locationExtraNote;
public string costCode;
public float estimatedTime;
}
private void finalInspection_Load(object sender, EventArgs e)
{
//Open the intialInspections xml file and load the values into the form
XmlDocument xdoc = new XmlDocument();
FileStream rFile = new FileStream(values.xmlInitialFileLocation, FileMode.Open);
xdoc.Load(rFile);
XmlNodeList list = xdoc.GetElementsByTagName("initialInspection");
for (int i = 0; i < list.Count; i++)
{
XmlElement initialInspection = (XmlElement)xdoc.GetElementsByTagName("initialInspection")[i];
XmlElement initialInspector = (XmlElement)xdoc.GetElementsByTagName("userInspection")[i];
XmlElement dateTime = (XmlElement)xdoc.GetElementsByTagName("dateTime")[i];
propertyID = int.Parse(initialInspection.GetAttribute("propertyID"));
initialInspectorUsername = initialInspector.InnerText;
intialDateTime = DateTime.Parse(dateTime.InnerText);
}
rFile.Close();
}
The XML looks like this:
<?xml version="1.0" standalone="yes"?>
<initialInspections>
<initialInspection propertyID="1">
<userInspection>defaultadmin</userInspection>
<dateTime>07/11/2015 17:15:20</dateTime>
<inspectionNote>
<location>Dining Room</location>
<locationNote>Remove whole carpet, leave underlay</locationNote>
<CostCode>L1</CostCode>
<estimatedTime>5</estimatedTime>
</inspectionNote>
<inspectionNote>
<location>Other - See Notes</location>
<locationNote>On the marked area with orange spray paint.</locationNote>
<CostCode>B1</CostCode>
<estimatedTime>12</estimatedTime>
</inspectionNote>
</initialInspection>
</initialInspections>
Any help would be much appreciated.
class Note
{
public string Location { get; set; }
public string LocationNote { get; set; }
public string CodeCost { get; set; }
public string EstimatedTime { get; set; }
}
var xml = XElement.Load(...your xml path here );
var data = xml.Descendants("initialInspection").Elements("inspectionNote").Select(n => new Note()
{
Location = n.Element("location").Value,
LocationNote = n.Element("locationNote").Value,
CodeCost = n.Element("CostCode").Value,
EstimatedTime = n.Element("estimatedTime").Value
}).ToList();
One possibility would be to use the LINQ2XML and the LINQ Descendants method to fetch all inspectionNotes at once:
var xml = XDocument.Load(fileLocation); // for example c:\temp\input.xml
// fetch all inspectionNotes
var inspectionNotes = xml.Root.Descendants("inspectionNote").ToList();
// TODO: error handling!
// map inspectionNote node to custom structure
var arrayOfNotes = inspectionNotes.Select (n => new initialInspectionNotes
{
costCode = n.Element("CostCode").Value,
estimatedTime = float.Parse(n.Element("estimatedTime").Value),
locationExtraNote = n.Element("locationNote").Value,
locationWithinProperty = n.Element("location").Value,
})
// and convert the result to array holding elements of the custom structure
.ToArray();
foreach (var note in arrayOfNotes)
{
Console.WriteLine(note.locationExtraNote);
}
The output is:
Remove whole carpet, leave underlay
On the marked area with orange spray paint.
Same logic applies if you want to read and map another XML nodes (f.e. initialInspection).
If you need to use XmlReader then use XPath in order to fetch the inner inspectionNote elements and the values of every inspectionNote element, using XmlNode.SelectSingleNode and XmlNode.SelectNodes:
//Open the intialInspections xml file and load the values into the form
XmlDocument xdoc = new XmlDocument();
FileStream rFile = new FileStream(values.xmlInitialFileLocation, FileMode.Open);
xdoc.Load(rFile);
XmlNodeList list = xdoc.GetElementsByTagName("initialInspection");
// create list of initialInspectionNotes in order to add as many nodes as needed
var notes = new List<initialInspectionNotes>();
// map data
for (int i = 0; i < list.Count; i++)
{
// read data
XmlElement initialInspection = (XmlElement)xdoc.GetElementsByTagName("initialInspection")[i];
XmlElement initialInspector = (XmlElement)xdoc.GetElementsByTagName("userInspection")[i];
XmlElement dateTime = (XmlElement)xdoc.GetElementsByTagName("dateTime")[i];
propertyID = int.Parse(initialInspection.GetAttribute("propertyID"));
initialInspectorUsername = initialInspector.InnerText;
intialDateTime = DateTime.Parse(dateTime.InnerText);
// fetch notes!
var inspectionNotes = initialInspection.SelectNodes("inspectionNote");
foreach (XmlNode inspectionNote in inspectionNotes)
{
// insert data into list
notes.Add(new initialInspectionNotes
{
locationExtraNote = inspectionNote.SelectSingleNode("locationNote").InnerText,
costCode = inspectionNote.SelectSingleNode("CostCode").InnerText,
locationWithinProperty = inspectionNote.SelectSingleNode("location").InnerText
});
}
}
// convert to array if needed
//var arrayOfNotes = notes.ToArray();
rFile.Close();
Regardless of how many inspectionNote elements the XML contains, the list resp. array will read them all.
class InitialInspectionNotes
{
public string Location { get; set; }
public string LocationNote { get; set; }
public string CodeCost { get; set; }
public string EstimatedTime { get; set; }
}
var xdoc = XDocument.Load("yourpath\filename.xml");
var dataXml = xdoc.Descendants("initialInspection").Elements("inspectionNote").Select(n => new InitialInspectionNotes()
{
Location = n.Element("location").Value,
LocationNote = n.Element("locationNote").Value,
CodeCost = n.Element("CostCode").Value,
EstimatedTime = n.Element("estimatedTime").Value
}).ToList();
var xmlList = new List<object>();
for (int i = 0; i < dataXml.Count; i++)
{
xmlList.Add(dataXml[i]);
}
Despite all the answers above might have worked, I have however found it quite complicated for reading just elements. I found a simpler solution which I would like to share for future audience in this case.
Supposing you have read the document stream into XmlDocument object named as document.
var dataNodes = document.GetElementsByTagName("Data");
var toList = dataNodes.OfType<XmlElement>().ToList();
In my case, Data node was not being called repeatedly. Hence this works, now I was able to append multiple nodes with the same name.
I need to read data from a .csv file and store the header and the content in my object in the following format. A list of the below mentioned class.
public class MappingData
{
private string ColumnName { get; set; }
private List<string> Data { get; set; }
}
So for example say I have a table like as shown below,
| Name | Phone | City |
|:-----------|------------:|:------------:|
| Yassser | 32342342234 | Mumbai
| Sachin | 32342342234 | Surat
| Will | 32342342234 | London
So for the above data my class should have 3 objects, first object will have the following details
ColumnName : 'Name'
Data: ['Yasser', 'Sachin', 'Will']
so, this is what I am trying to do, Below is what I have started with. I am reading the file using stream reader and spliting each line with comma seperator.
private List<MappingData> GetData(string filename)
{
var data = new List<MappingData>();
string fullPath = GetFilePath(filename);
StreamReader reader = new StreamReader(fullPath);
while (!reader.EndOfStream)
{
string line = reader.ReadLine();
if (!String.IsNullOrWhiteSpace(line))
{
string[] values = line.Split(',');
}
}
return data;
}
can some one please help me mold this data into the required format. Thanks.
You should create a small int variable (or bool if you prefer) to determine whether the first row has been completed :
And create each of the list objects you will need (mpName, mpPhone, mpCity), on the first row you set the ColumnName property and on the subsequent rows you add to the MappingData's data list.
Then you add each of the lists (mpName, mpPhone, mpCity) to the data list for the method and return this.
private List<MappingData> GetData(string filename) {
List<MappingData> data = new List<MappingData>();
int NumRow = 0;
MappingData mpName = new MappingData();
MappingData mpPhone = new MappingData();
MappingData mpCity = new MappingData();
string fullPath = GetFilePath(filename);
StreamReader reader = new StreamReader(fullPath);
while (!reader.EndOfStream) {
string line = reader.ReadLine();
if (!String.IsNullOrWhiteSpace(line)) {
string[] values = line.Split(',');
if (NumRow == 0) {
mpName.ColumnName = values[0];
mpPhone.ColumnName = values[1];
mpCity.ColumnName = values[2];
NumRow = 1;
} else {
mpName.Data.Add(values[0]);
mpPhone.Data.Add(values[1]);
mpCity.Data.Add(values[2]);
}
}
}
data.Add(mpName);
data.Add(mpPhone);
data.Add(mpCity);
return data;
}
Hope this helps.
There's an excellent library for processing the CSV files.
KentBoogard
It is very easy with the above third party library to read content from the CSV files.
I would suggest this because you just seem to be starting and don't re invent the wheel.
Still, if you want to process the file in your own way, here's one working implementation. Enjoy
var csvData = File.ReadAllLines("d:\\test.csv");
var dataRows = csvData.Skip(1).ToList();
var csvHeaderColumns = csvData.First().Split(',').ToList();
var outputList = new List<MappingData>();
foreach (var columnName in csvHeaderColumns)
{
var obj = new MappingData { columnName = columnName, data = new List<string>() };
foreach (var rowStrings in dataRows.Select(dataRow => dataRow.Split(',').ToList()))
{
obj.data.Add(rowStrings[csvHeaderColumns.IndexOf(columnName)]);
}
outputList.Add(obj);
}
This will populate your MappingData class.
Assuming the rest of your method is correct then try this:
private List<MappingData> GetData(string filename)
{
var raw = new List<string[]>();
var data = new List<MappingData>();
string fullPath = GetFilePath(filename);
using(var reader = new StreamReader(fullPath))
{
while (!reader.EndOfStream)
{
string line = reader.ReadLine();
if (!String.IsNullOrWhiteSpace(line))
{
raw.Add(line.Split(','));
}
}
}
Func<int, MappingData> extract =
n => new MappingData()
{
ColumnName = raw[0][n],
Data = raw.Skip(1).Select(x => x[n]).ToList(),
};
data.Add(extract(0));
data.Add(extract(1));
data.Add(extract(2));
return data;
}
You'd have to make you MappingData properties accessible though.