I am new to programing and I would like to read a csv file into textboxes that I have on a form. Right now I am reading the file into a dataTable and was thinking I would then read it into the texboxes but I am not sure if I am going about this correct. Is there an easier way to do this? This is what I have so far:
protected void getftp()
{
//create Data table to temporary storage
var myTable = new DataTable();
//add columns
myTable.Columns.Add("Start_date");
myTable.Columns.Add("End_date");
//...snip...
myTable.Columns.Add("Comments");
//The 'using' command close connection when it is done
using (var reader = new StreamReader(File.OpenRead(#"C:\ftp\inbox\test.csv")))
{
while (!reader.EndOfStream)
{
//read in one line of the file
string line = reader.ReadLine();
//create an array of strings from each value in the current line
string[] values = line.Split(',');
//add the array as a row in the DataTable
myTable.Rows.Add(values);
}
}
}
see the following link. It shows the basics of working with th CSV format http://www.codeproject.com/Articles/30705/C-CSV-Import-Export
Here is another SO Question regarding reading CSV Files in .NET:
Reading a CSV file in .NET?
This particular answer references 2 CSV Readers. You could use these to read in the CSV file and then set the values in your Textboxes on your Windows Form (or Web Form or ASPX or Razor page, you did not indicate your front-end).
I would recommend reusing one of these projects instead of re-inventing the wheel and rolling your own CSV parser. It's easy enough to write in C#, but reading/parsing CSV files is a problem that has been solved many times over.
Related
I am working on measurement software from which results are written to a huge list of lists which I am trying to put into .csv file using csvhelper. Problem is that to keep things fairly readable and I need to create thousands of columns and I don't see any simple and effective way to do it.
How can I create a .csv file which consist, let's say, 2000 columns?
EDIT 1.
Sorry for my inaccurate post (it was my first ever made on StackOverflow).
Problem I am trying to solve is to create a .csv file with thousands of columns while using only a couple of lines of code so that a program will be as simple as possible. The main reason why I want to do it that way is to keep file readable by different software for data analysis.
What about adding properties to an ExpandoObject() (perhaps programatically in a loop) and then writing it out like so:
void Main()
{
var records = new List<dynamic>();
dynamic record = new ExpandoObject();
record.Id = 1;
record.Name = "one";
records.Add(record);
using (var writer = new StringWriter())
using (var csv = new CsvWriter(writer))
{
csv.WriteRecords(records);
writer.ToString().Dump();
}
}
https://joshclose.github.io/CsvHelper/examples/writing/write-dynamic-objects
I have a code where i read a file (.csv) and store its columns in lists.
var pathskill = System.IO.Path.Combine(System.AppDomain.CurrentDomain.BaseDirectory.ToString(), "skill.csv");
using (var fs1 = File.OpenRead(pathskill))
using (var reader1 = new StreamReader(fs1))
while (!reader1.EndOfStream)
{
var line = reader1.ReadLine();
var values = line.Split(',');
list_MainId.Add(Convert.ToDouble(values[0]));
list_MainName.Add(values[1]);
list_AmountMade.Add(Convert.ToInt32(values[2]));
list_Level.Add(Convert.ToDouble(values[3]));
list_Exp.Add(Convert.ToDouble(values[4]));
list_MadeFrom_One_Id.Add(Convert.ToDouble(values[5]));
list_Amount_MadeFrom_One.Add(Convert.ToInt32(values[6]));
list_MadeFrom_Two_Id.Add(Convert.ToDouble(values[7]));
list_Amount_MadeFrom_Two.Add(Convert.ToInt32(values[8]));
}
This code works great and i get 9 lists with values.
However, i have many .csv files and i think it will be better that each will be like a sheet in xlsx file and i can choose which one to read by its name.
For example to have a sheet called skill1, skill2 and so on.
Is there a way to read a specific sheet from xlsx by its name and store it columns into lists?
Thank you
There are a lot of ways to do this. You could unzip xlsx files and read sheet XMLs directly, or you may use some library that can work with xlsx (e.g. EPPLUS that is freeware).
I'm working in C# and need making use of Entity Framework 6. I have a service that calls a stored procedure (using the Dbcontext) and places the results in an IList. I then have a controller that makes use of this service. Now originally I was using the results combined with Epplus to save this as an Excel/Xlsl File - this worked perfectly/as intended.
However now, I need to save it as an CSV file. I have found several links, such as this and this, which converts excel to CSV (however I can now skip this step, as I can just convert the resultset to CSV, with no need for the excel file), I also found this link.
From what I understand, it is fairly easy to export/convert a dataset/result set to CSV using a stringbuilder. However, I was wondering, given that I have Epplus and the ability to save as Excel- is there not a cleaner way of doing it? Or is it best to take the data, use a stringbuilder for comma delimiting the values and use that for the CSV?
I know similar topics (like this one) have been posted before - but I felt my question was unique enough for a new post.
Using EPPlus is not a cleaner way of doing this. You would only need much more code to accomplish exactly the same result. Creating a CSV file is nothing more than writing a text file with commas in it. So why not just do that?
StringBuilder sb = new StringBuilder();
foreach (DataRow dr in yourDataSet)
{
List<string> fields = new List<string>();
foreach (object field in dr.ItemArray)
{
fields.Add(field);
}
sb.Append(String.Join(",", fields) + Environment.NewLine);
}
//and save.. sb.ToString() as a .csv file
I am reading a text file in C# and trying to save it to a SQL database. I am fine except I don't want the first line, which is the names of the columns, included in the import. What's the easiest way to exclude these?
The code is like this
while (textIn.Peek() != -1)
{
string row = textIn.ReadLine();
string[] columns = row.Split(' ');
Product product = new Product();
product.Column1 = columns[0];
etc.....
product.Save();
}
thanks
If you are writing the code yourself to read in the file and then importing...why don't you just skip over the first line?
Here's my suggestion:
string[] file_rows;
using(var reader=File.OpenText(filepath))
{
file_rows=reader.ReadToEnd().Split("\r\n");
reader.Close();
}
for(var i=1;i<file_rows.Length;i++)
{
var row=file_rows[i];
var cells=row.Split("\t");
....
}
how are you importing the data? if you are looping in C# and inserting them one at a time, construct your loop to skip the first insert!
or just delete the first row inserted after they are there.
give more info, get more details...
Pass a flag into the program (in case in future the first line is also data) that causes the program to skip the first line of text.
If it's the same column names as used in the database you could also parse it to grab the column names from that instead of hard-coding them too (assuming that's what you're doing currently :)).
As a final note, if you're using a MySQL database and you have command line access, you may want to look at the LOAD DATA LOCAL INFILE syntax which lets you import pretty arbitrarily defined CSV data.
For future reference have a look at this awesome package: FileHelpers Library
I can't add links just yet but google should help, it's on sourceforge
It makes our lives here a little easier when people insist on using files as integration
I need a CSVParser class file
A Class File which parses csv and returns a dataSet as a result ASP.Net
I'm pretty sure that CSVReader (CodeProject) can read to DataTable.
DataTable table = new DataTable();
// set up schema... (Columns.Add)
using(TextReader text = File.OpenText(path))
using(CsvReader csv = new CsvReader(text, hasHeaders)) {
table.Load(csv);
}
Note that manually setting up the schema is optional; if you don't, I believe it assumes that everything is string.
Simple google gives plenty of results.
I've had luck with this parser. It will return results to a DataSet.
Another tool you might want to check out is FileHelpers. I see there's a tag for this resource here on SO.