Read CSV file and insert to LocalDB (asp.net MVC) - c#

I'm trying out a project with ASP.Net MVC and have a large CSV file that I want to save to the LocalDB.
I have been following this tutorial (and the ones before that are about MVC): https://learn.microsoft.com/en-us/aspnet/mvc/overview/getting-started/introduction/creating-a-connection-string
Now I want to add data to this database that I have set up and I would like to read this data from a csv file and then save it to my database.
I have tried this: https://www.aspsnippets.com/Articles/Upload-Read-and-Display-CSV-file-Text-File-data-in-ASPNet-MVC.aspx
but when I try to upload my file I get an error that my file is too large?
I would love it if it could be automated so that when I start my application the database will be populated with the data from my csv file (and if it already is populated it will not do it again) or just some way of coding so that I can add the data from my csv file to the database (LocalDB).
protected override void Seed(ProductsDBContext context)
{
Assembly assembly = Assembly.GetExecutingAssembly();
string resourceName = "WebbApplication.App_Data.SeedData.price_detail.csv";
using (Stream stream = assembly.GetManifestResourceStream(resourceName))
{
using (StreamReader reader = new StreamReader(stream, Encoding.UTF8))
{
CsvReader csvReader = new CsvReader(reader);
var products = csvReader.GetRecords<PriceDetail>().ToArray();
context.PriceDetails.AddOrUpdate(c => c.PriceValueId, products);
}
}
}

Your second link includes the following line:
string csvData = System.IO.File.ReadAllText(filePath);
If you are getting an Out of Memory Exception, then you should not load the entire file into memory at once - i.e. do not read all of the text.
The StreamReader has a built-in function to handle this.
System.IO.StreamReader file = new System.IO.StreamReader("WebbApplication.App_Data.SeedData.price_detail.csv");
while((line = file.ReadLine()) != null)
{
System.Console.WriteLine(line);
//Replace with your operation below
}
Potentially the same problem solved at this question.

With Cinchoo ETL - an open source library, you can bulk load CSV file into sqlserver with few lines of code.
using (var p = new ChoCSVReader(** YOUR CSV FILE **)
.WithFirstLineHeader()
)
{
p.Bcp("** ConnectionString **", "** tablename **");
}
For more information, please visit codeproject article.
Hope it helps.

Related

What is the best way to check if a file contains a key before adding a new line?

I have a CSV file containing the following columns -
Key,Value
First,Line
Second,Line
Third,Line
I want to add a new Key-Value to this file given the key is not already present in the file using C#? What would be the best way to do this? Would I have to traverse line by line and check for the Keys or is there any other better way?
I am not using the CSVHelper package or any other CSV writer.
You could do this:
string path = #"PathToFile.csv";
string Content = string.Empty;
using (StreamReader reader = new StreamReader(path))
{
Content = reader.ReadToEnd();
reader.Close();
}
if (!Content.Contains("YourKey"))
{
using (StreamWriter sw = new StreamWriter(path))
{
sw.WriteLine(Content + "\nYourkey,YourValue");
sw.Close();
}
}
Read the file and write all text to a string variable, check the variable if the key exists, if it doesn't then write content back to the file along with your new key. as the file grows it will take longer and longer to search the whole file but it'll work well for a couple thousand lines.

CSV Reader not working anymore... without making any changes

I have a little app that is working for a month now. I made absolute no changes to the code of this part and from one moment to the other the app stops working.
I even tried old files that I already imported in the past. No rights issue. Same drive. I can open all files. No changes at all in the files.
The error is always:
CsvHelper.HeaderValidationException: "Header with name 'Betrag der Rate' was not found.
How can I solve this?
if (filename != string.Empty)
{
using (var reader = new StreamReader(filename))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
csv.Configuration.RegisterClassMap<LastschriftMap>();
csv.Configuration.HasHeaderRecord = true;
var records = csv.GetRecords<Lastschriften>();
alleLastschriften = records.ToList();
}
}
If this is reading German data, try using CultureInfo.CreateSpecificCulture("de-DE") instead of CultureInfo.InvariantCulture. Or use CultureInfo.CurrentCulture which was the default in the past.

How to store a very large file into a database record

A client uses our web application to parse a XML file for a database. Then uses the information from the XML file from the database for various other things. I am as of now stream reading and stream writing the file. When the files were smaller, they used to be parsed into a string then inserted into a the database record. The problem is the file is so large that computers don't have enough RAM to process a string that large, how can I process this file and still store it in a database?
using (var reader = new StreamReader(outputPath))
{
string line = string.Empty;
bool save = false;
using (var sWriter = new StreamWriter(inputPath))
{
while ((line = reader.ReadLine()) != null)
{
System.Diagnostics.Debug.WriteLine(line);
if (line.Contains("<SaveDataFromScout>"))
{
sWriter.WriteLine("<SaveDataFromScout>");
save = true;
continue;
}
else if (line.Contains("</SaveDataFromScout>"))
{
sWriter.WriteLine(line);
save = false;
}
if (save)
{
if (line.Contains("ELEMENT TEXT"))
line = line.Replace("ELEMENT TEXT", "ELEMENTTEXT");
sWriter.WriteLine(line);
}
}
}
}
//string workbookXML = System.IO.File.ReadAllText(outputPath);
transactionID = DAC.ExecuteScalar
(
db.ConnectionString,
"dbo.cwi_InsertTransaction",
new SqlParameter("#TransactionTypeID", transactionTypeID),
new SqlParameter("#UploadedFileName", fileDataLink),
//new SqlParameter("#UploadedFileXml", workbookXML),
new SqlParameter("#CurrentUserID", CurrentUser.UserID)
);
Do you see the commented out code where it previously was converted into a string to be parsed into the database? Well, this works, but it does not work for files that are around 888MB large.
Load the file directly into SQL Server, XML column type using bulk load:
INSERT INTO T(XmlCol)
SELECT * FROM OPENROWSET(
BULK 'c:\SampleFolder\SampleData3.txt',
SINGLE_BLOB) AS x;
More documentation:
https://learn.microsoft.com/en-us/sql/relational-databases/import-export/examples-of-bulk-import-and-export-of-xml-documents-sql-server?view=sql-server-2017

Using FileHelpers to import Excel data using MVC 5

I'm trying to write an application in MVC 5 that will accept a file specified by a user and upload that file information into the database. The file itself has multiple worksheets, which I think FileHelpers handles gracefully, but I can't find any good documentation about working with a byte array. I can get the file just fine, and get to my controller, but don't know where to go from there. I am currently doing this in the controller:
public ActionResult UploadFile(string filepath)
{
//we want to check here that the first file in the request is not null
if (Request.Files[0] != null)
{
var file = Request.Files[0];
byte[] data = new byte[file.ContentLength];
ParseInputFile(data);
//file.InputStream.Read(data, 0, data.Length);
}
ViewBag.Message = "Success!";
return View("Index");
}
private void ParseInputFile(byte[] data)
{
ExcelStorage provider = new ExcelStorage(typeof(OccupationalGroup));
provider.StartRow = 3;
provider.StartColumn = 2;
provider.FileName = "test.xlsx";
}
Am I able to use the Request like that in conjunction with FileHelpers? I just need to read the Excel file into the database. If not, should I be looking into a different way to handle the upload?
So, I decided instead to use ExcelDataReader to do my reading from Excel. It puts the stream (in the below code, test) into a DataSet that I can just manipulate manually. I'm sure it might not be the cleanest way to do it, but it made sense for me, and allows me to work with multiple worksheets fairly easily as well. Here is the snippet of regular code that I ended up using:
//test is a stream here that I get using reflection
IExcelDataReader excelReader = ExcelReaderFactory.CreateOpenXmlReader(test);
DataSet result = excelReader.AsDataSet();
while(excelReader.Read())
{
//process the file
}
excelReader.Close();

Reading contents of CSV file

I have loaded a CSV file
Here is a sample of the content available in the CSV file
Name,Address,Address1,LandMark,User_location,City,State,Phone1,Phone2,Email,Category
Sriram Electricals and Plumbing Contractors,No 12, Vinayakar Koil Street Easa,"Back Side Of Therasa School,",Pallavaram,Chennai,Tamil Nadu,(044) 66590405,,sriram#gmail.com,Electrican
I've tried to convert the file to a list
public ActionResult UserCsv(HttpPostedFileBase uploadfile)
{
using (var sr = new StreamReader(uploadfile.InputStream, Encoding.UTF8))
{
var reader = new CsvReader(sr);
//CSVReader will now read the whole file into an enumerable
IEnumerable<UserCSVModel> records = reader.GetRecords<UserCSVModel>();
}
}
Unable to get a correct output.
try this article:
http://www.codeproject.com/Articles/415732/Reading-and-Writing-CSV-Files-in-Csharp
Or this Q on stack over flow:
Reading CSV file and storing values into an array
hope it helps.
Have a look a http://www.filehelpers.net/. It's a great library for working with CSV files and will give you an Enumerable that you can work with

Categories

Resources