Simple InputOutput method - c#

I'm writing simple method that should read data from the text file.I can't understand why this method read only 2nd, 4th, 6 lines? Method is below. What is wrong in my code?
public static List<Employee> ReadFromFile(string path = "1.txt")
{
List<Employee> employees = new List<Employee>();
Stream stream = null;
StreamReader sr = null;
try
{
stream = new FileStream(path, FileMode.Open, FileAccess.Read);
stream.Seek(0, SeekOrigin.Begin);
sr = new StreamReader(stream);
string line;
while ((line = sr.ReadLine()) != null)
{
Employee employee = new DynamicEmployee();
string str = sr.ReadLine();
employee.FirstName = str.Substring(1, 20).Trim();
employee.LasttName = str.Substring(20, 20).Trim();
employee.Paynment = Convert.ToDouble(str.Substring(40, 20).Trim());
Console.WriteLine("{0} {1} {2}", employee.FirstName, employee.LasttName, employee.Paynment);
employees.Add(employee);
//Console.WriteLine(str);
}
}
catch//(System.FormatException)
{
Console.WriteLine("File format is incorect");
}
finally
{
sr.Close();
stream.Close();
}
return employees;
}

You are calling line = sr.ReadLine() twice.
Remove this line, string str = sr.ReadLine(); and use variable line

It should look like this:
public static List<Employee> ReadFromFile(string path = "1.txt")
{
List<Employee> employees = new List<Employee>();
Stream stream = null;
StreamReader sr = null;
try
{
stream = new FileStream(path, FileMode.Open, FileAccess.Read);
stream.Seek(0, SeekOrigin.Begin);
sr = new StreamReader(stream);
string line;
while ((line = sr.ReadLine()) != null)
{
Employee employee = new DynamicEmployee();
// string str = sr.ReadLine(); // WRONG, reading 2x
employee.FirstName = line.Substring(1, 20).Trim();
employee.LasttName = line.Substring(20, 20).Trim();
employee.Paynment = Convert.ToDouble(line.Substring(40, 20).Trim());
Console.WriteLine("{0} {1} {2}", employee.FirstName, employee.LasttName, employee.Paynment);
employees.Add(employee);
//Console.WriteLine(str);
}
}
catch//(System.FormatException)
{
Console.WriteLine("File format is incorect");
}
finally
{
sr.Close();
stream.Close();
}
return employees;
}

Related

Sum Variable in file txt C# Console

i have a data stored in txt format, then the data is displayed. i want to get a total of QOH in the picture, i have a problem in the whole sum.
public void DisTranswe()
{
Console.Clear();
FileStream fs = new FileStream("TransactionHistory\\weekend\\transcationhistory.txt", FileMode.Open, FileAccess.Read);
StreamReader sr = new StreamReader(fs);
while ((str = sr.ReadLine()) != null)
{
string[] data = str.Split('#');
string id = data[0];
string date = data[1];
string qty = data[2];
string payment = data[3];
string note = data[4];
//output
Console.WriteLine("IdTransaksi");
Console.WriteLine(id);
Console.WriteLine("DateTransaksi");
Console.WriteLine(date);
Console.WriteLine("QOH");
Console.WriteLine(qty);
Console.WriteLine("TotalPayment");
Console.WriteLine(payment);
Console.WriteLine("Note");
Console.WriteLine(note);
}
sr.Close();
fs.Close();
}
Thank you!
Add counter for your QOH, like this:
public void DisTranswe()
{
Console.Clear();
FileStream fs = new FileStream("TransactionHistory\\weekend\\transcationhistory.txt", FileMode.Open, FileAccess.Read);
StreamReader sr = new StreamReader(fs);
int sum=0;
while ((str = sr.ReadLine()) != null)
{
string[] data = str.Split('#');
string id = data[0];
string date = data[1];
string qty = data[2];
string payment = data[3];
string note = data[4];
sum=sum+int.Parse(qty);
//output
Console.WriteLine("IdTransaksi");
Console.WriteLine(id);
Console.WriteLine("DateTransaksi");
Console.WriteLine(date);
Console.WriteLine("QOH");
Console.WriteLine(qty);
Console.WriteLine("TotalPayment");
Console.WriteLine(payment);
Console.WriteLine("Note");
Console.WriteLine(note);
}
Console.WriteLine("SUM");
Console.WriteLine(sum);
sr.Close();
fs.Close();
}
Try this:
int qoh =
File
.ReadAllLines("TransactionHistory\\weekend\\transcationhistory.txt")
.Sum(x => int.Parse(x.Split('#')[2]));

How to read excel file data using memory stream?

I want to read Excel file from JSON data which I am sending from ARC, Can anyone help me to sorted out?
public bool ControlAttachment(AttachmentFile file)
{
try
{
if (file != null && file.File != null)
{
string xlsfile = file.File;
string [] xls = {"application/excel","application/vnd.msexcel","xls","xlsx","application/vnd.ms-excel",};
if (xls.ToList().Contains(file.FileType.Trim()))
{
file.FileType = ".xls";
byte[] contents = Convert.FromBase64String(xlsfile);
string LogFilePaths = ConfigurationManager.AppSettings["ExcelMapperPath"];
string fileName = file.FileName.Split('.')[0] + file.FileType;
string LogFile = HttpContext.Current.Server.MapPath(LogFilePaths + file.FileName.Split('.')[0] + file.FileType);
System.IO.File.WriteAllBytes(LogFile, contents);
if (!File.Exists(LogFile))
{
File.Create(LogFile).Dispose();
}
MemoryStream ms = new MemoryStream();
using (var fs = new FileStream(LogFile, FileMode.Open, FileAccess.Write))
{
ms.CopyTo(fs);
ms.Dispose();
}
}
}
return true;
}
catch
{
return false;
}
}

SqlBulkCopy and File Archiving

I have a process that loads data into a sql table from a flat file then needs to immediately move the file to an archive folder.
However when running the code it imports the data but throws and IOException
{"The process cannot access the file because it is being used by another process."}
There appears to be some contention in the process. Where and how should I avoid this?
internal class Program
{
private static void Main(string[] args)
{
string sourceFolder = #"c:\ImportFiles\";
string destinationFolder = #"c:\ImportFiles\Archive\";
foreach (string fileName in Directory.GetFiles(sourceFolder, "*.*"))
{
string sourceFileName = Path.GetFileName(fileName);
string destinationFileName = Path.GetFileName(fileName) + ".arc";
ProcessFile(fileName);
string source = String.Concat(sourceFolder,sourceFileName);
string destination = String.Concat(destinationFolder,destinationFileName);
File.Move(source, destination);
}
}
static void ProcessFile(string fileName)
{
Encoding enc = new UTF8Encoding(true, true);
DataTable dt = LoadRecordsFromFile(fileName, enc, ',');
SqlBulkCopy bulkCopy = new SqlBulkCopy("Server=(local);Database=test;Trusted_Connection=True;",
SqlBulkCopyOptions.TableLock);
bulkCopy.DestinationTableName = "dbo.tblManualDataLoad";
bulkCopy.WriteToServer(dt);
bulkCopy.Close();
}
public static DataTable LoadRecordsFromFile(string fileName, Encoding encoding, char delimeter)
{
DataTable table = null;
if (fileName != null &&
!fileName.Equals(string.Empty) &&
File.Exists(fileName))
{
try
{
string tableName = "DataImport";
FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
List<string> rows = new List<string>();
StreamReader reader = new StreamReader(fs, encoding);
string record = reader.ReadLine();
while (record != null)
{
rows.Add(record);
record = reader.ReadLine();
}
List<string[]> rowObjects = new List<string[]>();
int maxColsCount = 0;
foreach (string s in rows)
{
string[] convertedRow = s.Split(new char[] { delimeter });
if (convertedRow.Length > maxColsCount)
maxColsCount = convertedRow.Length;
rowObjects.Add(convertedRow);
}
table = new DataTable(tableName);
for (int i = 0; i < maxColsCount; i++)
{
table.Columns.Add(new DataColumn());
}
foreach (string[] rowArray in rowObjects)
{
table.Rows.Add(rowArray);
}
//Remove Header Row From Import file
DataRow row = table.Rows[0];
row.Delete();
table.AcceptChanges();
}
catch
{
//TODO SEND EMAIL ALERT ON ERROR
throw new Exception("Error in ReadFromFile: IO error.");
}
}
else
{
//TODO SEND EMAIL ALERT ON ERROR
throw new FileNotFoundException("Error in ReadFromFile: the file path could not be found.");
}
return table;
}
}
Your program is likely holding the file open. You should wrap FileStream and StreamReader objects in using statements. This closes those objects when the using block finishes.
The part of your LoadRecordsFromFile function that reads the file should look something like:
...
string tableName = "DataImport";
List<string> rows = new List<string>();
using (FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (StreamReader reader = new StreamReader(fs, encoding))
{
string record = reader.ReadLine();
while (record != null)
{
rows.Add(record);
record = reader.ReadLine();
}
}
}
...

Searching for string using OpenFileDialog and Multiselect

I am doing an exercise where I need to find a string in a group of files.
I manage to find the string selecting each file individually.
How can I do the same selecting all files at once.
openFileDialog.Multiselect = true;
DialogResult result = openFileDialog.ShowDialog();
string filename = openFileDialog.SafeFileName;
FileStream fs = new FileStream(filename, FileMode.Open, FileAccess.Read);
BufferedStream bs = new BufferedStream(fs);
StreamReader sr = new StreamReader(fs);
String s;
if (result == DialogResult.OK)
{
while ((s = sr.ReadLine()) != null)
{
if(s.Contains("Specified string"))
{
MessageBox.Show(filename + " Contains the Specified string");
break;
}
}
}
fs.Close();
sr.Close();
OpenFileDialog has properties (FileNames, SafeFileNames) that return all selected files.
First of all, you should use SafeFileNames Property:
if (result == DialogResult.OK)
{
foreach(string filename = openFileDialog.SafeFileName)
{
FileStream fs = new FileStream(filename, FileMode.Open, FileAccess.Read);
BufferedStream bs = new BufferedStream(fs);
StreamReader sr = new StreamReader(fs);
String s;
while ((s = sr.ReadLine()) != null)
{
if(s.Contains("Specified string"))
{
MessageBox.Show(filename + " Contains the Specified string");
break;
}
}
fs.Close();
sr.Close();
}
}
For second, you can use Parallel Class for simultaneous processing of the files.

need to speed up C# application that converts timestamps

im having a tough time getting a small application to work faster. im not a developer and it took me some time to get this working as is. Can anyone offer any suggestions or alternate code to speed this process up, its taking about 1 Hour to process 10m of the input file.
the code is listed below and here is an example of the input file.
4401,imei:0000000000,2012-09-01 12:12:12.9999
using System;
using System.Globalization;
using System.IO;
class Sample
{
public static void Main(string[] args)
{
if (args.Length == 0)
{
return;
}
using (FileStream stream = File.Open(args[0], FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (StreamReader streamReader = new StreamReader(stream))
{
System.Text.StringBuilder builder = new System.Text.StringBuilder();
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
var values = line.Split(',');
DateTime dt = new DateTime();
DateTime.TryParse(values[2], out dt);
values[2] = Convert.ToString(dt.Ticks);
string[] output = new string[values.Length];
bool firstColumn = true;
for (int index = 0; index < values.Length; index++)
{
if (!firstColumn)
builder.Append(',');
builder.Append(values[index]);
firstColumn = false;
}
File.WriteAllText(args[1], builder.AppendLine().ToString());
}
}
}
}
}
The biggest performance hit is that every time a line is read the entire file (processed so far) is written back to disk. For a quick win, try moving your StringBuilder out of the loop:
System.Text.StringBuilder builder = new System.Text.StringBuilder();
using (FileStream stream = File.Open(args[0], FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (StreamReader streamReader = new StreamReader(stream))
{
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
var values = line.Split(',');
DateTime dt = new DateTime();
DateTime.TryParse(values[2], out dt);
values[2] = Convert.ToString(dt.Ticks);
string[] output = new string[values.Length];
bool firstColumn = true;
for (int index = 0; index < values.Length; index++)
{
if (!firstColumn)
builder.Append(',');
builder.Append(values[index]);
firstColumn = false;
}
builder.AppendLine();
}
}
}
File.WriteAllText(args[1], builder.ToString());
If you want to refactor further change the comma separating logic:
System.Text.StringBuilder builder = new System.Text.StringBuilder();
using (FileStream stream = File.Open(args[0], FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (StreamReader streamReader = new StreamReader(stream))
{
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
var values = line.Split(',');
DateTime dt = new DateTime();
DateTime.TryParse(values[2], out dt);
values[2] = Convert.ToString(dt.Ticks);
builder.AppendLine(string.Join(",", values));
}
}
}
File.WriteAllText(args[1], builder.ToString());
Edit: To avoid the memory usage, remove the Stringbuilder and use another FileStream to write to disk. Your proposed solution (using a List) will still use a substantial amount of memory and likely break on larger files:
using (FileStream input = File.Open(args[0], FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (FileStream output = File.Create(args[1]))
{
using (StreamReader streamReader = new StreamReader(input))
using (StreamWriter streamWriter = new StreamWriter(output))
{
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
var values = line.Split(',');
DateTime dt = new DateTime();
DateTime.TryParse(values[2], out dt);
values[2] = Convert.ToString(dt.Ticks);
streamWriter.WriteLine(string.Join(",", values));
}
}
}
here is what i found can fix this and handle the large files.
thanks to #Muzz and #Vache for the assistance.
string line = "";
System.IO.StreamReader file = new System.IO.StreamReader("c:/test.txt");
List<string> convertedLines = new List<string>();
while ((line = file.ReadLine()) != null)
{
string[] lineSplit = line.Split(',');
DateTime dt = new DateTime();
DateTime.TryParse(lineSplit[2], out dt);
lineSplit[2] = Convert.ToString(dt.Ticks);
 
string convertedline = lineSplit[0] + "," + lineSplit[1] + "," + lineSplit[2];
convertedLines.Add(convertedline);
}
file.Close();
File.WriteAllLines("c:/newTest.txt", convertedLines);

Categories

Resources