I am creating a banking program and I want to be able to read my text file of accounts and add them to a list. My problem is, it only reads 1 line and after that, it will get an error saying the line is null, but it shouldn't be because the second like should be the age.
I want it to continuously go through the accounts adding the data to the List, each account is separated by a blank line.
Code:
StreamReader FileToRead = new StreamReader(#"C:\Users\...\Accounts.txt");
Account NewAccount = new Account();
string line;
do
{
NewAccount.Name = FileToRead.ReadLine();
NewAccount.Age = int.Parse(FileToRead.ReadLine());
NewAccount.Balance = int.Parse(FileToRead.ReadLine());
NewAccount.Address.Country = FileToRead.ReadLine();
NewAccount.Address.City = FileToRead.ReadLine();
NewAccount.Address.FirstLine = FileToRead.ReadLine();
NewAccount.Address.SecondLine = FileToRead.ReadLine();
NewAccount.Address.PostCode = FileToRead.ReadLine();
NewAccount.AccountNumber = int.Parse(FileToRead.ReadLine());
Accounts.Add(NewAccount);
} while ((line = FileToRead.ReadLine()) != null);
Text file: http://pastebin.com/raw.php?i=1r9TEUPx
Well, the only real error I can see offhand is that you're not creating a new instance of Account - so what you'll be doing is changing the values on a single account and readding it to the list - you'll only end up with the last account in the file stored. You need to create a new Account for each iteration of the loop.
Tried with your file and the code fails at the second loop not the first one.
This because the "blankline" at the end triggers a second loop but then there is no more data to read.
If you are sure that every 'record' is separated by a blank line then you could simply add another read at the end of the loop
do
{
NewAccount = new Account();
NewAccount.Name = FileToRead.ReadLine();
NewAccount.Age = int.Parse(FileToRead.ReadLine());
NewAccount.Balance = int.Parse(FileToRead.ReadLine());
NewAccount.Address.Country = FileToRead.ReadLine();
NewAccount.Address.City = FileToRead.ReadLine();
NewAccount.Address.FirstLine = FileToRead.ReadLine();
NewAccount.Address.SecondLine = FileToRead.ReadLine();
NewAccount.Address.PostCode = FileToRead.ReadLine();
NewAccount.AccountNumber = int.Parse(FileToRead.ReadLine());
FileToRead.ReadLine(); // here to absorb the empty line between 'records'
Accounts.Add(NewAccount);
} while ((line = FileToRead.ReadLine()) != null);
Now when you reach the end-of-file the while loop exits correctly.....
EDIT: Seeing the answer from Eric -- Added the correct initialization of a new Account for every loop
Another way to do it:
string[] lines = System.IO.File.ReadAllLines(#"C:\Users\...\Accounts.txt");
if (lines != null && lines.Length > 0)
{
Account NewAccount = new Account();
NewAccount.Name = lines[0].ToString();
NewAccount.Age = lines[1].ToString();
NewAccount.Balance = lines[2].ToString();
NewAccount.Address.Country = lines[3].ToString();
NewAccount.Address.City = lines[4].ToString();
NewAccount.Address.FirstLine = lines[5].ToString();
NewAccount.Address.SecondLine = lines[6].ToString();
NewAccount.Address.PostCode = lines[7].ToString();
NewAccount.AccountNumber = lines[8].ToString();
Accounts.Add(NewAccount);
}
If you have such problems with the Streamreader, consider to use File.ReadAllLines instead:
var lines = File.ReadAllLines(path);
var NewAccount = new Account();
NewAccount.Name = lines.First();
NewAccount.Age = int.Parse(lines.ElementAt(1));
NewAccount.Balance = int.Parse(lines.ElementAt(2));
NewAccount.Address.Country = lines.ElementAt(3);
NewAccount.Address.City = lines.ElementAt(4);
NewAccount.Address.FirstLine = lines.ElementAt(5);
NewAccount.Address.SecondLine = lines.ElementAt(6);
NewAccount.Address.PostCode = lines.ElementAt(7);
NewAccount.AccountNumber = int.Parse(lines.ElementAt(8));
If your contains valid data your code probably throws exception on:
NewAccount.Address.Country = FileToRead.ReadLine();
It looks like you have some kind of class for address. You have to instantiate this property in Account constructor or in loop:
do
{
...
NewAccount.Balance = int.Parse(FileToRead.ReadLine());
NewAccount.Address = new Account.AddressClass();
NewAccount.Address.Country = FileToRead.ReadLine();
...
} while ((line = FileToRead.ReadLine()) != null);
I also assumed you previously instantiated other variables like
Account NewAccount = new Account();
List<Account> Accounts = new List<Account>();
Related
I have difficulties understanding this example on how to use facets :
https://lucenenet.apache.org/docs/4.8.0-beta00008/api/Lucene.Net.Demo/Lucene.Net.Demo.Facet.SimpleFacetsExample.html
My goal is to create an index in which each document field have a facet, so that at search time i can choose which facets use to navigate data.
What i am confused about is setup of facets in index creation, to
summarize my question : is index with facets compatibile with
ReferenceManager?
Need DirectoryTaxonomyWriter to be actually written and persisted
on disk or it will embedded into the index itself and is just
temporary? I mean given the code
indexWriter.AddDocument(config.Build(taxoWriter, doc)); of the
example i expect it's temporary and will be embedded into the index (but then the example also show you need the Taxonomy to drill down facet). So can the Taxonomy be tangled in some way with the index so that the are handled althogeter with ReferenceManager?
If is not may i just use the same folder i use for storing index?
Here is a more detailed list of point that confuse me :
In my scenario i am indexing the document asyncrhonously (background process) and then fetching the indext ASAP throught ReferenceManager in ASP.NET application. I hope this way to fetch the index is compatibile with DirectoryTaxonomyWriter needed by facets.
Then i modified the code i write introducing the taxonomy writer as indicated in the example, but i am a bit confused, seems like i can't store DirectoryTaxonomyWriter into the same folder of index because the folder is locked, need i to persist it or it will be embedded into the index (so a RAMDirectory is enougth)? if i need to persist it in a different direcotry, can i safely persist it into subdirectory?
Here the code i am actually using :
private static void BuildIndex (IndexEntry entry)
{
string targetFolder = ConfigurationManager.AppSettings["IndexFolder"] ?? string.Empty;
//** LOG
if (System.IO.Directory.Exists(targetFolder) == false)
{
string message = #"Index folder not found";
_fileLogger.Error(message);
_consoleLogger.Error(message);
return;
}
var metadata = JsonConvert.DeserializeObject<IndexMetadata>(File.ReadAllText(entry.MetdataPath) ?? "{}");
string[] header = new string[0];
List<dynamic> csvRecords = new List<dynamic>();
using (var reader = new StreamReader(entry.DataPath))
{
CsvConfiguration csvConfiguration = new CsvConfiguration(CultureInfo.InvariantCulture);
csvConfiguration.AllowComments = false;
csvConfiguration.CountBytes = false;
csvConfiguration.Delimiter = ",";
csvConfiguration.DetectColumnCountChanges = false;
csvConfiguration.Encoding = Encoding.UTF8;
csvConfiguration.HasHeaderRecord = true;
csvConfiguration.IgnoreBlankLines = true;
csvConfiguration.HeaderValidated = null;
csvConfiguration.MissingFieldFound = null;
csvConfiguration.TrimOptions = CsvHelper.Configuration.TrimOptions.None;
csvConfiguration.BadDataFound = null;
using (var csvReader = new CsvReader(reader, csvConfiguration))
{
csvReader.Read();
csvReader.ReadHeader();
csvReader.Read();
header = csvReader.HeaderRecord;
csvRecords = csvReader.GetRecords<dynamic>().ToList();
}
}
string targetDirectory = Path.Combine(targetFolder, "Index__" + metadata.Boundle + "__" + DateTime.Now.ToString("yyyyMMdd_HHmmss") + "__" + Path.GetRandomFileName().Substring(0, 6));
System.IO.Directory.CreateDirectory(targetDirectory);
//** LOG
{
string message = #"..creating index : {0}";
_fileLogger.Information(message, targetDirectory);
_consoleLogger.Information(message, targetDirectory);
}
using (var dir = FSDirectory.Open(targetDirectory))
{
using (DirectoryTaxonomyWriter taxoWriter = new DirectoryTaxonomyWriter(dir))
{
Analyzer analyzer = metadata.GetAnalyzer();
var indexConfig = new IndexWriterConfig(LuceneVersion.LUCENE_48, analyzer);
using (IndexWriter writer = new IndexWriter(dir, indexConfig))
{
long entryNumber = csvRecords.Count();
long index = 0;
long lastPercentage = 0;
foreach (dynamic csvEntry in csvRecords)
{
Document doc = new Document();
IDictionary<string, object> dynamicCsvEntry = (IDictionary<string, object>)csvEntry;
var indexedMetadataFiled = metadata.IdexedFields;
foreach (string headField in header)
{
if (indexedMetadataFiled.ContainsKey(headField) == false || (indexedMetadataFiled[headField].NeedToBeIndexed == false && indexedMetadataFiled[headField].NeedToBeStored == false))
continue;
var field = new Field(headField,
((string)dynamicCsvEntry[headField] ?? string.Empty).ToLower(),
indexedMetadataFiled[headField].NeedToBeStored ? Field.Store.YES : Field.Store.NO,
indexedMetadataFiled[headField].NeedToBeIndexed ? Field.Index.ANALYZED : Field.Index.NO
);
doc.Add(field);
var facetField = new FacetField(headField, (string)dynamicCsvEntry[headField]);
doc.Add(facetField);
}
long percentage = (long)(((decimal)index / (decimal)entryNumber) * 100m);
if (percentage > lastPercentage && percentage % 10 == 0)
{
_consoleLogger.Information($"..indexing {percentage}%..");
lastPercentage = percentage;
}
writer.AddDocument(doc);
index++;
}
writer.Commit();
}
}
}
//** LOG
{
string message = #"Index Created : {0}";
_fileLogger.Information(message, targetDirectory);
_consoleLogger.Information(message, targetDirectory);
}
}
I'm fairly new to C# and im writing a rental vehicle management system. I'm trying to retrieve all lines from a CSV file that is set up like this:
[Registration][Grade][Make][Model][Year][NumSeats][Transmission][Fuel][GPS][SunRoof][DailyRate][Colour]
[123ABC][Economy][Toyota][Camry][2005][5][Automatic][Petrol][No][No][30][White]
[234BCD][Economy][Ford][Focus][2012][5][Automatic][Petrol][Yes][No][45][Blue]
[987ZYX][Economy][Holden][Cruise][2016][5][Manual][Diesel][Yes][No][60][Red]
and then iterate it through a for loop before it's sent to another method.
In the following method beyond the one shown, it's being put into an ArrayList so that the values retrieved can be searched for by the user in the program.
I'm stuck on the for loop as it gives me an error on the vehicles1.Length; saying that vehicles1 is a use of an unassigned local variable. I don't know if initializing the array is my problem, because I've tried that and it gives me no errors but the program just breaks.
void setUpVehicles(out Fleet fleetVehicles)
{
const char DELIM = ',';
Vehicle veh = new Vehicle();
FileStream inFile = new FileStream(FILENAME3, FileMode.Open, FileAccess.Read);
StreamReader reader = new StreamReader(inFile);
string recordIn;
string[] vehicles1;
recordIn = reader.ReadLine();
while (recordIn != null)
{
string year = veh.Year.ToString();
string seats = veh.NumSeats.ToString();
string gps = veh.GPS.ToString();
string sunRoof = veh.SunRoof.ToString();
string dailyRate = veh.DailyRate.ToString();
vehicles1 = recordIn.Split(DELIM);
veh.Registration = vehicles1[0];
veh.Grade = vehicles1[1];
veh.Make = vehicles1[2];
veh.Model = vehicles1[3];
year = vehicles1[4];
seats = vehicles1[5];
veh.Transmission = vehicles1[6];
veh.Fuel = vehicles1[7];
gps = vehicles1[8];
sunRoof = vehicles1[9];
dailyRate = vehicles1[10];
veh.Colour = vehicles1[11];
}
fleetVehicles = new Fleet();
for (int i = 0; i < vehicles1.Length; i++)
{
fleetVehicles.insertVehicle(vehicles1[i]);
}
}
IEnumerable<Vehicle> setUpVehicles(string fileName)
{
using(var reader = new StreamReader(fileName))
using(var parser = new Microsoft.VisualBasic.TextFieldParser(reader))
{
parser.TextFieldType = Microsoft.VisualBasic.FileIO.FieldType.Delimited;
parser.Delimiters = new string[] {","};
string[] row;
while(!parser.EndOfData)
{
row = parser.ReadFields();
var vehicle = new Vehicle {
Registration = row[0],
Grade = row[1],
Make = row[2],
Model = row[3],
Year = row[4],
NumSeats = row[5],
Transmission = row[6],
Fuel = row[7],
GPS = row[8],
SunRoo = row[9],
DailyRate = row[10],
Colour = row[11]
};
yield return vehicle;
}
}
}
Then you would call it to make a fleet like this:
var fleetVehicles = new Fleet();
foreach(var vehicle in setUpVehicles(FILENAME3))
{
feetVehicles.insertVehicles(vehicle);
}
Below is the code I'm working with to pass multiple line items to create sales order through GP Web service. I can pass single Line Item without any problem , but when I pass multiple Items it is only taking the last one. The array has around 5 Item ID and I'm passing fixed Quantity as 15, Need to make both dynamic. But for the testing purpose I'm trying like this. I know the problem with the creation/initialization of some web service objects. As novice to the entire set of things I couldn't find the exact problem.
C# Code
CompanyKey companyKey;
Context context;
SalesOrder salesOrder;
SalesDocumentTypeKey salesOrderType;
CustomerKey customerKey;
BatchKey batchKey;
// SalesOrderLine salesOrderLine;
ItemKey orderedItem;
Quantity orderedAmount;
Policy salesOrderCreatePolicy;
DynamicsGPClient wsDynamicsGP = new DynamicsGPClient();
wsDynamicsGP.ClientCredentials.Windows.ClientCredential.UserName = "Admin";
wsDynamicsGP.ClientCredentials.Windows.ClientCredential.Password = "pass";
wsDynamicsGP.ClientCredentials.Windows.ClientCredential.Domain = "Gp";
System.ServiceModel.WSHttpBinding binding;
binding = new System.ServiceModel.WSHttpBinding(System.ServiceModel.SecurityMode.None);
context = new Context();
companyKey = new CompanyKey();
companyKey.Id = (1);
context.OrganizationKey = (OrganizationKey)companyKey;
salesOrder = new SalesOrder();
salesOrderType = new SalesDocumentTypeKey();
salesOrderType.Type = SalesDocumentType.Order;
salesOrder.DocumentTypeKey = salesOrderType;
customerKey = new CustomerKey();
customerKey.Id = "121001";
salesOrder.CustomerKey = customerKey;
batchKey = new BatchKey();
batchKey.Id = "RMS";
salesOrder.BatchKey = batchKey;
// SalesOrderLine[] orders = new SalesOrderLine[6];
SalesOrderLine[] lines = { };
for (int i = 1; i < 5; i++)
{
SalesOrderLine salesOrderLine = new SalesOrderLine();
orderedItem = new ItemKey();
orderedItem.Id = Arr[i].ToString();
salesOrderLine.ItemKey = orderedItem;
orderedAmount = new Quantity();
orderedAmount.Value = 15;
salesOrderLine.Quantity = orderedAmount;
lines = new SalesOrderLine[] { salesOrderLine };
MessageBox.Show(lines.Count().ToString());
}
salesOrder.Lines = lines;
//salesOrder.Lines = orders;
salesOrderCreatePolicy = wsDynamicsGP.GetPolicyByOperation("CreateSalesOrder", context);
wsDynamicsGP.CreateSalesOrder(salesOrder, context, salesOrderCreatePolicy);
if (wsDynamicsGP.State != CommunicationState.Faulted)
{
wsDynamicsGP.Close();
}
MessageBox.Show("Success");
lines = new SalesOrderLine[] { salesOrderLine }; will recreate the lines array object each time meaning you loose any previously added objects. Effectively only the final object in the loop is actually added.
Try using a List<T> like so:
SalesOrderLine[] lines = { }; Becomes List<SalesOrderLine> lines = new List<SalesOrderLine>();
lines = new SalesOrderLine[] { salesOrderLine }; Becomes: lines.Add(salesOrderLine);
If its important you end up with an array as the input:
salesOrder.Lines = lines; Becomes: salesOrder.Lines = lines.ToArray();
In an ASP.Net MVC4 application, I'm using the following code to process a Go To Webinar Attendees report (CSV format).
For some reason, the file that is being loaded is not being released by IIS and it is causing issues when attempting to process another file.
Do you see anything out of the ordinary here?
The CSVHelper (CsvReader) is from https://joshclose.github.io/CsvHelper/
public AttendeesData GetRecords(string filename, string webinarKey)
{
StreamReader sr = new StreamReader(Server.MapPath(filename));
CsvReader csvread = new CsvReader(sr);
csvread.Configuration.HasHeaderRecord = false;
List<AttendeeRecord> record = csvread.GetRecords<AttendeeRecord>().ToList();
record.RemoveRange(0, 7);
AttendeesData attdata = new AttendeesData();
attdata.Attendees = new List<Attendee>();
foreach (var rec in record)
{
Attendee aa = new Attendee();
aa.Webinarkey = webinarKey;
aa.FullName = String.Concat(rec.First_Name, " ", rec.Last_Name);
aa.AttendedWebinar = 0;
aa.Email = rec.Email_Address;
aa.JoinTime = rec.Join_Time.Replace(" CST", "");
aa.LeaveTime = rec.Leave_Time.Replace(" CST", "");
aa.TimeInSession = rec.Time_in_Session.Replace("hour", "hr").Replace("minute", "min");
aa.Makeup = 0;
aa.RegistrantKey = Registrants.Where(x => x.email == rec.Email_Address).FirstOrDefault().registrantKey;
List<string> firstPolls = new List<string>()
{
rec.Poll_1.Trim(), rec.Poll_2.Trim(),rec.Poll_3.Trim(),rec.Poll_4.Trim()
};
int pass1 = firstPolls.Count(x => x != "");
List<string> secondPolls = new List<string>()
{
rec.Poll_5.Trim(), rec.Poll_6.Trim(),rec.Poll_7.Trim(),rec.Poll_8.Trim()
};
int pass2 = secondPolls.Count(x => x != "");
aa.FirstPollCount = pass1;
aa.SecondPollCount = pass2;
if (aa.TimeInSession != "")
{
aa.AttendedWebinar = 1;
}
if (aa.FirstPollCount == 0 || aa.SecondPollCount == 0)
{
aa.AttendedWebinar = 0;
}
attdata.Attendees.Add(aa);
attendeeToDB(aa); // adds to Oracle DB using EF6.
}
// Should I call csvread.Dispose() here?
sr.Close();
return attdata;
}
Yes. You have to dispose objects too.
sr.Close();
csvread.Dispose();
sr.Dispose();
Better strategy to use using keyword.
You should use usings for your streamreaders and writers.
You should follow some naming conventions (Lists contains always multiple entries, rename record to records)
You should use clear names (not aa)
I'm writing a simple code to read some data from a text file and storing in a C# List but having problems with it. Please help if the problem is at my side or is it the library. I've written the following function :
public List<EmpBO> ReadData()
{
EmpBO temp = new EmpBO();
List<EmpBO> lis = new List<EmpBO>(100);
string[] tokens;
string data;
StreamReader sw = new StreamReader(new FileStream("emp.txt",FileMode.OpenOrCreate));
int ind = 0;
while ((data = sw.ReadLine())!=null)
{
Console.WriteLine("Reading " + data);
tokens = data.Split(';');
temp.Id = int.Parse(tokens[0]);
temp.Name = tokens[1];
temp.Salary = double.Parse(tokens[2]);
temp.Br = double.Parse(tokens[3]);
temp.Tax = double.Parse(tokens[4]);
temp.Designation = tokens[5];
//lis.Add(temp);
lis.Insert(ind,temp);
ind++;
}
sw.Close();
Console.WriteLine("Read this material and returning list");
for (int i = 0; i < lis.Count; i++)
{
Console.WriteLine("" + (lis.ElementAt(i)).Name);
}
//foreach (EmpBO ob in lis)
//{
// Console.WriteLine("" + ob.Id + ob.Name);
//}
return lis;
}
File emp.txt Contains:
1;Ahmed;100000;20;1000;manager
2;Bilal;200000;15;2000;ceo
Now as you can see that in while loop, I've displayed what StreamReader has read and it does 2 iterations in this case and displays.
Reading 1;Ahmed;100000;20;1000;manager
Reading 2;Bilal;200000;15;2000;ceo
and as you can see i'm saving this info in temp and inserting in the list.
after the while loop is finished , when I traverse the list for knowing that what is stored in it then it displays:
Read this material and returning list
Bilal
BIlal
Well, the second record is stored in the list twice and 1st record is absent.. What seems to be the problem? I've used Add() method too , and foreach loop for traversing list as you can see it's commented out but the result was same.. Please help
Move this line
EmpBO temp = new EmpBO();
into the while-loop so that it looks like
while ((data = sw.ReadLine())!=null){
EmpBO temp = new EmpBO();
Console.WriteLine("Reading " + data);
tokens = data.Split(';');
temp.Id = int.Parse(tokens[0]);
temp.Name = tokens[1];
temp.Salary = double.Parse(tokens[2]);
temp.Br = double.Parse(tokens[3]);
temp.Tax = double.Parse(tokens[4]);
temp.Designation = tokens[5];
//lis.Add(temp);
lis.Insert(ind,temp);
ind++;
}
You are not creating a new EmpBO for each entry, but more overwriting the same object with the read values and adding it again to the List.
The effect is that you add the same object mutiple times to the List.
In your code you have created the EmpBO object only once. In the second iteration you are modified the value in the same object. you have to create instance for EmpBO inside the while loop like below.
while ((data = sw.ReadLine())!=null)
{
Console.WriteLine("Reading " + data);
tokens = data.Split(';');
EmpBO temp = new EmpBO();
temp.Id = int.Parse(tokens[0]);
temp.Name = tokens[1];
temp.Salary = double.Parse(tokens[2]);
temp.Br = double.Parse(tokens[3]);
temp.Tax = double.Parse(tokens[4]);
temp.Designation = tokens[5];
//lis.Add(temp);
lis.Insert(ind,temp);
ind++;
}
This isn't a direct answer to the question, but your code has other problems.
Both your FileStream and StreamReader should be disposed of after use.
Alternatively, you could write your code like this:
public List<EmpBO> ReadData()
{
return File
.ReadAllLines("emp.txt")
.Select(data =>
{
var tokens = data.Split(';');
return new EmpBO()
{
Id = int.Parse(tokens[0]),
Name = tokens[1],
Salary = double.Parse(tokens[2]),
Br = double.Parse(tokens[3]),
Tax = double.Parse(tokens[4]),
Designation = tokens[5],
};
})
.ToList();
}
That, hopefully, should be even easier.
You've inserted the same object twice. You have to create a new object in the loop otherwise you will override the attributes on each iteration and simply and a reference to the same object over and over again.It's safe to assume that standard operations on the BCL classes work correctly or as Eric Lippert put's it Maybe there's something wrong with the universe but probably not
you simply need to change the start of the loop to this:
while ((data = sw.ReadLine())!=null)
{
EmpBO temp = new EmpBO();
If you try to add same object twice in a list ,it will override values entered first time and will show only values from second object but twice
for example :Take a list ,add a object in it . modify that object ,again add it .
when you try to print values ,you will get values of last object
ob1.a=5;
list1.add(ob1);
// list1[0]-->a-->5
ob1.a=7;
list1.add(ob1);
// list1[0]--->a--->7 list1[1]--->a--->7