I have an array that consist of First Name _ Last Name so they would read like so
Michael_Jordan
Javier_Lopez
George_Jones
I have an loop set-up to iterate through each of these, but I only want to take what's after the "" The problem I have is that the array was declared globally, and it is declared in far to many places for me to change. If I try to use the .Split function I receive an error of System.Array does not contain a definition for split. What is another option to take the data after the "" in the array?
public static string GetEmployees()
{
string queryString = "select employeeName from tbl_GlobalEmployeeData where state = 'AL';
SqlConnection connection = new SqlConnection(Connection.MyConnectionString.ConnectionStrings[0]);
{
SqlCommand cmd = new SqlCommand(queryString, connection);
connection.Open();
List<string> tempList = new List<string>();
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
try
{
if (!reader.IsDBNull(0))
{
tempList.Add(reader[0].ToString() + "_" + reader[1].ToString());
}
}
catch
{
if (!reader.IsDBNull(0))
{
tempList.Add(reader[0].ToString() + "_" + reader[1].ToString());
}
}
}
reader.Close();
AllCompanyEmployees.State.ThisStore = tempList.ToArray();
for (int q = AllCompanyEmployees.State.ThisStore.GetLowerBound(0); q <= AllCompanyEmployees.State.ThisStore.GetUpperBound(0); q++)
{
return AllCompanyEmployees.State.ThisStore[q];
}
return null;
}
}
}
for (int q = AllCompanyEmployees.State.ThisStore.GetLowerBound(0); q <= AllCompanyEmployees.State.ThisStore.GetUpperBound(0); q++)
{
//This line is where I get the error mentioned above
string lastName = AllCompanyEmployees.State.ThisStore.Split('_')[1];
}
I think your question is "I want to split the array - So for example it reads Javier_Lopez I want to take Lopez from the array"
Very easy:
string last = yourString.Split(new char[] { '_' })[1];
Again, you seem to be using this on an array which is why you are getting that error. You need to iterate through your array and do this on each individual string in your array.
EDIT: To modify the array and leave only last names, try this:
int i = 0;
foreach (string s in stringArray)
{
stringArray[i] = stringArray[i].Split(new char[] { '_' })[1];
i++;
}
You can only use Split on strings. So you could do something like this:
List<string> lastNames = new List<string>();
for (int q = AllCompanyEmployees.State.ThisStore.GetLowerBound(0); q <= AllCompanyEmployees.State.ThisStore.GetUpperBound(0); q++)
{
string lastName = AllCompanyEmployees.State.ThisStore[q].Split('_')[1];
lastNames.Add(lastName);
}
At the end you would have a List<string> with all last names of your employees. With that you can continue to work.
Related
Why am I getting just table's field names but data in csv file ? I wrote below c# code:
private string GetConnectionString()
{
return #"Data Source=BLC-LT-26\\SQLEXPRESS;Initial Catalog=Quality;Integrated Security=True;";
}
private string GetCSV()
{
SqlConnection cn = new SqlConnection("Data Source=BLC-LT-26\\SQLEXPRESS;Initial Catalog=Quality;Integrated Security=True;");
{
cn.Open();
return CreateCSV(new SqlCommand("select SNumber, Jnumber from J0_118 ", cn).ExecuteReader());
}
}
private string CreateCSV(IDataReader reader)
{
string file = #"C:\\CSV\\ExportedData.csv";
List<string> lines = new List<string>();
string headerLine = "";
if (reader.Read())
{
string[] columns = new string[reader.FieldCount];
for (int i = 0; i < reader.FieldCount; i++)
{
columns[i] = reader.GetName(i);
}
headerLine = string.Join(",", columns);
lines.Add(headerLine);
}
while (reader.Read())
{
object[] values = new object[reader.FieldCount];
reader.GetValues(values);
lines.Add(string.Join(",", values));
}
System.IO.File.WriteAllLines(file, lines);
return file;
}
Calling the Read() method of a DataReader loads the first record (if it exists) into the internal buffer and prepares the reader to load the next record. Calling again Read() loads the second record (if it exists)
In your code you use the reader.Read() a first time and then try to prepare the header for the CSV and then the code enters a while loop to read the data. But this logic causes the code to skip the first record and, of course, if there is only one record it causes the return of just the column' names
To fix, you should change the GetCSV method signature and declare the parameter to be an SqlDataReader, then instead of using reader.Read() to check if there are records you could check the reader.HasRows property
private string CreateCSV(SqlDataReader reader)
{
string file = #"C:\\CSV\\ExportedData.csv";
List<string> lines = new List<string>();
string headerLine = "";
if (reader.HasRows)
{
string[] columns = new string[reader.FieldCount];
for (int i = 0; i < reader.FieldCount; i++)
{
columns[i] = reader.GetName(i);
}
headerLine = string.Join(",", columns);
lines.Add(headerLine);
}
while (reader.Read())
{
object[] values = new object[reader.FieldCount];
reader.GetValues(values);
lines.Add(string.Join(",", values));
}
System.IO.File.WriteAllLines(file, lines);
return file;
}
But still I don't understand why you want to write the column names only if there are records. I would write them in any case. Just a header line will be present in the file if the recordset is empty.
I have a large csv file which has millions of rows. The sample csv lines are
CODE,COMPANY NAME, DATE, ACTION
A,My Name , LLC,2018-01-28,BUY
B,Your Name , LLC,2018-01-25,SELL
C,
All Name , LLC,2018-01-21,SELL
D,World Name , LLC,2018-01-20,BUY
Row C has new line, but actually this is same record. I want to remove new line character from the csv line within cell\field\column.
I tired \r\n, Envirnment.NewLine and many other things, but could not make it work.
Here is my code..
private DataTable CSToDataTable(string csvfile)
{
Int64 row = 0;
try
{
string CSVFilePathName = csvfile; //#"C:\test.csv";
string[] Lines = File.ReadAllLines(CSVFilePathName.Replace(Environment.NewLine, ""));
string[] Fields;
Fields = Lines[0].Split(new char[] { ',' });
int Cols = Fields.GetLength(0);
DataTable dt = new DataTable();
//1st row must be column names; force lower case to ensure matching later on.
for (int i = 0; i < Cols; i++)
dt.Columns.Add(Fields[i].ToLower(), typeof(string));
DataRow Row;
for (row = 1; row < Lines.GetLength(0); row++)
{
Fields = Lines[row].Split(new char[] { ',' });
Row = dt.NewRow();
//Console.WriteLine(row);
for (int f = 0; f < Cols; f++)
{
Row[f] = Fields[f];
}
dt.Rows.Add(Row);
if (row == 190063)
{
}
}
return dt;
}
catch (Exception ex)
{
throw ex;
}
}
How can I remove new line character and read the row correctly? I don't want to skip the such rows as per the business requirement.
You CSV file is not in valid format. In order to parse and load them successfully, you will have to sanitize them. Couple of issues
COMPANY NAME column contains field separator in it. Fix them by
surrounding quotes.
New line in CSV value - This can be fixed by combining adjacent rows as one.
With Cinchoo ETL, you can sanitize and load your large file as below
string csv = #"CODE,COMPANY NAME, DATE, ACTION
A,My Name , LLC,2018-01-28,BUY
B,Your Name , LLC,2018-01-25,SELL
C,
All Name , LLC,2018-01-21,SELL
D,World Name , LLC,2018-01-20,BUY";
string bufferLine = null;
var reader = ChoCSVReader.LoadText(csv)
.WithFirstLineHeader()
.Setup(s => s.BeforeRecordLoad += (o, e) =>
{
string line = (string)e.Source;
string[] tokens = line.Split(",");
if (tokens.Length == 5)
{
//Fix the second and third value with quotes
e.Source = #"{0},""{1},{2}"",{3}, {4}".FormatString(tokens[0], tokens[1], tokens[2], tokens[3], tokens[4]);
}
else
{
//Fix the breaking lines, assume that some csv lines broken into max 2 lines
if (bufferLine == null)
{
bufferLine = line;
e.Skip = true;
}
else
{
line = bufferLine + line;
tokens = line.Split(",");
e.Source = #"{0},""{1},{2}"",{3}, {4}".FormatString(tokens[0], tokens[1], tokens[2], tokens[3], tokens[4]);
line = null;
}
}
});
foreach (var rec in reader)
Console.WriteLine(rec.Dump());
//Careful to load millions rows into DataTable
//var dt = reader.AsDataTable();
Hope it helps.
You haven't made it clear what are the possible criteria an unwanted new line could appear in the file. So assuming that a 'proper' line in the CSV file does NOT end with a comma, and if one ends with a comma that means that it's not a properly formatted line, you could do something like this:
static void Main(string[] args)
{
string path = #"CSVFile.csv";
List<CSVData> data = new List<CSVData>();
using (FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read))
{
using (StreamReader sr = new StreamReader(fs))
{
sr.ReadLine(); // Header
while (!sr.EndOfStream)
{
var line = sr.ReadLine();
while (line.EndsWith(","))
{
line += sr.ReadLine();
}
var items = line.Split(new string[] { "," }, StringSplitOptions.None);
data.Add(new CSVData() { CODE = items[0], NAME = items[1], COMPANY = items[2], DATE = items[3], ACTION = items[4] });
}
}
}
Console.ReadLine();
}
public class CSVData
{
public string CODE { get; set; }
public string NAME { get; set; }
public string COMPANY { get; set; }
public string DATE { get; set; }
public string ACTION { get; set; }
}
Obviously there's a lot of error handling to be done here (for example, when creating a new CSVData object make sure your items contain all the data you want), but I think this is the start you need.
I have the following method which I show below:
public static string loadLista(string pStrOp)
{
System.Data.SqlClient.SqlConnection conn = new System.Data.SqlClient.SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["WXYZ"].ConnectionString);
System.Data.SqlClient.SqlCommand cmd = new System.Data.SqlClient.SqlCommand("select distinct RTRIM(LTRIM(c.str_val)) as Id , RTRIM(LTRIM(c.str_val)) as Value " +
"from dbo.toc a " +
"inner join dbo.propval c on c.tocid = a.tocid and c.PROP_ID = 698 " +
"where a.pset_id = 114 and c.str_val is not null " +
"order by 2 asc;", conn);
var items = new List<Parametro>();
try
{
conn.Open();
using (var dr = cmd.ExecuteReader())
{
if (dr.HasRows)
{
int fc = dr.FieldCount;
var colums = new Dictionary<int, string>();
for (int i = 0; i < fc; i++)
colums.Add(i, dr.GetName(i));
object[] values = new object[fc];
while (dr.Read())
{
dr.GetValues(values); //Get All Values
Parametro item = Activator.CreateInstance<Parametro>();
var props = item.GetType().GetProperties();
foreach (var p in props)
{
foreach (var col in colums)
{
if (p.Name != col.Value) continue;
var value = values[col.Key];
if (value.GetType() != typeof(DBNull))
{
p.SetValue(item, value, null);
}
}
}
items.Add(item);
}
}
}
}
catch (Exception ex)
{
return new System.Web.Script.Serialization.JavaScriptSerializer().Serialize("");
}
return new System.Web.Script.Serialization.JavaScriptSerializer().Serialize(items);
}
As you can see, I use a SqlCommand to make a query, then store it in a DataReader to process the data and finally save it in a list. The method works perfectly, the question is how can I replace the use of Lists in this method, to use arrays instead, since I have a limitation that prevents me from using Lists.
If you could give me an applied example, it would be very helpful. Thanks in advance for the help provided.
I don't quite understand why you cant use List<T> and you should figure that out first, however for academic purposes
Array.Resize Method (T[], Int32)
Changes the number of elements of a one-dimensional array to the
specified new size.
Remarks
This method allocates a new array with the specified size, copies
elements from the old array to the new one, and then replaces the old
array with the new one.array must be a one-dimensional array.
If array is null, this method creates a new array with the specified
size.
So instead of using a List<T>
Parametro[] items;
...
Array.Resize(items, (items?.Length ?? 0) +1 ) // resize
items[items.Length-1] = item; // add item to the last index
Note, this is fairly inefficient, List<T> actually works with a capacity that is set to a factor of 2 of whats needed when it has to internally increase the array.
Disclaimer : Not tested, there may be typos
how insert data array to database with linq to sql?
DataClassesDataContext db = new DataClassesDataContext();
simpleTbl tbl = new simpleTbl();
string[] str =File.ReadAllLines(Server.MapPath("~/str.txt"));
for (int i = 0; i <= 10; i++)
{
tbl.Name =str[i];
}
db.simpleTbl.InsertOnSubmit(tbl);
db.SubmitChanges();
but dosent work
using(DataClassesDataContext db = new DataClassesDataContext())
{
string[] strings = File.ReadAllLines(Server.MapPath("~/str.txt"));
foreach (var str in strings )
{
db.simpleTbl.InsertOnSubmit(new simpleTbl(){ Name = str });
}
db.SubmitChanges();
}
You need to insert a new item for each entry in the string array.
this code is correct but uses for one array if i have more one array how implement this?
using(DataClassesDataContext db = new DataClassesDataContext())
{
string[] strings = File.ReadAllLines(Server.MapPath("~/str.txt"));
foreach (var str in strings )
{
db.simpleTbl.InsertOnSubmit(new simpleTbl(){ Name = str });
}
db.SubmitChanges();
}
Q:
When i try to execute the following parametrized query:
INSERT INTO days (day,short,name,depcode,studycode,batchnum) values (?,?,?,?,?,?);SELECT SCOPE_IDENTITY();
through command.ExecuteScalar();
throws the following exception:
ERROR [07001] [Informix .NET provider]Wrong number of parameters.
Where is the problem?
EDIT:
public static int InsertDays(List<Day> days)
{
int affectedRow = -1;
Dictionary<string, string> daysParameter = new Dictionary<string, string>();
try
{
foreach (Day a in days)
{
daysParameter.Add("day", a.DayId.ToString());
daysParameter.Add("short", a.ShortName);
daysParameter.Add("name", a.Name);
daysParameter.Add("depcode", a.DepCode.ToString());
daysParameter.Add("studycode", a.StudyCode.ToString());
daysParameter.Add("batchnum", a.BatchNum.ToString());
affectedRow = DBUtilities.InsertEntity_Return_ID("days", daysParameter);
daysParameter.Clear();
if (affectedRow < 0)
{
break;
}
}
}
catch (Exception ee)
{
string message = ee.Message;
}
return affectedRow;
}
public static int InsertEntity_Return_ID(string tblName, Dictionary<string, string> dtParams)
{
int Result = -1;
DBConnectionForInformix DAL_Helper = new DBConnectionForInformix("");
string[] field_names = new string[dtParams.Count];
dtParams.Keys.CopyTo(field_names, 0);
string[] field_values = new string[dtParams.Count];
string[] field_valuesParam = new string[dtParams.Count];
dtParams.Values.CopyTo(field_values, 0);
for (int i = 0; i < field_names.Length; i++)
{
field_valuesParam[i] = "?";
}
string insertCmd = #"INSERT INTO " + tblName + " (" + string.Join(",", field_names) + ") values (" + string.Join(",", field_valuesParam) + ");SELECT SCOPE_IDENTITY();";
Result = int.Parse(DAL_Helper.Return_Scalar(insertCmd));
return Result;
}
You haven't shown where you're actually populating the parameter values. Given that you've got the right number of question marks, I suspect that's where the problem lies.
EDIT: Okay, now you've posted more code, it's obvious what's going wrong: your Return_Scalar method isn't accepting any actual values! You're not using field_values anywhere after populating it. You need to set the parameters in the command.
(You should also look at .NET naming conventions, by the way...)
Ensure that where you are providing the parameters that one of the values is not null. That may cause the provider to ignore the parameter. If this is your issue pass DBNull.
EDIT
As Jon stated you need to use command.Parameters to give the command the parameters to use in the query.