This question already exists:
I have around 20000 key value pairs for a dictionary, how do I create without getting slow or memory problem C#? [closed]
Need to create a dictionary (not C# Dictionary) which have 20,000 words. what is the best way to store data and search data? I have key values 20,000 [duplicate]
Closed 1 year ago.
OleDbConnection connection;
OleDbCommand command;
string commandText = "SELECT pali,sinhala FROM [Sheet3$]";
string oledbConnectString = "Provider=Microsoft.ACE.OLEDB.12.0;" +
#"Data Source=Book2.xlsx;" +
"Extended Properties=\"Excel 12.0;HDR=YES\";";
connection = new OleDbConnection(oledbConnectString);
command = new OleDbCommand(commandText, connection);
OleDbDataAdapter da = new OleDbDataAdapter(command);
connection.Close();
DataTable dt = new DataTable();
da.Fill(dt);
dataGridView1.DataSource = dt;
Excel sheet have 20 000 rows(UTF8 unicode). even my Lap 8gb ram i3 it takes like 8secs to load. Is their any way to store these data? I want to create a Dictionary(not csharp dictionary).
Related
Sorry for the vague title but I'm not sure how best to word it.
Basically, the issue that I'm having is that I initially was using an oledbconnection in C# to initiate a connection to a spreadsheet, query it, and load the results into a .net datatable. Unfortunately, it maxes out at 255 fields which I have around 600.
So, what I did instead was I created a dataset and tried to load 4 separate data tables with separate queries. Now for some reason, what's crazy to me is, if I load lets say, the first data table with 190 fields, and then I go on to query the spreadsheet again, if I go over that 250 mark (which I'd have 60 left), I get the following error:
An exception of type 'System.Data.OleDb.OleDbException' occurred in System.Data.dll but was not handled in user code
Additional information: No value given for one or more required parameters.
If I reduce the amount of fields in the second table to equal less than 250 though, it works. Anyway, is there a way that I can somehow clear the cache of the oledbconnection to somehow drop the first 190 fields or whatever is holding the result of the excel query so I can move on to the next? I tried doing a data adapter dispose but that didn't work, same issue. If I do a connection.dispose, I have to re-initialize the connection anyway. Is there a way I can still do this without having to drop the connection? Code below:
OleDbConnection cnn = new OleDbConnection(Settings.ExcelCN);
OleDbCommand fillT1 = new OleDbCommand(Settings.Excel_Proj_Select_1, cnn);
OleDbCommand fillT2 = new OleDbCommand(Settings.Excel_Proj_Select_2, cnn);
OleDbCommand fillT3 = new OleDbCommand(Settings.Excel_Proj_Select_3, cnn);
OleDbCommand fillT4 = new OleDbCommand(Settings.Excel_Proj_Select_4, cnn);
OleDbCommand updateCnt = new OleDbCommand(Settings.Excel_Update_Count_Select, cnn);
cnn.Open();
OleDbDataAdapter adp1 = new OleDbDataAdapter(fillT1);
OleDbDataAdapter adp2 = new OleDbDataAdapter(fillT2);
OleDbDataAdapter adp3 = new OleDbDataAdapter(fillT3);
OleDbDataAdapter adp4 = new OleDbDataAdapter(fillT4);
DataTable dt1 = new DataTable();
DataTable dt2 = new DataTable();
DataTable dt3 = new DataTable();
DataTable dt4 = new DataTable();
DataSet ds1 = new DataSet();
adp1.Fill(dt1);
ds1.Tables.Add(dt1);
adp2.Fill(dt2);
ds1.Tables.Add(dt2);
adp3.Fill(dt3);
ds1.Tables.Add(dt3);
adp4.Fill(dt4);
ds1.Tables.Add(dt4);
int rowcount = updateCnt.ExecuteNonQuery();
cnn.Close();
Given the following code that fills in a DataTable object with data from a query:
var response = new DataTable();
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
SqlCommand sqlCommand = new SqlCommand(query, connection);
sqlCommand.CommandTimeout = 30;
SqlDataAdapter da = new SqlDataAdapter(sqlCommand);
da.Fill(response);
connection.Close();
da.Dispose();
}
Suppose any query is sent through this code.
If the query returns tens of thousands of rows involving several gigabytes of data there could be performance problems. Is there a way to alter the 'Fill' command so it only fills in e.g. the first 1000 rows and stops accepting data after?
Currently when larger queries are returned a large heap space is taken (picture shows a snapshot before and after the fill) and I'm looking for a way to programmatically only fill a certain amount.
Is this possible?
at work I've used closedXML in the past to go from a datatable or dataset to excel extremely quickly without looping. Now I need to go the other way but the only documentation I can find on closed XMl or anything else to go from excel to datatable is to loop. I can't imagine with the current demand for speed, the large amounts of data that can go into excel and the widespread use of Office that nobody has figured out a faster way than looping.
Is there a way in closed XML or another reasonably sized, safe library that quickly moves excel to datables or other system.data objects such as datasets without looping?
You can use a plain OleDb connection to read data from a worksheet into a DataTable:
string strExcelConn = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=c:\filename.xlsx;Extended Properties='Excel 12.0 Xml;HDR=YES;'";
using (OleDbConnection connExcel = new OleDbConnection(strExcelConn))
{
string selectString = "SELECT * FROM [CA$A1:D500]";
using (OleDbCommand cmdExcel = new OleDbCommand(selectString,connExcel))
{
cmdExcel.Connection = connExcel;
connExcel.Open();
DataTable dt=new DataTable();
OleDbDataAdapter adp = new OleDbDataAdapter();
adp.SelectCommand = cmdExcel;
adp.FillSchema(dt, SchemaType.Source);
adp.Fill(dt);
int range=dt.Columns.Count;
int row = dt.Rows.Count;
}
}
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I want convert this code into using a DataSet.
I don't have any idea how to retrieve data from a DataSet.
conn.Open();
string strquery = "select * from tblclientinfo where clientId=" +
txtid.Text; SqlCommand sqlcmd = new SqlCommand(strquery, conn);
SqlDataReader sqldreader = sqlcmd.ExecuteReader();
if (sqldreader.HasRows)
{
while (sqldreader.Read())
{
txtid.Text = sqldreader["clientId"].ToString();
txtname.Text = sqldreader["name"].ToString();
txtmobile.Text = sqldreader["mobile"].ToString();
txtcnic.Text = sqldreader["cnic"].ToString();
}
}
else
{
MessageBox.Show("Record for ID" + " " + txtid.Text + " " + "Not Found !");
}
sqldreader.Close();
conn.close();
Do you want something like this ?
SqlDataAdapter da = new SqlDataAdapter();
da.SelectCommand.CommandText = "select * from tblclientinfo where clientId = #id";
da.SelectCommand.Parameters.AddWithValue("#id", txtid);
da.SelectCommand.Connection = conn;
DataSet ds = new DataSet();
da.Fill(ds);
foreach (DataRow row in ds.Tables[0].Rows)
{
txtid.Text = row["clientId"].ToString();
txtname.Text = row["name"].ToString();
...
}
The DataReader is the active part reading in the data from the database connection.
The DataSet is a container defined in a generic way so that it may contain any structure of data that tis read in by a DataReader.
You can use this approach:
// Assumes that conn is a valid SqlConnection object.
string strquery = "select * from tblclientinfo where clientId=" + txtid.Text;
SqlDataAdapter adapter = new SqlDataAdapter(strquery , conn);
DataSet resultSet = new DataSet();
resultSet.Fill(resultSet, "resultSet");
//now you can read data from DataTables contained in your dataset and do whatever with it
DataTable FirstTableOfDataSet = resultSet.Tables[0];
As your question asks about the difference, so here it goes:
DataReader is used in connected architecture. That means a connection remains open till you iterate through and fetch all records one by one. Only after fetching records when you write con.Close() it gets closed.
DataSet and DataAdapter are used in disconnected Architecture. That means they bring data and do not have a open connection for a long time. So in case if the database get changed after bringing data you won't have the updates, because it is cached.
Take an example:
An oil company is located at say Dubai(DataSource - Database), it needs to send oil to say USA(Application UI). What are the two ways?
A jet plane carries the oil to USA.- Disconnected. The jet plane is the DataAdapter and the container in which the oil is brought is dataset/datatable.
A pipeline runs through Dubai to USA. - Connected. The pipeline is the reader.
Awkward example. ;) Hope this make the difference little clear.
This question already has answers here:
Is datareader quicker than dataset when populating a datatable?
(9 answers)
Closed 10 years ago.
I am using the following code (Variant DataReader):
public DataTable dtFromDataReader(list<String> lstStrings)
{
OleDBConn_.Open();
using (OleDbCommand cmd = new OleDbCommand())
{
DataTable dt = new DataTable();
OleDbDataReader reader = null;
cmd.Connection = OleDBConn_;
cmd.CommandText = "SELECT * from TableX where SUID=?";
foreach (String aString in lstStrings)
{
cmd.Parameters.AddWithValue("?", aNode.SUID);
reader = cmd.ExecuteReader();
if (reader != null)
dt.Load(reader);
cmd.Parameters.Clear();
}
return dt;
}
}
and compare it to (Variant DataAdapter):
public DataTable dtFromDataAdapter(list<String> lstStrings)
{
dt = new DataTable();
foreach (string aString in lstStrings)
{
sOledb_statement = String.Concat("SELECT * FROM TableX where SUID='", aString, "'");
OleDbDataAdapter oleDbAdapter;
using (oleDbAdapter = new OleDbDataAdapter(sOledb_statement, OleDBConn_))
{
GetOleDbRows = oleDbAdapter.Fill(dt);
}
}
}
When i connect to an offline database (microsoft access) my reading time is (~1.5k retrieved items):
DataReader 420 ms
DataAdapter 5613 ms
When reading from oracle server (~30k retrieved items):
DataReader 323845 ms
DataAdapter 204153 ms
(several tests, times do not change much)
Even changing the order of the commands (dataadapter before datareader) didn't change much (i thought that there may have been some precaching..).
I thought DataTable.Load should be somewhat faster than DataAdapter.Fill?
And i still believe, even though i see the results, that it should be faster. Where am i losing my time? (There are no unhandled exceptions..)
Your comparison isn't really an Adapter vs DataReader with the way you have the code setup. You are really comparing the Adapter.Fill vs DataTable.Load methods.
The DataReader would normally be faster on a per-record basis because you would be traversing the records one at a time and can react accordingly when you read each record.
Since you are returning a DataTable in both instances, the Adapter.Fill method would probably be the optimal choice to use. It was designed to do just that.