I have a sqlite database consist of 50 columns and more than 1.2 million rows. I'm using System.Data.Sqlite to work with Visual Studio 2013 C# language.
I used a very simple code to retrieve my data from database but it is taking too much time.
private SQLiteConnection sqlite;
public MySqlite(string path)
{
sqlite = new SQLiteConnection("Data Source="+path+"\\DBName.sqlite");
}
public DataTable selectQuery(string query)
{
SQLiteDataAdapter ad;
DataTable dt = new DataTable();
try
{
SQLiteCommand cmd;
sqlite.Open();
cmd = sqlite.CreateCommand();
cmd.CommandText = query; //set the passed query
ad = new SQLiteDataAdapter(cmd);
ad.Fill(dt); //fill the datasource
}
catch (SQLiteException ex)
{
//exception code here.
}
sqlite.Close();
return dt;
}
And, the select statement is:
select * from table
As I told you, it is a very simple code.
I want to know how to boost select operation performance to get the appropriate result. for this code the process takes up to 1 minute which I want to get to less than 1 second.
and another thing is that there seems to be some tags for configuring sqlite database but I don't know where to apply them. could some one tell me how to configure sqlite database with System.Data.Sqlite;
Consider narrowing your result set by getting necessary columns or paging.
Related
Currently I'm working with my local MSSQL database and when I make a connection all works good. However that is not the question I'm having right now; I want to make my code cleaner and I don't want to have duplicated code or almost duplicated code.
For now I'm working with one large class that holds all the methods to selecting, creating, updating and/or deleting an user. But I think it can be writen down better with an override string that rides over the sql string inside the code.
Only thing is that I'm (for now) a complete noob and have no idea how to accomplish this... please help? As an example I've set the two regions, might change them to classes, below.
#region Select Data from Database
public DataTable Select()
{
// static method to connect to database
SqlConnection conn = new SqlConnection(myconnstring);
// to hold the data from database
DataTable dt = new DataTable();
try
{
// sql query to get date from database
String sql = "SELECT * FROM tbl_users";
// for executing command
SqlCommand cmd = new SqlCommand(sql, conn);
// getting data from database
SqlDataAdapter adapter = new SqlDataAdapter(cmd);
// database connection open
conn.Open();
// fill data in datatable
adapter.Fill(dt);
}
catch(Exception ex)
{
// throw message if any error accures
MessageBox.Show(ex.Message);
}
finally
{
// closing connection
conn.Close();
}
// return value in datatable
return dt;
}
#endregion
#region Search User on Database using KeyWords
public DataTable Search(string keywords)
{
// static method to connect to database
SqlConnection conn = new SqlConnection(myconnstring);
// to hold the data from database
DataTable dt = new DataTable();
try
{
// sql query to get date from database
String sql = "SELECT * FROM tbl_users WHERE id LIKE '%"+keywords+"%' OR first_name LIKE '%"+keywords+"%' OR last_name LIKE '%"+keywords+"%' OR username LIKE '%"+keywords+"%'";
// for executing command
SqlCommand cmd = new SqlCommand(sql, conn);
// getting data from database
SqlDataAdapter adapter = new SqlDataAdapter(cmd);
// database connection open
conn.Open();
// fill data in datatable
adapter.Fill(dt);
}
catch (Exception ex)
{
// throw message if any error accures
MessageBox.Show(ex.Message);
}
finally
{
// closing connection
conn.Close();
}
// return value in datatable
return dt;
}
#endregion
Step 1. Read https://xkcd.com/327/ and whatever solution you go with fix id LIKE '%"+keywords+"%'
I encourage you to research an object mapper, like Dapper which will make your methods return types (e.g. User) and not raw DataTables. An ORM can help pushing you into the pit of success.
As for reuse you can notice that your methods that do SELECT look very similar so you could make a helper method DataTable ExecuteSelect(string sql) which you could reuse from your Search and Select methods.
You really must fix this '%"+keywords+"%' issue. SQL injection is no joke.
Think about changing your entire concept and work with entity framework.
That way you create a connection to the DB and class for each main entity you have there.
Afterwards all your selects, updates, deletes etc will be automatic with all the functionality you have in MSSQL.
(This would be a mess as a comment)
First of all you must write code that is trustable and doesn't have unnecessary additions. ie:
#region Select Data from Database
public DataTable Select(string sql)
{
DataTable dt = new DataTable();
try
{
// getting data from database
SqlDataAdapter adapter = new SqlDataAdapter(sql, myconnstring);
// fill data in datatable
adapter.Fill(dt);
}
catch(Exception ex)
{
// throw message if any error accures
MessageBox.Show(ex.Message);
}
// return value in datatable
return dt;
}
#endregion
#region Search User on Database using KeyWords
public DataTable Search(string keywords)
{
// to hold the data from database
DataTable dt = new DataTable();
String sql = #"SELECT *
FROM tbl_users
WHERE id LIKE #pattern OR
first_name LIKE #pattern OR
last_name LIKE #pattern OR
username LIKE #pattern";
try
{
using (SqlConnection conn = new SqlConnection(myconnstring))
using (SqlCommand cmd = new SqlCommand(sql, conn))
{
cmd.Parameters.Add("#pattern", SqlDbType.Varchar).Value = '%"+keywords+"%';
// database connection open
conn.Open();
// fill data in datatable
tbl.Load(cmd.ExecuteReader());
}
catch (Exception ex)
{
// throw message if any error accures
MessageBox.Show(ex.Message);
}
// return value in datatable
return dt;
}
#endregion
First spare time to understand why you should use parameters, how you would use then etc.
Next, you would quickly see that this pattern is not flexible and not worth to make into a class as is.
You should think about getting away from DataTable and DataSet as well. Look into Linq and Entity Framework (Linq To EF). Others already found the wheel for you. Read about different patterns like Repository pattern. And also read about different backends and their advantagaes\disadvantages, weighing it against your use cases, before coding specifically for one of them.
I have Handheld device that connect to Sql Server database, read the Sql server data and get it on the SQL Compact database that is located on device. This is my code:
public void InsertData() // Function insert data into SQL commapct database
{
dt = new DataTable();
dt = SqlServer_GetData_dt("Select id_price, price, id_item from prices", SqlCeConnection); // get data form sql server
if (dt.Rows.Count > 0)
{
for (int i = 0; i < dt.Rows.Count; i++)
{
string sql = "";
sql = "insert into prices" +
" ( id_prices, price,id_item) values('"
+ dt.Rows[i]["id_price"].ToString().Trim() + "', '"
+ dt.Rows[i]["price"].ToString().Trim() + "', '"
+ dt.Rows[i]["id_item"].ToString().Trim() + "')";
obj.SqlCE_WriteData_bit(sql, connection.ConnectionString);//insert into sql compact
}
}
}
public DataTable SqlServer_GetData_dt(string query, string conn)
{
try
{
DataTable dt = new DataTable();
string SqlCeConnection = conn;
SqlConnection sqlConnection = new SqlConnection(SqlCeConnection);
sqlConnection.Open();
{
SqlDataReader darSQLServer;
SqlCommand cmdCESQLServer = new SqlCommand();
cmdCESQLServer.Connection = sqlConnection;
cmdCESQLServer.CommandType = CommandType.Text;
cmdCESQLServer.CommandText = query;
darSQLServer = cmdCESQLServer.ExecuteReader();
dt.Load(darSQLServer);
sqlConnection.Close();
}
return dt;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
DataTable dt = new DataTable();
return dt;
}
}
public object SqlCE_WriteData_bit(string query, string conn)
{
try
{
string SqlCeConnection = conn;
SqlCeConnection sqlConnection = new SqlCeConnection(SqlCeConnection);
if (sqlConnection.State == ConnectionState.Closed)
{
sqlConnection.Open();
}
SqlCeCommand cmdCESQLServer = new SqlCeCommand();
cmdCESQLServer.Connection = sqlConnection;
cmdCESQLServer.CommandType = CommandType.Text;
cmdCESQLServer.CommandText = query;
object i = cmdCESQLServer.ExecuteScalar();
sqlConnection.Close();
return i;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
return 0;
}
}
This is all work fine but the problem is that all this work very slow. I have 20 000 row that's need to be inserted into SQL compact database.
Is there any way for faster insert?
Thanks.
Aside from the obvious poor usage of the Connection for every call, you can greatly improve things by also eliminating the query processor altogether. That means don't use SQL. Instead open the destination table with TableDirect and a SqlCeResultset. The iterate through the source data (a DataTable is a bad idea, but that's a completely different thing) and use a series of CreateRecord, SetValues and Insert.
A pretty good example can be found here (though again, I'd use SetValues to set the entire row, not each individual field).
Reuse your connection and don't create a new connection for every INSERT statement.
Instead of passing a connection string to your SqlCE_WriteData_bit method, create the connection once in the InsertData method, and pass the connection object to SqlCE_WriteData_bit.
Put all the data into a DataTable and then use a SqlCeDataAdapter to save all the data with one call to Update. You may have to fiddle with the UpdateBatchSize to get the best performance.
Looking more closely, I see that you already have a DataTable. Looping through it yourself is therefore ludicrous. Just note that, as you have it, the RowState of every DataRow will be Unchanged, so they will not be inserted. I think that you can call DataTable.Load such that all RowState values are left as Added but, if not, then use a SqlDataAdapter instead, set AcceptChangesDuringFill to false and call Fill.
Is there any way for faster insert?
Yes, but it probably won't be "acceptably fast enough" when we're talking about inserting 20k rows.
The problem I can see is that you are opening a connection for every single row you are retrieving from SqlServer_GetData_dt, that is, you open a connection to insert data 20k times...opening a connection is an expensive transaction. You should build the whole query using a StringBuilder object and then execute all the insert statements in one batch.
This will bring some performance gains but don't expect it to solve you're problem, inserting 20k rows will still take some time, specially if indexes need to be re-built. My suggestion is that you should thoroughly analyse your requirements and be a bit smarter about how you approach it. Options are:
bundle a pre-populated database if possible so your app doesn't have to suffer the population process performance penalties
if not, run the insert process in the background and access the data only when the pre-population is finished
I have been writing database software for a few years now and know that there are multiple ways to access data. Personally I do everything manually when pulling data by using a data reader object. This has just started to get on my nerves when working with tables that have a large amount of columns. It becomes very inefficient to have to write 30 lines of this code at a time
if(reader[count] != DBNull.Value)
someObject = reader.GetString(count++);
else
count++;
it is bad enough that the queries themselves take a long time to type out and get ready.
I was thinking on possibly using a datatable to retrieve my records since you can do that in a few lines and then write a few helper methods that iterate throught the rows and return me an observable collection of objects. Is there another more simple, or more concise way to go about doing this?
I just use the datatable
using (SqlConnection conn = new SqlConnection())
{
conn.Open();
DataTable dt = new DataTable();
using (SqlDataAdapter adapter = new SqlDataAdapter("select * from mytable", conn))
{
adapter.Fill(dt);
}
}
I'm using a datatable as the datasource of some dropdowns on a page, but have noticed that the page is very slow during the postbacks.
I've tracked it through to here:
DataTable dt = new DataTable();
dt.Load(sqlCmd.ExecuteReader()); // this takes ages
The sql command is a parametrised query, not a stored procedure (the return values and where are quite 'dynamic' so this wouldn't be practicable), but nevertheless a simple select union query.
Usually returns between 5 and 20 options per dropdown, depending on what's been selected on the other dropdowns.
When I run the query in the management studio, it's done in under a second. Here it can take up to 7 seconds per dropdown, with 6 dropdowns on the page it soon adds up.
I have also tried with a SqlDataAdapter:
SqlDataAdapter sqlDa = new SqlDataAdapter(sqlCmd);
sqlDa.Fill(dt); // this takes ages
but this was just as slow.
I have this on 2 different systems and on both have the same performance issues.
If anyone knows a better (faster) methord, or knows why this is so slow that would be great.
Not the best thread I've seen on the issue, but there's good links inside, & it's in my post history:
SQL Query that runs fine in SSMS runs very slow in ASP.NET
The SQL Optimizer sometimes likes to decide what's best & you'll have to break out your query through some tracing and logging of data execution plans. It may very well be something as buried as a bad index, or your query code might need optimization. Seeing as we don't have the query code, and having it may or may not be helpful. I'd recommend you follow the guides linked to in the above post and close your question.
here is an example on how you can load a DataTable very quickly notice how I show specific Columns that I want to return
private DataTable GetTableData()
{
string sql = "SELECT Id, FisrtName, LastName, Desc FROM MySqlTable";
using (SqlConnection myConnection = new SqlConnection(connectionString))
{
using (SqlCommand myCommand = new SqlCommand(sql, myConnection))
{
myConnection.Open();
using (SqlDataReader myReader = myCommand.ExecuteReader())
{
DataTable myTable = new DataTable();
myTable.Load(myReader);
myConnection.Close();
return myTable;
}
}
}
}
If you want to use DataAdapter to Fill the DataTable here is a simple example
private void FillAdapter()
{
using (SqlConnection conn = new SqlConnection(Your ConnectionString))
{
conn.Open();
using (SqlDataAdapter dataAdapt = new SqlDataAdapter("SELECT * FROM EmployeeIDs", conn))
{
DataTable dt = new DataTable();
dataAdapt.Fill(dt);
// dataGridView1.DataSource = dt;//if you want to display data in DataGridView
}
}
}
Having problems with Visual Fox Pro ODBC drivers (and OLE DB) the code I'm using to dynamically get all the column names for a DBF table works fine for small datasets. But if I try getting the table schema using the below code for a large table then it can take 60 seconds to get the schema.
OdbcConnection conn = new OdbcConnection(#"Dsn=Boss;sourcedb=u:\32BITBOSS\DATA;sourcetype=DBF;exclusive=No;backgroundfetch=Yes;collate=Machine;null=Yes;deleted=Yes");
OdbcCommand cmd = new OdbcCommand("Select * from " + listBox1.SelectedItem.ToString(), conn);
conn.Open();
try
{
OdbcDataReader reader = cmd.ExecuteReader();
DataTable dt = reader.GetSchemaTable();
listBox2.Items.Clear();
foreach (DataRow row in dt.Rows)
{
listBox2.Items.Add(row[0].ToString());
}
conn.Close();
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
conn.Close();
}
Now I get that this is due to the Select command dragging all 500,000 records or so into the data reader before it can then extract the column names. But visual fox pro is pretty rubbish at limiting the record return.
I ideally want to limit the return to just 1 record, but on the assumption I don't know any of the column names to utilise a WHERE clause this doesn't work.
VFP can utilise the TOP SQL command, but to do that you need an ORDER BY to go with it, and since I don't have the any of the column names to utilise, this doesn't work either.
So am a bit stumped trying to think of a clever way to dynamically get the table schema without the slow down in VFP.
I don't think database conversions on the fly will be any faster. Anyone got any clever ideas (apart from changing the database system, it's inherited in the job and I currently have to work with it :) )?
Change this line:
OdbcCommand cmd = new OdbcCommand("Select * from " + listBox1.SelectedItem.ToString(), conn);
To this line:
OdbcCommand cmd = new OdbcCommand(string.Format("Select * from {0} where 1 = 0", listBox1.SelectedItem.ToString()), conn);
This will bring back zero records, but fill the schema. Now that you have the schema you can build a filter further.