MySqlDataAdapter.Fill() is slow/fails - c#

I currently use the MySQL.Data extension in order to load my MySQL database information into my C# application. I use the MySqlDataAdapter.Fill() command to load the data into a DataTable.
This solution works for the majority of my tables, except the larger ones. For example, I have a table that's about 1,310,634 rows long. When I attempt to fill a DataTable with that data, the application will hang and eventually fail with an exception:
Exception was Unable to read data from the transport connection: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond.
I've tried solutions where I increase the timeouts of the MySQL connection with commands like this set net_write_timeout=99999; set net_read_timeout=99999;, but it doesn't seem to help.
This is my C# code so far:
//Construct output DataTable
DataTable output = new DataTable();
MySqlDataAdapter ad = new MySqlDataAdapter(query, connection);
try
{
ad.FillSchema(output, SchemaType.Mapped); //Filling Columns, Types, Limitations
ad.Fill(output); //Filling Data
ad.Dispose();
}
catch (Exception ex)
{ throw; }
return output;

As stated in the comments above, the best solution to your issue is to refactor your solution. Instead of fetching the whole tables from your database, use some SQL SELECT-statements to get only the rows you really need to accomplish the task at hand.
If you really need to process the whole table, e.g. for some kind of consistency check that cannot be accomplished with SQL, then you could load the data in portions, selecting only 100K rows at a time.
But again, I doubt fetching multiple tables with over a million rows at the same time is a valid use case and therefore should not be performed...

Related

AdsConnection throws EntryPointNotFoundException on second connection but works the first time

I have a piece of code that I reuse that helps me connect to an adt database and read the data.
using Advantage.Data.Provider;
...
protected DataTable FillTable(string tableName)
{
DataTable table = new DataTable();
using (var conn = new AdsConnection(connectionString))
using (var adapter = new AdsDataAdapter())
using (var cmd = new AdsCommand())
{
cmd.Connection = conn;
cmd.CommandText = "select * from " + tableName;
adapter.SelectCommand = cmd;
conn.Open();
adapter.Fill(table);
conn.Close();
}
return table;
}
This code works perfectly the first time I go through it, but gives the following exception the second time I call it with a different table name.
System.EntryPointNotFoundException: 'Unable to find an entry point named 'AdsIsConnectionAlive' in DLL 'ace32.dll'.'
I would like an explanation.
I've tried to read up on this error, but all the possible scenario's I've found don't explain why it works the first time. They mention problems with the DLL like it being the wrong version or some incompatability with the .NET version, ...
If I change the order of the calls the code still fails on the second time, so I know the problem isn't with the name of the table or the way I call my code. The problem is probably with me not closing the connection correctly. I've tried adding more braces just to make sure that that part runs correctly and I've debugged to make sure that the first conn.Close(); is executed correctly.
I could place all my code within this code and only use one connection that I keep open as long as I need it. That would bypass my problem, but I would like to avoid that and to understand what I'm doing wrong.
This is most likely caused by loading an older version of ace32.dll from a newer version of the ado.net components. The AdsIsConnectionAlive was introduced in a later version of the DLL - not sure about the exact version probably 6.0 or later.
The first time the connection was made, the ado.net component knows that the connection was not alive so there was no need to call the IsAlive entry point. The second time around, since there was already a connection made to the same connection path, it would try to reuse it by checking to see if it is still alive. I think that there is a way to disable the connection caching but do not remember the detail. A better solution would be to make sure that the advantage DLLs are matching version.

SQL Slow in code not SSMS

I am running into issues working with a very large table from C# .Net 4.0.
For reference the table has ~120 Million rows.
I can't do even a simple query like
SELECT TOP(50) *
FROM TableName
WHERE CreatedOn < '2015-06-01';
From code it will timeout (Default setting - 15 seconds I believe), but in SSMS it is instant.
There is an index on the column in the WHERE clause. I have also tried explicitly casting the string to a DateTime, and using a DateTime parameter instead of a literal value.
I tried a different query that filters by the PK (bigint, identity, clustered index) If I do something like "Where TableRowID = 1" it works fine from code, but if I try to use "<" or "<=" instead it will timeout (returns instantly in SSMS), regardless of how many rows are turned.
The execution plans are very simple and are exactly the same.
I have tried changing ARITHABORT but that has had no effect.
I tried having the Application connect with my own account (SSPI) instead of its own credentials with no effect.
I have been researching this issue for a few days, but everything I have found blames different execution plans.
Does anyone have any idea what could be causing this issue?
The .Net code looks like this:
private DataSet ExecuteQuery(string query, string db, List<SqlParameter> parms = null)
{
string connectionString = ConfigurationManager.ConnectionStrings[db].ToString();
SqlConnection con = new SqlConnection(connectionString.Trim());
SqlDataAdapter sqlDataAdapter = new SqlDataAdapter();
try
{
con.Open();
DataSet ds = new DataSet();
sqlDataAdapter.SelectCommand = new SqlCommand(query, con);
sqlDataAdapter.SelectCommand.CommandType = CommandType.Text;
if (parms != null)
{
foreach (SqlParameter p in parms)
{
sqlDataAdapter.SelectCommand.Parameters.Add(p);
}
}
sqlDataAdapter.Fill(ds);
if (ds.Tables.Count > 0 && ds.Tables[0].Rows.Count > 0)
{
return ds;
}
return null;
}
finally
{
if (sqlDataAdapter != null)
sqlDataAdapter.Dispose();
if (con != null)
con.Dispose();
}
}
The error message I get in .Net is the standard timeout message:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Here are my experience when dealing the issues.
Very C# sql code passed to sql server is
SELECT TOP(50) *
FROM TableName
WHERE CreatedOn < '2015-06-01'
Make sure the criteria. If it takes "instant time" to retrieve records on SSMS, the sql statement and db optimization is ok.
As other people have pointed out, you should post your C# codes and see what happens. There could be other issues. Could that be network? Could that be web.config? Do you call directly from C# code? Do you call via web service?
when you said time out? Does it time out at the time you execute the query. There is very little information you provide. Do you use third party routines (such as written by vendor or your company) to execute queries? If it possible, put the break point at the code that execute sql statement. What I mean is dig all the way to native codes, and put the break codes.
120 million records. Looks like the database has be optimized if it runs very fast on SSMS. I would take look outside SQL server.
good luck
My first step there would be to look at what your code is sending to the sql server. I'd begin by running the sql profiler. if you're not familiar with it. Here is a link on how to use it.
https://www.mssqltips.com/sqlservertip/2040/use-sql-server-profiler-to-trace-database-calls-from-third-party-applications/
After this you may want to look into network traffic times between the 2 servers.
Then look at IIS and see how it's setup. Could be a setting is wrong.
Check the error logs and see if you have any errors as well.
When you execute code in SSMS, the system is setting some default values on your connection. Specifically, look at Tools --> Options --> Query Execution --> Sql Server --> Advanced (and also ANSI). These statements are executed on your behalf when you open a new query window.
When you create your connection object in C#, are you setting these same options? If you don't explicitly set them here, you are taking the default values as defined in the actual SQL Server instance. In SSMS, you can get this by viewing the properties of the server instance and choosing the Connections page. This shows the default connection options.
You can also get this information without using the UI (you can do this in a separate application that uses the same connection string, for example). This article on MSSQLTips should guide you in the right direction.

Does instantiating a DataSet object automatically create a connection to a SQL service-based database for CRUD operations?

Here is everything that I did:
In a visual studio 2013 C# project, I created a service database (.mdf file). Note: I changed the name from Database1.mdf to fghLocalDB.mdf.
I opened this database in the server explorer.
I created 2 tables called Country and CarbonDioxide using the table designer.
I added an entry to the Country table as shown by the Data Table of the Country table.
I did the following to create a DataSet my application can use. I created a Data Source by clicking on the "Project" option on the top menu bar and clicking on the "Add New Data Source ..." option from the drop down.
This is what my project files looked like at this point.
I wrote the following code in the main method thinking that this would be all I need to write to the database.
// Create a connection to the DataSet and TableAdapters that will communicate with our
// local database to handle CRUD operations.
fghLocalDBDataSet dataSet = new fghLocalDBDataSet();
fghLocalDBDataSetTableAdapters.CountryTableAdapter countryTableAdapter =
new fghLocalDBDataSetTableAdapters.CountryTableAdapter();
try
{
// Insert a row into Country table. EDIT 1 Will comment after first program run.
Console.WriteLine(countryTableAdapter.Insert("United States"));
// Actually writeback information to the database?
// dataSet.AcceptChanges(); EDIT 2 commented this as LeY suggested it was not needed.
// EDIT 3 Validation code as suggested by Ley.
var dt = new fghLocalDBDataSet.CountryDataTable();
var adapter = new fghLocalDBDataSetTableAdapters.CountryTableAdapter();
adapter.Fill(dt);
foreach (var row in dt)
{
// This does not get executed after a second run of the program.
// Nothing is printed to the screen.
Console.WriteLine("Id:" + row.Id + "----Name: " + row.Name);
}
Console.Read();
}
catch(SqlException exception){
Console.WriteLine("ERROR: " + exception.ToString());
}
Console.ReadLine();
I ran the program and everything seemed fine.
I opened the tables by right clicking on these tables in the server explorer and pressing "Show Data Table".
The "United States" row was not added as wanted.
I think it has to do with the connectionstring. I right clicked on my project and opened properties.
Here I made sure the connection string matched that of the local database by looking at the string in the properties of the database. They are the same.
I copied and pasted the actual text for each connection string:
Connection string of project:
Data Source=(LocalDB)\v11.0;AttachDbFilename=|DataDirectory|\fghLocalDB.mdf;Integrated Security=True
Connection string of actual database (.mdf file):
Data Source=(LocalDB)\v11.0;AttachDbFilename=C:\Users\gabriel\Source\Workspaces\Capstone\Sandbox\aduclos\QueryDataMarketConsole\QueryDataMarketConsole\fghLocalDB.mdf;Integrated Security=True
I am assuming |DataDirectory| is equal to C:\Users\gabriel\Source\Workspaces\Capstone\Sandbox\aduclos\QueryDataMarketConsole\QueryDataMarketConsole\fghLocalDB.mdf; since in the picture above when I clicked on the button to expand the Value of the connection string the connection properties window opened up and had this path for the database file name.
My question in a nutshell is does instantiating a DataSet object in the code automatically create a connection to a SQL service-based database for CRUD operations?
If not how do I connect my DataSet object to my sql database so that way I can actually write to the database when using the TableAdapters?
I read the following links:
Insert method of TableAdapter not working?
TableAdapter Insert not persisting data
Use connectionstring from web.config in source code file
Do I need an actual SqlConnection object? and how to I connect this to the DataSet & TableAdapters?
I never used tableadpter.insert() method. But I tried it on my local machine, and it works.
I can't figure out your problem based on the information you provided, sorry, but I can point you a direction.
If you created everything from wizard, you don't need to worry about the connection, the table Adapters will handle the connection for you. The connection string (you circled) will be added to your app.config file as well as your setting class automaticly. That is how your application (or you) uses it.
var countryTableAdapter = new CountryTableAdapter();
countryTableAdapter.Insert("United States");
This 2 lines of code are enough to insert the row into database if there is no exception thrown, I don't know why it doesn't work for you. Maybe the way you verify it somehow goes wrong, but you can verify it in another way.
The countryTableAdapter.Insert method will return the number of row get affected, in your case , should be one. So put the following code in , and set a breakpoint after it. if the rowAffected == 1, then the insertion works.
var rowAffected = countryTableAdapter.Insert("Test2")
If you need more confirmation , try this.
var dt = new fghLocalDBDataSet.CountryDataTable();
var adapter = new CountryTableAdapter();
adapter.fill(dt);
foreach (var row in dt){
Console.WriteLine("Id:" + row.Id + "----Name: " + row.Name);
}
Console.Read();
you will see all the records in your table.
I hope this will help.
By the way, from your code
dataSet.AcceptChanges();
The line of code above doesn't update the database at all. It only modify your local data storage.
it overwrites your dataRow original version using current version and change the current version row state to unchanged.
Only the tableadapters can talk to database (not true I know, but I just want to make a point that Dataset can not talk to database directly).
And I usually only need tableadapte.Update method and pass the dataSet or dataTable in with correct RowState.
The tableAdapter.update method will call AcceptChanges on each row eventually if it successfully updated the database.
You should never need to call AcceptChanges explicitly unless you only want update your dataset in memory.
I recommend you to read ADO.NET Architecture to get the big picture how DataSet and TableAdapter worked.
It was my connection string after all. In my original post, I said I had two connection strings:
Connection string in project settings:
Data Source=(LocalDB)\v11.0;AttachDbFilename=|DataDirectory|\fghLocalDB.mdf;Integrated Security=True
Actual connection string in fghLocalDB.mdf file:
Data Source=(LocalDB)\v11.0;AttachDbFilename=C:\Users\gabriel\Source\Workspaces\Capstone\Sandbox\aduclos\QueryDataMarketConsole\QueryDataMarketConsole\fghLocalDB.mdf;Integrated Security=True
Something went wrong with
|DataDirectory| = C:\Users\gabriel\Source\Workspaces\Capstone\Sandbox\aduclos\QueryDataMarketConsole\QueryDataMarketConsole\fghLocalDB.mdf;
in my App.config.
My Solution:
What I did was copy the actual connection string of the .mdf file from the .mdf properties panel and paste it into the project properties => Settings => Value field of the connection string set up.
Afterwards I ran my code again and sure enough the data persisted in the tables.
I did not need dataSet.AcceptChanges(); as #LeY pointed out. I also did not need a TableAdapter.Update(dataset) call as posted in other solutions. I just needed the TableAdapter.Insert("...") call.
EDIT: ALSO Most importantly to answer my original question, instantiation a DataSet does not create a connection with the local database. Instead instantiating a TableAdapter does establish a connection with the database!

C# Sync MS Access database to sql server

I am trying to set up a synchronization routine in C# to send data from a ms access database to a sql server. MS Access is not my choice it's just the way it is.
I am able to query the MS Access database and get OleDbDataReader record set. I could potentially read each individual record and insert it onto SQL Server but it seems so wasteful.
Is there a better way to do this. I know I could do it in MS Access linking to sql server and perform the update easy but this is for end users and I don't want them messing with access.
EDIT:
Just looking at SqlBulkCopy I think that may be the answer if I get my results into DataRow[]
I found a solution in .NET that I am very happy with. It allows me to give the access to the sync routine to any user within my program. It involves the SQLBulkCopy class.
private static void BulkCopyAccessToSQLServer
(CommandType commandType, string sql, string destinationTable)
{
using (DataTable dt = new DataTable())
{
using (OleDbConnection conn = new OleDbConnection(Settings.Default.CurriculumConnectionString))
using (OleDbCommand cmd = new OleDbCommand(sql, conn))
using (OleDbDataAdapter adapter = new OleDbDataAdapter(cmd))
{
cmd.CommandType = commandType;
cmd.Connection.Open();
adapter.SelectCommand.CommandTimeout = 240;
adapter.Fill(dt);
adapter.Dispose();
}
using (SqlConnection conn2 = new SqlConnection(Settings.Default.qlsdat_extensionsConnectionString))
{
conn2.Open();
using (SqlBulkCopy copy = new SqlBulkCopy(conn2))
{
copy.DestinationTableName = destinationTable;
copy.BatchSize = 1000;
copy.BulkCopyTimeout = 240;
copy.WriteToServer(dt);
copy.NotifyAfter = 1000;
}
}
}
}
Basically this puts the data from MS Access into a DataTable it then uses the second connection conn2 and the SqlBulkCopy class to send the data from this DataTable to the SQL Server. It's probably not the best code but should give anyone reading this the idea.
You should harness the power of SET based queries over RBAR efforts.
Look into a SSIS solution to synchronize the data and then schedule the package to run at regular intervals using SQL Server Agent.
You can call an SSIS package from the command line so you can effectively do it from MS Access or from C#.
Also, the SQL Server, the MS Access DB and the SSIS package do not have to be on the same machine. As long as your calling program can see the SSIS package, and the package can connect to the SQL Server and the MS Access DB, you can transfer data from one place to another.
It sounds like what you are doing is ETL. There are several tools that are built to do this and to me, there is little reason to reinvent the functionality. You have SQL Server, therefore you have SSIS. It has a ton of tools for automated transformations, cleanups, lookups, etc. that you can use out of the box.
Unless this is a real cut-and-dry data load and there is absolutely no scope for the complexity of the upload to increase later on (yeah, right!) I would go with a tried and tested ETL tool.
If SQL Server Integration Services isn't an option, you could write out to a temporary text file the data that you read from Access and then call bcp.exe to load it to the database.
I have done something like this before.
I used
OleDbConnection aConnection = new OleDbConnection(String.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0}", fileName));
aConnection.Open();
to open the access db. Then
OleDbCommand aCommand = new OleDbCommand(String.Format("select * from {0}", accessTable), aConnection);
OleDbDataReader aReader = aCommand.ExecuteReader();
to execute the read from the table. Then
int fieldCount = aReader.FieldCount;
to get the field count
while (aReader.Read())
to loop the records and
object[] values = new object[fieldCount];
aReader.GetValues(values);
to retrieve the values.
There are several ways to sync but it can give a problem when you change a field name in sql server or add a new column or delete. The best option would be:
Create connections for sql server and oledb.
Write custom query to fetch record from one connection and save it to another.
Before executing make sure to program in a way that you update all table definitions.
In my case this helped in a way because the load on sql server became down.
can you not transfer Access file to the server and delete it once sync is complete?
You can create windows service for that..

Always get exception when trying to Fill data to DataTable

The code below is just a test to connect to an Oracle database and fill data to a DataTable. After executing the statement da.Fill(dt);, I always get the exception
"Exception of type 'System.OutOfMemoryException' was thrown."
Has anyone met this kind of error? My project is running on VS 2005, and my Oracle database version is 11g. My computer is using Windows Vista. If I copy this code to run on Windows XP, it works fine.
Thank you.
using System.Data;
using Oracle.DataAccess.Client;
...
string cnString = "data source=net_service_name; user id=username; password=xxx;";
OracleDataAdapter da = new OracleDataAdapter("select 1 from dual", cnString);
try
{
DataTable dt = new DataTable();
da.Fill(dt); // Got error here
Console.Write(dt.Rows.Count.ToString());
}
catch (Exception e)
{
Console.Write(e.Message); // Exception of type 'System.OutOfMemoryException' was thrown.
}
Update
I have no idea what happens to my computer. I just reinstall Oracle 11g, and then my code works normally.
How big is your dual table? This query:
select 1 from dual
Will return a single-column table with as many rows as the dual table, with 1 in every row. If the table has millions of rows then it wouldn't surprise me if it threw an out of memory exception.
Edit Of course, this doesn't explain why it would work on XP but not on Vista, unless it's something implementation-specific (querying a different instance of the database on the two different workstations, for example).
Edit 2:
Ok, so presumably there is only one row in dual since your comment indicates that the query only returns a single row.
A couple of things to investigate:
The Oracle ADO.NET connection requires the Oracle client software, right? Is the Oracle software on your Vista box the same version as on the XP box? Perhaps there's a discrepancy there.
Instead of showing e.Message, try showing e.ToString() to get a full stack trace - it might give you more insight as to where the error is being thrown from.

Categories

Resources