The code below is just a test to connect to an Oracle database and fill data to a DataTable. After executing the statement da.Fill(dt);, I always get the exception
"Exception of type 'System.OutOfMemoryException' was thrown."
Has anyone met this kind of error? My project is running on VS 2005, and my Oracle database version is 11g. My computer is using Windows Vista. If I copy this code to run on Windows XP, it works fine.
Thank you.
using System.Data;
using Oracle.DataAccess.Client;
...
string cnString = "data source=net_service_name; user id=username; password=xxx;";
OracleDataAdapter da = new OracleDataAdapter("select 1 from dual", cnString);
try
{
DataTable dt = new DataTable();
da.Fill(dt); // Got error here
Console.Write(dt.Rows.Count.ToString());
}
catch (Exception e)
{
Console.Write(e.Message); // Exception of type 'System.OutOfMemoryException' was thrown.
}
Update
I have no idea what happens to my computer. I just reinstall Oracle 11g, and then my code works normally.
How big is your dual table? This query:
select 1 from dual
Will return a single-column table with as many rows as the dual table, with 1 in every row. If the table has millions of rows then it wouldn't surprise me if it threw an out of memory exception.
Edit Of course, this doesn't explain why it would work on XP but not on Vista, unless it's something implementation-specific (querying a different instance of the database on the two different workstations, for example).
Edit 2:
Ok, so presumably there is only one row in dual since your comment indicates that the query only returns a single row.
A couple of things to investigate:
The Oracle ADO.NET connection requires the Oracle client software, right? Is the Oracle software on your Vista box the same version as on the XP box? Perhaps there's a discrepancy there.
Instead of showing e.Message, try showing e.ToString() to get a full stack trace - it might give you more insight as to where the error is being thrown from.
Related
I use SqlBulkCopy to insert approximately 3.7 million rows and throws error
Exception of type 'System.OutOfMemoryException' was thrown
Here is my code. The columns are added dynamically, otherwise it is a straight forward code.
using (var bulkCopy = new SqlBulkCopy(connection))
{
connection.Open();
using (var reader = ObjectReader.Create(inputRecordsToProcess))
{
bulkCopy.BatchSize = 100000;
bulkCopy.BulkCopyTimeout = 0;
bulkCopy.DestinationTableName = schemaName + "." + tableName;
var classProperties = GetClassPropertiesByAttributes<T>();
var i = 0;
foreach (var property in classProperties)
{
bulkCopy.ColumnMappings.Add(property.Item2, property.Item1);
i++;
}
try
{
bulkCopy.WriteToServer(reader);
}
catch (Exception ex)
{
throw;
}
}
}
One important point is, I was able to insert ~3.6 million rows, but throws exception when it goes over that. Do I need to make any changes to the code?
This occurs on all servers (dev, prod and even local)
Thank you all for your response.
I figured out what was the problem. Firstly, it didn't occur when calling the SQL Bulk Copy, which I assumed it could have been because, in previous run, I was able to run few 1000 less records than the current one, so I thought the SQL Bulk Copy had limitations or need to settings in order to handle large number of records. When I debugged the code, I found that there is a method call before calling the database insert to "convert object type to strong type", meaning, I have a dynamic class object which I need to convert to strong type object, so I loop all records in that method to convert and it fails there. But still, that method worked earlier with over 3 million records.
So after googling, I found that the culprit was "32-bit" platform. The application was set to run in "Any CPU", but the "Prefer 32-bit" was checked. When I unchecked, I was able to run the records and insert into the database using SQL Bulk Copy!
When i went thru this link's answer, I changed it https://social.msdn.microsoft.com/Forums/en-US/ace563e2-66fe-4666-9f04-cbfc90ab59bc/system-out-of-memory-exception-due-to-mismanaged-memory-using-threads?forum=csharplanguage
Thank you all for taking time to read thru my post! Appreciate it!
I am running into issues working with a very large table from C# .Net 4.0.
For reference the table has ~120 Million rows.
I can't do even a simple query like
SELECT TOP(50) *
FROM TableName
WHERE CreatedOn < '2015-06-01';
From code it will timeout (Default setting - 15 seconds I believe), but in SSMS it is instant.
There is an index on the column in the WHERE clause. I have also tried explicitly casting the string to a DateTime, and using a DateTime parameter instead of a literal value.
I tried a different query that filters by the PK (bigint, identity, clustered index) If I do something like "Where TableRowID = 1" it works fine from code, but if I try to use "<" or "<=" instead it will timeout (returns instantly in SSMS), regardless of how many rows are turned.
The execution plans are very simple and are exactly the same.
I have tried changing ARITHABORT but that has had no effect.
I tried having the Application connect with my own account (SSPI) instead of its own credentials with no effect.
I have been researching this issue for a few days, but everything I have found blames different execution plans.
Does anyone have any idea what could be causing this issue?
The .Net code looks like this:
private DataSet ExecuteQuery(string query, string db, List<SqlParameter> parms = null)
{
string connectionString = ConfigurationManager.ConnectionStrings[db].ToString();
SqlConnection con = new SqlConnection(connectionString.Trim());
SqlDataAdapter sqlDataAdapter = new SqlDataAdapter();
try
{
con.Open();
DataSet ds = new DataSet();
sqlDataAdapter.SelectCommand = new SqlCommand(query, con);
sqlDataAdapter.SelectCommand.CommandType = CommandType.Text;
if (parms != null)
{
foreach (SqlParameter p in parms)
{
sqlDataAdapter.SelectCommand.Parameters.Add(p);
}
}
sqlDataAdapter.Fill(ds);
if (ds.Tables.Count > 0 && ds.Tables[0].Rows.Count > 0)
{
return ds;
}
return null;
}
finally
{
if (sqlDataAdapter != null)
sqlDataAdapter.Dispose();
if (con != null)
con.Dispose();
}
}
The error message I get in .Net is the standard timeout message:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Here are my experience when dealing the issues.
Very C# sql code passed to sql server is
SELECT TOP(50) *
FROM TableName
WHERE CreatedOn < '2015-06-01'
Make sure the criteria. If it takes "instant time" to retrieve records on SSMS, the sql statement and db optimization is ok.
As other people have pointed out, you should post your C# codes and see what happens. There could be other issues. Could that be network? Could that be web.config? Do you call directly from C# code? Do you call via web service?
when you said time out? Does it time out at the time you execute the query. There is very little information you provide. Do you use third party routines (such as written by vendor or your company) to execute queries? If it possible, put the break point at the code that execute sql statement. What I mean is dig all the way to native codes, and put the break codes.
120 million records. Looks like the database has be optimized if it runs very fast on SSMS. I would take look outside SQL server.
good luck
My first step there would be to look at what your code is sending to the sql server. I'd begin by running the sql profiler. if you're not familiar with it. Here is a link on how to use it.
https://www.mssqltips.com/sqlservertip/2040/use-sql-server-profiler-to-trace-database-calls-from-third-party-applications/
After this you may want to look into network traffic times between the 2 servers.
Then look at IIS and see how it's setup. Could be a setting is wrong.
Check the error logs and see if you have any errors as well.
When you execute code in SSMS, the system is setting some default values on your connection. Specifically, look at Tools --> Options --> Query Execution --> Sql Server --> Advanced (and also ANSI). These statements are executed on your behalf when you open a new query window.
When you create your connection object in C#, are you setting these same options? If you don't explicitly set them here, you are taking the default values as defined in the actual SQL Server instance. In SSMS, you can get this by viewing the properties of the server instance and choosing the Connections page. This shows the default connection options.
You can also get this information without using the UI (you can do this in a separate application that uses the same connection string, for example). This article on MSSQLTips should guide you in the right direction.
I currently use the MySQL.Data extension in order to load my MySQL database information into my C# application. I use the MySqlDataAdapter.Fill() command to load the data into a DataTable.
This solution works for the majority of my tables, except the larger ones. For example, I have a table that's about 1,310,634 rows long. When I attempt to fill a DataTable with that data, the application will hang and eventually fail with an exception:
Exception was Unable to read data from the transport connection: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond.
I've tried solutions where I increase the timeouts of the MySQL connection with commands like this set net_write_timeout=99999; set net_read_timeout=99999;, but it doesn't seem to help.
This is my C# code so far:
//Construct output DataTable
DataTable output = new DataTable();
MySqlDataAdapter ad = new MySqlDataAdapter(query, connection);
try
{
ad.FillSchema(output, SchemaType.Mapped); //Filling Columns, Types, Limitations
ad.Fill(output); //Filling Data
ad.Dispose();
}
catch (Exception ex)
{ throw; }
return output;
As stated in the comments above, the best solution to your issue is to refactor your solution. Instead of fetching the whole tables from your database, use some SQL SELECT-statements to get only the rows you really need to accomplish the task at hand.
If you really need to process the whole table, e.g. for some kind of consistency check that cannot be accomplished with SQL, then you could load the data in portions, selecting only 100K rows at a time.
But again, I doubt fetching multiple tables with over a million rows at the same time is a valid use case and therefore should not be performed...
I run a simple query to SQL and can't seem to get it working correctly.
DataTable loDTDOffre = new DataTable();
SqlCommand loSQLCommand = new SqlCommand("dbo.SP_StoredProc", loConnectionBD.ConnectionSql);
loSQLCommand.CommandType = CommandType.StoredProcedure;
loSQLCommand.Parameters.Add("#liNoUnit", SqlDbType.Int);
loSQLCommand.Parameters["#liNoUnit"].Value = noUniteProduction;
SqlDataAdapter loSqlDataAdapter = new SqlDataAdapter(loSQLCommand);
loSqlDataAdapter.SelectCommand = loSQLCommand;
loSqlDataAdapter.Fill(loDTDOffre);
My database connection is open, the stored proc executed in SQL Management Studio works fine. All I get as error message in VS2010 is :
Warning: Fatal error 50000 occurred at May 30 2013 11:17AM. Note the error and time, and contact your system administrator.
Process ID 59 has raised user error 50000, severity 20. SQL Server is terminating this process.
Is there any way to get a clearer message of what is wrong, the code seems right. The error message is so general, I can't figure out what I do wrong.
My stored procedure returns a simple select, one row.
Thanks
Turns out it runs on an approle, this came out :
The SELECT permission was denied on the object 'thisTable', database 'thisDatabase', schema 'dbo'.
Fixed, thank you.
I want to insert about 2000 records every time a button is clicked.
It works fine until record 511, and throw this exception:
Unspecified Error \r\n Object invalid or no longer set
I've debugged it several times with different records or different order and always get the same error on 511th record.
What's happening?
CODE:
(I read the ID of the last record, before i insert another one)
string CmdText = "SELECT TOP 1 Id FROM MyTable ORDER BY Id DESC";
OleDbCommand com = new OleDbCommand(CmdText,tran.Connection,tran);
com.CommandType = CommandType.Text;
OleDbDataReader reader = com.ExecuteReader(); //exception started here
It sounds like somehow the Jet engine is
not working properly or is corrupted.
When opening and closing connections or recordsets using the Microsoft ODBC Driver for Access or the Microsoft OLE DB Provider for Jet, the following error may be reported:
Object invalid or no longer set.
To resolve this problem, install the latest Microsoft Jet 4.0 service pack 6. For additional information FIX: "Object invalid or no longer set" Error with Microsoft Jet
I've figured it out guys.
I have to close OleDBDataReader every time i want to insert new record.
Now it works fine. Thanks.
The best way to resolve this problem is to delete that table in which its giving error in inserting / updating. and then re-create the table, but be sure, to backup the table data first.