ODP.NET memory leak when using UDT:s - c#

When using ODP.NET load data into and spatial database i'm using UDT to define the SDOGEOMETRY type.
Then i use the ArrayBindCount on the OracleCommand to load batches of data. Everything works, but i see a constant increase of memory of the process, and performance counters shows the same thing..
Parameter is created using:
var param = new OracleParameter("geom", OracleDbType.Object);
param.UdtTypeName = "MDSYS.SDO_GEOMETRY";
param.Direction = ParameterDirection.Input;
cmd.Parameters.Add(param);
Also, i set the cmd.AddToStatementCache = false to prevent data from ending up in there..
When adding data i use:
param.Value = new object[numRowsToInsert];
int row = 0;
foreach (var row in rowstoinsert)
{
OracleUDT.SdoGeometry geoVal = rowstoinsert[row].geom;
(param.Value as object[])[row] = geoval;
}
...
cmd.ExecuteNonQuery(); //THIS IS WHERE MEMORY LEAK APPEARS TO BE
..
I tried running the program with ExecuteNonQuery() removed, and then there is no MemoryLeakage at all....
Edit:
I also tried removing the UDT-parameter and run through the program, also without any leak. So it looks the problem is very close related to UDT:s and when statements are executed.
I'm using ODP.NET 11.2.0.2.1
Anyone got any clue?
Is there something i need to clean that does not get created if not running ExecuteNonQuery()?

Thought I'd give a followup on this one.
After numerous emails with Oracle Tech-Support i finally got this accepted as a bug
This appears to be Bug 10157396 which is fixed in 12.1, is planned to be fixed in 11.2.0.4 and has been backported to 11.2.0.2 (available in Patch Bundle 18). This can be downloaded from MyOracleSupport as Patches 10098816 (11.2.0.2) and 13897456 (Bundle 18) for a temporary solution while we get a backport to 11.2.0.3 or until 11.2.0.4 is released.

Related

How To DEALLOCATE PREPARE Statement In C# Using Mysql.Data Client?

hope you guys are fine?
OK.. i am using MySQL.Data client/library to access and use MySQL database. I was using happily it for sometimes on quite a few project. But suddenly facing a new issue that causing me hold on my current project. :(
Because current project makes some (looks like it's a lot) db queries. and i am facing following exception :
Can't create more than max_prepared_stmt_count statements (current value: 16382)
i am closing and disposing the db engine/connection every time i am done with it. But getting damn confused why i am still getting this error.
​here is the sample code just to give you idea.. (trimmed out unnecessary parts)
//this loop call an API with pagination and get API response
while(ContinueSalesOrderPage(apiClient, ref pageNum, days, out string response, window) == true)
{
//this handle the API date for the current page, it's normally 500 entry per page, and it throws the error on 4th page
KeyValueTag error = HandleSalesOrderPageData(response, pageNum, out int numOrders, window);
}
private KeyValueTag HandleSalesOrderPageData(string response, int pageNum, out int numOrders, WaitWindow window)
{
numOrders = json.ArrayOf("List").Size;
//init db
DatabaseWriter dbEngine = new DatabaseWriter()
{
Host = dbHost,
Name = dbName,
User = dbUser,
Password = dbPass,
};
//connecting to database
bool pass = dbEngine.Connect();
//loop through all the entry for the page, generally it's 500 entries
for(int orderLoop = 0; orderLoop < numOrders; orderLoop++)
{
//this actually handle the queries, and per loop there could be 3 to 10+ insert/update query using prepared statements
KeyValueTag error = InsertOrUpdateSalesOrder(dbEngine, item, config, pageNum, orderLoop, numOrders, window);
}
//here as you can see, i disconnect from db engine, and following method also close the db connection before hand
dbEngine.Disconnect();
}
//code from DatabaseWriter class, as you see this method close and dispose the database properly
public void Disconnect()
{
_CMD.Dispose();
_engine.Close();
_engine.Dispose();
}
so, as you can see i close/dispose the database connection on each page processing, but still it shows me that error on 4th page. FYI, 4th page data is not the matter i checked that. If i skip the page and only process the 4th page, it process successfully.
and after some digging more in google, i found prepare statement is saved in database server and that needs to be close/deallocate. But i can't find any way to do that using MySQL.Data Client :(
following page says:
https://dev.mysql.com/doc/refman/8.0/en/sql-prepared-statements.html
A prepared statement is specific to the session in which it was created. If you terminate a session without deallocating a previously prepared statement, the server deallocates it automatically. 
but that seems incorrect, as i facing the error even after closing connection on each loop
so, i am at dead end and looking for some help here? 
thanks in advance
best regards
From the official docs, the role of max_prepared_stmt_count is
This variable limits the total number of prepared statements in the server.
Therefore, you need to increase the value of the above variable, so as to increase the maximum number of allowed prepared statements in your MySQL server's configuration
Open the my.cnf file
Under the mysqld section, there is a variable max_prepared_stmt_count. Edit the value accordingly(remember the upper end of this value is 1048576)
Save and close the file. Restart MySQL service for changes to take place.
You're probably running into bug 77421 in MySql.Data: by default, it doesn't reset connections.
This means that temporary tables, user-declared variables, and prepared statements are never cleared on the server.
You can fix this by adding Connection Reset = True; to your connection string.
Another fix would be to switch to MySqlConnector, an alternative ADO.NET provider for MySQL that fixes this and other bugs. (Disclaimer: I'm the lead author.)

System.AccessViolationException - Oracle client tools

Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.AccessViolationException
Stack:
at Oracle.DataAccess.Client.OracleParameter.SetStatus(Int32)
at Oracle.DataAccess.Client.OracleParameter.PreBind(Oracle.DataAccess.Client.OracleConnection, IntPtr, Int32, Boolean)
at Oracle.DataAccess.Client.OracleCommand.ExecuteNonQuery()
My application is crashing with this exception.
context is: We are trying to execute a stored procedure that accepts two parameters, we are using name binding for parameters in command text.
Example:
OraCommand.CommandText = "begin LOAD_UTILS.TRUNCATE_TABLE(:a1,:a2); end;"
OraCommand.Parameters.Add(New OracleParameter("a1", OracleDbType.Varchar2, 1000))
OraCommand.Parameters(0).Direction = ParameterDirection.Input
OraCommand.Parameters(0).Value = params(2)
OraCommand.Parameters.Add(New OracleParameter("a2", OracleDbType.Varchar2, 1000))
OraCommand.Parameters(1).Direction = ParameterDirection.Input
OraCommand.Parameters(1).Value = params(3)
The oddity i have observed is, we run these statements in a for loop for a certan number of retries incase if there are any errors, OraCommand is an instance variable, so parameters collection is not getting cleared.
In iteration#1: paratmeters a1, a2 are getting added
In iteration#2: the first two parameters are already there, and again we are adding parameters with names a1, and a2.
...
Am not seeing the issue, when i clear the parameter collection at the start of every iteration, but i am not able to come up with a theory that is causing the issue here, any thoughts?
You are using ODP.Net and way it works is, .Net C# APIs are just wrapper to expose the their service layer in C, which communicates with OCI (Oracle Call interfaces), mostly the access violation exception come from their service layer or OCI, because it is not really feasible to have Access Violation (AV) in the managed code (.Net C#), CLR doesn't allow it. Easier way to debug the issue would be to use Windbg, load the symbols and it will point you to exact method which is causing an issue, which will be the unmanaged code, but the challenge is you will have the release version with at most public symbols, which doesn't help much, other option would be to post the issue with code to the Oracle Technet, where the ODP.Net developer can provide a workaround and fix. However before that in my view there are certain issues with your code:
In a loop following lines will add a new parameter a1 and a2 everytime
OraCommand.Parameters.Add(New OracleParameter("a1", OracleDbType.Varchar2, 1000))
OraCommand.Parameters.Add(New OracleParameter("a2", OracleDbType.Varchar2, 1000))
The behavior that you are experiencing in expected, since only after creation of new parameters you are setting the value and direction at the index 0 and 1, which is same every time, in every iteration you end up adding two new parameters to the OracleCommand object and somewhere that is messing up with the internal structure, since what you are doing is incorrect. Correct way would be following code in the loop:
OracleParameter A1 = OracleParameter("a1", OracleDbType.Varchar2, 1000)
A1.Direction = ParameterDirection.Input
A1.Value = params[2]
OracleCommand.Parameters[0] = A1;
OracleParameter A2 = OracleParameter("a2", OracleDbType.Varchar2, 1000)
A1.Direction = ParameterDirection.Input
A1.Value = params[3]
OracleCommand.Parameters[1] = A2;
In fact another odd thing in your code is access to the collection like array with a following code:
OraCommand.Parameters(0).Value = params(2)
This is not possible in C#, you have to use square bracket [], In my view you must be using VB.Net, since even you new is not correct, it is not New in C#
#Mrinal Kamboj You are right, code is in VB.NET; The exception was thrown when it tries to execute ExecuteNonQuery. I tried WinDbg, and this what it gives:

EzAPI OLE DB Destination

I've searched all over and I now have to ask SO. I'm trying to construct a simple dataflow using EzAPI. It's been anything but easy, but I'm committed to figuring this out. What I can't figure out is how to get the EzOleDBDestination working. Here's my complete code
var a = new Application();
// using a template since it's impossible to set up an ADO.NET connection to MySQL
// using EzAPI and potentially even with the raw SSIS API...
var pkg = new EzPackage(a.LoadPackage(#"C:\...\Package.dtsx", null));
pkg.Name = "Star";
var df = new EzDataFlow(pkg);
df.Name = "My DataFlow";
var src = new EzAdoNetSource(df);
src.Name = "Source Database";
src.SqlCommand = "SELECT * FROM enum_institution";
src.AccessMode = AccessMode.AM_SQLCOMMAND;
src.Connection = new EzConnectionManager(pkg, pkg.Connections["SourceDB"]);
src.ReinitializeMetaData();
var derived = new EzDerivedColumn(df);
derived.AttachTo(src);
derived.Name = "Prepare Dimension Attributes";
derived.LinkAllInputsToOutputs();
derived.Expression["SourceNumber"] = "id";
derived.Expression["Name"] = "(DT_STR,255,1252)description";
// EDIT: reordered the operation here and I no longer get an error, but
// I'm not getting any mappings or any input columns when I open the package in the designer
var dest = new EzOleDbDestination(df);
dest.AttachTo(derived, 0, 0);
dest.Name = "Target Database";
dest.AccessMode = 0;
dest.Table = "[dbo].[DimInstitution]";
dest.Connection = new EzConnectionManager(pkg, pkg.Connections["TargetDB"]);
// this comes from Yahia's link
var destInput = dest.Meta.InputCollection[0];
var destVirInput = destInput.GetVirtualInput();
var destInputCols = destInput.InputColumnCollection;
var destExtCols = destInput.ExternalMetadataColumnCollection;
var sourceColumns = derived.Meta.OutputCollection[0].OutputColumnCollection;
foreach(IDTSOutputColumn100 outputCol in sourceColumns) {
// Now getting COM Exception here...
var extCol = destExtCols[outputCol.Name];
if(extCol != null) {
// Create an input column from an output col of previous component.
destVirInput.SetUsageType(outputCol.ID, DTSUsageType.UT_READONLY);
var inputCol = destInputCols.GetInputColumnByLineageID(outputCol.ID);
if(inputCol != null) {
// map the input column with an external metadata column
dest.Comp.MapInputColumn(destInput.ID, inputCol.ID, extCol.ID);
}
}
}
Basically, anything that involves calls to ReinitializeMetadata() results in 0xC0090001, because that method is where the error happens. There's no real documentation to help me, so I have to rely on any gurus here.
I should mention that the source DB is MySQL and the target DB is SQL Server. Building packages like this using the SSIS designer works fine, so I know it's possible.
Feel free to tell me if I'm doing anything else wrong.
EDIT: here's a link to the base package I'm using as a template: http://www.filedropper.com/package_1 . I've redacted the connection details, but any MySQL and SQL Server database will do. The package will read from MySQL (using the MySQL ADO.NET Connector) and write to SQL Server.
The database schema is mostly irrelevant. For testing, just make a table in MySQL that has two columns: id (int) and description (varchar), with id being the primary key. Make equivalent columns in SQL Server. The goal here is simply to copy from one to the other. It may end up being more complex at some point, but I have to get past this hurdle first.
I can't test this now BUT I am rather sure that the following will help you get it working:
Calling ReinitializeMetadata() causes the component to fetch the table metadata. This should only be called after setting the AccessMode and related property. You are calling it before setting AccessMode...
Various samples including advice on debugging problems
define the derived column(s) directly in the SQL command instead of using a EzDerivedColumn
try to get it working with 2 SQL Server DBs first, some of the available MySQL ADO.NET provider have some shortcomings under some circumstances
UPDATE - as per comments some more information on debugging this and a link to a complete end-to-end sample with source:
http://blogs.msdn.com/b/mattm/archive/2009/08/03/looking-up-ssis-hresult-comexception-errorcode.aspx
http://blogs.msdn.com/b/mattm/archive/2009/08/03/debugging-a-comexception-during-package-generation.aspx
Complete working sample with source
I've had this exact same issue and been able to resolve it with a lot of experimentation. In short you must set the connection for both the source and destination, and then call the attachTo after both connections are set. You must call attachTo for every component.
I've written a blog about starting with an SSIS package as a template, and then manipulating it programmatically to produce a set of new packages.
The article explains the issue more.

Retrieve data from browser local storage using c#

Is it possible to retrieve data from chrome/firefox local storage using C#?
Disclaimer: I have tested this on my Windows 7 x64 running Google Chrome 13.0.782.220 at the moment. The information provided here is a result of my own research and is not any official way or API to retrieve this information. Use at your own risk. Also the technique presented here might break with any future release if Chrome changes the way to store this information.
So, Google Chrome uses SQLite to persist local storage data. You could use the System.Data.SQLite managed driver to read it from your .NET application. If you are running on Windows 7 (don't know for others as that's the one I have and can test), you will have the following folder:
c:\Users\SOMEUSERNAME\AppData\Local\Google\Chrome\User Data\Default\Local Storage\
This folder will contain multiple files with the .localstorage extension. Each file is for different site. For example for StackOverflow I have http_stackoverflow.com_0.localstorage but of course this naming is totally arbitrary and you cannot rely upon it. Each file represents a SQLite database.
I have noticed that this database contains a table called ItemTable with 2 string columns called key and value.
So to read the values it's a simple matter of sending a SQL query:
class Program
{
static void Main()
{
using (var conn = new SQLiteConnection("Data Source=http_stackoverflow.com_0.localstorage;Version=3;"))
using (var cmd = conn.CreateCommand())
{
conn.Open();
cmd.CommandText = "SELECT key, value FROM ItemTable";
using (var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
Console.WriteLine(
"key: {0}, value: {1}",
reader.GetString(reader.GetOrdinal("key")),
reader.GetString(reader.GetOrdinal("value"))
);
}
}
}
}
}
When using Chromium Embedded Framework I found that the above solution has many limitations. It seems like Chromium have moved to using leveldb instead.
I ended up with a solution where I inject JS code that modify local storage in FrameLoadStart. (it should be easy to read values as well - JavascriptResponse.Result can be casted to a IDictionary when using this script: "window.localStorage;" instead)
// writing local storage in FrameLoadStart
foreach (var entry in LocalStorage)
{
script += $"window.localStorage.{entry.Key} = '{entry.Value}';";
}
IFrame frame = chromeBrowser.GetMainFrame();
var result = await frame.EvaluateScriptAsync(script , frame.Url, 0);
if(!result.Success)
{
throw new Exception(result.Message);
}

IDataRecord.IsDBNull causes an System.OverflowException (Arithmetic Overflow)

I have a OdbcDataReader that gets data from a database and returns a set of records.
The code that executes the query looks as follows:
OdbcDataReader reader = command.ExecuteReader();
while (reader.Read())
{
yield return reader.AsMovexProduct();
}
The method returns an IEnumerable of a custom type (MovexProduct). The convertion from an IDataRecord to my custom type MovexProduct happens in an extension-method that looks like this (abbrev.):
public static MovexProduct AsMovexProduct(this IDataRecord record)
{
var movexProduct = new MovexProduct
{
ItemNumber = record.GetString(0).Trim(),
Name = record.GetString(1).Trim(),
Category = record.GetString(2).Trim(),
ItemType = record.GetString(3).Trim()
};
if (!record.IsDBNull(4))
movexProduct.Status1 = int.Parse(record.GetString(4).Trim());
// Additional properties with IsDBNull checks follow here.
return movexProduct;
}
As soon as I hit the if (!record.IsDBNull(4)) I get an OverflowException with the exception message "Arithmetic operation resulted in an overflow."
StackTrace:
System.OverflowException was unhandled by user code
Message=Arithmetic operation resulted in an overflow.
Source=System.Data
StackTrace:
at System.Data.Odbc.OdbcDataReader.GetSqlType(Int32 i)
at System.Data.Odbc.OdbcDataReader.GetValue(Int32 i)
at System.Data.Odbc.OdbcDataReader.IsDBNull(Int32 i)
at JulaAil.DataService.Movex.Data.ExtensionMethods.AsMovexProduct(IDataRecord record) [...]
I've never encountered this problem before and I cannot figure out why I get it. I have verified that the record exists and that it contains data and that the indexes I provide are correct. I should also mention that I get the same exception if I change the if-statemnt to this: if (record.GetString(4) != null). What does work is encapsulating the property-assignment in a try {} catch (NullReferenceException) {} block - but that can lead to performance-loss (can it not?).
I am running the x64 version of Visual Studio and I'm using a 64-bit odbc driver.
Has anyone else come across this? Any suggestions as to how I could solve / get around this issue?
Many thanks!
For any one experiencing the same issue, the way I solved this was to switch from the Odbc* classes to their OleDb* counterparts. This of course demands that your data driver has support for OleDb connections.
Which DB are you trying to talk to? If it uses some "private" column types that can cause problems. That doesn't apply to SQL Server of course :-)
Also check that you are compiling and running as x64 (Process Explorer will show you thsi and even plain tack manager shows it). devenv.exe will still be x86 but your actual binary should run as x64. The reason I mention is that crossing 32/64 bit boundary is notorious for breaking 3rd party ODBC drivers.

Categories

Resources