How do I save an image in an Informix database using C# code?
At the moment, I'm unable to save an image in an Informix database using C# code.
All the steps work, only when the query is about to get executed it throws error "E42000: (-201) A syntax error has occurred."
Below is the code.
mycon.Open();
int len = Upload.PostedFile.ContentLength;
byte[] pic = new byte[len];
HttpPostedFile img = Upload.PostedFile;
Response.Write("Size of file = " + pic);
img.InputStream.Read(pic, 0, len);
//Upload.PostedFile.InputStream.Read(pic,0,len);
string str = "insert into imageinfo (name,address,photo) values('" + txtname.Text + "','" + txtaddress.Text + "'," + pic + ")";//,photo,#photo
mycmd = new OleDbCommand(str, mycon);
//mycmd.Parameters.AddWithValue("#id", int.Parse(txtid.Text));
mycmd.Parameters.AddWithValue("#name", txtname.Text);
mycmd.Parameters.AddWithValue("#address", txtaddress.Text);
mycmd.Parameters.AddWithValue("#photo", pic);
mycmd.ExecuteNonQuery();
mycon.Close();
lblMessage.Text = "Image details inserted successfully";
Response.Redirect("~/RetriveImage.aspx");
mycon.Close();
It starts with your code being a copy/paste mess from at least 2 sources because it makes no sense.
Your insert has the values directly - and you can not put pic in like that, sorry - but you define parameters that are not used. 2 sources code.
Then the code was mindlessly copy/pasted. You define named paraameters, but OleDb does not support named parameters (http://social.msdn.microsoft.com/Forums/en-US/vsreportcontrols/thread/637db5d4-e205-489c-b127-7ca14abc48e3/) only parameter by position, which tells me you enver ever used parameters with Informix via OleDb and just copied/pasted the code together.
Remove the direct data in the INSERT statement, put in parameter markers.
Then use the parameters, as per the MS link I sent you (positional, marker is a ?).
Then it will work.
Occasionally it may require you to actually read the documentation instead of just stitching together code from different sources and asking for help.
Related
I have an C# API which I use to read and insert pdf files into sql-server table.
On the front-end side I have an React-application which consumes the API to do CRUD operations.
The react part works perfectly, I can do all the operations and I have no error/warning there.
I have already inserted some PDF documents into my table with this query:
INSERT INTO [DEFLEGOPINION] (content,extension,title)
SELECT BulkColumn , '.pdf' , 'Title123'
FROM OPENROWSET(BULK N'C:\temp\test.pdf', SINGLE_BLOB) AS BulkColumn
SELECT CAST('string' as varbinary(max)) FOR XML PATH(''), BINARY BASE64
So when I try to open and read these documents which are inserted this way, I can easily do that without any problem.
On the other hand, when I insert new files from API, the insertion works correctly but when I try to open the file it says that it's in a bad/corrupted format...
Here's how am I trying to insert files from c# api:
connection.Open();
string[] newContent = content.Split(',');
//here I removed the 'data:application/pdf;base64' part since I read somewhere that this makes a problem and should be remove
string encodedStr = Convert.ToBase64String(Encoding.UTF8.GetBytes(content));
String sql = "INSERT INTO myTable(Id,Name,Content) values('" + Guid.NewGuid() + "', '"
+ name + "', CONVERT(varbinary(max), '" + newContent[1] + "', 0))";
using (SqlCommand command = new SqlCommand(sql, connection))
{
command.ExecuteNonQuery();
return true;
}
//Note:Name and Content are string parameters
Since the react part work as it should reading other documents, I don't think I have an issue there.
What am I missing? Should I change something on the method I use on the c# side or should I implement the first method but with the use of a file path (which it will be dynamic and I cannot know the user's file path's)
Posting this here so it might help someone else in the future.
I just found out that it is a must to use parameters in order to insert a file into sql-server from c# api.
It goes like this:
using (SqlCommand command = new SqlCommand(sql, connection))
{
command.Parameters.Add(new SqlParameter("#Ident", Guid.NewGuid()));
command.Parameters.Add(new SqlParameter("#Name", name));
command.Parameters.Add(new SqlParameter("#Content", file));
command.ExecuteNonQuery();
}
I'm not the first to have these issues, and will list some reference posts below, but am still looking for a proper solution.
I need to call a stored procedure (Oracle 10g database) from a C# web service. The web server has an Oracle 9i client installed and I am using Microsofts System.Data.OracleClient.
The procedure takes an XML as a CLOB. When the XML was over 4000 Bytes (which is likely in a normal use case), I stumbled over the following error:
ORA-01460 - unimplemented or unreasonable conversion requested
I've found this, this and this post.
Further I found a promising workaround which doesn't call the stored procedure directly from C# but defines a piece of anonymous PL/SQL code instead. This code is run as an OracleCommand. The XML is embedded as a string literal and the procedure call is done from within that piece of code:
private const string LoadXml =
"DECLARE " +
" MyXML CLOB; " +
" iStatus INTEGER; " +
" sErrMessage VARCHAR2(2000); " +
"BEGIN " +
" MyXML := '{0}'; " +
" iStatus := LoadXML(MyXML, sErrMessage); " +
" DBMS_OUTPUT.ENABLE(buffer_size => NULL); " +
" DBMS_OUTPUT.PUT_LINE(iStatus || ',' || sErrMessage); " +
"END;";
OracleCommand oraCommand = new OracleCommand(
string.Format(LoadXml, xml), oraConnection);
oraCommand.ExecuteNonQuery();
Unfortunately, this approach now fails as soon as the XML is over 32 KBytes or so, which still is very likely in my application. This time the error stems from the PL/SQL compiler which says:
ORA-06550: line1, column 87: PLS-00172: string literal too long
After some research I conclude that it's simply not feasible to solve the problem with my second approach.
Following the above-mentioned posts I have the following two options.
Switch to ODP.NET (because it is supposed to be a bug in Microsoft's deprecated DB client)
Insert the CLOB into a table and make the stored proc read from there
(The first post said some clients are buggy, but mine (9i) does not fall in the mentioned range of 10g/11g versions.)
Can you confirm that these are the only two options left? Or is there another way to help me out?
Just to clarify: the XML won't eventually be saved in any table, but it is processed by the stored procedure which inserts some records in some table based on the XML contents.
My considerations about the two options:
Switching to ODP.NET is difficult because I have to install it on a web server on which I don't have system access so far, and because we might also want to deploy the piece of code on clients, so each client would have to install ODP.NET as part of the deployment.
The detour over a table makes the client code quite a bit more complicated and also takes quite some effort on the database adapting/extending the PL/SQL routines.
I found that there is another way to work around the problem! My fellow employee saved my day pointing me to this blog, which says:
Set the parameter value when
BeginTransaction has already been
called on the DbConnection.
Could it be simpler? The blog relates to Oracle.DataAccess, but it works just as well for System.Data.OracleClient.
In practice this means:
varcmd = new OracleCommand("LoadXML", _oracleConnection);
cmd.CommandType = CommandType.StoredProcedure;
var xmlParam = new OracleParameter("XMLFile", OracleType.Clob);
cmd.Parameters.Add(xmlParam);
// DO NOT assign the parameter value yet in this place
cmd.Transaction = _oracleConnection.BeginTransaction();
try
{
// Assign value here, AFTER starting the TX
xmlParam.Value = xmlWithWayMoreThan4000Characters;
cmd.ExecuteNonQuery();
cmd.Transaction.Commit();
}
catch (OracleException)
{
cmd.Transaction.Rollback();
}
In my case, chiccodoro's solution did not work. I'm using ODP.NET ( Oracle.DataAccess ).
For me the solution is using OracleClob object.
OracleCommand cmd = new OracleCommand("LoadXML", _oracleConnection);
cmd.CommandType = CommandType.StoredProcedure;
OracleParameter xmlParam = new OracleParameter("XMLFile", OracleType.Clob);
cmd.Parameters.Add(xmlParam);
//connection should be open!
OracleClob clob = new OracleClob(_oracleConnection);
// xmlData: a string with way more than 4000 chars
clob.Write(xmlData.ToArray(),0,xmlData.Length);
xmlParam.Value = clob;
try
{
cmd.ExecuteNonQuery();
}
catch (OracleException e)
{
}
I guess I just googled this for you to get cheap points, but there's a great explanation here:
http://www.orafaq.com/forum/t/48485/0/
Basically you cannot use more than 4000 chars in a string literal, and if you need to do more, you must use a stored procedure. Then, you are limited to 32KB at max so you have to "chunk" the inserts. Blech.
-Oisin
chiccodoro is right.
public static int RunProcedure(string storedProcName, IDataParameter[] parameters)
{
using (OracleConnection connection = new OracleConnection(connectionString))
{
int rowsAffected;
OracleCommand command = new OracleCommand(storedProcName, connection);
command.CommandText = storedProcName;
command.CommandType = CommandType.StoredProcedure;
foreach (OracleParameter parameter in parameters)
{
command.Parameters.Add(parameter);
}
connection.Open();
try
{
// start transaction
command.Transaction = connection.BeginTransaction();
rowsAffected = command.ExecuteNonQuery();
command.Transaction.Commit();
}
catch (System.Exception ex)
{
command.Transaction.Rollback();
throw ex;
}
connection.Close();
return rowsAffected;
}
}
I have recently started a new job and they use Vistadb so I cannot change the software package before people suggest that. I have obtained out of the database a byte[] from a datatype of image that is used in there different systems so its data type cannot be changed from image to varbinary. I have made alterations to the byte[] and now need to put it back into the database in an new record however I cant seem to work out how the SQL Query should be for it so far I have.
zz is the byte[] the rest of it works fine just need a way to put that into my SQL Query
sql = "INSERT INTO TimeHistory(\"Data\",\"Name\",\"Units\",\"ParameterData\",\"StartTime\",\"EndTime\",\"StorageRate\",\"Measurement\") SELECT \'" +zz+ "\',\'" + Name + "\',\'" + Units + "\',\'" + ParameterData + "\',\'" + start + "\',\'" + end + "\',\'" + storage + "\'" + ",SELECT Max(ID)From Measurement;";
ExecuteScript(sql);
This is done with c#.net using WPF forms.
The key to doing what you want is to use parameters to pass data to your SQL operation, not to convert it to a string and embed it in the TSQL code. This is a best practice not just because it prevents needless type conversions (say from DateTime to string and string back to DateTime for storage) but also for security - it ensures the database engine only attempts to execute as code things you intended to be code, not data that happened to be escaped so it was evaluated as part of the string.
We have a good example of how to do this in our ADO.NET samples at:
Common Operations in ADO.NET
If you go down the page you'll see an example "Inserting Data Using a Parameterized Command" which will work with any type, like this:
using (VistaDBConnection connection = new VistaDBConnection())
{
connection.ConnectionString = #"Data Source=C:\mydatabase.vdb5";
connection.Open();
using (VistaDBCommand command = new VistaDBCommand())
{
int Age = 21;
command.Connection = connection;
command.CommandText = "INSERT INTO MyTable (MyColumn) VALUES (#age)";
command.Parameters.Add("#age", Age);
command.ExecuteNonQuery();
}
}
I am using SqlFileStream to store files in a database, however I have a small issue when inserting new records that I can't seem to resolve! The problem lies with the fact that I need to create a temporary file when inserting a record to provide a file path to stream the actual filedata to. I was expecting the temporary file to be overwritten when the actual data is streamed to the file path but it actually adds another file - leaving the null file in the file system.
Here is the code (please ignore the query string format and the way I am inserting the parameters, I have been hacking away trying different things and have lost the will to code nicely ;) ):
using (TransactionScope transactionScope = new TransactionScope())
{
string InsertTSql = "Insert Into Documents(file_name, file_type, category, uploaded_by, file_data) values('" + request.fileInfo.fileName + "', '" + request.fileInfo.fileType + "', '" + request.fileInfo.category + "', '" + request.fileInfo.uploadedBy + "',Cast('' As varbinary(Max))); Select file_data.PathName() As Path From Documents Where document_id = ##Identity";
SqlConnection dbConnection = new SqlConnection(#"Data Source=SAM\SQLEXPRESS;Initial Catalog=PIRS;Integrated Security=True;MultipleActiveResultSets=True;Application Name=EntityFramework");//ConfigurationManager.ConnectionStrings["PIRSDBCon"].ToString()) )
var cmd = new SqlCommand(InsertTSql, dbConnection);
dbConnection.Open();
string filePath = (string)cmd.ExecuteScalar();
string GetTcTSql = "Select GET_FILESTREAM_TRANSACTION_CONTEXT() As TransactionContext";
cmd = new SqlCommand(GetTcTSql, dbConnection);
byte[] transactionContext =(byte[]) cmd.ExecuteScalar();
SqlFileStream sqlFileStream = new SqlFileStream(filePath, transactionContext, FileAccess.Write);
request.fileData.CopyTo(sqlFileStream);
sqlFileStream.Close();
transactionScope.Complete();
}
Please just let me know if I can add further information or more clarity regarding the issue.
The filestream system uses a garbage collector - every change always generates a new file, and then eventually the old file is deleted. But that doesn't happen instantly.
See e.g. How It Works: FileStream Garbage Collection or How It Works: File Stream the Before and After Image of a File for some discussion of the mechanics behind this.
I'm not the first to have these issues, and will list some reference posts below, but am still looking for a proper solution.
I need to call a stored procedure (Oracle 10g database) from a C# web service. The web server has an Oracle 9i client installed and I am using Microsofts System.Data.OracleClient.
The procedure takes an XML as a CLOB. When the XML was over 4000 Bytes (which is likely in a normal use case), I stumbled over the following error:
ORA-01460 - unimplemented or unreasonable conversion requested
I've found this, this and this post.
Further I found a promising workaround which doesn't call the stored procedure directly from C# but defines a piece of anonymous PL/SQL code instead. This code is run as an OracleCommand. The XML is embedded as a string literal and the procedure call is done from within that piece of code:
private const string LoadXml =
"DECLARE " +
" MyXML CLOB; " +
" iStatus INTEGER; " +
" sErrMessage VARCHAR2(2000); " +
"BEGIN " +
" MyXML := '{0}'; " +
" iStatus := LoadXML(MyXML, sErrMessage); " +
" DBMS_OUTPUT.ENABLE(buffer_size => NULL); " +
" DBMS_OUTPUT.PUT_LINE(iStatus || ',' || sErrMessage); " +
"END;";
OracleCommand oraCommand = new OracleCommand(
string.Format(LoadXml, xml), oraConnection);
oraCommand.ExecuteNonQuery();
Unfortunately, this approach now fails as soon as the XML is over 32 KBytes or so, which still is very likely in my application. This time the error stems from the PL/SQL compiler which says:
ORA-06550: line1, column 87: PLS-00172: string literal too long
After some research I conclude that it's simply not feasible to solve the problem with my second approach.
Following the above-mentioned posts I have the following two options.
Switch to ODP.NET (because it is supposed to be a bug in Microsoft's deprecated DB client)
Insert the CLOB into a table and make the stored proc read from there
(The first post said some clients are buggy, but mine (9i) does not fall in the mentioned range of 10g/11g versions.)
Can you confirm that these are the only two options left? Or is there another way to help me out?
Just to clarify: the XML won't eventually be saved in any table, but it is processed by the stored procedure which inserts some records in some table based on the XML contents.
My considerations about the two options:
Switching to ODP.NET is difficult because I have to install it on a web server on which I don't have system access so far, and because we might also want to deploy the piece of code on clients, so each client would have to install ODP.NET as part of the deployment.
The detour over a table makes the client code quite a bit more complicated and also takes quite some effort on the database adapting/extending the PL/SQL routines.
I found that there is another way to work around the problem! My fellow employee saved my day pointing me to this blog, which says:
Set the parameter value when
BeginTransaction has already been
called on the DbConnection.
Could it be simpler? The blog relates to Oracle.DataAccess, but it works just as well for System.Data.OracleClient.
In practice this means:
varcmd = new OracleCommand("LoadXML", _oracleConnection);
cmd.CommandType = CommandType.StoredProcedure;
var xmlParam = new OracleParameter("XMLFile", OracleType.Clob);
cmd.Parameters.Add(xmlParam);
// DO NOT assign the parameter value yet in this place
cmd.Transaction = _oracleConnection.BeginTransaction();
try
{
// Assign value here, AFTER starting the TX
xmlParam.Value = xmlWithWayMoreThan4000Characters;
cmd.ExecuteNonQuery();
cmd.Transaction.Commit();
}
catch (OracleException)
{
cmd.Transaction.Rollback();
}
In my case, chiccodoro's solution did not work. I'm using ODP.NET ( Oracle.DataAccess ).
For me the solution is using OracleClob object.
OracleCommand cmd = new OracleCommand("LoadXML", _oracleConnection);
cmd.CommandType = CommandType.StoredProcedure;
OracleParameter xmlParam = new OracleParameter("XMLFile", OracleType.Clob);
cmd.Parameters.Add(xmlParam);
//connection should be open!
OracleClob clob = new OracleClob(_oracleConnection);
// xmlData: a string with way more than 4000 chars
clob.Write(xmlData.ToArray(),0,xmlData.Length);
xmlParam.Value = clob;
try
{
cmd.ExecuteNonQuery();
}
catch (OracleException e)
{
}
I guess I just googled this for you to get cheap points, but there's a great explanation here:
http://www.orafaq.com/forum/t/48485/0/
Basically you cannot use more than 4000 chars in a string literal, and if you need to do more, you must use a stored procedure. Then, you are limited to 32KB at max so you have to "chunk" the inserts. Blech.
-Oisin
chiccodoro is right.
public static int RunProcedure(string storedProcName, IDataParameter[] parameters)
{
using (OracleConnection connection = new OracleConnection(connectionString))
{
int rowsAffected;
OracleCommand command = new OracleCommand(storedProcName, connection);
command.CommandText = storedProcName;
command.CommandType = CommandType.StoredProcedure;
foreach (OracleParameter parameter in parameters)
{
command.Parameters.Add(parameter);
}
connection.Open();
try
{
// start transaction
command.Transaction = connection.BeginTransaction();
rowsAffected = command.ExecuteNonQuery();
command.Transaction.Commit();
}
catch (System.Exception ex)
{
command.Transaction.Rollback();
throw ex;
}
connection.Close();
return rowsAffected;
}
}