I am using SqlFileStream to store files in a database, however I have a small issue when inserting new records that I can't seem to resolve! The problem lies with the fact that I need to create a temporary file when inserting a record to provide a file path to stream the actual filedata to. I was expecting the temporary file to be overwritten when the actual data is streamed to the file path but it actually adds another file - leaving the null file in the file system.
Here is the code (please ignore the query string format and the way I am inserting the parameters, I have been hacking away trying different things and have lost the will to code nicely ;) ):
using (TransactionScope transactionScope = new TransactionScope())
{
string InsertTSql = "Insert Into Documents(file_name, file_type, category, uploaded_by, file_data) values('" + request.fileInfo.fileName + "', '" + request.fileInfo.fileType + "', '" + request.fileInfo.category + "', '" + request.fileInfo.uploadedBy + "',Cast('' As varbinary(Max))); Select file_data.PathName() As Path From Documents Where document_id = ##Identity";
SqlConnection dbConnection = new SqlConnection(#"Data Source=SAM\SQLEXPRESS;Initial Catalog=PIRS;Integrated Security=True;MultipleActiveResultSets=True;Application Name=EntityFramework");//ConfigurationManager.ConnectionStrings["PIRSDBCon"].ToString()) )
var cmd = new SqlCommand(InsertTSql, dbConnection);
dbConnection.Open();
string filePath = (string)cmd.ExecuteScalar();
string GetTcTSql = "Select GET_FILESTREAM_TRANSACTION_CONTEXT() As TransactionContext";
cmd = new SqlCommand(GetTcTSql, dbConnection);
byte[] transactionContext =(byte[]) cmd.ExecuteScalar();
SqlFileStream sqlFileStream = new SqlFileStream(filePath, transactionContext, FileAccess.Write);
request.fileData.CopyTo(sqlFileStream);
sqlFileStream.Close();
transactionScope.Complete();
}
Please just let me know if I can add further information or more clarity regarding the issue.
The filestream system uses a garbage collector - every change always generates a new file, and then eventually the old file is deleted. But that doesn't happen instantly.
See e.g. How It Works: FileStream Garbage Collection or How It Works: File Stream the Before and After Image of a File for some discussion of the mechanics behind this.
Related
I have an C# API which I use to read and insert pdf files into sql-server table.
On the front-end side I have an React-application which consumes the API to do CRUD operations.
The react part works perfectly, I can do all the operations and I have no error/warning there.
I have already inserted some PDF documents into my table with this query:
INSERT INTO [DEFLEGOPINION] (content,extension,title)
SELECT BulkColumn , '.pdf' , 'Title123'
FROM OPENROWSET(BULK N'C:\temp\test.pdf', SINGLE_BLOB) AS BulkColumn
SELECT CAST('string' as varbinary(max)) FOR XML PATH(''), BINARY BASE64
So when I try to open and read these documents which are inserted this way, I can easily do that without any problem.
On the other hand, when I insert new files from API, the insertion works correctly but when I try to open the file it says that it's in a bad/corrupted format...
Here's how am I trying to insert files from c# api:
connection.Open();
string[] newContent = content.Split(',');
//here I removed the 'data:application/pdf;base64' part since I read somewhere that this makes a problem and should be remove
string encodedStr = Convert.ToBase64String(Encoding.UTF8.GetBytes(content));
String sql = "INSERT INTO myTable(Id,Name,Content) values('" + Guid.NewGuid() + "', '"
+ name + "', CONVERT(varbinary(max), '" + newContent[1] + "', 0))";
using (SqlCommand command = new SqlCommand(sql, connection))
{
command.ExecuteNonQuery();
return true;
}
//Note:Name and Content are string parameters
Since the react part work as it should reading other documents, I don't think I have an issue there.
What am I missing? Should I change something on the method I use on the c# side or should I implement the first method but with the use of a file path (which it will be dynamic and I cannot know the user's file path's)
Posting this here so it might help someone else in the future.
I just found out that it is a must to use parameters in order to insert a file into sql-server from c# api.
It goes like this:
using (SqlCommand command = new SqlCommand(sql, connection))
{
command.Parameters.Add(new SqlParameter("#Ident", Guid.NewGuid()));
command.Parameters.Add(new SqlParameter("#Name", name));
command.Parameters.Add(new SqlParameter("#Content", file));
command.ExecuteNonQuery();
}
I am trying to insert new data into an old .dbf database created with foxpro.
The database has a lot of columns and I dont need to fill every single one.
The connection itself works. But now im getting the exception "Field XY does not allow null values" for every single one I'm not adding in my insert statement. But the database is configured to allow null values.
I am using the following code:
OleDbConnection dbfcon = new OleDbConnection("Provider=VFPOLEDB.1;" +
"Data Source=" + Application.StartupPath + "\\Daten;");
dbfcon.Open();
String query = "INSERT INTO TB_KUVG (KDNR, Kuvg_id) " +
"VALUES(?,?)";
OleDbCommand cmd = new OleDbCommand(query, dbfcon);
cmd.Parameters.AddWithValue("#KDNR", 1);
cmd.Parameters.AddWithValue("#Kuvg_id", 1);
cmd.ExecuteNonQuery();
dbfcon.Close();
So what am I doing wrong?
Is it better to use another way to write into a .dbf from c#?
You are almost doing it right. Note that parameter names are not important and would be located positionally (ie: #KDNR is added first so that it correspond to first ? placeholder). What you are missing was, if the fields you don't pass doesn't accept NULL values then you should notify the connection that instead you want "empty" values for those fields ('' for string, / / for date, 0 for numeric and vice versa). To notify the driver, you execute 'SET NULL OFF' on the same connection.
While adding it, I revised your existing code a bit:
string dataFolder = Path.Combine(Application.StartupPath, "Daten");
String query = #"INSERT INTO TB_KUVG
(KDNR, Kuvg_id)
VALUES
(?,?)";
using (OleDbConnection dbfcon = new OleDbConnection("Provider=VFPOLEDB;Data Source=" + dataFolder))
{
OleDbCommand cmd = new OleDbCommand(query, dbfcon);
cmd.Parameters.AddWithValue("#KDNR", 1);
cmd.Parameters.AddWithValue("#Kuvg_id", 1);
dbfcon.Open();
new OleDbCommand("set null off",dbfcon).ExecuteNonQuery();
cmd.ExecuteNonQuery();
dbfcon.Close();
}
PS: Application.StartupPath might not be a good idea as it may be under "Program Files" which is read only.
PS2: It would be better if you added "VFP" tag there, instead of "DBF".
I have an WPF application that reads data from a file like so:
foreach (String line in File.ReadAllLines(file, Encoding.UTF8))
{}
Each line is then parsed and displayed on the screen which all works fine. Some of the data has cyrillic alphabet in it and the strings that I'm using to store this data in are also displayed fine on the screen in the app window.
However, after that I'm using those same strings to insert them into MySQL database. I'm building a query and firing it up MySqlCommand cmd = new MySqlCommand(query, conn); which successfully inserts a new line in the database with the appropriate information. Numbers are all fine, however all the strings that go into the database and have cyrillic letters are displayed as ????????
Database engine is InnoDB and the encoding of the table and all varchar fields in it is utf_general_ci so any idea what is going on and how can I save the correct string in the database?
EDIT:
Per request, here's some code. Database connection:
conn = new MySqlConnection();
conn.ConnectionString = "//censored//";
And the file reading / db loading, shortened for the purposes of this code snippet:
foreach (String line in File.ReadAllLines(file, Encoding.UTF8))
{
string[] tokens = line.Split('|');
string query = "INSERT INTO myTable SET first_name = '" + tokens[0] + "'" + ", last_name = '" + tokens[1] + "'";
MessageBox.Show(tokens[0]);
MySqlCommand cmd = new MySqlCommand(query, conn);
cmd.ExecuteNonQuery();
}
The message box shows the name as it should be but what goes into the database is ???????.
After some headbanging I did figure out where the problem is so posting an answer for all to see:
The key part is the way you establish your connection to the database:
conn.ConnectionString = #"Server = YOURSERVER; Database = YOURDB; Uid = YOURUSER ; Pwd = 'YOURPASSWORD'; charset=utf8;";
I was missing the charset=utf8; part before so I assume all kinds of non-utf8 junk was going to the database regardless of the fact that I was encoding in UTF8 on both sides of the connection. Hope this helps!
How do I save an image in an Informix database using C# code?
At the moment, I'm unable to save an image in an Informix database using C# code.
All the steps work, only when the query is about to get executed it throws error "E42000: (-201) A syntax error has occurred."
Below is the code.
mycon.Open();
int len = Upload.PostedFile.ContentLength;
byte[] pic = new byte[len];
HttpPostedFile img = Upload.PostedFile;
Response.Write("Size of file = " + pic);
img.InputStream.Read(pic, 0, len);
//Upload.PostedFile.InputStream.Read(pic,0,len);
string str = "insert into imageinfo (name,address,photo) values('" + txtname.Text + "','" + txtaddress.Text + "'," + pic + ")";//,photo,#photo
mycmd = new OleDbCommand(str, mycon);
//mycmd.Parameters.AddWithValue("#id", int.Parse(txtid.Text));
mycmd.Parameters.AddWithValue("#name", txtname.Text);
mycmd.Parameters.AddWithValue("#address", txtaddress.Text);
mycmd.Parameters.AddWithValue("#photo", pic);
mycmd.ExecuteNonQuery();
mycon.Close();
lblMessage.Text = "Image details inserted successfully";
Response.Redirect("~/RetriveImage.aspx");
mycon.Close();
It starts with your code being a copy/paste mess from at least 2 sources because it makes no sense.
Your insert has the values directly - and you can not put pic in like that, sorry - but you define parameters that are not used. 2 sources code.
Then the code was mindlessly copy/pasted. You define named paraameters, but OleDb does not support named parameters (http://social.msdn.microsoft.com/Forums/en-US/vsreportcontrols/thread/637db5d4-e205-489c-b127-7ca14abc48e3/) only parameter by position, which tells me you enver ever used parameters with Informix via OleDb and just copied/pasted the code together.
Remove the direct data in the INSERT statement, put in parameter markers.
Then use the parameters, as per the MS link I sent you (positional, marker is a ?).
Then it will work.
Occasionally it may require you to actually read the documentation instead of just stitching together code from different sources and asking for help.
What I have is an extremely large text file that needs to go into a specific column at a specific raw. The file is around 100k lines. So what I want to do is read the whole file, and for each line append into that specific SQL column the line. Here's what I have but i really need help on the SQL query
string[] primaryfix = File.ReadAllLines(dinfo+"\\"+filex);
string filename = filex.ToString();
string[] spltifilename = filename.Split('.');
foreach (string primary in primaryfix)
{
string sqltable = ("dbo.amu_Textloadingarea");
string sql = "update " + sqltable + " set [Text] = [Text] + '" + primary + "' where begbates = '" + spltifilename[0] + "'";
SqlConnection con = new SqlConnection("Data Source= Corvette ;Initial Catalog= GSK_Avandia_SSECASE;Integrated Security= SSPI");
con.Open();
SqlCommand cmd = new SqlCommand(sql, con);
SqlDataReader reader = cmd.ExecuteReader();
con.Close();
}
everything is fine except for the string sql, it doesn't update the way I would like it to.
Any help is always appreciated.
Look likes you're trying to read from the database with that code inside the loop. SqlDataReader provides a way to read rows from the database, but not the other way around.
Replace
SqlDataReader reader = cmd.ExecuteReader();
with
cmd.ExecuteNonQuery();
The first thing I see is that you are not escaping the input from the text file; any SQL escape characters (like a single quote) will break that command. I'd recommend using parameters so you needn't worry about escapes at all.
Other than that, nothing pops to mind that would suggest why the command isn't working, but I do wonder if it might not cause fewer problems if it's such a large file to read it line-by-line rather than all at once.