I'm trying to grab an image from the web, and add it to my database.
String lsResponse = string.Empty;
using (HttpWebResponse lxResponse = (HttpWebResponse)req1.GetResponse())
{
using (BinaryReader reader = new BinaryReader(lxResponse.GetResponseStream()))
{
Byte[] lnByte = reader.ReadBytes(1 * 1024 * 1024 * 10);
using (FileStream lxFS = new FileStream(id + ".png", FileMode.Create))
{
lxFS.Write(lnByte, 0, lnByte.Length);
My SQL Server datatype is Varbinary(MAX). Tried with different types (image,...) did not work.
Put it into database.
SqlCommand cmd = new SqlCommand("INSERT INTO PlayersDB (id) ) VALUES (#id);
cmd.Parameters.AddWithValue("#id", lnByte);
I keep getting an error:
Implicit conversion from data type nvarchar to varbinary(max) is not allowed. Use the CONVERT function to run this query.
So, my program sees my lnByte not as binary?
First be sure about the data type is correct.I don't know much about SQL server but i've encountered similar problem in MySql. I changed the data type to Blob and it worked
We use blobs for storing images within a sql db at my workplace. I'm not particularly experienced with it myself, but here are some Technet resources that can explain it better than I can.
Binary Large Object (Blob) Data (SQL Server)
FILESTREAM (SQL Server)
Related
I need to read blob data into a memory stream from an Image type column from an SQL Server database. How can I do this using Dapper?
I was reading the Dapper manual but was unable to find information about this.
UPDATE: I need to read the data from the database (from a query). All the links suggested so far has information about how to store the blob in the database.
Figured it out. The result dynamic type is a byte[].
var row = con.QueryFirst("SELECT BLOBFIELD FROM TABLE WHERE ID = 1");
byte[] bytes = drawings.BLOBFIELD;
using (var stream = new System.IO.FileStream(#"C:\Temp\Test.dat", System.IO.FileMode.CreateNew))
stream.Write(bytes, 0, bytes.Length);
I am trying to insert a image to my online database but on insertion the database closes my connection during the execution. I have tested other insert statements with other tables on the same database and works fine. I have tried changing the database type to varbinary(MAX) and image. Still the same error.
My image is getting saved in a byte [] variable and then inserted with AddWithValue.
Here is my code to convert a signature to a byte []:
var signatureImage = sig.GetImage();
byte[] bitmapData;
using (var memStream = new MemoryStream())
{
signatureImage.Compress(Bitmap.CompressFormat.Jpeg, 50, memStream);
bitmapData = memStream.ToArray();
}
Here is my code for adding into a parameter for my SQL command:
command.Parameters.AddWithValue("#waiverSigned", user.waiverSigned);
The Error I am getting:
System.Data.SqlClient.SqlException (0x80131904): Server closed the connection.
Environment: Xamarin Android
Let me know if you need anything else.
i want to select a picture that save as a large object in a postgresql database.
i know i use lo_export to do this.
but there is a problem: i want to save this picture directly to my computer because i cant access the files that save on Server using lo_export
(i think that is best for me if picture transfer to my computer by a select query)
I don't exactly know my way around C# but the Npgsql Manual has an example sort of like this of writing a bytea column to a file:
command = new NpgsqlCommand("select blob from t where id = 1);", conn);
Byte[] result = (Byte[])command.ExecuteScalar();
FileStream fs = new FileStream(args[0] + "database", FileMode.Create, FileAccess.Write);
BinaryWriter bw = new BinaryWriter(new BufferedStream(fs));
bw.Write(result);
bw.Flush();
fs.Close();
bw.Close();
So you just read it out of the database pretty much like any other column and write it to a local file. The example is about half way down the page I linked to, just search for "bytea" and you'll find it.
UPDATE: For large objects, the process appears to be similar but less SQL-ish. The manual (as linked to above) includes a few large object examples:
NpgsqlTransaction t = Polacz.BeginTransaction();
LargeObjectManager lbm = new LargeObjectManager(Polacz);
LargeObject lo = lbm.Open(takeOID(idtowaru),LargeObjectManager.READWRITE); //take picture oid from metod takeOID
byte[] buf = new byte[lo.Size()];
buf = lo.Read(lo.Size());
MemoryStream ms = new MemoryStream();
ms.Write(buf,0,lo.Size());
// ...
Image zdjecie = Image.FromStream(ms);
Search the manual for "large object" and you'll find it.
Not familiar with C# but if you've contrib/dblink around and better access to a separate postgresql server, this might work:
select large object from bad db server.
copy large object into good db server using dblink.
lo_export on good db server.
If your pictures don't exceed 1GB (or if you don't access only parts of the bytes) then using bytea is the better choice to store them.
A lot of SQL GUI tools allow to directly download (even view) the content of bytea columns directly
I am building some C# desktop application and I need to save file into database. I have come up with some file chooser which give me correct path of the file. Now I have question how to save that file into database by using its path.
It really depends on the type and size of the file. If it's a text file, then you could use File.ReadAllText() to get a string that you can save in your database.
If it's not a text file, then you could use File.ReadAllBytes() to get the file's binary data, and then save that to your database.
Be careful though, databases are not a great way to store heavy files (you'll run into some performance issues).
FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read);
BinaryReader br = new BinaryReader(fs);
int numBytes = new FileInfo(fileName).Length;
byte[] buff = br.ReadBytes(numBytes);
Then you upload it to the DB like anything else, I'm assume you are using a varbinary column (BLOB)
So filestream would be it but since you're using SQL 2K5 you will have to do it the read into memory way; which consumes alot of resources.
First of the column type varchar(max) is your friend this give you ~2Gb of data to play with, which is pretty big for most uses.
Next read the data into a byte array and convert it to a Base64String
FileInfo _fileInfo = new FileInfo(openFileDialog1.FileName);
if (_fileInfo.Length < 2147483647) //2147483647 - is the max size of the data 1.9gb
{
byte[] _fileData = new byte[_fileInfo.Length];
_fileInfo.OpenRead().Read(_fileData, 0, (int)_fileInfo.Length);
string _data = Convert.ToBase64String(_fileData);
}
else
{
MessageBox.Show("File is too large for database.");
}
And reverse the process to recover
byte[] _fileData = Convert.FromBase64String(_data);
You'll want to dispose of those strings as quickly as possible by setting them to string.empty as soon as you have finished using them!
But if you can, just upgrade to 2008 and use FILESTREAM.
If you're using SQL Server 2008, you could use FILESTREAM (getting started guide here). An example of using this functionality from C# is here.
You would need the file into a byte array then store this as a blob field in the database possible with the name you wanted to give the file and the file type.
You could just reverse the process for putting the file out again.
I'm trying to load text files (.aspx, .cs, html, etc) into a sql server 2008 database. I'm able to load all files that are less than 64 kb so far. I have two questions; How do I get around the 64 kb limit, and is the method I'm using the best way to do this?
Thanks for the help.
Database:
file_length int,
file_path varchar(250),
file_string varchar(MAX)
private static void Load_Files()
{
string source = HttpContext.Current.Server.MapPath("~/website/");
DirectoryInfo di = new DirectoryInfo(source);
FileInfo[] files = di.GetFiles();
foreach (FileInfo f in files)
{
string sourceFile = f.FullName;
FileStream fs_reader = new FileStream(sourceFile, FileMode.Open, FileAccess.Read);
StreamReader reader = new StreamReader(fs_reader);
string content = reader.ReadToEnd();
Int32 file_length = content.Length;
string CS = ConfigurationManager.ConnectionStrings["MCP_CS"].ConnectionString;
SqlConnection SQL_Conn_01 = new SqlConnection(CS);
string SQL_01 = "INSERT INTO Page_File_Store (file_length, file_path, file_string) VALUES (#file_length, #file_path, #file_string)";
SqlCommand SQL_File_Load = new SqlCommand(SQL_01, SQL_Conn_01);
SQL_File_Load.Parameters.Add(new SqlParameter("#file_length", file_length));
SQL_File_Load.Parameters.Add(new SqlParameter("#file_path", sourceFile));
//SQL_File_Load.Parameters.Add(new SqlParameter("#file_string", content));
SqlParameter contentParameter = new SqlParameter("#file_string", SqlDbType.VarChar, -1);
contentParameter.Value = content;
SQL_File_Load.Parameters.Add(contentParameter);
SQL_Conn_01.Open();
SQL_File_Load.ExecuteNonQuery();
SQL_Conn_01.Close();
reader.Close();
}
}
}
Please Note: this is a copy of a question I asked earlier and lost control of when I cleared my cookies. How do I load text files greater than the 64 kb buffersize limit?
There is no 64kb limit in SQL Server. The limits for SQL strings are either at 8000 bytes for in-row data types (char, varchar, nchar, nvarchar, binary and varbinary) or 2 GB for LOB types (varchar(max), nvarchar(max) and varbinary(max)). The 64 kb limitation you see must come from something else, most likely from IIS upload or ASP or CLR processing limitations.
But you're not going to be able to process arbitrary length files like this. .Net will not load a large stream into a string using Stream.ReadToEnd() because the memory allocation won't succeed. You are going to load the file in chunks and append each chunk into the database, using the LOB specific UPDATE table SET column.WRITE... syntax.
P.S. Some responses recommend use of the old LOB types like text, ntext and image. Don't use those types with SQL Server 2008, they are deprecated and discouraged.
Use a TEXT column instead of a VARCHAR/CHAR column. If you need something even bigger than TEXT, or will be loading in binary files, look into BINARY/VARBINARY/IMAGE etc.
MSDN provides documentation on all of the available data types. For text files you'll probably want to use the TEXT type, for binary files use one of the Binary String types.
In addition to phoebus's response, if your working buffer is too small, or even smaller than the 64k, you can read in the first segment, update the text field with that, read another buffer and update text with text + new buffer and repeat until all data loaded.
Databases are not made to store big files in it. Store the files on the harddisk instead and store the filenames into the database.
If you still want to store them into the database anyway you can use a compression library like #ziplib to decrease file sizes (source code compresses very well) and use binary column types like phoebus proposes.