Dapper and Varbinary(max) stream parameters - c#

I'm trying to store a binary blob in a database table and get an output back from it. The data is stored in a MemoryStream object. I'm trying to save this to a sql server 2012 table using query async. The call succeeds but no data is inserted into the column. (e.g. I get a 0x0 entry when I query it back).
Sure enough, actually checking a trace I see dapper sending a 0x0. The memorystream has a length so am I doing something wrong or does dapper not support this scenario?
My query string is just a simple insert and gets the id and insertion time back.
I'm using the following call
using(var conn=new SqlConnection(_connstr)){
var dynParams = new DynamicParameters();
dynParams.Add("BinaryBlob",
_memoryStream,DbType.Binary,ParameterDirection.Input,-1);
var res = await conn.QueryAsync<MyResultType>(_queryStr, dynParams);
}
The query inserts a row and gets a timestamp back, but no data is actually inserted. What am I doing wrong?

Make sure you seek to the beginning of the Memory stream. Streams have a positional state. Another approach would be to convert the memory stream to a Byte[] before trying to persist it.
e.g.
_memorystream.Seek(0, SeekOrigin.Begin);

Related

SQL Calls from Azure Functions

I've an Azure function which uses an event hub trigger to post data to Blob storage. I'm reading the incoming payload to determine the folder structure of the Blob. I'm inserting some values from the payload into a MS SQL Database. However, these values have to be inserted every hour and not on every trigger. How can I achieve this?
I'm reading the incoming message like this:
var msg = JsonConvert.DeserializeObject<DeviceInfo>(Convert.ToString(myEventHubMessage));
and storing in the blob:
using (var writer = binder.Bind<TextWriter>(new BlobAttribute(path)))
{
writer.Write(myEventHubMessage);
}
Here I check to see if the record has been inserted in the Database. If not I insert it. But the method CurrentTimeUnprocessed() makes a call to the DB on every request. I don't want to do it.
if (CurrentTimeUnprocessed(parameter_array) == 0)
AddToUnprocessed(parameter_array);
What's the best way to achieve this?
Azure functions don't maintain any state, so you would have to store the timestamp somewhere.
Is there any reason why you don't want to check the database on every request? This could be a very lightweight query that is very fast and simply checks the timestamp on a table.
If this is not an option, another alternative would be to use something like Redis with an expiring key.

Entity Framework SqlQuery Select LONG RAW into byte array

I am executing a query in Entity Framework to select LONG RAW data into a byte array.
var result = db.Database.SqlQuery<byte[]>("SELECT MESSAGE FROM FOCUS.ENTRIES");
var list = await result.ToListAsync();
When I execute this code, I get a list of byte arrays, but all of them are empty. In the database they are not empty.
The MESSAGE table looks like this:
CREATE TABLE "FOCUS"."ENTRY"
( "PRIMKEY" NUMBER,
"TITLE" VARCHAR2,
"MESSAGE" LONG RAW
);
I am using ODP.NET, Managed Driver as DB provider.
I guess it's some mapping problem, but i can't figure it out.
Any help would be welcome.
Thanks!
SqlQuery expects a class with member name equivalent to SQL column.
public class MessageInfo{
public byte[] Message;
}
var result = await db.Database
.SqlQuery<MessageInfo>("SELECT MESSAGE FROM FOCUS.ENTRIES")
.ToListAsync();
var list = result.Select( x => x.Message );
Few notes from Oracle Docs , https://docs.oracle.com/html/A96160_01/features.htm
When an OracleDataReader is created containing LONG or LONG RAW types,
OracleDataReader defers the fetch of the LONG or LONG RAW column data.
The initial number of characters for LONG or bytes for LONG RAW
fetched on the client side depends on the InitialLONGFetchSize
property of the OracleCommand. By default, InitialLONGFetchSize is 0.
ODP.NET does not support CommandBehavior.SequentialAccess. Therefore,
LONG and LONG RAW data can be fetched in a random fashion.
To obtain data beyond InitialLONGFetchSize bytes or characters, a
primary key column must be provided in the list of selected columns.
The requested data is fetched from the database when the appropriate
typed accessor method (GetOracleString for LONG or GetOracleBinary for
LONG RAW) is called on the OracleDataReader object.
So try adding primary key and see if you can retrieve data. Otherwise you will have to skip SqlQuery and use ODP.NET directly to fetch if you cannot change configuration. Or you will have to create instance of OracleConnection and pass it as parameter in your DbContext's constructor.

Right way to store Image in database through Entity Framework

I have converted an image to a byte array using below code to store it in Database
if (Request.Content.IsMimeMultipartContent())
{
var provider = new MultipartMemoryStreamProvider();
await Request.Content.ReadAsMultipartAsync(provider);
foreach (var file in provider.Contents)
{
var filename = file.Headers.ContentDisposition.FileName.Trim('\"');
var attachmentData = new AttachmentData ();
attachmentData.File = file.ReadAsByteArrayAsync().Result;
db.AttachmentData.Add(attachmentData);
db.SaveChanges();
return Ok(attachmentData);
}
}
Here File column in DB is of type "varbinary(max)" and in EF model it is byte array (byte[]).
Using above code I was able to save the image in the File column something similar to "0x30783839353034453437304430413143136303832....." (This length is exceeding 43679 characters which is more than default given to any column so the data got truncated while storing it)
I have to change the default length(43679) of column to store it in database.
Am I doing it the correct way to retrieve and store image in Database. I was also thinking to store the image as "Base64 String" but 43679 will still exceed.
I am using Angular JS to show the image on front end which uses WebAPI to fetch and save the image as ByteArray in database.
Yes, It really helps to not know how databases work.
First:
varbinary(max)
This stores up to 2gb, not 43679 bytes.
Then:
similar to "0x30783839353034453437304430413143136303832.....
This is not how it is stored. This is a textual representation in uotput.
(This length is exceeding 43679 characters which is more than default given to
any column so the data got truncated while storing it)
There is no default given to any column - outside basically SQL Server Management Studio which likely will not be able to handle images as images and has a lot of limitations. But this is not the database, it is an admin ui.
I was also thinking to store the image as "Base64 String" but 43679 will still
exceed.
Actually no, it will exceed this by more - your data will be significantly longer as Base64 is longer than binary data.

Entity Framework and appending VARBINARY field

i am trying to add bytes of a file to a field in the database which is of type VARBINARY bu this needs to be appended due to file size constraits
Is there any example code/website of how to do this? Or is it even possible to append the bytes to this field using Entity Framework?
I need to append the data as getting a byte array of 1GB + is going to cause memory exceptions so I think this is the only way..
Some code I have done
using (var stream = File.OpenRead(fn))
{
long bytesToRead = 1;
byte[] buffer = new byte[bytesToRead];
while (stream.Read(buffer, 0, buffer.Length) > 0)
{
Item = buffer;
}
}
Thanks for any help
The basic idea is making an stored procedure that implements an update like this:
UPDATE MyTable SET Col = Col + #newdata WHERE Id = #Id
and invoking it using ExecuteSqlCommand (see MSDN docs here).
But in this case you're only transfering the problem to the SQL Server side The column must be retrieved, modified, and written back).
To really get rid of the memory problem, implement your stored procedure using UPDATETEXT, which is much more efficient for your requirements:
Updates an existing text, ntext, or image field. Use UPDATETEXT to change only a part of a text, ntext, or image column in place. Use WRITETEXT to update and replace a whole text, ntext, or image field
When storing large files in a database, it is usual to store the file on disc on the Web Server rather than in the database. In the database you store the path to the file, thus your code can get to it's contents without having to store gigs of data in the database.
However, if you are attempting to manipulate the contents of a 1GB+ file in memory, this is going to be interesting however you do it...

How to get Bytea Data in string from Postgres Database

Is there a way to get the actual encoded string saved in the Database of column with DataType Bytea. Because when the record is fetched in C# it returns as System.Byte[] which I don't want. I want the data which is saved in that column
E'\\\142\\\247\\\ and so on till the data ends.
I will appreciate your support
When I am querying the data through
SELECT tpl::TEXT from Cards where ecode="xyz";
I get the following error
Error: Cannot cast type bytea to text
Line1: Select tpl::TEXT from cards
Thank you
Like this
As you see that the Bytea column is showing System.Byte[] which was overwritten by my application because the code in C# stores the data in the DataTable column as System.Byte[] while updating the data I didn't decode it and update it .
I am using Navicat premium when I query data it shows me the result when I right click on the grid result and copy as insert statement it shows me two result for different rows
like this
INSERT INTO "public"."cards" ("ecode", "tpl") VALUES ('4210149888659', E'System.Byte[]');
INSERT INTO "public"."cards" ("ecode", "tpl") VALUES('3650257637661',E '\\247\\377\\001\\021\\340\\000\\230\\000\\002U\\000e\\000\\362\\000\\002-\\000\\253\\000p\\000\\002\\207\\000~\\000g\\000\\002\\215\\000{\\000\\317\\000\\002\\334\\000h\\000\\222\\000\\001|\\000\\004\\001U\\000\\002\\202\\000K\\000\\201\\000\\001\\000\\000\\204\\000\\241\\000\\001w\\000\\213\\000\\305\\000\\002\\021\\000V\\000\\237\\000\\002L\\001=\\001\\364\\000\\001X\\001"\\001\\313\\000\\002J\\000\\010\\001\\324\\000\\001\\370\\000\\037\\001J\\000\\002;\\0017\\000\\202\\000\\002\\300\\000\\317\\0007\\000\\002\\215\\000[\\000\\004\\011\\017\\007\\012\\005\\015\\014\\006\\016\\012\\007\\010\\005\\005\\007\\011\\010\\001\\004\\012\\017\\002\\003\\010\\012\\004\\010\\005\\003\\013\\014\\005\\017\\007\\003\\010\\003\\001\\011\\004\\012\\006\\020\\011\\005\\013\\015\\010\\002\\004\\005\\010\\007\\011\\012\\000\\002\\002\\020\\012\\003\\015\\000\\005\\002\\017\\003\\000\\006\\016\\020\\010\\017\\014\\000\\001\\012\\001\\010\\011\\002\\004\\007\\010\\000\\002\\006\\011\\007\\003\\020\\011\\003\\001\\005\\011\\000\\007\\002\\012\\002\\000\\020\\000\\016\\004\\017\\004\\003\\011\\017\\000\\003\\004\\000\\001\\007\\017\\002\\001\\017\\014\\006\\002\\016\\015\\011\\015\\006\\014\\016\\010\\020\\013\\000\\003\\006\\015\\002\\005\\020\\015\\016\\015\\004\\001\\003\\015\\010\\010\\006\\014\\002\\007\\020\\014\\011\\001\\000\\014\\010\\003\\016\\001\\015\\017\\020\\013\\006\\013\\016\\013\\011\\001\\014\\013\\004\\013\\002\\013\\001\\000'
);
You can't just convert it because PostgreSQL can't guarantee it can be converted safely. The best you can do is to convert the escaped form into a string and that's not what you probably want. Keep in mind that since bytea is binary data there is no way that PostgreSQL can be sure that the string that comes out will be legit. You could have embedded nulls in a UTF-8 string, for example, which could cause some fun with buffer overruns if you aren't careful.
This is the sort of thing that should be done in the client-side and you should assume that the data is binary, and not necessarily a valid string. If you want to store strings, store text fields, not bytea.

Categories

Resources