I have an ASP .Net (3.5) website. I have the following code that uploads a file as a binary to a SQL Database:
Print("
protected void UploadButton_Click(object sender, EventArgs e)
{
//Get the posted file
Stream fileDataStream = FileUpload.PostedFile.InputStream;
//Get length of file
int fileLength = FileUpload.PostedFile.ContentLength;
//Create a byte array with file length
byte[] fileData = new byte[fileLength];
//Read the stream into the byte array
fileDataStream.Read(fileData, 0, fileLength);
//get the file type
string fileType = FileUpload.PostedFile.ContentType;
//Open Connection
WebSysDataContext db = new WebSysDataContext(Contexts.WEBSYS_CONN());
//Create New Record
BinaryStore NewFile = new BinaryStore();
NewFile.BinaryID = "1";
NewFile.Type = fileType;
NewFile.BinaryFile = fileData;
//Save Record
db.BinaryStores.InsertOnSubmit(NewFile);
try
{
db.SubmitChanges();
}
catch (Exception)
{
throw;
}
}");
The files that will be uploaded are PDFs, Can you please help me in writing the code to get the PDF out of the SQL database and display it in the browser. (I am able to get the binary file using a linq query but not sure how to process the bytes)
So are you really just after how to serve a byte array in ASP.NET? It sounds like the database part is irrelevant, given that you've said you are able to get the binary file with a LINQ query.
If so, look at HttpResponse.BinaryWrite. You should also set the content type of the response appropriately, e.g. application/pdf.
How big are the files? Huge buffers (i.e. byte[fileLength]) are usually a bad idea.
Personally, I'd look at things like this and this, which show reading/writing data as streams (the second shows pushing the stream as an http response). But updated to use varchar(max) ;-p
protected void Test_Click(object sender, EventArgs e)
{
WebSysDataContext db = new WebSysDataContext(Contexts.WEBSYS_CONN());
var GetFile = from x in db.BinaryStores
where x.BinaryID == "1"
select x.BinaryFile;
FileStream MyFileStream;
long FileSize;
MyFileStream = new FileStream(GetFile, FileMode.Open);
FileSize = MyFileStream.Length;
byte[] Buffer = new byte[(int)FileSize];
MyFileStream.Read(Buffer, 0, (int)FileSize);
MyFileStream.Close();
Response.Write("<b>File Contents: </b>");
Response.BinaryWrite(Buffer);
}
I tryed this and this did not work. I get a compile error on this line "MyFileStream = new FileStream(GetFile, FileMode.Open);"
I not sure where i am going wrong, is it due to the way i have stored it?
When you store binary files in SQL Server it adds an OLE Header to the binary-data. So you must strip that header before actually reading the byte[] into file. Here's how you do this.
// First Strip-Out the OLE header
const int OleHeaderLength = 78;
int strippedDataLength = datarow["Field"].Length - OleHeaderLength;
byte[] strippedData = new byte[strippedDataLength];
Array.Copy(datarow["Field"], OleHeaderLength,
strippedData , 0, strippedDataLength );
Once you run this code, strippedData will contain the actual file data. You can then use MemoryStream or FileStream to perform I/O on the byte[].
Related
I'm trying to save .docx files in a database and the code shown here is where I'm converting the .docx file into byte array and then trying to save it into the database.
I'm getting an error
String or binary data would be truncated
I used a column of type varbinary(max) in the database, and the same code is working for pdf and text file but its not working for .docx.
Please guide me.
Controller:
try
{
byte[] byteDocument = new byte[0];
if (file.Length > 0)
{
long length = file.Length;
using var fileStream = file.OpenReadStream();
byteDocument = new byte[length];
fileStream.Read(byteDocument, 0, (int)file.Length);
_attachmentDto = new ReviewAttachmentDto
{
ReviewId = reviewId,
DocumentType = file.ContentType,
Document = byteDocument
};
}
string requestBody = JsonConvert.SerializeObject(_attachmentDto);
// Call API
var _responseObj = await WebAPIHelper.PostDataToAPI(appSettings.SaveUrl, requestBody;
}
Database save:
public void SaveAction(ReviewAttachment reviewAttachment)
{
Entities.Surveillance.ReviewAttachment reviewAttachmentDB = new Entities.Surveillance.ReviewAttachment();
reviewAttachmentDB.ReviewId = Int32.Parse(reviewAttachment.ReviewId);
reviewAttachmentDB.DocumentType = reviewAttachment.DocumentType;
reviewAttachmentDB.Document = reviewAttachment.Document;
context.Add(reviewAttachmentDB);
context.SaveChanges();
}
Since I doubt that your Word doc is over 2 GB in size, then I would suggest that you check to see if you have not reached the size limit for your actual database file. You may need to enable autogrowth on your database. Check the answer here. This person was getting the same error message with a varbinary(max) column.
https://stackoverflow.com/a/11006473/1461269
Also, to find out the implications of enabling this option you can read about it on Microsoft's support site: https://support.microsoft.com/en-ca/help/315512/considerations-for-the-autogrow-and-autoshrink-settings-in-sql-server
Im trying to read a DAT file with BinaryReady but I get an exception and don't see why. " Unable to read beyond the end of the stream" is the message I get. My code looks like this:
private void button1_Click(object sender, EventArgs e)
{
OpenFileDialog OpenFileDialog = new OpenFileDialog();
OpenFileDialog.Title = "Open File...";
OpenFileDialog.Filter = "Binary File (*.dat)|*.dat";
OpenFileDialog.InitialDirectory = #"C:\";
if (OpenFileDialog.ShowDialog() == DialogResult.OK)
{
FileStream fs = new FileStream(OpenFileDialog.FileName, FileMode.Open);
BinaryReader br = new BinaryReader(fs);
label1.Text = br.ReadString();
label2.Text = br.ReadInt32().ToString();
fs.Close();
br.Close();
}
I hat a certain DAT file with a lot of information and was hoping to be able to read it out and maybe even place it in a table and plot the data. But its been a while I worked with C#. So if anyone could help me I would highly appreciate
BinaryReader is very rarely a good choice for reading an external file; it only really makes sense when used in parallel with code that writes the file using BinaryWriter, since they use the same conventions.
I imagine that what is happening here is that your call to ReadString is trying to use conventions that aren't valid for your file - specifically, it will be reading some bytes (I want to say 4, big endian?) as a length prefix for the string, and then trying to read that many bytes as the string. But if that isn't what the file contents are: that could easily try to read gibberish, interpret it as a huge number, and then fail to read that-many bytes.
If you're processing an arbitrary file (nothing to do with BinaryWriter), then you really need to know a lot about the protocol/format. Given that file extensions are ambiguous, I'm not going to infer anything from ".dat" - what matters is: what is the data and where did it come from?. Only with that information can a sensible comment on reading it be made.
From the comments, here's some (untested) code that should get you started in terms of parsing the contents as a span:
public static YourResultType Process(string path)
{
byte[] oversized = null;
try
{
int len, offset = 0, read;
// read the file into a leased buffer, for simplicity
using (var stream = File.OpenRead(path))
{
len = checked((int)stream.Length);
oversized = ArrayPool<byte>.Shared.Rent(len);
while (offset < len &&
(read = stream.Read(oversized, offset, len - offset)) > 0)
{
offset += read;
}
}
// now process the payload from the buffered data
return Process(new ReadOnlySpan<byte>(oversized, 0, len));
}
finally
{
if (oversized is object)
ArrayPool<byte>.Shared.Return(oversized);
}
}
private static YourResultType Process(ReadOnlySpan<byte> payload)
=> throw new NotImplementedException(); // your code here
I've got a pesky problem with gzipstream targeting .Net 3.5. This is my first time working with gzipstream, however I have modeled after a number of tutorials including here and I'm still stuck.
My app serializes a datatable to xml and inserts into a database, storing the compressed data into a varbinary(max) field as well as the original length of the uncompressed buffer. Then, when I need it, I retrieve this data and decompress it and recreates the datatable. The decompress is what seems to fail.
EDIT: Sadly after changing the GetBuffer to ToArray as suggested, my issue remains. Code Updated below
Compress code:
DataTable dt = new DataTable("MyUnit");
//do stuff with dt
//okay... now compress the table
using (MemoryStream xmlstream = new MemoryStream())
{
//instead of stream, use xmlwriter?
System.Xml.XmlWriterSettings settings = new System.Xml.XmlWriterSettings();
settings.Encoding = Encoding.GetEncoding(1252);
settings.Indent = false;
System.Xml.XmlWriter writer = System.Xml.XmlWriter.Create(xmlstream, settings);
try
{
dt.WriteXml(writer);
writer.Flush();
}
catch (ArgumentException)
{
//likely an encoding issue... okay, base64 encode it
var base64 = Convert.ToBase64String(xmlstream.ToArray());
xmlstream.Write(Encoding.GetEncoding(1252).GetBytes(base64), 0, Encoding.GetEncoding(1252).GetBytes(base64).Length);
}
using (MemoryStream zipstream = new MemoryStream())
{
GZipStream zip = new GZipStream(zipstream, CompressionMode.Compress);
log.DebugFormat("Compressing commands...");
zip.Write(xmlstream.GetBuffer(), 0, xmlstream.ToArray().Length);
zip.Flush();
float ratio = (float)zipstream.ToArray().Length / (float)xmlstream.ToArray().Length;
log.InfoFormat("Resulting compressed size is {0:P2} of original", ratio);
using (SqlCommand cmd = new SqlCommand())
{
cmd.CommandText = "INSERT INTO tinydup (lastid, command, compressedlength) VALUES (#lastid,#compressed,#length)";
cmd.Connection = db;
cmd.Parameters.Add("#lastid", SqlDbType.Int).Value = lastid;
cmd.Parameters.Add("#compressed", SqlDbType.VarBinary).Value = zipstream.ToArray();
cmd.Parameters.Add("#length", SqlDbType.Int).Value = xmlstream.ToArray().Length;
cmd.ExecuteNonQuery();
}
}
Decompress Code:
/* This is an encapsulation of what I get from the database
public class DupUnit{
public uint lastid;
public uint complength;
public byte[] compressed;
}*/
//I have already retrieved my list of work to do from the database in a List<Dupunit> dupunits
foreach (DupUnit unit in dupunits)
{
DataSet ds = new DataSet();
//DataTable dt = new DataTable();
//uncompress and extract to original datatable
try
{
using (MemoryStream zipstream = new MemoryStream(unit.compressed))
{
GZipStream zip = new GZipStream(zipstream, CompressionMode.Decompress);
byte[] xmlbits = new byte[unit.complength];
//WHY ARE YOU ALWAYS 0!!!!!!!!
int bytesdecompressed = zip.Read(xmlbits, 0, unit.compressed.Length);
MemoryStream xmlstream = new MemoryStream(xmlbits);
log.DebugFormat("Uncompressed XML against {0} is: {1}", m_source.DSN, Encoding.GetEncoding(1252).GetString(xmlstream.ToArray()));
try{
ds.ReadXml(xmlstream);
}catch(Exception)
{
//it may have been base64 encoded... decode first.
ds.ReadXml(Encoding.GetEncoding(1254).GetString(
Convert.FromBase64String(
Encoding.GetEncoding(1254).GetString(xmlstream.ToArray())))
);
}
xmlstream.Dispose();
}
}
catch (Exception e)
{
log.Error(e);
Thread.Sleep(1000);//sleep a sec!
continue;
}
Note the comment above... bytesdecompressed is always 0. Any ideas? Am I doing it wrong?
EDIT 2:
So this is weird. I added the following debug code to the decompression routine:
GZipStream zip = new GZipStream(zipstream, CompressionMode.Decompress);
byte[] xmlbits = new byte[unit.complength];
int offset = 0;
while (zip.CanRead && offset < xmlbits.Length)
{
while (zip.Read(xmlbits, offset, 1) == 0) ;
offset++;
}
When debugging, sometimes that loop would complete, but other times it would hang. When I'd stop the debugging, it would be at byte 1600 out of 1616. I'd continue, but it wouldn't move at all.
EDIT 3: The bug appears to be in the compress code. For whatever reason, it is not saving all of the data. When I try to decompress the data using a third party gzip mechanism, I only get part of the original data.
I'd start a bounty, but I really don't have much reputation to give as of now :-(
Finally found the answer. The compressed data wasn't complete because GZipStream.Flush() does absolutely nothing to ensure that all of the data is out of the buffer - you need to use GZipStream.Close() as pointed out here. Of course, if you get a bad compress, it all goes downhill - if you try to decompress it, you will always get 0 returned from the Read().
I'd say this line, at least, is the most wrong:
cmd.Parameters.Add("#compressed", SqlDbType.VarBinary).Value = zipstream.GetBuffer();
MemoryStream.GetBuffer:
Note that the buffer contains allocated bytes which might be unused. For example, if the string "test" is written into the MemoryStream object, the length of the buffer returned from GetBuffer is 256, not 4, with 252 bytes unused. To obtain only the data in the buffer, use the ToArray method.
It should be noted that in the zip format, it first works by locating data stored at the end of the file - so if you've stored more data than was required, the required entries at the "end" of the file don't exist.
As an aside, I'd also recommend a different name for your compressedlength column - I'd initially taken it (despite your narrative) as being intended to store, well, the length of the compressed data (and written part of my answer to address that). Maybe originalLength would be a better name?
I'm not entirely new to programming but I still see myself as a novice. I'm currently creating an Invoicing system with a max of 5 line items, this being said, I'm creating a String<> item, serializing it to store and then de-serializing it to display.
So far I've managed the serializing, and de-serializing, and from the de-serialized value I've managed to display the relevant information in the correct fields.
My question comes to: HOW do I add the list of items in the String<> object to either a Binary or XML field in my SQL table?
I know it should be similar to adding an Image object to binary but there's a catch there. usually:
byte[] convertToByte(string sourcePath)
{
//get the byte file size of image
FileInfo fInfo = new FileInfo(sourcePath);
long byteSize = fInfo.Length;
//read the file using file stream
FileStream fStream = new FileStream(sourcePath, FileMode.Open, FileAccess.Read);
//read again as byte using binary reader
BinaryReader binRead = new BinaryReader(fStream);
//convert image to byte (already)
byte[] data = binRead.ReadBytes((int)byteSize);
return data;
}
this kind of thing is done for an image however the whole "long" thing does not apply to the List<> object.
Any assistance would be helpful
If you simply want to store your data as "readable" text, you can use the varchar(MAX) or nvarchar(MAX) (depending on whether you need extended character support). That translates directly into a string in ADO.NET or EntityFramework.
If all you need are bytes from a string, the Encoding class will do that:
System.Text.Encoding.Default.GetBytes(yourstring);
See: http://msdn.microsoft.com/en-us/library/ds4kkd55%28v=vs.110%29.aspx
A way of saving a binary file in a string is to convert the image to a Base64 string. This can be done with the Convert.ToBase64String (Byte[]) method:
Convert.ToBase64String msdn
string convertImageToBase64(string sourcePath)
{
//get the byte file size of image
FileInfo fInfo = new FileInfo(sourcePath);
long byteSize = fInfo.Length;
//read the file using file stream
FileStream fStream = new FileStream(sourcePath, FileMode.Open, FileAccess.Read);
//read again as byte using binary reader
BinaryReader binRead = new BinaryReader(fStream);
//convert image to byte (already)
byte[] data = binRead.ReadBytes((int)byteSize);
return Convert.ToBase64String (data);
}
Now you will be able to save the Base64 string in a string field in your database.
I'm trying to convert a .db file to binary so I can stream it across a web server. I'm pretty new to C#. I've gotten as far as looking at code snippets online but I'm not really sure if the code below puts me on the right track. How I can write the data once I read it? Does BinaryReader automatically open up and read the entire file so I can then just write it out in binary format?
class Program
{
static void Main(string[] args)
{
using (FileStream fs = new FileStream("output.bin", FileMode.Create))
{
using (BinaryWriter bw = new BinaryWriter(fs))
{
long totalBytes = new System.IO.FileInfo("input.db").Length;
byte[] buffer = null;
BinaryReader binReader = new BinaryReader(File.Open("input.db", FileMode.Open));
}
}
}
}
Edit: Code to stream the database:
[WebGet(UriTemplate = "GetDatabase/{databaseName}")]
public Stream GetDatabase(string databaseName)
{
string fileName = "\\\\computer\\" + databaseName + ".db";
if (File.Exists(fileName))
{
FileStream stream = File.OpenRead(fileName);
if (WebOperationContext.Current != null)
{
WebOperationContext.Current.OutgoingResponse.ContentType = "binary/.bin";
}
return stream;
}
return null;
}
When I call my server, I get nothing back. When I use this same type of method for a content-type of image/.png, it works fine.
All the code you posted will actually do is copy the file input.db to the file output.bin. You could accomplish the same using File.Copy.
BinaryReader will just read in all of the bytes of the file. It is a suitable start to streaming the bytes to an output stream that expects binary data.
Once you have the bytes corresponding to your file, you can write them to the web server's response like this:
using (BinaryReader binReader = new BinaryReader(File.Open("input.db",
FileMode.Open)))
{
byte[] bytes = binReader.ReadBytes(int.MaxValue); // See note below
Response.BinaryWrite(bytes);
Response.Flush();
Response.Close();
Response.End();
}
Note: The code binReader.ReadBytes(int.MaxValue) is for demonstrating the concept only. Don't use it in production code as loading a large file can quickly lead to an OutOfMemoryException. Instead, you should read in the file in chunks, writing to the response stream in chunks.
See this answer for guidance on how to do that
https://stackoverflow.com/a/8613300/141172