Deserialize SQL Server image field back in Excel format - c#

I have a SQL Server table that contains serialized Excel files, with 3 fields:
IdDocument -> int (PK)
DataFile -> image
FileName -> nvarchar
where DataFile contains the Excel file serialized, and FileName the name of the file (with path).
Something like this:
0xD0CF11E0A1B11AE100.....
U:\SAP_R3V4_Validation_Documents\March2012.xls
Now I need to get these files back in Excel format.
How can I accomplish this?
Using C# console application or SQL Server features could be fine.
Thank you in advance.
Luis

Excel files are binary. The xls format is obsolete, replaced since 2007 (15 years ago) by xlsx, a ZIP package containing XML files. What the question shows is how binary data looks in SSMS, not some kind of serialized format.
BTW the image is deprecated, replaced by varbinary(max) in 2005 or 2008 (can't remember).
In any case, reading binary data is the same as reading any other data. A DbDataReader is used to retrieve the query results and strongly typed methods are used to read specific fields per row. In this particular case GetStream() can be used to retrieve the data as a Stream that can be saved to disk:
using var con=new SqlConnection(connectionString)
{
using (var cmd=new SqlCommand(sql,con))
{
using (var reader=cmd.ExecuteReader())
{
while(reader.Read())
{
var path=reader.GetString(2);
var finalPath=Path.Combine(root,Path.GetFileName(path))
using(var stream=reader.GetStream(1))
{
using(var fileStream=File.Create(finalPath))
{
stream.CopyTo(fileStream);
}
}
}
}
}
}
The only thing that's different is the code that reads the field as a stream and saves it to disk
using(var stream=reader.GetStream(1))
{
using(var fileStream=File.Create(finalPath))
{
stream.CopyTo(fileStream);
}
}
The using clauses are used to ensure the data and file streams are closed even in case of error. The path itself is constructed by combining a root folder with the stored filename, not the full path

Related

I want to back up the data in an xml file, I couldn't find how to save the newly received data without adding the same data to the existing file

I want to back up the data for today's date in an XML file every 10 minutes I managed to create the XML file, but I couldn't find how to save the newly received data without adding the same data to the existing file
Can I convert the file I created to dataSet with dataSet.ReadXml, add the new data I got from the query and convert it back to an XML file and save it? What method should I use?
String QueryString = "SELECT * FROM dbo.db_records WHERE DAY(datetime) = DAY(CURRENT_TIMESTAMP)";
public void run()
{
while (true)
{
try
{
Thread.Sleep(600000);
if (odbcConnection.State != ConnectionState.Open)
{
odbcConnection.Close();
odbcConnection.Open();
}
DataSet dataSet = new DataSet("XMLDB");
odbcDataAdapter.Fill(dataSet, "#ID");
if (File.Exists(Path))
{
}
else
{
using (FileStream fs = File.Create(Path))
{
dataSet.WriteXml(fs);
}
}
}
catch (Exception) { }
}
}
Xml is not a great format if you want to append data, since it uses tags that need to be closed. So you have a few options:
Save separate files
Since you seem to fetch data for the current day, just attach date-info to your file-name. When reading the data you may need to read all files in the folder fitting the pattern, and merge it.
Use a format that is trivial to append
If your data model is simple tabular data you may use a .csv file instead. You can add data to this using one of the File.Append methods.
Overwrite all data
Get the complete data you want to save each time, and overwrite any existing data. This is simple, but may be slow if you have lots of data. But if the database is small and grow slowly this might be perfectly fine.
Parse the existing data
You could read the existing file with Readxml as you suggest, and use DataSet.Merge to merge it with your new set before overwriting the existing file. This may also be slow, since it needs to process all the data. But it may put less load on the database than fetching all data from the database each time.
In any case, you might want to periodically save full backups, or have some other way to handle corrupt files. You should also have some way to test the backups. I would also consider using the backup options built into most database engines if that is an alternative.

zip HUGE csv files in c#

I am working in .Net 4.7.2. We have List of objects say MyObject which is to be converted to csv single file.
Currently using below code i use to create HUGE csv file ( 10GB and onwards).
using (var writ = new StreamWriter(fileStream, Encoding.UTF8))
{
using (var csvWrit = new CsvWriter(writ))
{
//logic
csvWrit.NextRecord();
}
}
ZipFile.CreateFromDirectory(<sourcefileName>, <destFileName>);
Now I need to create zip of these HUGE file. I found ZipFile.CreateFromDirectory in C#. After csv is created I call ZipFile.CreateFromDirectory to create zip file.
My Question Should I continue to first create the csv and then zip
OR
Any other efficient way to do this?

Retrieving Compressed files from database

I have a SQL Server 2008-R2 database that has a table that is storing different types of files (Word Docs, PDF, TIF, etc), I can successfully retrieve these files from the database using the following method:
private void GetFilesFromDatabase() {
try
{
const string connStr = #"Data Source=localhost\SQLInstance;Initial Catalog=MyData;Integrated Security=True;";
//Initialize SQL Server connection.
SqlConnection CN = new SqlConnection(connStr);
//Initialize SQL adapter.
SqlDataAdapter ADAP = new SqlDataAdapter("Select ole_id, ole_object From OLE Where ole_id = 21601", CN);
//Initialize Dataset.
DataSet DS = new DataSet();
//Fill dataset with FilesStore table.
ADAP.Fill(DS, "FilesStore");
//Get File data from dataset row.
byte[] FileData = (byte[])DS.Tables["FilesStore"].Rows[0]["ole_object"];
string FileName = #"C:\Temp\Text.doc";
//Write file data to selected file.
using (FileStream fs = new FileStream(FileName, FileMode.Create))
{
fs.Write(FileData, 0, FileData.Length);
fs.Close();
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
This file I am retrieving is a Word Doc file according to other information contained in that particular row. When I attempt to open the file after retrieving it to disk the data all appears to be gibberish and is not able to be read. Now I believe that these files were compressed prior to be saved to the database but I don't know how to de-compress them so that they can be viewed, any thoughts on how I may accomplish this? My ultimate goal is to move these images into another database.
Your code to save files look OK. Assuming your original data was .DOC file and you got bad file after saving you pretty much out of luck. You may want to look at the content of the file in binary editor (i.e. Visual Studio's one) to confirm that file is not something obviously different (text/image....).
You need to ask around how files where stored in the database. There is no way this can be answered remotely as it could be compressed, encrypted, split into chunks or even simply corrupted.

How to save file in SQL Server database if have file path?

I am building some C# desktop application and I need to save file into database. I have come up with some file chooser which give me correct path of the file. Now I have question how to save that file into database by using its path.
It really depends on the type and size of the file. If it's a text file, then you could use File.ReadAllText() to get a string that you can save in your database.
If it's not a text file, then you could use File.ReadAllBytes() to get the file's binary data, and then save that to your database.
Be careful though, databases are not a great way to store heavy files (you'll run into some performance issues).
FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read);
BinaryReader br = new BinaryReader(fs);
int numBytes = new FileInfo(fileName).Length;
byte[] buff = br.ReadBytes(numBytes);
Then you upload it to the DB like anything else, I'm assume you are using a varbinary column (BLOB)
So filestream would be it but since you're using SQL 2K5 you will have to do it the read into memory way; which consumes alot of resources.
First of the column type varchar(max) is your friend this give you ~2Gb of data to play with, which is pretty big for most uses.
Next read the data into a byte array and convert it to a Base64String
FileInfo _fileInfo = new FileInfo(openFileDialog1.FileName);
if (_fileInfo.Length < 2147483647) //2147483647 - is the max size of the data 1.9gb
{
byte[] _fileData = new byte[_fileInfo.Length];
_fileInfo.OpenRead().Read(_fileData, 0, (int)_fileInfo.Length);
string _data = Convert.ToBase64String(_fileData);
}
else
{
MessageBox.Show("File is too large for database.");
}
And reverse the process to recover
byte[] _fileData = Convert.FromBase64String(_data);
You'll want to dispose of those strings as quickly as possible by setting them to string.empty as soon as you have finished using them!
But if you can, just upgrade to 2008 and use FILESTREAM.
If you're using SQL Server 2008, you could use FILESTREAM (getting started guide here). An example of using this functionality from C# is here.
You would need the file into a byte array then store this as a blob field in the database possible with the name you wanted to give the file and the file type.
You could just reverse the process for putting the file out again.

How to paste CSV data to Windows Clipboard with C#

What I'm trying to accomplish
My app generates some tabular data
I want the user to be able to launch Excel and click "paste" to place the data as cells in Excel
Windows accepts a format called "CommaSeparatedValue" that is used with it's APIs so this seems possible
Putting raw text on the clipboard works, but trying to use this format does not
NOTE: I can correctly retrieve CSV data from the clipboard, my problem is about pasting CSV data to the clipboard.
What I have tried that isn't working
Clipboard.SetText()
System.Windows.Forms.Clipboard.SetText(
"1,2,3,4\n5,6,7,8",
System.Windows.Forms.TextDataFormat.CommaSeparatedValue
);
Clipboard.SetData()
System.Windows.Forms.Clipboard.SetData(
System.Windows.Forms.DataFormats.CommaSeparatedValue,
"1,2,3,4\n5,6,7,8",
);
In both cases something is placed on the clipboard, but when pasted into Excel it shows up as one cell of garbarge text: "–§žý;pC¦yVk²ˆû"
Update 1: Workaround using SetText()
As BFree's answer shows SetText with TextDataFormat serves as a workaround
System.Windows.Forms.Clipboard.SetText(
"1\t2\t3\t4\n5\t6\t7\t8",
System.Windows.Forms.TextDataFormat.Text
);
I have tried this and confirm that now pasting into Excel and Word works correctly. In each case it pastes as a table with cells instead of plaintext.
Still curious why CommaSeparatedValue is not working.
The .NET Framework places DataFormats.CommaSeparatedValue on the clipboard as Unicode text. But as mentioned at http://www.syncfusion.com/faq/windowsforms/faq_c98c.aspx#q899q, Excel expects CSV data to be a UTF-8 memory stream (it is difficult to say whether .NET or Excel is at fault for the incompatibility).
The solution I've come up with in my own application is to place two versions of the tabular data on the clipboard simultaneously as tab-delimited text and as a CSV memory stream. This allows the destination application to acquire the data in its preferred format. Notepad and Excel prefer the tab-delimited text, but you can force Excel to grab the CSV data via the Paste Special... command for testing purposes.
Here is some example code (note that WinForms-equivalents from the WPF namespaces are used here):
// Generate both tab-delimited and CSV strings.
string tabbedText = //...
string csvText = //...
// Create the container object that will hold both versions of the data.
var dataObject = new System.Windows.DataObject();
// Add tab-delimited text to the container object as is.
dataObject.SetText(tabbedText);
// Convert the CSV text to a UTF-8 byte stream before adding it to the container object.
var bytes = System.Text.Encoding.UTF8.GetBytes(csvText);
var stream = new System.IO.MemoryStream(bytes);
dataObject.SetData(System.Windows.DataFormats.CommaSeparatedValue, stream);
// Copy the container object to the clipboard.
System.Windows.Clipboard.SetDataObject(dataObject, true);
Use tabs instead of commas. ie:
Clipboard.SetText("1\t2\t3\t4\t3\t2\t3\t4", TextDataFormat.Text);
Just tested this myself, and it worked for me.
I have had success pasting into Excel using \t (see BFree's answer) as column separators and \n as row separators.
I got the most success defeating formatting issues by using a CSV library (KBCsv) to write the data into a CSV file in the temp folder then open it in Excel with Process.Start(). Once it is in Excel the formatting bit is easy(er), copy-paste from there.
string filePath = System.IO.Path.GetTempPath() + Guid.NewGuid().ToString() + ".csv";
using (var streamWriter = new StreamWriter(filePath))
using (CsvWriter csvWriter = new CsvWriter(streamWriter))
{
// optional header
csvWriter.WriteRecord(new List<string>(){"Heading1", "Heading2", "YouGetTheIdea" });
csvWriter.ValueSeparator = ',';
foreach (var thing in YourListOfThings ?? new List<OfThings>())
{
if (thing != null)
{
List<string> csvLine = new List<string>
{
thing.Property1, thing.Property2, thing.YouGetTheIdea
};
csvWriter.WriteRecord(csvLine);
}
}
}
Process.Start(filePath);
BYO Error handing & logging.

Categories

Resources