Archiving SQL data into file system - c#

I have a sql database, need to archive the database into filesystem (excel files).
Is the size of the data will increased or not when I migrate to filesystem?
I am doing archive through C# and save into files.
Some of the experts suggested the data of the filesystem is more as compared to the sql data.
Is it right or wrong?

Related

SqlPackage import maximum db size

I want to import a database from a .bacpac file to a SQL Server in Azure. I read the document here: https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-import?view=sql-server-ver15
It says that there is a flag called DatabaseMaximumSize=(INT32). I wanted to know if there's a limit that sqlpackage can support? For example if I got 8 gb of RAM available, will Sqlpackage be able to load larger .bacpacs than that, meaning it doesn't load it all to the memory?
SQLPACKAGE.exe does not have to do anything with your system RAM.
Sqlpackage.exe will export and import much larger databases then your system ram.
When you import a bacpac file using sqlpackage.exe in Azure and you don't have target db created beforehand, then by default a db will be created in target server with max db size of 32gb only.
So if your db which you exported as bacpac was larger than 32gb then you should create your target db ( keep it empty) before importing bacpac file. Also, size of bacpac is not indicative of actual size of db, bacpac is highly compressed object and is nowhere related to actual db size.
Now coming to import parameter /p Database maximum size, I believe it is there to specify the maximum limit of your target db.

Manualy Creating TSQL server .BAK File from code

I’m trying to write about 70 million data rows into an SQL server as fast as possible, the Bulk inserter is taking to long to write my chunks so I’m trying to manually create a .BAK file from my C# code to speed up the import to multiple other SQL servers.
Is there any Documentation about the structure of .BAK files? I've tried to find anything on google but all results jut show how to ex/import BAK Files using SQL.

Exporting BLOB data from a sqlite table into a CSV file or (another extension) without corrupting binary data

I've tried exporting a table which it has 5 TEXT columns and 1 BLOB column, this last column is used for storing images.
But, when I export the database table into a csv file (by using the software SQLite Database Browser) the information of this columns is transformed into TEXT as well. (Perhaps I was wrong)
How can I export BLOB data from my sqlite db table to a sql server table?
I will be so thankful if you help me. Please!! (I'm programming a c# application, but It doesn't matter, because I'm using SQLite Database Browser 2.0 b1.exe to see the data.)

How to save binary data back as a file from SQL table

I am using Microsoft SQL server database file (file with .mdf extension) as the database.
I managed to save files into the SQL table as varbinary(max). Now my question is : How do I get back the files?
I am thinking I may still need to use SELECT statement to get the file from the SQL table. However how do I save the file to my hard disk from its current binary format? What are the procedures to do that?

C# app to select data from MySQL and insert to SQL-server (blob files)

I want to run a query on a MySQL database and insert the results in a SQL-Server 2008 R2 database.
In my table (MySQL database) there are multiple columns and one of them contains a file path. I want to use that path to insert the actual file as BLOB in my SQL-server.
So all columns from MySQL need to be inserted in SQL-server and the actual file as BLOB.
I can connect and query the MySQL database and also connect my SQL-Server.
But how can i insert the results. (some files are very large !!)
I found someting about OPENROWSET, but i could not find a good example inserting the metadata and the file.
I want to write a C# app for this. I appreciate any help.
SQL-Server 2008 R2 Support file stream. (http://technet.microsoft.com/en-us/library/bb933993.aspx)
BLOBs can be standard varbinary(max) data that stores the data in tables
FILESTREAM varbinary(max) objects that store the data in the file system
If Objects that are being stored are, on average, larger than 1 MB you shoul go with filestream.
As #Pongsathon.keng mentioned in his response FileStream is an option. You need to enable your DB to support FileStream and the user that is writing cannot be a sql login (Keep that in mind). You can use a varbinary(max) also mentioned (Image will be deprecated).
We decided to go with FileStream so we could also utilize it in conjunction with FUll-Text Indexing. You could also go with the typical "Store File Path" approach and store the actual files / blogs in a file system. FileStream gives you the added benefit of backing up files with the DB and apply permissions / transactions, etc.
Really depends on what you are doing.
Some good articles on FileStream Here and Here

Categories

Resources