Manualy Creating TSQL server .BAK File from code - c#

I’m trying to write about 70 million data rows into an SQL server as fast as possible, the Bulk inserter is taking to long to write my chunks so I’m trying to manually create a .BAK file from my C# code to speed up the import to multiple other SQL servers.
Is there any Documentation about the structure of .BAK files? I've tried to find anything on google but all results jut show how to ex/import BAK Files using SQL.

Related

How to read more than 400 meg excel in sql

How can I import that Excel file into SQL Server? I have more than 400 meg Excel file which has two sheets.
I want to know that if is there any way by which can I read it in chunks/parse it to csv and then import. I need to know if it can be done without using oledb or openxml or any other kind of automation tools.
I have checked the options with openxml and tried to send it to SQL Server but not getting enough luck, I have tried oledb but it seems like that oledb is having its own limitations.
Thanks
If you have sql server integration services or data tools installed you can use the Sql Server Wizard to "Import and Export Data". There you choose "Excel File" as Source, your Database as Destination and then you can define table and column mappings including the necessary data conversions. You can also define pre- and post-import sql commands (like drop and (re)create the table that you want to import the data into... or simply empty the table).
As soon as you have the whole import defined you can save a so-called SSIS (Sql Server Integration Service) - Package either into your database or into a file nested in your solution. That *.dtsx file can be opened with the "Sql Server Integration Services Package
Execution Utility" to automatically run the import.
So you could run the import at some time in your build-process or even start it in a background process. Just make sure you place your excel file where you've defined the source for the import to be.
If you have Visual Studio 12 and up, Sql Server Data Tools should be automatically installed. With those in place you can open up the *.dtsx files at any time in your Visual Studio and modify the import behaviour. I've only recently copied such a .dtsx-file because i need to import an excel file into 4 different staging databases, so for each database i just modified the corresponding copy of the .dtsx-file's database-connection-credentials.
I would suggest taking a look at python + pandas - pandas has a method for reading from excel into pandas data frames, read_exel(), and to write to your sql database, to_sql().

Importing huge data from text file to sql using c# winforms

I am wanting to add the ability to import massive CSV files into a table in the database. I've built an SSIS package that does this, but I just wanted to make sure that this was the correct way of doing it.
The text files are millions of rows with 50 columns. They don't open in notepad or notepad++ a lot of times. The SSIS package handles them with no problem and gets everything imported. Is SSIS the right way to deal with this? I just need to pass the file location parameter to the job and execute it right? Is there an easier way that I am overlooking?
The text files are millions of rows with 50 columns
Small. Why an SSIS package?
They don't open in notepad or notepad++
because both OPEN them - there is no need to open them and load them all into memory, a proper application can read them line by line.
Is SSIS the right way to deal with this?
No. SImply because the title is: Importing huge data from text file to sql using c# winforms
As Winforms can do that effectively - I am inserting around 100.000 rows into a database - in C# with quit eeasy coding (except some pages and a day to get sqlbulkcopy to work properly) and you say nothing about any transformations, SSIS just is another not needed technology AND makes things complicated (as in: more to install, or hav a packag on the server but then find a location for the file that the server can reach etc.
I am all for SSIS and if you have a larger SSIS infrastructure or do a lot of processing etc. and it makes architectural sense in a larger context - yes. But as the question stands, without additional reasons - absolutely not.
Heck, there is a good chance you can load the whole thing in one command into SQL server as SQL Server has some CSV processing capabilities:
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/

MySQL using bulk SQL statements with c#

I realize that LOAD DATA INFILE can be used to load text/csv data. However, this seems to be limited to inserts. However, is there a way to get MYSQL to bulk import SQL files using c#.
My hope is to build up the SQL in a text file and then send it to my connection. Since this process is daily, I need it automated so that I do not have to have any user intervention once the program is started. It just takes too long inserting and updating when it can be done in bulk.
Have you tried MySqlBulkLoader, it's just a C# class that wraps LOAD DATA INFILE.
http://dev.mysql.com/doc/refman/5.1/en/connector-net-programming-bulk-loader.html

C# app to select data from MySQL and insert to SQL-server (blob files)

I want to run a query on a MySQL database and insert the results in a SQL-Server 2008 R2 database.
In my table (MySQL database) there are multiple columns and one of them contains a file path. I want to use that path to insert the actual file as BLOB in my SQL-server.
So all columns from MySQL need to be inserted in SQL-server and the actual file as BLOB.
I can connect and query the MySQL database and also connect my SQL-Server.
But how can i insert the results. (some files are very large !!)
I found someting about OPENROWSET, but i could not find a good example inserting the metadata and the file.
I want to write a C# app for this. I appreciate any help.
SQL-Server 2008 R2 Support file stream. (http://technet.microsoft.com/en-us/library/bb933993.aspx)
BLOBs can be standard varbinary(max) data that stores the data in tables
FILESTREAM varbinary(max) objects that store the data in the file system
If Objects that are being stored are, on average, larger than 1 MB you shoul go with filestream.
As #Pongsathon.keng mentioned in his response FileStream is an option. You need to enable your DB to support FileStream and the user that is writing cannot be a sql login (Keep that in mind). You can use a varbinary(max) also mentioned (Image will be deprecated).
We decided to go with FileStream so we could also utilize it in conjunction with FUll-Text Indexing. You could also go with the typical "Store File Path" approach and store the actual files / blogs in a file system. FileStream gives you the added benefit of backing up files with the DB and apply permissions / transactions, etc.
Really depends on what you are doing.
Some good articles on FileStream Here and Here

LINQ : saving files to a database

I want to save a pdf and mp3 file(s) to a SQL Server database and be able to retrieve from it.
I'm still starting out with LINQ and don't master it yet.
You need to convert these into byte arrays (System.Data.Linq.Binary). One line to load
var myMp3 = new Binary(File.ReadAllBytes(mp3Filename));
If you create your database schema (VarBinary in the database) and drag the table over from Server Explorer into the DBML designer, it'll do everything for you.
To start off with your going to have a binary field in your database to save the file to.
Are you using LinqToSql, EntityToSql, or? Need some more information...
But once you get an object with a []byte to save the file to then it just a matter of making the appropriate Save() call... but with having some more information it hard to say.
Did you google tutorials?
Here is one that I found: Uploading Binary files or Images using LINQ to SQL
Has example code and sql to generate dummy tables...

Categories

Resources