MySQL using bulk SQL statements with c# - c#

I realize that LOAD DATA INFILE can be used to load text/csv data. However, this seems to be limited to inserts. However, is there a way to get MYSQL to bulk import SQL files using c#.
My hope is to build up the SQL in a text file and then send it to my connection. Since this process is daily, I need it automated so that I do not have to have any user intervention once the program is started. It just takes too long inserting and updating when it can be done in bulk.

Have you tried MySqlBulkLoader, it's just a C# class that wraps LOAD DATA INFILE.
http://dev.mysql.com/doc/refman/5.1/en/connector-net-programming-bulk-loader.html

Related

How to copy large datasets from one data source to another in Oracle(SqlDeveloper) / MS Sql Server using C#?

Using C#,
I have tried using BulkCopy and BulkInsert in the batches of 5000. But for 300million rows it is still taking lots of time and memory. Gives Out of Memory Exception.
For bulk copying:
We get data from source.
We upload it to target.
Is there a way where I can copy from one datasource to another datasource in the table with same name without loading any data?(if table does not exists, then create a new one)
(without loading data data meaning: without getting data first)
I want to copy all the rows directly from source to target through code in C#.
you can use ssis to extract transform and load data from any source to another, install sql server data tools from
https://learn.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-ver16
then you can create ssis package deploy on sql server and use it in your application

Manualy Creating TSQL server .BAK File from code

I’m trying to write about 70 million data rows into an SQL server as fast as possible, the Bulk inserter is taking to long to write my chunks so I’m trying to manually create a .BAK file from my C# code to speed up the import to multiple other SQL servers.
Is there any Documentation about the structure of .BAK files? I've tried to find anything on google but all results jut show how to ex/import BAK Files using SQL.

How to read more than 400 meg excel in sql

How can I import that Excel file into SQL Server? I have more than 400 meg Excel file which has two sheets.
I want to know that if is there any way by which can I read it in chunks/parse it to csv and then import. I need to know if it can be done without using oledb or openxml or any other kind of automation tools.
I have checked the options with openxml and tried to send it to SQL Server but not getting enough luck, I have tried oledb but it seems like that oledb is having its own limitations.
Thanks
If you have sql server integration services or data tools installed you can use the Sql Server Wizard to "Import and Export Data". There you choose "Excel File" as Source, your Database as Destination and then you can define table and column mappings including the necessary data conversions. You can also define pre- and post-import sql commands (like drop and (re)create the table that you want to import the data into... or simply empty the table).
As soon as you have the whole import defined you can save a so-called SSIS (Sql Server Integration Service) - Package either into your database or into a file nested in your solution. That *.dtsx file can be opened with the "Sql Server Integration Services Package
Execution Utility" to automatically run the import.
So you could run the import at some time in your build-process or even start it in a background process. Just make sure you place your excel file where you've defined the source for the import to be.
If you have Visual Studio 12 and up, Sql Server Data Tools should be automatically installed. With those in place you can open up the *.dtsx files at any time in your Visual Studio and modify the import behaviour. I've only recently copied such a .dtsx-file because i need to import an excel file into 4 different staging databases, so for each database i just modified the corresponding copy of the .dtsx-file's database-connection-credentials.
I would suggest taking a look at python + pandas - pandas has a method for reading from excel into pandas data frames, read_exel(), and to write to your sql database, to_sql().

Is it possible to write data into csv files and then use SSRS reports with SQL queries to view reports?

All of my data is in Csv files and I wanted to ask if it is possible to display the data as SSRS reports. Is it possible to achieve this without extract data from csv files and dumping into SQL DB.
Possible? Certainly.
As easy as you hope it will be? Probably not.
There is no out-of-the-box connection from SSRS to .csv files. So you'll need to get the data into SSRS somehow. If you are doing client side SSRS, this means creating a dataset from the .csv. If you are using a server based report, then you need to create a Data Processing Entension. They aren't too hard to write, but you definitely need to be familiar with the .NET language of your choice.
If the task of reporting on .csv files was given to me, I would implement a SSIS package to import the files to a SQL database, and then write the report against the database. This would provide better performance and more flexibility and also be quicker to implement than reporting directly on the .csv files.
Have you tried using your CSV file as an ODBC data source within your report?

SqlBulkCopy equivalent in MySql?

I am more used to SQL Server and I'd like to know if in MySql I can find something analog to SqlBulkCopy. I have a data editor written in WPF/Silverlight and C# using Connector/NET to connect to a MySql server, and I should provide a function inside the editor to make a full backup of some MySql databases. I have no direct access to the server so I cannot use dump or other command-line tools.
What would then the best way of dumping a whole database using only C# code via Connector? I just have to massively export data into XML, CSV or the like with my own SQL queries, or is there any suggested way for such tasks?
Sounds like MySQL LOAD DATA might be what you are looking for. See the docs here.
You could also use OUTFILE (from the same doc page), for example:
SELECT * INTO OUTFILE 'somefile.txt'
FIELDS TERMINATED BY ','
FROM your_table;
SELECT INTO OUTFILE is the standard way to do this, but it can only dump data to the server's local file system. If you don't have access to the server, you won't be able to access the dump files.
I'd suggest a combination of
SHOW CREATE TABLE XXXX to retrieve the table sql, and some form of SELECT * FROM XXXX to retrieve the data. The Maatkit tools might be useful as reference. You could figure out what it's doing and copy the SQL:
http://www.maatkit.org/doc/mk-archiver.html

Categories

Resources