i want to make Mysql Database converter that can obtain the data from Database A to database B.
both database have same table and column.
how i can make them using c#.
are anyone show me flow or how i can do it
Try the following steps:
Execute the query to retrieve the data from database A;
For every record retrieved, insert a new record into database B.
This mechanism can be automated by automatically creating queries and inserts. To retrieve the tables and fields in a MySQL database, refer to the INFORMATION_SCHEMA tables in MySQL. Using the information retrieved from these tables, you can automatically construct the required queries and migrate an entire database automatically.
Related
Summary
I have a requirement to modify the content of Database based on some input .txt file that modify thousands of records in database, for each being a business transaction (daily around 50 transaction will performed).
My application will read that .txt file and perform the modification to data in SQL Server database.
The current application that imports the data from DB and perform the data modification in memory (DataTable) and later after that push back to database it does so using SqlBulkCopy into a SQL Server 2008 database table.
Does anyone know of a way to use SqlBulkCopy while preventing duplicate rows, without a primary key? Or any suggestion for a different way to do this?
Already implemented and dropped for performance issues.
Before this I was using SQL statements was generated automated for data modifications but it's really slow, so I thought of loading complete database table into a DataTable (C#) memory perform look up and modifications and accept the changes to that memory ...
One more approach to implement, give me some feedback about my new approach please right me if I am wrong ..
Steps
load to database table into C# DataTable (fill DataTable using SqlDataAdapter)
Once DataTable is in memory, perform data modifications on it
Load again the base table in database and compare in memory prepare the
non-existing records and finally perform insert.
push the DataTable to memory using Bulk insert
I cant have Primary key!!!!
Please give me any suggestions for my workflow. and whether i am in right approach to deal my problem?.
To explain, I have a database db_Name. Over the course of development of my program, the parameters of this database has changed through the use of MySQL Workbench. Unfortunately the hard-coded construction of the database in my program has not been kept up to date.
Is there a way that I could return the code for the construction of my database in the latest version?
To explain: I'm looking for this code:
Table_Name_1(Tbl_Id CHAR(25) NOT NULL AUTO_INCREMENT, Field_1 CHAR(25) UNIQUE, Field_2 INT, PRIMARY KEY(Tbl_Id));
Table_Name_2(..........
etc.
If you need SQL for database structure - you can backup your database to *.sql file without table data.
You can do that in MySQL Workbench.
Step-by-step guide:
How to take MySQL database backup using MySQL Workbench?
You can also use SHOW CREATE TABLE TableName sql statement to return database structure as data view.
So I have two systems I often have to join together to display certain reports. I've got one system that stores metadata on documents that is stored in SQL Server, usually by part number. The list of part numbers I would want to get documents for come from an Oracle table in our ERP system. My current method goes like this:
Get data from ERP (Oracle) system into a DataTable.
Compile string[] of part numbers from a column.
Use an IN() statement to get all document information from docs (MSSQLSVR) system into another DataTable.
Add columns to ERP DataTable, loop through rows.
Fill out document info from docs DataTable, if(erpRow["ITEMNO"] == docRow["ITEMNO"])
This, to me feels really inefficient. Now obviously I can't use one connection string to JOIN the two tables, or use a database link, so I assume there will have to be two calls, one to each database. Is there another way to join these two sets together?
I would suggest a LikedServer approach (http://msdn.microsoft.com/en-us/library/ms188279.aspx). Write a Stored Procedure on the SQL Server side that pulls the data over from an Oracle Linked Server, does the JOIN locally and returns the combined data.
SQL Server has been designed to execute JOINs efficiently. No need to try to recreate that functionality in the app layer.
Since you've ruled out a database link I would do the following
Get data from ERP (Oracle) system into a DataTable.
Pass DataTable as a Parameter to SQL Server via a Table-Valued Parameter
Return your data (no loops updating an older set)
We use LINQ to SQL in our project. One of the tables is "Users" used in every action in the project.
Recently we were said to add "IsDeleted" column to the table and consider that column in every data fetching in LINQ to SQL queries.
We wouldn't want to add "WHERE IsDeleted = Fasle" to all queries.
Is it possible "to interrupt" to LINQ after the data was fetched but before sending further to code in the project?
This can be solved by C# but it would really be the wrong tool for the job.
Create a view in the database that includes this statement and only work with the view from now on. You can even enforce this by not granting privileges on the table any more.
I'm writing a program in C# that will grab data from a staging table, and then insert that same data back into their new respective locations in a SQL Server database. The program will do the following steps sequentially:
Select columns from first row of staging table
Store each column as unique variable
Insert data into new respective locations in the database (each value is going to multiple different tables in the DB, and the values are duplicated between many of the tables)
Move to the next Record
Repeat from step 1 until all records have been processed
So is there a way to iterate through the entire record set, storing each result from a column as a unique variable without having to write separate queries for each value that you want to store? There are 51 columns that all have to go somewhere, and I didn't think it would be very efficient to hardcode 51 variables each with a custom query to the database.
I thought about doing this with a multidimensional array, but then that would just be one string with a ton of values. Any advice would be greatly appreciated.
Although you can do this through a .NET application, really this would be much easier to achieve with a SQL statement. SQL has good syntax for moving data between tables:
INSERT INTO [Destination] ([Columns,])
SELECT [Columns,]
FROM [Source]
If you're moving data between databases, you just need to link one of the databases to the other and then run the query. If you're using SQL Server Management Studio, you can follow this article to set up linked servers. Otherwise, you can use the sp_addlinkedserver procedure to register the linked server.
You can create a class that contains a property for each column in your table and use a micro ORM like Dapper to populate a list of instances of those classes from your database. You can then iterate over the list and do your inserts to other tables.
You could even create other classes for your individual inserts and use AutoMapper to create instances of those from your source class.
But... this might all be overkill for what you are trying to achieve.