I have a lot of data in several tables that I am pulling from and combining into one view. I need to have a daily job using c# to pull all of this data then insert it into a separate database/table running on a different server. The data consists of some 150+ columns once combined and I don't want to use reader.read() reader.getstring() reader.etc for every column then combine it all into a string to insert again. Is there a way to just pass the results of an sql query to an insert in a simple and compact way?
private static void GetPrimaryData(string query)
{
using (MySqlConnection connection = new MySqlConnection(_awsOptionsDBConnectionString))
{
connection.Open();
using (MySqlCommand command = new MySqlCommand(query, connection))
{
using (MySqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
Console.WriteLine(reader.GetInt32("tsid"));
}
}
}
connection.Close();
}
}
Ideally I'd just replace the Console.WriteLine(reader... part of code with some sort of insert where I pass the reader or the entire result of the reader query in.
Whether the data comes directly from a database, or from a user, one still needs to send the data to a database in the structure the database defines.
Your question is asking for a magical converter which would work with any type of data and send it to any database. There is no such tool or library in C#/.Net.
One should look into a cloud solution that has already been developed such as Azure Data Factory or Informatica, etc.
Otherwise write the mappings/translations as needed in C# and use an ORM to send it the data to the database.
Related
I'm starting to program with ASP-NET MVC an application with Angular for the front-end and SQL Sever to the database. In some cases, I have complex query than I have to use and I cannot modify because of a restriction business. I am using a structure similar to this one: using simple queries in ASP.NET MVC but I don´t know which is the correct way to handle a lot of data and show in the front-end.
I have a ViewModel with the data structure of the results query, a DomainModel where the query is located and the Controller to communicate with the front-end.
My problem is that I don´t know which would be the way to develop what I am trying. Now I´m trying to create as many objects in a list object as rows in my query, but when this method is running my computer gets blocked with no error showed (I can guess it is because it is using the whole memory).
Note that the table in the front has to show only 25 results per page, maybe I can execute the query always when the user choose a diferente page of the table, getting a different lots of results. I didn´t try this choice yet.
This is part of the DomainModel:
public IEnumerable<OperationView> GetOperations()
{
List<OperationView> Operationslist = new List<OperationView>();
using (SqlConnection connection = new SqlConnection(connectionString))
using (SqlCommand command = new SqlCommand("", connection))
{
command.CommandText = /*Query joining 8-10 tables*/;
connection.Open();
SqlDataReader reader = command.ExecuteReader();
while (reader.Read())
{
var OperationView = new OperationView();
OperationView.IdOperacion = reader["ID_OPERACION"].ToString();
//Loading here some other variables of OperationView
Operationslist.Add(OperationView);
}
connection.Close();
}
return Operationslist;
}
This is part of the Controller:
public IEnumerable<OperationView> GetOperaciones()
{
var Operation = new OperationPDomainModel();
return Operation.GetOperations();
}
I think that my front and ViewModel are not importants for this problem, but I can include them if needed.
Currently, if I try to execute the computer shuts down unexpectely...
As your system is going out of memory, you need to have pagination.
This paging is should be done in the database side. UI just need to pass the page index and number of records displayed per page.
So your query should be something as below
Select a,b,c, ROW_NUMBER() OVER(ORDER BY a) rnum from foo
where rnum between (25 * Page_Index) + 1 and (25 * Page_Index) + 25
There are a few improvements you could make.
Make the call async
The operation would hang as it blocks the main thread. If possible try this operation async. Use task-based programming to run the operation on a different thread. That should make things a little better not improve that significantly.
Use pagination
Get only the number of records that you need to display on the page. This should be the best improvement based on the code you have. It would also be better to have some more filters if possible. But getting only 25 records if you need only 25 should be the way to go.
It would also help if you could use modern programming techniques like EF and LINQ instead of traditional ADO.Net
Use Ajax
Such large processing should be done using AJAX calls. If you do not want the user to wait for the data to be loaded, you can load the page and make the data retrieval a part of a separate AJAX call.
check this View Millions Of Records
https://www.c-sharpcorner.com/article/how-to-scroll-and-view-millions-of-records/
We are importing a csv file with CSVReader then using SqlBulkCopy to insert that data into SQL Server. This code works for us and is very simple, but wondering if there is a faster method (some of our files have 100000 rows) that would also not get too complex?
SqlConnection conn = new SqlConnection(connectionString);
conn.Open();
SqlTransaction transaction = conn.BeginTransaction();
try
{
using (TextReader reader = File.OpenText(sourceFileLocation))
{
CsvReader csv = new CsvReader(reader, true);
SqlBulkCopy copy = new SqlBulkCopy(conn, SqlBulkCopyOptions.KeepIdentity, transaction);
copy.DestinationTableName = reportType.ToString();
copy.WriteToServer(csv);
transaction.Commit();
}
}
catch (Exception ex)
{
transaction.Rollback();
success = false;
SendFileImportErrorEmail(Path.GetFileName(sourceFileLocation), ex.Message);
}
finally
{
conn.Close();
}
Instead of building your own tool to do this, have a look at SQL Server Import and Export / SSIS. You can target flat files and SQL Server databases directly. The output dtsx package can also be run from the command line or as a job through the SQL Server Agent.
The reason I am suggesting it is because the wizard is optimized for parallelism and works really well on large flat files.
You should consider using a Table-Valued Parameter (TVP), which is based on a User-Defined Table Type (UDTT). This ability was introduced in SQL Server 2008 and allows you to define a strongly-typed structure that can be used to stream data into SQL Server (if done properly). An advantage of this approach over using SqlBulkCopy is that you can do more than a simple INSERT into a table; you can do any logic that you want (validate / upsert / etc) since the data arrives in the form of a Table Variable. You can deal with all of the import logic in a single Stored Procedure that can easily use local temporary tables if any of the data needs to be staged first. This makes it rather easy to isolate the process such that you can run multiple instances at the same time as long as you have a way to logically separate the rows being imported.
I posted a detailed answer on this topic here on S.O. a while ago, including example code and links to other info:
How can I insert 10 million records in the shortest time possible?
There is even a link to a related answer of mine that shows another variation on that theme. I have a third answer somewhere that shows a batched approach if you have millions of rows, which you don't, but as soon as I find that I will add the link here.
I'm coming over from PHP and am having a hard time with storing information into my newly created local database. I'm using Microsoft Visual C# 2010 to help me learn and develop.
I'm reading that many people do not like datasets and would opt to ignore them all together. That is fine if I am able to hard-wire into my local database. (I did not use the server database option provided because I'll turn my completed product into a commercial solution and this will require the users to store their information into a local database that stores their project data.
I've made a video showing my windows form and my database, and the extent of my knowledge so far. Maybe you guys can help? http://screencast.com/t/x9Qt1NtOgo6X
There are many ways to access a database from your application. These range from low-level ado.net commands (SqlDataReader, etc..) to using an Object Relational Mapper (ORM) such as Entity Framework.
All of them will require that you learn the technologies, but you can start here:
http://windowsclient.net/learn/videos.aspx
Here's some code that uses SQLServer to do a direct insert, although you'll need a connection string to your database.
Include the SQL server database includes.
using System.Data.SqlClient;
using System.Data.SqlTypes;
.
.
.
using (SqlConnection cn = new SqlConnection("XXXXX")) // must put a connection string to your database here
{
cn.Open();
using (SqlCommand cmd = new SqlCommand("INSERT INTO Session(field1, field2) VALUES(#Value1, #Value2)"))
{
cmd.Parameters.AddWithValue("#Value1", 4);
cmd.Parameters.AddWithValue("#Value2", "test");
cmd.ExecuteNonQuery();
}
}
Well, if you want a quick, almost close to the wire code like the way you used to have with PHP, the following code should work.
var conn = new SqlConnection("Your Connection String");
var command = conn.CreateCommand();
command.CommandText = "insert into sessions (id, name) values (#id, #name)";
command.Parameters.AddWithValue("#id", "");
command.Parameters.AddWithValue("#name", "test");
conn.Open();
command.ExecuteNonQuery();
command.Dispose();
conn.Close();
In the long run, it would be better if you get accustomed to one of the data-related / ORM frameworks such as Entity Framework, NHibernate and the likes. That would really help a lot in data manipulation and make your life a whole lot easier.
It depends on your requirments, but for most situations, I would highly recommend you use Entity Framework or Linq to Sql data classes. You'd be much better off... go with the latter as a start... hope it helps.
[Edited]
If you want to see how easy an ORM can be:
right-click on your project
select Add New Item
Choose Linq to Sql Data Classes
When you've added it, you'll have a blank .dbml file
Go to server explorer and add a connection to the sql db
Drag and drop the tables wherever you like
Start using the entities like this:
using (DataClasses1DataContext db = new DataClasses1DataContext("Data Source=localhost\sqlexpress; Initial Catalog=myDBName; Integrated Security=true"))
{
IEnumerable citiesForUSA = db.Cities.Where(x => x.Country.Name == "United States");
City city = new City();
city.Name = "Metropolis";
//etc
db.Cities.InsertOnSubmit(city);
db.SubmitChanges(); // <-- INSERT INTO completed
//etc
}
Good luck!
:-)
Suppose I have a myTest.sql scripts, which contains thousands of "create table blahblah" statements.
myTest.sql :
CREATE TABLE A;
CREATE TABLE A1;
....
CREATE TABLE A1000;
What Im trying to achieve is that make an C# script to force MySql server EXECUTE the myTest.sql file, instead of doing
using (MySqlConnection cn = new MySqlConnection(ConectionString))
{
MySqlCommand newCmd = new MySqlCommand("create statement here 1", cn);
cn.Open();
newCmd.ExecuteNonQuery();
cn.Close();
}
I dont want to repeat 1000 times or a for loop something like that. Thanks for all helps and please forgive my grammar problems.
Could you load the myTest.sql file into a string and pass it to the MySqlCommand.
string myTestSql = IO.File.ReadAllText("myTest.sql");
...
MySqlCommand newCmd = new MySqlCommand(myTestSql, cn);
Should work as long as MySQL accepts commands separated by semicolons.
You certainly don't have to open and close the connection every single time, but it would be the cleanest if you ran each of these one at a time and look at the result to ensure that the statement completed successfully. Unfortunately if you run a giant statement with 1000 statements and it fails, you don't have an easy way of determining which step(s) were successful and which have to be repeated.
Use this
You can do it using mysql command-line and shell with System.Process static methods if you want to use .net / c#.
We document our SQL Server database by creating table and column level Description extended properties. We usually enter these via SSMS.
My question is this. I'm creating a C# application where I'd like to read the extended properties for a particular table and its associated columns.
Can someone show me how I might go about doing this?
Thanks - Randy
You simply ask for them using the built-in fn_listextendedproperty. The result of this function is an ordinary table result set that you read in C# using your data access tool of choice (SqlCommand/SqlDataReader, linq, datasets etc).
Read this: Extract SQL Column Extended Properties From LINQ in C# and see if that's something you could do in your situation.
A full example for a simple property:
In SQL Server :
Code:
String strVersion;
string cmd = "SELECT value from sys.extended_properties where name = 'MinimumClientVersion'";
using (var connection = new SqlConnection(connectionString))
using (var comm = new SqlCommand(cmd, connection))
{
connection.Open();
strVersion = (string)comm.ExecuteScalar();
connection.Close();
}
Version MinimumVersion = new Version(strVersion);