i am creating an application using visual studio that use a database of course. i don't get why we create a data set as i tried some query without creating a data set and it worked perfectly. the queries i will be using are update, delete, insert and select (simple and complex ones).
so should i use the data set and why?
Note the database is a big one, and as i understood creating a data set will create a copy of the database, so will this make a storage (memory) problem?
You are looking for ways without a Dataset..
For INSERT, UPDATE and DELETE can use System.Data.Sql and System.Data.SqlClient, open your own SqlConnection and proceed https://learn.microsoft.com/en-us/dotnet/api/system.data.sqlclient.sqlconnection?view=dotnet-plat-ext-3.1 .... but for reads (SELECT), it is practical to use a DataSet. On this level (below Entity Framework !) the DataSet has a Fill() method, you can fill it with any data you want. The only class I know of that can read without a DataSet is DataReader, refer https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/retrieving-data-using-a-datareader
NOTE: as mensioned in the comments above, these low level methods to access a database are inherently unsafe, because you have to specify and pass explicit SQL to ADO, making user input vulnerable.. This can be avoided using a parametrized DataAdaper+Dataset, or Entity Framework, in either way you can avoid SQL injection.
If you have a big database you should consider using an ORM like Entity Framework. By using the old ADO .Net a dataset would help you handling your data. Inside it you will have a DataTable that represents your data and so, whenever you make a select the rows of your database will be stored temporarily in your dataset/datatable. That said, you don't need all your database there, but only the rows you need to work on.
Every time you add a new record to the datatable, it is inserted flagged as new. When you modify a row, it is flagged as modified or deleted, when it is the case. So after all handling you can save your dataset back to the database.
Related
I've hit a wall when it comes to adding a new entity object (a regular SQL table) to the Data Context using LINQ-to-SQL. This isn't regarding the drag-and-drop method that is cited regularly across many other threads. This method has worked repeatedly without issue.
The end goal is relatively simple. I need to find a way to add a table that gets created during runtime via stored procedure to the current Data Context of the LINQ-to-SQL dbml file. I'll then need to be able to use the regular LINQ query methods/extension methods (InsertOnSubmit(), DeleteOnSubmit(), Where(), Contains(), FirstOrDefault(), etc...) on this new table object through the existing Data Context. Essentially, I need to find a way to procedurally create the code that would otherwise be automatically generated when you do use the drag-and-drop method during development (when the application isn't running), but have it generate this same code while the application is running via command and/or event trigger.
More Detail
There's one table that gets used a lot and, over the course of an entire year, collects many thousands of rows. Each row contains a timestamp and this table needs to be divided into multiple tables based on the year that the row was added.
Current Solution (using one table)
Single table with tens of thousands of rows which are constantly queried against.
Table is added to Data Context during development using drag-and-drop, so there are no additional coding issues
Significant performance decrease over time
Goals (using multiple tables)
(Complete) While the application is running, use C# code to check if a table for the current year already exists. If it does, no action is taken. If not, a new table gets created using a stored procedure with the current year as a prefix on the table name (2017_TableName, 2018_TableName, 2019_TableName, and so on...).
(Incomplete) While the application is still running, add the newly created table to the active LINQ-to-SQL Data Context (the same code that would otherwise be added using drag-and-drop during development).
(Incomplete) Run regular LINQ queries against the newly added table.
Final Thoughts
Other than the above, my only other concern is how to write the C# code that references a table that may or may not already exist. Is it possible to use a variable in place of the standard 'DB_DataContext.2019_TableName' methodology in order to actually get the table's data into a UI control? Is there a way to simply create an Enumerable of all the tables where the name is prefixed with a year and then select the most current table?
From what I've read so far, the most likely solution seems to involve the use of a SQL add-on like SQLMetal or Huagati which (based solely from what I've read) will generate the code I need during runtime and update the corresponding dbml file. I have no experience using these types of add-ons, so any additional insight into these would be appreciated.
Lastly, I've seen some references to LINQ-to-Entities and/or LINQ-to-Objects. Would these be the components I'm looking for?
Thanks for reading through a rather lengthy first post. Any comments/criticisms are welcome.
The simplest way to achieve what you want is to redirect in SQL Server, and leave your client code alone. At design-time create your L2S Data Context, or EF DbContex referencing a database with only a single table. Then at run-time substitue a view or synonym for that table that points to the "current year" table.
HOWEVER this should not be necessary in the first place. SQL Server supports partitioning, so you can store all the data in a physically separate data structures, but have a single logical table. And SQL Server supports columnstore tables, which can compress and store many millions of rows with excellent performance.
I am developing a C# app where I have to read/write existing MS SQL database. I decided to use object class for the database but the table columns can be changed during runtime and that causes an exception because of an attempt to write a new row (in the case of a new not null column).
Is there any recommendation how to preserve object approach to the database and deal with variable database tables? It is not necessary to have the object updated in the runtime, just to handle the new columns - fill them with a valid default value.
More details to my solution:
I used Data Source Configuration Wizard in VS2015 what generates objects for the database and everything is fine. When a table has a new column I have to run the wizard again to update the objects and define appropriate new value.
I can't modify anything in the database structure (existing ERP system). The database is huge (hundreds of tables, each has around 60+ columns) so I am looking for the automated ways how to generate the database objects.
I hope I just overlooked (as a newbie) some obvious solution.
Thanks for all suggestions in advance.
Petr
I would recommend to do the following:
Create a set of import tables with the needed columns and leave those tables fixed
Let your application copy data to the import tables
Update the production tables on the database from the import tables with a stored procedure
I currently have a populated datatable, but I'm having trouble inserting it into my existing (empty) database.
I have looked into sqlbulkcopy as well, but haven't had much luck.
Although using Entity Framework I would expect:
_db.TableName.AddRange(dt);
To properly insert the new data from my datatable.
Where am I going wrong here?
You are missing a few steps. First you need to create the table on the database, this is generally done not at run time for lots of complex reasons. Research Entity Framework Code First for more details on how to do this.
Secondly if you truly want to use Entity Framework you will need to convert your data table into a set of objects of the correct type.
Honestly though I would avoid Entity Framework and embed a CREATE TABLE script (or even better define it before you execute) and then use SqlBulkCopy, just note that SqlBulkCopy needs the table defined ahead of time.
I have a Datagrid in my app. This datagrid fetches some data from a MySQL DB. They are fetched from a List<> to be true, because I'm not able to fetch the data from a Dataset (and I don't know why).
Anyway, when I update a field in my app i want these changes to be reflected on the list and therefore on the table in my DB.
Any idea?
Also, it's a good option to save tables data on a List<> or it's better to save them on a DataSet/DataTable?
Thank you.
I'll answer your last question first. In general, you want to use DataSet/DataTable, because they have many methods and properties related to database functionality. A List<> may get you where you're going for now, but extending it in the future will be a nightmare.
I would focus on getting a DataSet properly filled from your Database (see: http://dev.mysql.com/usingmysql/dotnet/ if you don't know where to start), and then simply setting your DataGrid to use that DataSet as its DataSource. You can then use things like LINQ to SQL or Entity Framework to better model that DataSet in code. Assuming you have the proper ODBC drivers installed, it should be as simple as creating the correct Connection String and doing everything normally from there.
You can definitely do things the way you're doing now, but you will have to manually send any SQL update statements instead of relying on the automatic ways of doing it. But I would seriously consider reworking it to use proper .NET data objects.
I need to create a quick and dirty solution to migrate data from database into another. This is only being used a proof of concept. Long term we will use .NET's Sync Framework.
The databases are identical. The solution is going to be used as an OCA (occasionally connected application).
I read in which tables they want to migrate from some XML.
Disable all constraints on the target for each table.
For each table they want to migrate data from I create a DataTable from the source.
Create a DataTable pointing to the target.
Import all the rows from the source into the target and insert them
Enable all constraints on the target tables again.
I am not sure if the above is possible. I had most of it working and I was cloning the source DataTable. I then had the problem where the cloned DataTable wasn't pointing anywhere.
Can I point it to the target and then insert?
Is there a better way to do this?
The alternative is to create INSERT INTO statements, using metadata to identify identity columns and not include them in the column names.
What you're proposing should work. But you might find it easier (and you'll definitely see better performance) with the SqlBulkCopy class.
(This is a code-focused solution)
Short answer: You can load your DataTable and save it into another database by using a different DataAdapter.
But, for a code less approach, you can to use SQL Server Database Publishing Toolkit as stated here.
You can use the Sql Server Import and Export Wizard (dtswizard.exe).
It creates an Integration Services package that you can then save and execute whenever you want.