Generating Entity Framework model takes too long - c#

Some friends and I are making a system based on C#, they make the Desktop application with CodeFirst and I make the Web application from their CodeFirst, but with DataBaseFirst, add DataBase.
It turns out that when I am going to add the model, I select the tables and start generating them, but it takes at least two hours.
I need your help because it takes a lot of time and I can not find a solution online.

As discussed in this question, it seems to be a bug in reverse engineering using SQL Server.
This issue was fixed by a cumulative update or a service pack, depending on the version you are running on.
The update is available here

Related

Entity Framework Code-First too slow at startup

I know this has been asked a lot before, but I still haven't found a working fix.
I'm creating a desktop application which will regularly be started and stopped. The database is a MySQL database stored online and I'm using the newest version of EF and the MySQL connector.
I'm also working code-first. For now, I only have 3 small entities, but these will grow a lot in time. The database is generated already at startup, so nothing needs to be created anymore.
Every time the application is started (even when deployed), retreiving the first data from the database (only like 50 records, but I've also tried only 10 and it doesn't make any difference) is slow: around 5 seconds. After that, the next queries are pretty fast (around 1 second).
I've already tried generating views, but it doesn't change anything. I also create only 1 DbContext.
If I attempt to use ADO.NET, I get the results almost instantly, even on the first query (retreiving all 50 records), so it has nothing to do with a connection issue.
I'm not sure what information I have to give in order for you to help me, so feel free to ask more info.
Any idea what I could try? Is it really supposed to take like 5 seconds before the user can start working with the program?
On EF the first a query is run it has to be compiled even though the program is already complied.
I would suggest reading this http://www.codeproject.com/Articles/38174/How-to-improve-your-LINQ-query-performance-by-X and this https://msdn.microsoft.com/en-us/library/vstudio/bb896297%28v=vs.100%29.aspx and trying again to see if this helps.
Good luck!

c# Service to copy data between two sql servers

I have two sql servers installed on my computer (SQL2008 EXPRESS) and also SQL2008 that comes with the specific software that we are using.
I need to make a service that runs all the time and at a specific time updates the non existing records in the SQL2008 EXPRESS from SQL2008.. can you suggest a way of doing this?
Currently the best thing I got is making a local copy in excel file, but that will result 365 excel files per year which I dont think is a good idea :)
p.s. sorry if my english is bad :)
You don't have to hand-craft your own software for that. There are 3rd party tools like OpenDbDiff or RedGate dbdiff to do that. These tools generate the differential sql that you can apply on your target database.
I'm confused when you mention Excel. What would Excel have anything to do with moving data from one SQL database to another?
The short answer is, if you need a C# service, then write a C# service that copies the data directly from one database to the other. The problem that you are trying to solve is not very clear.
Having said all that, and with my limited understanding of the problem, it sounds like what you need is a SQL job that is scheduled to run once a day that copies the data from one server to the other. Since it sounds like they are on separate instances, you'll just need to set up a linked server on either the source or destination database and either push or pull the data into the correct table(s).
EDIT:
Ok, so if a windows service is a requirement, that is perfectly acceptable. But, like I mentioned, you should forget about Excel. You wouldn't want to go from SQL->Excel->SQL if you have no other reason for the data to exist in Excel.
Here is some information on creating a windows service:
Easiest language for creating a Windows service
Here is a simple tutorial on accessing SQL in C#: http://www.codeproject.com/Articles/4416/Beginners-guide-to-accessing-SQL-Server-through-C
If you want a more formal solution (read: data access layer), I'd point you toward Entity Framework. The complexity of the project will probably be the driving factor on whether you just want to do SQL statements in your code vs. going with a full blown DAL.

Data Sync with Sync Framework

I've been working with the Sync Framework from microsoft with c# trying to synchronize about 35 tables from a local database to a database stored in a central server, the main problem is that one of my tables has more or less than 1million records, and even with the filters it takes to long to synchronize, i dont know if there's a way or any other framework that works a bit faster than this.
for the complete sync it takes about 4-6 hours.
any help will be good, thanks in advance.
Sync framework is just like any other database app and you can troubleshoot/optimize it similarly.
You can enable sync framework tracing to see where its spending its time: querying for changes, serialising changes, applying changes, locks/concurrency issues, network latency, etc...
Do you have pre-existing data on the destination DB? Have you tried initializing replicas from backup? Have you enabled batching? etc...
I am not sure if you still looking for a solution, but while using sync select upload / download incremental changes. This way every time you sync only new changes will be transferred. This way the process will consume time on the first sync only, later on only changes will be uploaded / downloaded, this it will be fairly quick.
If it has million records it will take lot of time to sync...Better is to build ur own sync architecture...we faced same issue then we created our own framework with rest based web apis returning Json formats.Only last updated rows will be synced based on last updated time...dnt need to compare whole database...

Updating database for desktop application (patching)

I wonder what you are using for updating a client database when your program is patched?
Let's take a look at this scenario:
You have a desktop application (.net, entity framework) which is using sql server compact database.
You release a new version of your application which is using extended database.
The user downloads a patch with modified files
How do you update the database?
I wonder how you are doing this process. I have some conception but I think more experienced people can give me better and tried solutions or advice.
You need a migration framework.
There are existing OSS libraries like FluentMigrator
project page
wiki
long "Getting started" blogpost
Entity Framework Code First will also get its own migration framework, but it's still in beta:
Code First Migrations: Beta 1 Released
Code First Migrations: Beta 1 ‘No-Magic’ Walkthrough
Code First Migrations: Beta 1 ‘With-Magic’ Walkthrough (Automatic Migrations)
You need to provide explicitly or hidden in your code DB upgrade mechanism, and - thus implement something like DB versioning chain
There are a couple of aspects to it.
First is versioning. You need some way of tying teh version of teeh db to the version of the program, could be something as simple as table with a version number in it. You need to check it on executing the application as well.
One fun scenario is you 'update' application and db successfully, and then for some operational reason the customer restores a previous version of the db, or if you are on a frequent patch cycle, do you have to do each patch in order or can thay catch up. Do you want to deal with application only or database only upgrades differently?
There's no one right way for this, you have to look at what sort of changes you make, and what level of complexity you are prepared to maintain in order to cope with everything that could go wrong.
A couple a of things worth looking at.
Two databases, one for static 'read-only' data, and one for more dynamic stuff. Upgrading the static data, can then simply be a restore from a resource within the upgrade package.
The other is how much can you do with meta-data, stored in db tables. For instance a version based xsd to describe your objects instead of a concrete class. That's goes in your read only db, now you've updated code and application with a restore and possibly some transforms.
Lots of ways to go, just remember
'users' will always find some way of making you look like an eejit, by doing something you never thought they would.
The more complex you make the system, the more chance of the above.
And last but not least, don't take short cuts on data version conversions, if you lose data integrity, everything else you do will be wasted.

How to determine which tables are used by ASP.NET web solution?

Problem is a bit complex(or not) so I don't know exactly how to ask this question.
My project is hybrid of MVC and WebForms approach. It uses 3rd part software and it is packed with one quite big solution with a lot of little sub-systems. Whole project uses two databases A and B.
I wanna stop using both databases and merge it to single one - lets say it will be A. To achieve this I have to create some deploy script to migrate B tables into A. I am not only sure which tables I really want.
Main problem is I am not able to determine which tables are currently used (because some of them for sure are not - this is really messy). Additionally some table names are same in A and B. Is there any automated way to do this? This project is quite big and we hadn't been supplied with documentation. Manual way for identifying this stuff would be quite horrible.
Any ideas for speed up would be appreciated.
Best Regards.
You can use SQL Profiler to seen what users are accessing your database.
If the two applications have different sql user login's you can quite easy see what queries are being run and which tables are being accessed.
Sql Profiler

Categories

Resources