I realize this is not a very specific question that is open to different opinions, but answering it requires technical expertise that I do not currently have.
My team of 10 people continually maintains and updates a SQL database of approximately 1 million records and 100 fields. We are a market research outfit - most of the data points are modeled assumptions that we need to update frequently as new information becomes available.
We use a .NET web interface to update/insert/delete. However, this interface is clunky and slow. For example:
We cannot copy/paste in a row of numbers; e.g., budgeted amounts across several time periods;
We cannot copy/paste similar dimension values across multiple records;
We cannot create new records and fill it's contents quickly
Cannot assess the effect of your changes on your dataset, as you could with Excel
Most of us are more inclined to navigate data, change values and test our assumptions in Excel, and then go back and make changes in the .NET interface.
My question is:
is it possible for multiple users to simultaneously use Excel as a content management system for a custom SQL database? I would design a workbook with tabs that would be specifically designed to upload to the database, and on other tabs analysts could quickly and easily perform their calculations, copy/paste, etc.
If Excel is not ideal, are there other CMS solutions out there that I could adapt to my database? I was looking at something like this, but I have no idea if it is ideal: http://sqlspreads.com/
If the above is not realistic, are there ways that a .NET CMS interface can be optimized to 1) run queries faster 2) allow copy/paste, or 3) other optimization?
Having multiple people working on one Excel sheet won't work. What you want to do it create an Excel template that is the same for everyone. Then you have have everyone entering data in on their templates. Write a script that takes this template and uploads it to the database table. You can have a template for each table/view and then you have join tables or views to get a bigger picture of all the data.
it's possible to do something like that in Excel - but it's not that easy. I created such a solution for one of my customers. 400 to 500 users are downloading data from a MS-SQL server into Excel. The data can be changed there and uploaded back to the server then. This works for pure line by line as well as for more complex reporting decks. But as I said: to built such a solution isn't a quick one.
Personally I would try to improve the .NET frontend. Because if it is so slow then I would guess you are doing something wrong there. On the end of the day it doesn't make such a great difference what kind of frontend you use. You will always face similar problems.
Related
I have this situation, maybe it is too basic for this site, but I still hope to get some suggestions:
There are 3 different systems that I need to collect data from, all 3 on different servers in a local network. One of them is based on MySQL database which I have complete access to, the second one is based on MS Access database, and the third one has a flat file database and its data can only be accessed through txt exports from application
I need to collect data into independent database and create excel and pdf reports
I don't need charts, nicely formatted excel table should be just fine
Data is updated each hour, so it should be collected and the report should be produced every hour
Any suggestions about how to integrate the data, which dbms is best to use for this purpose?
What is the best option for creating excel and pdf reports without having to buy any software?
I hope to get some guidelines, thank you.
Well something to look into for Excel is the Microsoft Interop libraries. It's free and integrates directly into Office tools.
http://msdn.microsoft.com/en-us/library/microsoft.office.interop.excel.aspx
As far as your database situation, is there any problem with just coding the server calls and statements into different classes, then setting it on a one hour timer to refresh and collect new data?
We would like to give users of our system the opportunity to drag some of the data from a database into Excel. (Only reading data, no chance of writing any data back to the DB). The users do not have direct access to the database, so we would have some authentication for them in place. (Firstly to connect to the database, but also secondly to use the security settings of our system, so that user 1 is only allowed to see certain tables.)
I was instructed to begin writing a C# addin for this, but a colleague was instructed to write a VBA macro.
I was thinking of using the Entity Framework to access the data, but I haven't worked with it before. I don't know what they would be using from within the macro, but the macro-manager thinks that I will be killing the network with the heavy data transfer. He also doesn't like the idea that the users have to install the add-in on their computers. However, I have a vague uneasiness regarding macro's and the notion that they're not very secure. How safe are macro's? Are they tweak-safe, or could a knowledgable user change the code?
I would like to know, what are the pro's and con's of each approach and what the general feeling is of people with more experience and knowledge than myself?
With particular regard to matters such as:
Information Security (Some tables should not be accessed.)
Network traffic
Ease of maintenance and future modifications
Any other relevant concern that I've missed
Kind regards,
What I would do in a situation like this is:
-Create Views on the database and assign them to a schema. Don't restrict the data, just specify the columns you want them to see, let them filter the data in Excel (assuming it's a massive amount of data returned)
-Create an Active Directory Group and give members of it read access to that schema
-Use the Excel -> Data -> Connections (It's in Excel 2010, not sure about 2008) to connect worksheets to that View
They can mess away with the data in excel, but it can't be written back to the database. And you can restrict what tables / columns they can see, and do the joins for lookup tables in the View so they don't see the Database Ids.
I have a system (using Entity Framework) that is deployed in various production systems and also on a quality control system. My problem is that data entry is often done only on one of those occurrences of my system (different databases).
I want to find the best way to transfer my data from one database to another database. Ids can change, as long as the relations between my objects are maintained. 98% of my data in in DB, some of it is external files, I can manage those separately, manually.
Currently we use a xml structure as a transition file. The file is then imported in the destination system, and code manually imports the entities and re-creates the data.
I'm looking for a more generic way to do this, with less code. Since all my data in stored in Entities couldn't I simply create a big List and throw all my objects in there, then serialize that in some matter into an external file and finally generically import all the entities in there in my destination system? (I'll probably have to be careful in maintaining relation ids, but should be ok...)
Anyways I'm wondering if anyone would have smart approaches, I'm pretty sure I,m not the first with a similar problem.
Thanks!
You need to get some process around this. If all environments contain the same data (unlikely) you can replicate. It is the most automatic. But a QA environ should not update production, so you have to really think this through.
If semi-automated is okay, there are tools out there you can use from a variety of vendors. I use Red Gate tools, personally, but others are also fine.
Can you set up a more automated push with EF? Sure, but the amount of time you spend is really not worth it.
In my opinion you can check some of the following approaches:
1) Use Sql Compare or Sql Data Compare. Those tools are from Red Gate and can be found here
2) Regular backups and restores of the databases. You could, if it is an option regularly backup your most up-to-date database and restore it on the destination systems. I have no experience in automatizing this but here is a link to do that through .net.
3) You could always give it a go creating a version control system of your own. I would picture one such system selecting all records from a certain table (or all of them), deleting all records in the target database and inserting them. This seems pretty complex though, as you have to worry about relationships, data dependencies, etc.
Hope this helps in some way.
Regards
If you for some reason will not be satisfied with existing tools may be you'll want take a look at the Sync Framework and implement this functionality yourself for your very particular data bases.
Given what you described, pushing data from One SQL Server to another for demo purposes, you should consider SQL Server Integration Services.
If you're got a simple scenario where you just move the data and objects from DB to the next you can use their built-in Wizards. If you need to do custom stuff you can build complex workflows using C# and SQL (tools you already know). Note: most of what you're going to want comes with the standard edition so if you're using express this is less interesting.
The story for Red Gate products is more compelling when you don't have SQL Server (So you have to go out and buy something) and if you are interested in finding out what the changes are between DB's (like viewing code changes in a .cs file in a source control product)
I have developed an network application that is in use in my company for last few years.
At start it was managing information about users, rights etc.
Over the time it grew with other functionality. It grew to the point that I have tables with, let's say 10-20 columns and even 20,000 - 40,000 records.
I keep hearing that Access in not good for multi-user environments.
Second thing is the fact that when I try to read some records from the table over the network, the whole table has to be pulled to the client.
It happens because there is no database engine on the server side and data filtering is done on the client side.
I would migrate this project to the SQL Server but unfortunately it cannot be done in this case.
I was wondering if there is more reliable solution for me than using Access Database and still stay with a single-file database system.
We have quite huge system using dBase IV.
As far as I know it is fully multiuser database system.
Maybe it will be good to use it instead of Access?
What makes me not sure is the fact that dBase IV is much older than Access 2000.
I am not sure if it would be a good solution.
Maybe there are some other options?
If you're having problems with your Jet/ACE back end with the number of records you mentioned, it sounds like you have schema design problems or an inefficiently-structured application.
As I said in my comment to your original question, Jet does not retrieve full tables. This is a myth propagated by people who don't have a clue what they are talking about. If you have appropriate indexes, only the index pages will be requested from the file server (and then, only those pages needed to satisfy your criteria), and then the only data pages retrieved will be those that have the records that match the criteria in your request.
So, you should look at your indexing if you're seeing full table scans.
You don't mention your user population. If it's over 25 or so, you probably would benefit from upsizing your back end, especially if you're already comfortable with SQL Server.
But the problem you described for such tiny tables indicates a design error somewhere, either in your schema or in your application.
FWIW, I've had Access apps with Jet back ends with 100s of thousands of records in multiple tables, used by a dozen simultaneous users adding and updating records, and response time retrieving individual records and small data sets was nearly instantaneous (except for a few complex operations like checking newly entered records for duplication against existing data -- that's slower because it uses lots of LIKE comparisons and evaluation of expressions for comparison). What you're experiencing, while not an Access front end, is not commensurate with my long experience with Jet databases of all sizes.
You may wish to read this informative thread about Access: Is MS Access (JET) suitable for multiuser access?
For the record this answer is copied/edited from another question I answered.
Aristo,
You CAN use Access as your centralized data store.
It is simply NOT TRUE that access will choke in multi-user scenarios--at least up to 15-20 users.
It IS true that you need a good backup strategy with the Access data file. But last I checked you need a good backup strategy with SQL Server, too. (With the very important caveat that SQL Server can do "hot" backups but not Access.)
So...you CAN use access as your data store. Then if you can get beyond the company politics controlling your network, perhaps then you could begin moving toward upfitting your current application to use SQL Server.
I recently answered another question on how to split your database into two files. Here is the link.
Creating the Front End MDE
Splitting your database file into front end : back end is sort of a key to making it more performant. (Assume, as David Fenton mentioned, that you have a reasonably good design.)
If I may mention one last thing...it is ridiculous that your company won't give you other deployment options. Surely there is someone there with some power who you can get to "imagine life without your application." I am just wondering if you have more power than you might realize.
Seth
The problems you experience with an Access Database shared amongst your users will be the same with any file based database.
A read will pull a lot of data into memory and writes are guarded with some type of file lock. Under your environment it sounds like you are going to have to make the best of what you have.
"Second thing is the fact that when I try to read some records from the table over the network, the whole table has to be pulled to the client. "
Actually no. This is a common misstatement spread by folks who do not understand the nature of how Jet, the database engine inside Access, works. Pulling down all the records, or excessive number of records, happens because you don't have all the fields used in the selection criteria or sorting in the index. We've also found that indexing yes/no aka boolean fields can also make a huge difference in some queries.
What really happens is that Jet brings down the index pages and data pages which are required. While this is a lot more data than a database engine would create this is not the entire table.
I also have clients with 600K and 800K records in various tables and performance is just fine.
We have an Access database application that is used pretty heavily. I have had 23 users on all at the same time before without any issues. As long as they don't access the same record then I don't have any problems.
I do have a couple of forms that are used and updated by several different departments. For instance I have a Quoting form that contains 13 different tabs and 10-20 fields on each tab. Users are typically in a single record for minutes editing and looking for information. To avoid any write conflicts I call the below function any time a field is changed. As long as it is not a new record being entered, then it updates.
Function funSaveTheRecord()
If ([chkNewRecord].value = False And Me.Dirty) Then
'To save the record, turn off the form's Dirty property
Me.Dirty = False
End If
End Function
They way I have everything setup is as follows:
PDC.mdb <-- Front End, Stored on the users machine. Every user has their own copy. Links to tables found in PDC_be.mdb. Contains all forms, reports, queries, macros, and modules. I created a form that I can use to toggle on/off the shift key bipass. Only I have access to it.
PDC_be.mdb <-- Back End, stored on the server. Contains all data. Only form and VBA it contains is to toggle on/off the shift key bipass. Only I have access to it.
Secured.mdw <-- Security file, stored on the server.
Then I put a shortcut on the users desktop that ties the security file to the front end and also provides their login credentials.
This database has been running without error or corruption for over 6 years.
Access is not a flat file database system! It's a relational database system.
You can't use SQL Server Express?
Otherwise, MySQL is a good database.
But if you can't install ANYTHING (you should get into those politics sooner rather than later -- or it WILL be later), just use you existing database system.
Basically with Access, it cannot handle more than 5 people connected at the same time, or it will corrupt on you.
Am working on a business layer for a complex web application and am temporary using the Dynamic Data Site functionality to allow data to be entered into the many tables that I need to maintain. I don't want to spend too much time on this DDS since the business layer needs to be finished first. Once the business layer is done, it gets shipped to someone else to add a better user interface.
However, while the DDS offers a lot of functionality in a very easy way, I just want to extend it with an "Export to XML" button or link. (And I'll probably will add an "Export to Excel" button later.)
So, has anyone done something like this already? What would be the easiest way to implement this in .NET, without rewriting the DDS?
(I use an Entity model for the database connection and much of the business layer is built upon this entity model. Once the business layer is finished, the real GUI interface will be developed for this web application but for now I just need a good way to input/output this data.)
I have no problems converting an entity set to XML. That's the easy part. My problem lies in expanding "ListDetails.aspx" with an additional button which the user can click. Once clicked, it should export the dataset to XML. To make it interesting, if the user has set one or more filters, it should only export those filtered records.
I think I'll have to look into the "GridDataSource" object that's on this page and experiment with it. Will it return the whole table or just the filtered dataset? Or just the records that are on the current page?
Now, with export, I just want a dump of the dataset to XML. Basically, what you see should end up in the final XML. If I have access to the filtered dataset then creating the XML would be easy. (And creating the Excel sheet on top of that is a piece of cake too.) In general, the export is just used to help develop the business layer of a project I'm working on. Most of the code is business logic that will be used in other (web/desktop) client applications but while the project is still in progress, the DDS is needed for easier data entry for the project. Once it's finished (a gazillion years from now, I guess) then the DDS won't be used anymore. Nor would we use the XML exports or the export sheets. But for now, those exports are useful to evaluate the data. (Since I still have to develop the more complex analysis tools.)
This is fairly straightforward, you've got to address a couple of issues:
Providing a means to trigger the export
Generating the XML
Making the XML available (as a link) for download - assuming that that's what you want to do.
There's a slightly less straightforward alternative which is to create a service to generate and return the XML.
In terms of the first - there's nothing to stop you editing either the master page or the default page to add your own functionaliy i.e. a button or a link to an XML gen page.
In terms of the second - Linq makes it almost trivial to generate XML from your Entity model.
Once you've got your XML you've got various options the key here is that you can add your own pages to the site if you want - the magic in Dynamic Data is simply a starting point not the final product (although if it does all you need then you can walk away with a smile on your face).
I appreciate that these are generic answers but its a fairly generic question and the details of implementation would be better addressed by more specific pages.
In terms of specific, I have a Dyanmic data site which needs to generate XML, the first iteration was simply a button on the default page that saved a file to disk (one file name, one file format, click, gen, save, done). The reason for the XML was as the source data for another site so I then added a WCF service which exposes the same XML. Total time spent (less a bit for getting my head around WCF) probably less than half a day - most of which has been spent fiddling with the XML output.
Maybe you can do something with FileHelpers.