database integration and report generator - c#

I have this situation, maybe it is too basic for this site, but I still hope to get some suggestions:
There are 3 different systems that I need to collect data from, all 3 on different servers in a local network. One of them is based on MySQL database which I have complete access to, the second one is based on MS Access database, and the third one has a flat file database and its data can only be accessed through txt exports from application
I need to collect data into independent database and create excel and pdf reports
I don't need charts, nicely formatted excel table should be just fine
Data is updated each hour, so it should be collected and the report should be produced every hour
Any suggestions about how to integrate the data, which dbms is best to use for this purpose?
What is the best option for creating excel and pdf reports without having to buy any software?
I hope to get some guidelines, thank you.

Well something to look into for Excel is the Microsoft Interop libraries. It's free and integrates directly into Office tools.
http://msdn.microsoft.com/en-us/library/microsoft.office.interop.excel.aspx
As far as your database situation, is there any problem with just coding the server calls and statements into different classes, then setting it on a one hour timer to refresh and collect new data?

Related

Integrating Excel as part of database management

I realize this is not a very specific question that is open to different opinions, but answering it requires technical expertise that I do not currently have.
My team of 10 people continually maintains and updates a SQL database of approximately 1 million records and 100 fields. We are a market research outfit - most of the data points are modeled assumptions that we need to update frequently as new information becomes available.
We use a .NET web interface to update/insert/delete. However, this interface is clunky and slow. For example:
We cannot copy/paste in a row of numbers; e.g., budgeted amounts across several time periods;
We cannot copy/paste similar dimension values across multiple records;
We cannot create new records and fill it's contents quickly
Cannot assess the effect of your changes on your dataset, as you could with Excel
Most of us are more inclined to navigate data, change values and test our assumptions in Excel, and then go back and make changes in the .NET interface.
My question is:
is it possible for multiple users to simultaneously use Excel as a content management system for a custom SQL database? I would design a workbook with tabs that would be specifically designed to upload to the database, and on other tabs analysts could quickly and easily perform their calculations, copy/paste, etc.
If Excel is not ideal, are there other CMS solutions out there that I could adapt to my database? I was looking at something like this, but I have no idea if it is ideal: http://sqlspreads.com/
If the above is not realistic, are there ways that a .NET CMS interface can be optimized to 1) run queries faster 2) allow copy/paste, or 3) other optimization?
Having multiple people working on one Excel sheet won't work. What you want to do it create an Excel template that is the same for everyone. Then you have have everyone entering data in on their templates. Write a script that takes this template and uploads it to the database table. You can have a template for each table/view and then you have join tables or views to get a bigger picture of all the data.
it's possible to do something like that in Excel - but it's not that easy. I created such a solution for one of my customers. 400 to 500 users are downloading data from a MS-SQL server into Excel. The data can be changed there and uploaded back to the server then. This works for pure line by line as well as for more complex reporting decks. But as I said: to built such a solution isn't a quick one.
Personally I would try to improve the .NET frontend. Because if it is so slow then I would guess you are doing something wrong there. On the end of the day it doesn't make such a great difference what kind of frontend you use. You will always face similar problems.

c# Service to copy data between two sql servers

I have two sql servers installed on my computer (SQL2008 EXPRESS) and also SQL2008 that comes with the specific software that we are using.
I need to make a service that runs all the time and at a specific time updates the non existing records in the SQL2008 EXPRESS from SQL2008.. can you suggest a way of doing this?
Currently the best thing I got is making a local copy in excel file, but that will result 365 excel files per year which I dont think is a good idea :)
p.s. sorry if my english is bad :)
You don't have to hand-craft your own software for that. There are 3rd party tools like OpenDbDiff or RedGate dbdiff to do that. These tools generate the differential sql that you can apply on your target database.
I'm confused when you mention Excel. What would Excel have anything to do with moving data from one SQL database to another?
The short answer is, if you need a C# service, then write a C# service that copies the data directly from one database to the other. The problem that you are trying to solve is not very clear.
Having said all that, and with my limited understanding of the problem, it sounds like what you need is a SQL job that is scheduled to run once a day that copies the data from one server to the other. Since it sounds like they are on separate instances, you'll just need to set up a linked server on either the source or destination database and either push or pull the data into the correct table(s).
EDIT:
Ok, so if a windows service is a requirement, that is perfectly acceptable. But, like I mentioned, you should forget about Excel. You wouldn't want to go from SQL->Excel->SQL if you have no other reason for the data to exist in Excel.
Here is some information on creating a windows service:
Easiest language for creating a Windows service
Here is a simple tutorial on accessing SQL in C#: http://www.codeproject.com/Articles/4416/Beginners-guide-to-accessing-SQL-Server-through-C
If you want a more formal solution (read: data access layer), I'd point you toward Entity Framework. The complexity of the project will probably be the driving factor on whether you just want to do SQL statements in your code vs. going with a full blown DAL.

C# Add-in vs VBA Macro for Excel functionality

We would like to give users of our system the opportunity to drag some of the data from a database into Excel. (Only reading data, no chance of writing any data back to the DB). The users do not have direct access to the database, so we would have some authentication for them in place. (Firstly to connect to the database, but also secondly to use the security settings of our system, so that user 1 is only allowed to see certain tables.)
I was instructed to begin writing a C# addin for this, but a colleague was instructed to write a VBA macro.
I was thinking of using the Entity Framework to access the data, but I haven't worked with it before. I don't know what they would be using from within the macro, but the macro-manager thinks that I will be killing the network with the heavy data transfer. He also doesn't like the idea that the users have to install the add-in on their computers. However, I have a vague uneasiness regarding macro's and the notion that they're not very secure. How safe are macro's? Are they tweak-safe, or could a knowledgable user change the code?
I would like to know, what are the pro's and con's of each approach and what the general feeling is of people with more experience and knowledge than myself?
With particular regard to matters such as:
Information Security (Some tables should not be accessed.)
Network traffic
Ease of maintenance and future modifications
Any other relevant concern that I've missed
Kind regards,
What I would do in a situation like this is:
-Create Views on the database and assign them to a schema. Don't restrict the data, just specify the columns you want them to see, let them filter the data in Excel (assuming it's a massive amount of data returned)
-Create an Active Directory Group and give members of it read access to that schema
-Use the Excel -> Data -> Connections (It's in Excel 2010, not sure about 2008) to connect worksheets to that View
They can mess away with the data in excel, but it can't be written back to the database. And you can restrict what tables / columns they can see, and do the joins for lookup tables in the View so they don't see the Database Ids.

Building aggregation/summary reporting database against Oracle, Sql Server and Mongodb

This is a design, since i've not done anything similar in the past, and is a good challange. I have a server which supports Oracle, Sql Server and Mongodb. You can select which one to use at startup. Essentially each server stores xml packets, which are split down into their component elements.
I need to build a reporting database which provides aggration and summary data for reports for the dashboard, but the problem (opportunity) is Mongodb. I could easily use sql server reporting services to build the reportdb, same with Oracle, or I could something like Crystal which works against both, or even create a db, and set a bundle of triggers on each table, with some pl/sql logic with Oracle, or T-Sql with Sql to create the reporting db on the fly. And that would take care of report. But their is mongodb. Little or no reporting infrastructure, certainly not outside BIRT, or jaspersoft (Java). I'm using C#.
I was thinking of having c# server component, which intercepts incoming xml packets, and extracts the appropriate element field data, and writes it into a reporting db, perhaps something like sqlite (which may be too small). If it was running on sql server, or Oracle then I would use that db instance to support the reporting db.
On any database, i'm really only supporting upto 6 months data. The data will be classified as 24 hours, 1 week, 1 month, 3 months, 6 months, with a progressive archive onto on compression and backup db.
But this is where it gets hazy. For instance, using sqlite as the reporting db, and mongodb as the xml databse. Taking an example. If a user wants to drill down, would I have to provide some kind of dynamic update that would pull the additional reporting info from Mongodb, or could all be done at the server component stage, when it's been writen in to sqlite.
Or is all f bol.cks
Any ideas or thoughts greatly appreciated.
Bob.
In terms of getting data from mongodb for reporting you can write your own code on top of
1) mongodb queries
2) Aggregation framework
3) In database Map/reduce or
4) Use the hadoop connector.
You can use the C# driver for it. Apart from that as you mentioned there is a the Jaspersoft integration or Pentaho (http://wiki.pentaho.com/display/BAD/Create+a+Report+with+MongoDB)
I think Microsoft's Biztalk Server best suits your need. You can use the pipeline component of the Biztalk server to actually process the incoming messages. (You can do simple property promotions, transformations etc.) You can use the Biztalk Orchestrations for actual processing of the data. And for Aggregation and Reporting you can use Biztalk's Business Activity Monitoring. It supports Real Time Aggregation of Data and puts them into your Database. It has a BAM Portal from which you can see all the stored and aggregated data. In case you want to have your own style of reports you can use Microsoft's Report Builder 3 and deploy your reports using SSRS.
Have a look at Nucleon BI Studio. You can get a fully-featured free 30 day trial, and the full version is $250. I've used it in the past, it's not bad, and a fraction of what it would cost to develop.
I am not associated with the company in any way.
Perhaps I don't understand your question entirely but I will give it a shot: first your question, summarized.
You want to generate reports based on different types of datastores: sql this, sql that or a document database. The current options you feel you have are the build in reporting of various types.
You have various points available for getting the data. You can intercept the data as it comes into the system or derive the information from your databases. In order to make a dynamic report with drill down it really depends on the type of reporting tool you want to use. You will simply need to build a facade that hides the datastore-- either by intercepting the packets and storing them in a database of your choice or actually building them from your chosen datastore through that same abstraction/facade. You can even think of a hybrid solution where you initialize from the datastore, such as mongo, on initializing your reporting component and then update dynamically based on incoming packets.
It all depends on where you want to go.

Which One is Best OLEDB Or Excel Object Or Database

I need to work with Excel 2007 File for reading the data. for that which one is the best way to do that:
Using OLEDB Provider
Excel Interop Object
Dump the Excel data to Database and Using Procedure
kindly guide me to choose.
Here are my opinions:
1. Using OLEDB Provider
will only suit your needs if you have simple, uniform structured tables. It won't help you much, for example, if you have to extract any cell formatting information. The Jet engine's buggy "row type guessing" algorithm may make this approach almost unusable. But if the data type can be uniquely identified from the first few rows of each table, this approach may be enough. Pro: it is fast, and it works even on machines where MS Excel is not installed.
2. Excel Interop Object
may be very slow, especially compared to option 1, and you need MS Excel to be installed. But you get complete access to Excel's object model, you can extract almost every information (for example: formatting information, colors, frames etc) that is stored in your Excel file and your sheets can be as complex structured as you want.
3. Dump the Excel data to Database and Using Procedure
depends on what kind of database dump you have in mind, and if you have a database system at hand. If you are thinking of MS access, this will internally use the Jet engine again, with the same pros and cons as approach 1 above.
Other options:
4. write an Excel VBA macro to read the data you need and write it to a text file. Read the text file from a C# program. Pro: much faster than approach 2, with the same flexibility in accessing meta information. Con: you have to split your program in a VBA part and a C# part. And you need MS Excel on your machine.
5. Use a third party library / component for this task. There are plenty of libraries for the job, free and commercial ones. Just ask Google, or search here on SO. Lots of those libs don't require MS Excel on the machine, and they are typically the best option if you are going to extract the data as part of a server process.
Options 1 and 2 are almost always an exercise in pain, no matter how you ask the question.
If you can use SSIS to move the data into a database, and if that suits your needs because of other requirements, that's also a good option.
But the preferred option is usually to use Office Open XML for Excel 2007 and later. That has none of the COM headaches you get with Option 2, and none of the issues you have with guessing row types as you have with Option 1.
With a more carefully crafted question, you can get a far better answer, though.

Categories

Resources