Web app for synchronizing data between two systems - c#

What I want to do is this: create an ASP.NET MVC web app that periodically looks for new or updated data from an external API (eg, a new order created or modified in an inventory management system), and then sends this data to another system. Similar sort of thing to this - I have tried using it, but I've found it's a bit too limited for some of the things I wanted to do.
I am currently trying to figure out the best way to go about this. I imagine that it will work like this:
Periodically (using a timer?) pull the data from one system's API, and cache the id and DateUpdated fields (in a local database I assume) for each item.
If there are any changes against the cached data, then convert the item to appropriate format if needed and post to the second API.
I would be connecting to a few different third party systems, which for the most part do not support web hooks etc, so I assume polling and caching is the only option.
Am I on the right track? Haven't been able to find a lot of resources on this sort of thing, so would appreciate any advice.

Related

ASP.NET show dynamic changes on all clients

I'm a beginner with ASP.NET and webapplications in general.
For a project I have to interact with an enginnering software to read some data, for this I have to to use a ASP.NET project based on the .Net Framework 4.8.
For now I called these functions with buttons and displayed the data in gridviews. The problem is I want to show the data on all clients and the data should still be there when I refresh the page on one client.
To load some data in the gridview I tested it by using a function like this.
Load data to datagrid
The problem is I can't see these changes on other clients.
Is there a way to implement this?
Well, running some in-memmory code for one user of course will not work for other users. You probably would be best to write a seperate console application, place it on the server, and then say schedule it to run ever 5 minuites or whatever. That console or desktop program would thus then write out the data to a database table. Now, any and all web pages (and users) can now have a grid display that quieres against that database.
The other possbile (if for some strange reason you wanted to avoid a database to persist and store this information?
You could consider using signalR. It is complex, but allows you to push (send) out information to all web clients connected to your system. This involves some rather fancy foot work, and does require your web page to include some JavaScript. As a result, the simple database idea is less work, simple, and does not require many hand-stands.
but, signalR is what is used for say pushing out information to each client browser, you can get started here, since this is a vast topic well beyond that of a simple Q & A on SO.
https://learn.microsoft.com/en-us/aspnet/signalr/overview/getting-started/

Easy way to convert the MS Access database to Web application

As per requirement, we need to convert the existing MS Access database to a web application. Is there any easy way to convert the MS Access database to web application? As of now they are inserting the data to access db using access Forms. User also wish to continue access form feature even if we create new web application for the same. That means user should have the option to access the MS access database through Access forms as well as web application.
Please guide me away to solve this issue.
Best Regards,
Ranish
You can use Office 365 and have somewhat of a web-based application.
https://blogs.office.com/en-us/2012/07/30/get-started-with-access-2013-web-apps/
Or, store Access in SharePoint, but your functionality will be quite limited. Keep in mind, no VBA will run on a web-based application.
The alternative is to use SQL Server Express, and ASP.NET, both of which are free from Microsoft. I'll tell you now, though, the learning curve will be quite steep if you have never used these technologies before. This combo, however, will give you the most control!
You can get the .NET framework from here.
https://www.microsoft.com/en-us/download/details.aspx?id=30653
You can get SQL Server Express from here.
https://www.microsoft.com/en-US/download/details.aspx?id=42299
Four years after and according to this:
https://www.comparitech.com/net-admin/microsoft-access/
still a question for many. Access can be converted to an Web App in almost no time. Particularly Access Forms are super easy to crate with the library like Jam.py.
The process was discussed on Reddit in April 2021:
https://www.reddit.com/r/MSAccess/comments/mj4aya/moving_ms_access_to_web/
I see quite a few Access databases with more than 100 tables, all converted successfully to SQLite3. After inspecting the imported tables via provided link, Forms are automatically created. Which leaves the Access Reports and Business Logic untouched. Reports can be designed in LibreOffice for Jam.py as Templates. Business Logic can be moved from VB to Python, if there is a need to do so.
The SQLite was selected as the default DB for the conversion, since it is very portable. Looks like the converted App can be moved to any DB that Jam.py supports, by Export/Import.
Cheers
First of all, Database and Web Application are not mutually exclusive.
Back to original question, I have done multiple projects like that. A client started with small Microsoft Access database with a couple of user; then they migrate to to web application when they get more traffic.
At first, you want to convert data from MS Access Database to SQL Server. MS Access Database is not meant to access multiple users simultaneously. Then you develop the Web Application which uses SQL server as back end database.
Right before you go live, you convert the data again from MS Access Database to SQL Server very last time. Then do not let them use old MS Access Database anymore.
Easy way to convert the MS Access database to Web application
Most of the time whoever created MS Access database are not software engineer, so table are not normalized and do not have relationship at all. I normally create new normalized database in SQL Server. Then write a small program to convert those data from MS Access to SQL database.
There are generally two approaches with more details covered in this article looking at ways to convert microsoft access to web application
Direct Port means simply a basic migration whereby you port more or less verbatim basic Access forms into a web portal i.e. Microsft Access to a browser-based version as is using a third-party tool. Some of these are quite mundane in that it just allows you to run the Access application inside an internet browser (whoopee!) or can be quite drawn out and then limits you on how much you can change afterward. With even more complex cases requiring a consultant to help you migrate the system. Though it does help to know your user count as the higher you tend to be, the less appealing a third-party porting service becomes due to subscription-based models.
Upsize -the more involved or complex your data structure is an upsize using custom development and splitting the system across web and data tiers might be worth it if
You've got a special process or some secret sauce you're looking to keep.
Likely going to have a significant user count and want to avoid subscription
Inherently cynical or cautious, and want to handle your own architecture and security
Looking for a specific user experience
If you mean how to convert automatically and you want to keep both Access and the Web application (I don't recommend that, I would move everything to the Web app) I would do the following:
Export your Access data in CSV/Excel
Use a platform like DaDaBIK to import the CSV/Excel file and automatically create a Web app based on that file, with data stored on SQL server, MySQL, PostgreSQL or SQLite.
connect your Access to the SQL Server (or Mysql, ...) database created by DaDaBIK, from now on Access will only be used as a frontend.
Now you have a web app created with DaDaBIK and your Access frontend both working on the same DB. As I said I would skip 3) and keep only the Web app, this helps with handling data integrity when two users are accessing the same record.
Depending on how complex is your Access Application (e.g. complex validation rules or custom VB code you added), you could reach your goal without any coding or with some coding.

Retrieve huge data from rest api

I am calling third party rest api.this rest api returns more then 2 thousand records that takes much time to
retrieve this information.I want to show these records in asp.net page.
Records display in page successfully but problem is its takes much time to display/retrieve this information.
I call this api against dropdown change so api call might be again and again.means it is not one time call and
also i cann't cache this information because in server end may be information will change.
So How I reduce download time or any trick(paging is not supported),Any suggestion?
Thanks in advance
Well there's always a workaround for any given problem even when it doesn't come to your mind at first sight.
I had one situation like this, and this is what we did:
First, keep in mind that real-time consistency is just something that cannot be achieved without writing extensive code. For example, if a user requests data from your service and then that data is rendered on the page, potentially the user could be seeing out-of-data data if at that precise time the source is updated.
So we need to accept the data will be consistent eventually
Then consider the following, replicate the data in your database using an external service periodically
Expose this local data allowing paging and consume it from your client code
As any design decision, there are trade-offs that you should evaluate to take your final choice, these were problems we had with a design like this:
Potentially phe data will not be consistent when the user requests it. You can poll the data to the server every xxx minutes depending on how often it changes (Note: usually we as developers, we are not happy with this kind of data inconsistency, we would like to see everything on real-time, but the truth, is that the end users, most of the time can live with this kind of latency because that's the way they use to work in real-life, consult with your end-users to gather their point of views about this. You can read more about this from Eric Evans in the Blue Book)
You have to write extra code to maintain the data locally
Sadly when you are dealing with a third party service with these conditions, there are few options to choose from, I recommend you to evaluate them all with your end-users explaining their trade-offs to choose the solution that best fits your needs
Alternatives:
Since I already said the data will not be consistency all the time, then perhaps you could consider caching the call to the service =)
Usually the easiest way is always the best way to face something like this, since you didn't mention it in your post I will comment here. Have you tried to simply request to the third party organization to allow paging?. Get in touch with your third party partner and talk about this.
If you can, evaluate another source (other third party companies providing a similar service)
So How I reduce download time or any trick(paging is not supported),Any suggestion?
If the API doesn't support paging, there's not much you could do from a consumer perspective than probably caching the data to avoid the expensive HTTP call.
There's no miracle. There's no client.MaxDownloadTimeMs property that you could set to 5 for example and get your huge data in less than 5 milliseconds.
Your only option would be to cache the data for some amount of time. If the third party api has a call you can make which is more generic than your dropdown change you could call that before the page loads to get all possible drop down values and store the results for a certain period of time. Then just return the appropriate data each time.
Also, look to see if the third party api has any options for gzipping (compressing) the data before sending it down to you. I have created several API's and have created this type of option for large datasets.

C# Sync two identical DataSets over a Web Service

What is the most effective way to sync two identically structured DataSets using a Web Service?
The design I use at the moment is simple without a Web Service. All data iscached on the Client, and it will only update data from the MySQL database if an update has been done, this is determined by a timestamp.
If possible I want to keep the same simple design, but add a Web Service in the middle, to allow easier access to the database over our limited VPN.
Best Regards, John
That's one heck of a question, but something I'm doing myself too. Easiest way I guess would be to add a "saved version" property. If it really is a simple design then you could just re-write only the DAL code to get things working for a web service. In fact, if the WSDL is done right, you may only need to make very minor adjustments (especially if the DB was previously designed using EF).
You say you want to sync two datasets simultaneously, but the question I guess is why? And which two? Do you have two web services, or are you wanting to sync data to both the local cache and the online web service (MSSQL db?) simultaneously?

Save Sharepoint Data outside of Lists

What is the best way to store your Data outside of SharePoint. I want to use the default Edit/View options in SharePoint (though I do plan on extending their functionality). I need to store the data outside of the SharePoint Lists as I am expecting a large record set(150,000 to start with).
I totally agree with GalacticJello, storing data outside of the regular sharepoint content database is at the moment (MOSS 2007) a complete and utter nightmare. MS provided a ExternalStorage provider baseclas for us to override, but there are major cons against using this:
writing and using your own custom ExternalStorage provider implementation is doable, but very difficult
ExternalStorage provider implementations only do just that, making sharepoint store stuff in a different location than the regular content db.
This means that you would need to write your own code to keep the external storage and the list items in sync, and i'm not even mentioning workflows and versioning.
last but not least (and IMHO the worst), creating and using a custom ExternalStorage provider is not web app or site collection targetable, it's usage is farm wide (and there are NO workarounds). So any site collection you create in that farm will have it's doc libs use the external storage provider.
You can store the items in the list, the trick is to create efficient views that return paged data quickly back to the user.
Another option is to use folders to split up the data.
If you really need to store it outside of SharePoint, I would consider waiting for SharePoint 2010 and thier "External Lists" features, as there are a ton of pitfalls and things to consider if you want to mimic that functionality in SharePoint 2007 (been there, done that).
You could create a SQL Data source and use the Data Form Web part to connect it to an edit form etc.:
Data Forms and SQL Server, Part 1
Data Forms and SQL Server, Part 2
Personally I prefer to have the level of control of a totally custom web part.. with this you will not run into limitations eventually.
In this case it sounds like all the elements will be in the same list, otherwise SLAM (free tool on CodePlex) is a very neat data replication tool for related lists.
Based on the comments above it sure looks like a plain ol' ASP.NET page might be the best option, unless there is some good reason why SharePoint must be used.

Categories

Resources