Reloading the model in the view - c#

I have a google map that loads with markers and a route at the beginning. (MVC in .NET5)
The position of the markers changes every few seconds (it is saved on the server, the data comes from the mobile application)
I would like to download new data from the server every few seconds and update the view. (I mean refresh the data without reloading the page)
I did this by asking GET to REST in JavaScript every 5 seconds, but is it possible to do it differently?
If and how to update the model in the view to receive new data every now and then?
In the internet I find partially reloading views, but I only want to refresh the data (coordinates and other data)

I highly suggest you look into Blazor and using SignalR for this, if possible. Using WebSockets to listen for changes is going to be so much more efficient and cleaner than having an ajax call every 5 seconds.

Related

Refresh table when new record is inserted in ASP.NET Core MVC

I'm creating an app with ASP.NET Core MVC, Razor, EF Core, SQL Server as the database.
I have 3 views: List, Create, and Edit.
List - showing the data from the table (as a list)
Create - for creating a new record
Edit - to edit a selected record
These pages will be accessed by different users.
The case is: how can I update the list without refreshing the List page? So when there's a new record, the data table will be updated with a new record, even when the record is edited.
I am trying the code with this sample --> https://github.com/dyatchenko/ServiceBrokerListener
But it seems that the page didn't updated.
My code:
public IActionResult Index()
{
var connectionString = this.Configuration.GetConnectionString("SQLConnection");
var listener = new SqlDependencyEx(connectionString, "ProductStatistic", "TrialPurpose", listenerType: SqlDependencyEx.NotificationTypes.Insert);
listener.TableChanged += (o, e) => Console.WriteLine("Your table was changed!");
listener.Start();
var model = _db.TrialPurposeDataViews.FromSqlRaw("exec spTrialPurposeDataView").ToList();
listener.Stop();
return View(model);
}
Did I make a mistake on the code?
Need advice and help, really appreciated.
Thank you.
How can I update the list without refreshing the List page?
Do you mean that you want to see record changes in near-real-time in your web page, without requiring an entire postback / page refresh?
You need to use client side programming for that, and the only client side language that works in browsers is Javascript. MS are pushing something new called "Blazor" for client side programming, but it's not really "native" as far as I can tell.
The functionality you're after is delivered using something called "AJAX" which is basically "Using javascript to make your web page behave like a thick client application"
There are now many javascript plugins, libraries, frameworks and layers that do what you want. I recommend you buy a third party tool rather than write your own. I've seen Telerik used successfully in the past.
It's not very well explained in various ASP.Net MVC articles that there is no way to do all that "standard" stuff if just using ASP.NET MVC.
If you do take the AJAX / Telerik route, keep in mind that you still need to build API's in your controller. These basically return data, often in JSON format, which your client side javascript uses to update the page in-place rather than having to post back.
An option that I tried and gave up on was the free jqGrid libray. Here's an example of how incredibly complicated it is to get seemingly simple things like grid updates running.
Real-time data in a grid - better method

ClientSide caching DropDownList with HUGE amount of data?

SCENARIO: I have a GridView which has DropDownList in each row that gets bound on rowbound event. This data comes from Database and doesn't change very often. (say it changes weekly). As per my understanding the database it hit as many times as there are rows in GridView. One thing that I can do to minimize database hits is to use ViewState or session. BUT, the dropdown data will still be transferred to client side again and again. This is huge data (3MB).
Even I use ajax calls there would still be a lot of data being transferred.
It might not be an issue for fast internet connections but for slow internet connection will result in slowdowns. I was wondering if I can save dropdown data on client side and and bind it from there?
I came across an article that explains how I can store data on clientside in HTML5 CLIENT SIDE CACHING
But I would like a solution that works on browsers that dont support HTML5 as well. What would by my best bet and why?
I think the approach of using the HTML5 local storage is the best and the only possible in caching data larger then 100kb on the client side, but it would be hard for you to deserialize, unless you store you object as json string to the local storage, Since local storage or HTML5 is an issue you can always use cookie, but, if it really 3mb (cant imagine the size of that drop-down) it will not be possible since cookie can store MAX~5KB
As said, data is more than 3mb also not changing frequently. Two things i would suggest to do,
1. Don't store this much amount of data on browser. Use server side Cache object.if possible try to use cache dependency.
2. Use auto complete dropdown kind of stuff. Always bind top 30( or more ) data to dropdown to reduce load on page. if data is not visible in top 30 then auto complete would help to search appropriate data.
Let me know if this idea helps..

remove and add google map markers dynamically asp.net

I am new to asp.net and Google maps. I want to achieve tracking capability on my asp.net page. I want to create a number of markers and then move them along a path.
Can any one help me understand, how do I do it?
I want google maps to work as AJAX
this can be done in many ways, I will dish out a very easy solution,
1.Your webpage should load a map and add markers representing your assets with an id.
2. Your web page should ping your web server every sec to query for change
3. If your server has a change to report it should reply back with the latest latlon for those ids as a json
4.you can then update the positions of those markers.
If you can implement http push(long polling/websockets) then even better as you will get near realtime updates
This assumes that your web server is being updated by your tracking device.
your webservice should always return the latest position it has for the assets.
Updating your markers(assets) on client side is pretty easy as well,
To 'move' your existing marker, you'll wanna make sure its global and then you can just update its position within the function with something like:
marker.setPosition(results[0].geometry.location);

Checking the database for new rows at regular intervals of time

I'm creating a web application in which I need to Check the Database in regular Intervals for any new rows Updated in the table. If any Rows have been updated and return the Rows to the user.
When I refresh the Page I update the rows in my Website. But I need to Know If any new Rows have been added to the Table without refreshing the Page
How can I achieve this ??
Can you suggest me a Method (Probably which doesn't slow down the web site).
Can I achieve this using web services??
The usual way to do this is using javascript to hit a page on your website with setInterval posting back to a page to ask about any changes, web sockets can do this real time but require modern browsers.
Here is a similar question that's been answered well, though it's for Asp.net MVC:
Creating an AJAX alert for a user when there's a server-side change
If I'm being forced to use vanilla ASP.Net I do usually put those kind of requests into a webservice.

How can I speed up a web page that has server bound controls with a lot of data

I'm working on a page that has several server side dropdown lists, one with 500 items. Based on what's selected, I show/hide other page elements during postback (I only bind the data on initial load). The customer opens this page a lot and I don't want them pulling the 500 items down every time they open it. Currently, it takes about 2 to 5 seconds for the page to render. I've started to migrate to a fully javascript/jquery version of the page but want your opinion because I'm not loving the new version.
Is there a way to make this page faster and limit pulling down all 500 items every time?
Note: Some users will want to enter the dental procedure code directly. Others will need to do a look up.
We work on a system where a user's name can be selected from a dropdown list and then user information is displayed below. There are approximately 600 users and one of the stakeholder requirements was that the users had to be selectable in a dropdown list - the stakeholders felt that non-technical users better "understood" how to use a dropdown list.
Our performance for loading the dropdown list is very good. We do the following:
Load the page as quickly as possible but DO NOT load the dropdown list
On page load, display a loading indicator and then immediately fetch the data for the dropdown list
We get the data by calling a webservice using jQuery that returns ONLY usernames and IDs and data is returned in JSON format
The query that requests the data is cached on the server for future requests
The resulting JSON object is used to populate the dropdown list
Hide the loading indicator and you're done
The above occurs extremely quickly and makes for a very pleasant user experience.
If anything, try very hard to do the following:
Avoid postbacks even if you're using an Update Panel - these will kill performance if you have a large viewstate
Only return the absolute bare minimum of data that you need to populate the dropdown list
Don't access any data that isn't immediately necessary. Get the page loaded as quickly as possible and then fetch the remaining information while the user is reading the page
When adding large amounts of data to a page, milliseconds count. Anything you can do to reduce calls for data (and the subsequent adding of that data to the page) will drastically improve the user experience.
It's been a while since I've done asp.net but remember something from the Ajax control toolkit that is like a set of filtering drop downs that group items so you don't have to get the full list.
For example if you're getting a list of all cars, you could have the first drop down as Manufacturer, which when selected activates a second drop down with their range of Models. It limits the ammount of data you have to load at once.
A dropdown list is not a good container for 500 items because the looong list looks ugly and it's hard to locate an item. You can change it to a table-like control(from server view, a gridview or a repeater) with paging function(e.g. display 20 items per page), also you can add some textboxes above the table, users can quickly locate an item by typing some keywords. After that, put the table in a update panel to make the page partially updated when clicking some button.
Anything you can do on the page that doesn't require the entire page to change can be made AJAX-y by enclosing it in an UpdatePanel. UpdatePanels and ScriptManagers allow ASP.NET pages to perform partial postbacks using AJAX, which will speed up anything but a full page reload by drastically reducing the number of data that has to come across.
Other performance tips/tricks:
If you're using an ORM, or generic queries, to pull in records, try to pull the minimum amount of data you need to show the results. The more data that has to come from the DB and be digested into the viewmodel, the slower the back-end will be.
Avoid nested MultiViews. Multiviews are great for organizing a lot of data in a "tabbed" fashion, but behind the scenes a MultiView is rendered as a series of divs with CSS to hide/show them. That means that EVERY tab of a MultiView must be rendered on the initial page load. When multiple MultiViews are nested as Views of other MultiViews, the problem is compounded. You can avoid this by using the codebehind to dynamically select and insert the proper control into the page, or by using other code to detect whether the View that this control corresponds to is the currently-selected view, and skip any heavy lifting of data retrieval/processing that would otherwise happen. You may combine either approach with some AJAX components.
I'd start with correctly indexing the database.
One way to speed things up would be to get rid of the postbacks. Showing/hiding page elements is a client-side operation and doesn't require a postback. If you're using jQuery already, you can .show() and .hide() any element on the page.
This doesn't necessarily address the performance of the initial page load, but would improve the performance of the overall user experience when interacting with the page.
For the initial load, perhaps break out various data-bound elements into AJAX calls that happen behind the scenes after the initial page markup loads? I'm kind of shooting in the dark here without knowing a whole lot about the page, but it's worth a try. Maybe load the basic markup without the data in the lists, then on $(document).ready() make an AJAX call to a server-side handler which returns the elements for the first menu. Then, when each menu is selected, fetch the elements for the next menu in the same manner.
The overall load time would be roughly the same (maybe even a fraction of a second longer), but the UI would fully render in the meantime and you'd be using the time the user spends looking at the page and starting to interact with it, a few precious seconds, to load the rest.
Edit: In response to one of your comments above on the question, maybe you can use the jQuery UI Autocomplete to improve the user experience a little? Do the users necessarily want to select the codes from a list, or would it be easier for them to start typing the code and narrow down to the correct one? From a data-entry perspective, avoiding mouse usage is often a good idea.
Using javascript or any of client script is not an solution because the client browser may have javascript disabled...
I would suggest you these opmization,
Optimise database query if your table containing 500 records and
is not frequently have insert/update operations then create unclustered indexes.
Cache the data if not changes frequently.
Create stored procedures improves query results because it's being precompiled query and prevents sql injection attacks.

Categories

Resources