I'm hesitating and thinking about how to display the result searches.
The situation is following:
I have a website, where a user can fill in few texboxes and based on that perform a search. And then i would like to display him the results in a paged way: something like:
photo, name, age | photo, name, age | photo, name age
So i would like to have a 3 columns, and then probably 5 rows.
But I don't know, what is the best way to represent something like this, is there a best approach etc?
Thanks in advance
Well, there's so many approaches you can go with from many different perspectives. We could really be talking about it for ages. But, in a nutshell, you can implement paging at the presentation layer, at the controller level, at the business logic layer or at the data layer. Basically, the lower you go down your application layer the better performance you'll get, there's no much difference between implementing paging at the controller and the business logic layer when it comes to performance but design wise it's better to keep these concerns at the business logic layer for better maintainability and scalability. You will get much better performance if paging is implemented at the data layer, specially if you have a lot of data to display and then tell the data access layer to fetch just the page of data your application is interested in. Using other approaches will force you to retrieve all data from the backend store which might result in unnecessary bandwidth consumption and data transfers. See this example below where I'm doing the paging at the data layer level..
Example
SQL Query
I'm using a custom data access layer that runs the sql query below to query a large amount of devices...
select top (#pagesize) *
from
(
select row_number() over(order by d.Name) as RowId
, d.Id
, d.Name
, d.IsDeleted
, dt.Id as DeviceTypeId
, dt.Name as DeviceTypeName
from Devices d
left join DeviceTypes dt on dt.Id = d.DeviceTypeId
)
as o
where o.RowId <= (#page * #pagesize) and o.RowId > ((#page - 1) * #pagesize)
The query is simple you specify the page size(records per data page) and the data page to retrieve (first page, the second page, etc.). This query is run by my data access layer which constructs business ojects and pass them to the Business Logic Layer
Business Logic
Then, in the my business logic I called the devices data object to retrieve the data page I'm interested in. The following method does that job...
public IList<DeviceBO> GetAllDevices(int page, out int count)
{
count = DataProvider.Devices.GetAllDevicesCount();
return DataProvider.Devices.GetAllDevices(page, BusinessConstants.GRID_PAGE_SIZE);
}
Very straight-forward operation. Yet, there's a few things to notice. One, that I'm making two round-trips to the database; one to get the count of all devices and the second one to retrieve a data page of devices. We can argue that this could be avoided which is true by doing a single trip to the database, but in my case it is a very quick query that runs in less than 1 second and believe me...I have over a million records. The call to the GetAllDevicesCount does nothing more than a sql select count(Id) from dbo.Devices
Also notice that DataProvider is just a factory object. You don't really need to worry about the implementation details. And BusinessConstants.GRID_PAGE_SIZE simple returns a number, the standard page size used across my application (I've got a few grids/tables) and it's just a way to keep things in one place in case I want to change the page size across all grids/tables later on.
Controller
Then I have defined a device controller that exposes the following action method (Manage)...
public class DevicesController : Controller
{
public ActionResult Manage(string currentPage)
{
return process_device_list(currentPage);
}
//this method was created as a result of code re-factoring since it is used in several other action methods not shown here
private ActionResult process_device_list(string currentPage, bool redirect = false)
{
int count = 0;
int page = 1;
if (!int.TryParse(currentPage, out page))
{
if (!string.IsNullOrEmpty(currentPage))
return RedirectToAction("Manage");
page = 1;
}
var model = new DeviceManagementListModel();
model.Devices = BusinessFactory.DevicesLogic.GetAllDevices(page, out count);
model.ActualCount = count;
model.CurrentPage = page;
if (!redirect)
return View(model);
else
return RedirectToAction("Manage", new { #currentPage = currentPage });
}
}
View
The view is pretty much just HTML and razor syntax, there's nothing interesting going on there. Perhaps, the footer of the table/grid is where more interesting things happen since the paging markup is define there....
<tfoot>
<tr>
<td colspan="4">
<div class="pager">
#using (Html.BeginForm("Manage", "Devices", FormMethod.Get)){
#Html.TextBoxFor(x => x.CurrentPage, new { title = "Enter a page number to change the page results" })
<input type="submit" value="Go" title="Change the current page result" />
}
</div>
<div class="total">
#Model.ActualCount record(s) found
</div>
</td>
</tr>
</tfoot>
And this is how it looks...
Notice in the tfoot section of the table in my view that I have added a form element that makes a GET request to my action method Manage rather than a POST request. This is useful to provide a nice restful url where users can specified manually the page they're interested in...
I hope it helps you get some basic ideas
Related
Having abit of an issue with datatables, I want all the data from my sql table, however running it on our site is causing some severe lag as it has over 8000 records. So I tried using Take(10).ToList(); in my controller, however jquery datatables will only populate with 10 records (obviously). I am hoping there is a simple method or approach in my controller that I could take, for only loading ten records at a time. Yet still keeping the pagination of the entire sql table in the datatables. (Long shot)
For instance if I load the entire table data into the datatable it has something like 800+ paged pages. If I only take 10 records at a time datatables will only show one page. I need to take 10 records at a time but also show 800 paged pages and when I click next or on a specific paged page it will load those records. Can this be done from the controller/linq? Without taking an exhaustively long trip down json/ajax lane.
Controller
public PartialViewResult Listing()
{
var model = _db.MyDataBase.Take(10).ToList();
View:
<table id="example" class="table table-striped table-bordered table-hover">
<tbody>
#foreach (var p in Model)
{
<tr class="gridRow" data-id="#p.MyId">
<td>#p.Id</td>
// etc....
Script:
<script type="text/javascript">
$(function () {
// Initialize Example
$('#example').dataTable();
});
</script>
I Suggest you go through server side processing of datatables.net . Refer Datatables.net
Just tweak your method to receive 2 parameters: pageNumber pageResults
Than adding Skip to jump through the results you don't want to show, and using Take to start taking results from the point Skip skipped over.
Usage:
Listing(1, 20) - Will give us results 0-20
Listing(3, 20) - Will give us results 60-80
Listing(5, 10) - Will give us results 50-60
Code:
public PartialViewResult Listing(int pageNumber, int pageResults)
{
var model = _db.MyDataBase
.OrderBy(row => row.ID)
.Skip((pageNumber -1) * pageResults)
.Take(pageResults)
.ToList();
}
I have an ASP.Net MVC 3 web application, and within one particular page there is quite a bit of data processing happening when the page loads. Please see my sample code below.
Basically the code is querying my database and pulling back data into the variable shiftDates, this could be anything up to a thousand records. Next I perform a group by and distinct on that data and place it into a variable called groupOrgs . Then I iterate through that data, and place each record and its related data into a ViewModel which my Razor View requires.
You may read this and think that this code shouldn't take too long to load, query and display to the user, however, there are two other pieces of code, very similar to that below, also within my Controller method which also run when the user loads the page. Therefore, picture the code below being asked to run three times within one page load. This then causes a noticeable delay on the page load time.
I guess I would like any help or advice on how to restructure my code below or that when the page loads, initially it pulls it from the database and does all the querying etc, but if the page is reloaded, then the data is pulled from a cached source.
Really, any help with this is greatly appreciated.
Thanks.
Controller
public ActionResult Index()
{
ViewModelHomepage model = new ViewModelHomepage();
IList<ViewModelHomepageCounters> orgShiftDates = new List<ViewModelHomepageCounters>();
//Get all available dates
IList<ShiftDate> shiftDates = _shiftDateService.GetAllShiftDatesToBeFilled();
//Get total dates per Organisation
var groupOrgs = shiftDates
.GroupBy(x => x.Shift.Organisation.description)
.Select(x => new
{
_id = x.FirstOrDefault().Shift.organisationID,
_orgName = x.Key,
_shiftDateCount = x.Count()
}
).OrderBy(x => x._orgName);
foreach (var item in groupOrgs )
{
ViewModelHomepageCounters t = new ViewModelHomepageCounters();
t.id = item._id;
t.OrgName = item._orgName.ToString();
t.totalShiftDates = Convert.ToInt32(item._shiftDateCount);
orgShiftDates.Add(t);
}
model.OrgShiftDates = orgShiftDates;
}
1-problem: I need to enable users to select one or more things from a large amount of information that is grouped into a hierarchical structure for selection, data entry, were data could have a depth of 4, 5 parent categories.
2-functionality I´m looking for:
Similar to eBay shows cascading lists when selecting an item’s category. When the page is displayed, you only get the first list box. After selecting one in the first, the second is displayed. The process continues until the selected category does not have sub-categories.
}
3-actual table and query:
table:
-int Id
-string Name
-int ParentId
query:
public IList<CategoryTable> listcategories(int parentId)
{
var query = from c in categorytable
where c.ParentId == parentId
select c;
var result= query.ToList();
return result;
}
4-I dont know how to start, any guideline, live example jsfiddle, demo or tutorial would be greatly appreciated.
brgds
UPDATE: I believe that this functionality is not very developed in webtutorials and questions. consequently I started a bounty for a great answer. I will asign the bounty for a live example of the functionality previous commented. thanks!
What I have learned by handling large amounts of data is:
don't try to load all data at once to the client
load only the data the client actually needs
do the filtering, searching and sorting in the database, e.g. by stored procedures. Especially for data, which are distributed across multiple tables.
optimize your database queries, indexes are good
keep always in mind how many simultanious queries do you expect
linq is good but not for everything when handling large data
spend time to think and plan what data are really needed
To display the data on your webpage there many jQuery plugins to list data where you could bind functions to an "selected"-event. For example knockOut.js, which comes with MVC4. You may don't need a fully loaded jQuery "hierachical-data-list-display"-plugin. Perhaps you can realize it by using "seleted"-events, ajax loading and show/hide functions.
According to your comments I would think of a combination of jQuery and MVC:
in MVC I would create a patial view like
#model MvcApplication.Models.DataModel
<ol id="#Model.DataCategorieLevel">
#for (var i = 0; Model.Data.Count > i; i++)
{
<li value="#Model.Data[i].ItemId" onclick="itemSelected(#Model.Data[i].ItemId, #Model.DataCategoryLevel);" >#Model.Data[i].ItemName</li>
}
</ol>
the javascript could be something like:
function itemSelected(selectedItemId, itemCategoryLevel) {
// ajax call to an action which loads the next categorie items into the partial view and returns them
// on success remove all lists with an category - level lower than itemCategoryLevel
// append the returned List to the HTML-container which holds the lists
}
in the called MVC-Action I would determine if it is the last category level or not. If it is the last level, I would return a different partial view with other onclick event bindings
This is what I would try to realize, before I start searching for some plugins
I'm using knockout and Webapi to power cascading dropdowns in an app I'm developing at the moment.
View
I've got a basic dropdown list like below.
<select data-bind="options: CurrentList,
optionsText: 'name',
value: CurrentListSelectedItem,
optionsCaption: 'Please Select...'"></select>
View Model
self.CurrentList = ko.observableArray(CurrentListData);
self.CurrentListSelectedItem = ko.observable();
self.CurrentListSelectedItem.subscribe(function () {
//ajaxcall to populate list 2
});
Server side I've got a simple rest service that take an Id of a point in the tree and returns all its children, this way you can just chain as many of these dropdowns together as you wish (as long as your hierarchy has the levels to match.
See fiddle of working example with mocked data http://jsfiddle.net/tgriley1/vEBGS/
I recently had a similar problem when using Cascading Drop-downs and I did something like this.
Firstly, write some jquery on the view so that when you select the first item it sends an ajax request to the server, and brings back a JSON or xml response.
I did something like
<script>
$(function () {
$("select#ParentId").change(function (evt) {
$.ajax({
url: "/Home/GetChildItems",
type: 'Post',
data: { ParentId: $("select#ParentId").val() },
success: function (data) {
var items = "";
$.each(data, function (i, val) {
items += "<option value='" + val.ChildId + "'>" + val.ChildName + "</option>";
});
$("select#ChildDropDown").empty().html(items);
}
});
});
});
</script>
On the Controller, something like
Public JsonResult GetChildItems(int ParentId)
{
//code to retrieve the data
JsonResult result = new JsonResult();
result.Data = **object that contains the child data**;
return result;
}
I'm a beginner myself, so I'm not sure how good this code is, but it worked for me when creating cascading drop-downs using jquery.
Hope it helps.
Link to the cascading drop down question : Populating dropdown with JSON result - Cascading DropDown using MVC3, JQuery, Ajax, JSON
Hi I had the same scenario , What I used is, a autocomplete list with with web API, after specific number of characters , it calls the Web API and loads the data for the particular wild card.
Apart from this when I found that data returned is still large , I added pagination at SQL server end
The telerik demo is always a good place to learn MVC from
http://demos.telerik.com/aspnet-mvc/razor/combobox/cascadingcombobox
This does not exactly use listboxes as per your screenshots but it could very easily be changed to use them. With a few javascript modifications you could have unlimited levels.
Here is another one:
http://weblogs.asp.net/raduenuca/archive/2011/04/03/asp-net-mvc-cascading-dropdown-lists-tutorial-part-5-1-cascading-using-jquery-ajax-ajax-and-dom-objects.aspx
I am currently using Entity Framework in an ASP.NET MVC 3 project.
And I am getting severe performance issues while looping through the records in the view to display them.
The data is being recieved quickly, so I know it's not the connection to our remote oracle server and there are no lazy loaded relationships in the model I'm using, yet each record is taking 120-300ms to process a simple 3 field output with an action link.
Currently it takes over 3 minutes to load the page with 800ish records.
I've tried tweaking with configuration options but none seem to help.
Anyone has any ideas?
edit: controller code
readonly OracleSampleManagerContext db = new OracleSampleManagerContext();
public virtual ActionResult Index()
{
var spList = db.SamplePoints.OrderBy(e=>e.Id).ToList();
return View(MVC.Reports.Views.SamplePointList, spList);
}
<h2>
Selection By Sample Point
</h2>
<table>
#foreach (var sp in Model)
{
System.Diagnostics.Stopwatch sw = new System.Diagnostics.Stopwatch();
sw.Start();
<tr>
<td>#Html.ActionLink(sp.Id, MVC.Reports.Results(sp.Id))</td>
<td>#sp.Description</td>
<td>#sp.PointLocation</td>
<td>#sw.ElapsedMilliseconds</td>
</tr>
sw.Stop();
sw.Reset();
}
</table>
Example:
0200 72" Sewer to river - Once through cooling water OUTFALLS 346ms
0400 66" Sewer to river - Combined effluent OUTFALLS 347ms
0500 54" Sewer to river - Once through cooling water OUTFALLS 388ms
06-AI-18 TBA in Water IB2 228ms
06-AI-31 TBA in Water IB2 172ms
My guess is that MVC.Reports.Results(sp.Id) does some sort of db lookup, and since you converted your model to a list before sending it to the view, you now have to hit the database again for each record Making a page of 800 records require 801 seperate trips to the database instead of one.
Remove the ToList() from your first query.
Refactor MVC.Reports.Results(sp.Id) to take an object instead of an int, and within that method work on the object directly.
Both of the above may require you to move the scope of your entities context from within the action out to the Controller.
I'm using ASP.NET MVC and Azure Table Storage in the local development fabric. My pagination code is very slow when working with a large resultset:
var PageSize = 25;
var qResult2 = from c in svc.CreateQuery<SampleEntity>(sampleTableName)
where c.PartitionKey == "samplestring"
select c;
TableStorageDataServiceQuery<SampleEntity> tableStorageQuery =
new TableStorageDataServiceQuery<SampleEntity>
(qResult2 as DataServiceQuery<SampleEntity>);
var result = tableStorageQuery.ExecuteAllWithRetries()
.Skip((page - 1) * PageSize)
.Take(PageSize);
var numberOfEntities = tableStorageQuery.ExecuteAllWithRetries().Count
ViewData["TotalPages"] = (int)Math.Ceiling((double) numberOfEntities / PageSize);
ViewData["CurrentPage"] = page;
return View(result);
The ViewData is used by the View to calculate paging links using code from Sanderson's MVC book. For an Azure Table with 1000+ entities, this is very slow. For starters, "Count" takes quite a long time to calculate the total number of entities. If I'm reading my LINQ book correctly, this is because the query doesn't implement ICollection. The book is "Pro LINQ" by Joseph Rattz.
Even if I set "numberOfEntities" to the known total (e.g. 1500), the paging is still slow for pages above 10. I'm guessing that .Skip and/or .Take are slow. Also, I call ExecuteAllWithRetries() twice, and that can't be helping if in fact Azure is queried twice.
What strategy should I follow for paging through large datasets with ASP.NET MVC and Azure?
EDIT: I don't need to know the exact total number of pages.
Skip and Take aren't the problem here - they will be executed against the IEnumerable, which will already be in memory and thus very quick.
ExecuteAllWithRetries is likely to be the culprit here - you're basically retrieving all of the entities in the partition from the remote storage in this call, which will result in a very large payload.
Pagination in the manner you're showing is quite difficult in Table Storage. Here are a few issues:
The only order that's guaranteed is the PartitionKey/RowKey order, so you need to design your RowKeys with this in mind.
You can perform the Take in the query (ie, your qResult2), so this will reduce the number of entities going over the wire.
To perform the Skip-like functionality, you'll need to use a comparison operator. So you'll need to know where you are in the result set and query all RowKeys above that value (ie, add something like where c.RowKey > [lastRowKey] to your query)
There's no way to retrieve a count without keeping track of it yourself (or retrieving the entire table like you're already doing). Depending on your design, you could store the count along with each entity (ie, use an incrementing value) - but just make sure you keep track of concurrent edit conflicts, etc. If you do keep track of the count with each entity, then you can also perform your Skip using this as well. Another option would be to store the count in a single value in another entity (you could use the same table to ensure transactional behaviour). You could actually combine these approaches too (store the count in a single entity, to get the optimistic concurrency, and also store it in each entity so you know where it lies).
An alternative would be, if possible, to get rid of the count altogether. You'll notice a couple of large scalable sites do this - they don't provide an exact list of how many pages there are, but they might let you go a couple of pages ahead/back. This basically eliminates the need for count - you just need to keep track of the RowKeys for the next/prev pages.