Store search results for later use - Accessing results view - c#

I have a c# asp.net MVC project. I am doing a a search and want to access the search results on the details page of one of the search results.
This is so that I can have < prev | next > links on the detail pages that will link to the next property in the last search results.
My approach so far is to put the search results object into a session variable, but I can't figure out the code to actually access it. When I do a watch on Session["SearchResults"] below, I can see the records in the Result View, which seems to hold an array.
I know someone is going to tell me I'm thinking about this all wrong, and I can't wait to be enlightened.
Someone suggested I should just store the last search results on the repository as a public property, would that be a better option? Or can someone recommend an altogether better way of doing what I need to do?
This is my controller
public ActionResult Search(int? page, Search search)
{
search.regusApi = Convert.ToBoolean(ConfigurationManager.AppSettings["regusApiLiveInventory"]);
Session["SearchResults"] = MeetingRoomRepositoryWG.Search(search).AsPagination(page ?? 1, 7);
return View(new SearchResultsWG { SearchCriteria = search, Locations = MeetingRoomRepositoryWG.Search(search).AsPagination(page ?? 1, 7) });
}
public ActionResult NiceDetails(String suburb, String locationName, int id)
{
**Here I want to acceess the session variable**
return View(MeetingRoomRepositoryWG.RoomDetails(id).First());
}
Here is the code from the repository:
public static List<Location> Search(Search search)
{
String LongLatString = search.LongLat;
LongLatString = LongLatString.Substring(1, LongLatString.Length - 2);
var LonLatSplit = LongLatString.Split(',');
var latitude = Convert.ToDecimal(LonLatSplit[0]);
var longitude = Convert.ToDecimal(LonLatSplit[1]);
using (var context = new MyContext())
{
var query = context.Locations.Include("Location_LonLats").ToList();
query.OrderBy(x => (Convert.ToDecimal(x.Location_LonLats.Lat) - latitude) * (Convert.ToDecimal(x.Location_LonLats.Lat) - latitude)
+ (Convert.ToDecimal(x.Location_LonLats.Lon) - longitude) * (Convert.ToDecimal(x.Location_LonLats.Lon) - longitude));
return query;
}
}

Not sure how large the data is you search against but, it's better to not store search results at all. It will scale very poorly and become a resource hog quite easily. Why store e.g. 500 pages of search results if the user only ends up looking at 1 or 3?
How long are you going to hold these potentially large result sets in session storage? For how many users?
Just do a page based search, essentially redoing the search for each "next" click the client does. A good index or something like Lucene.net can help if your searches are too slow.

Related

How to get all entries from the database table using Take and Skip methods?

In our current application we have following functionality in Data layer:
public IEnumerable<User> GetUsers(IPagedAndFilteredAndSortedRequest request)
{
var users = dbContext.Users;
//1) "filteredAndSorted" is a result of applying filters and sorts on users
//2) "filteredAndSorted" is OrderedQueriable
//3) "rows" is number of rows to skip based on request.PageSize and request.PageNumber
var result = filteredAndSorted.Skip(rows).Take(request.PageSize);
return result.ToArray();
}
And we need to get all users from the database using this method. So, the questions are:
Is it a good idea to pass 1 as pageNumber and Int32.MaxValue as pageSize?
What is the maximum number of rows in MSSQL database table?
Is it a good idea to pass 1 as pageNumber and Int32.MaxValue as pageSize?
Not really. It would be best to add another property to the request, something like
var result = filteredAndSorted;
if (request.UsePaging)
{
result = filteredAndSorted.Skip(rows).Take(request.PageSize);
}
return result.ToArray();
Or use request.PageSize < 1 to turn off paging.

Optimize EF core query in an alphabetically ordered list

I've been dealing with an issue lately, and although i have some solutions in mind, i'd like to find the best one from every point of view.
Let's say i have a WPF app with EF Core. There are about 3000 customers in my database (SQLite in my case, but in the future this should also work with slower ones). When the user opens the customer's list, i'm loading only some of them (quantity = 50, page = 0), in alphabetical order. As soon as the user scrolls down to the bottom, 50 more are loaded (quantity = 50, page = 1).
CustomerRepository.GetQueryableAll().Skip(page * quantity).Take(quantity).ToList();
Everything works fine. Here comes the problem though: there's a button to create a new customer, which opens a modal window. Let's say the user creates a customer with starting letter W. As soon as he/she hits SAVE, the new customer is saved to the database, the window is closed, and the list must be reloaded. But loading the whole list until W is, of course, really slow.
So far, i've tried to query the database in a background task and store how many customers start with each letter of the database in a static Dictionary: as soon as SAVE is hit, i can guess more or less how many "pages" to Skip() in the database and get the group of 50 in which the new customer will be. It works, it's quite fast, but i'm worried that it won't work in countries with non Latin alphabets:
public async Task<Dictionary<char, int>> GetCustomersByInitialsCount()
{
return await Task.Run(async delegate
{
var dictionary = new Dictionary<char, int>();
for (char c = 'A'; c <= 'Z'; c++)
{
var count = await CustomerRepository.GetCustomerCountStartingWith(c.ToString());
dictionary.Add(c, count);
}
return dictionary;
});
}
[... and in the repository:]
public async Task<int> GetCustomerCountStartingWith(string startingLetter)
{
using (var dbContext = new MyDbContext())
{
return await dbContext.Set<Customer>().CountAsync(p => p.LastName.ToUpper().StartsWith(startingLetter.ToUpper()));
}
}
Otherwise, instead of this background query, i could also try to "guess" the right page depending on the starting char, but i'm still puzzled by the unexpected outcomes i could have with non latin languages.
If anybody knows better tools or have any other useful ideas, i'll gladly consider them!
Thank you very much in advance and happy coding.
What if you add a request to get all the first "letters" in your table ?
public async Task<List<string>> GetCustomerFirstLetter()
{
using (var dbContext = new MyDbContext())
{
return await dbContext.Set<Customer>().Select(x => x.lastName.Substring(0, 1)).Distinct().ToList();
}
}
and then
public async Task<Dictionary<char, int>> GetCustomersByInitialsCount()
{
return await Task.Run(async delegate
{
var dictionary = new Dictionary<char, int>();
var letters = GetCustomerFirstLetter();
foreach(letter in letters)
{
var count = await CustomerRepository.GetCustomerCountStartingWith(letter);
dictionary.Add(letter, count);
}
return dictionary;
});
}
Alternative solution. A little bit more efficient from my point of view
Your problem boils down to how to get new customer's row number in whole dataset ordered by customer's name.
First of all, in plain SQL for SQLite or MSSQL you may solve your problem of getting right page number with ROW_NUMBER function. Query example:
SELECT TOP 1 rnd.rownum, rnd.LastName
from (SELECT ROW_NUMBER() OVER( ORDER BY c.LastName) AS rownum, c.LastName
FROM [Customer] c) rnd
WHERE rnd.LastName = '<your new customers name here>'
So, after getting exact rownumber value and having already page count param you can easily calculate needed page.
Getting back to your code. This feature can be implemented in EF with overloaded version of Select method, but unfortunately, it has not been implemented in EF Core for IQueryable yet (see this).
But you can still pass exact query right to db using FromSql method.
Solution consists of two steps:
To get required data you need to define Query for model builder this way (additional fields just for example, youl need RowNum only):
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Query<CustomerRownNum>();
}
public class CustomerRownNum
{
public long RowNum { get; set; }
public Guid Id { get; set; }
public string LastName { get; set; }
}
Then you need to pass mentioned above SQL query to context's Query method this way:
string customerLastName = "<your customer's last name>";
var result = dbContext.Query<CustomerRownNum>().FromSql(
#"select top 1 rnd.RowNum, rnd.Id, rnd.LastName
from
(SELECT ROW_NUMBER() OVER( ORDER BY c.LastName) AS RowNum
, c.Id, c.LastName
FROM [Customer] c) rnd
WHERE rnd.LastName = {0}", customerLastName).FirstOrDefault();
Finally you'll get data you needed right in result variable.
Hope that helps!

exception when trying to fetch more than 1 thousand customer(s) and invoice(s) from QBO API

I have more than 1000 customer(s) and invoice(s) and I am trying to fetch all those customers and invoice(s) into a drop-down list.
Documentation on the QBO site suggests that we should need to use pagination if I want to load all the customers in a grid, but what I want is to load all the customer(s) and invoice(s) in a drop-down list.
I am getting the following exception when I try to fetch more than 1000 customer(s) and invoice(s):
Validation Exception was thrown.
Details: QueryValidationError: value 100000 is too large. Max allowed value is 1000.
I am trying to fetch all the customers using the following code
public static List<Customer> GetAllQBOCustomers(ServiceContext context)
{
return Helper.FindAll<Customer>(context, new Customer(),1,100000);
}
I wrote the below code and solved my issue.
1. First I get the count of all the customers
2. Then I get all the customers in chunks and the chunk size is 1000
3. Create a List for customers.
4. Define 3 integer type variables for counting.
5. After that use do-while loop
6. Add all the customers are added to the main customer list
string strQuery = "Select Count(*) From Customer";
string custCount = qboAccess.GetCutomerCount(qboInz.QboServiceContext, strQuery);
List<qboData.Customer> customers = new List<Customer>();
int maxSize = 0;
int position = 1;
int count = Convert.ToInt32(custCount);
do
{
var custList = qboAccess.GetAllQBOEntityRecords(qboInz.QboServiceContext, new Customer(), position, 1000);
customers.AddRange(custList);
maxSize += custList.Count();
position += 1000;
} while (count > maxSize);
The straightforward answer is to loop enough times to get the records you need:
public static List<Customer> GetAllQBOCustomers(ServiceContext context)
{
var list = new List<Customer>();
for (int i=0; i<=10000; i+= 1000)
{
var results = Helper.FindAll<Customer>(context, new Customer(),i, 1000);
list.AddRange(results);
}
return list;
}
Or if you want to try to do it in parallel (and the API allows concurrent connections):
public static List<Customer> GetAllQBOCustomers(ServiceContext context)
{
var bag = new ConcurrentBag<Customer>();
Parallel.ForEach( Enumerable.Range(0, 10), i =>
{
var results = Helper.FindAll<Customer>(context, new Customer(),i * 1000, 1000);
bag.AddRange(results);
});
return bag.ToList();
}
Since the series of calls is likely to be expensive, I suggest you cache the results.
You can't download those records all at once. That is what the error is telling you - very clearly. There's no magic way to avoid the server's rules.
However, I really think you should not download them all at once anyway. A drop-down list is not a good way to display that amount of data to users. Consider the user experience - would you want to scroll through a list of thousands of customers to try and find the one you want? Or would it be easier to start typing part of the name and have it pop up a short list of possible matches to choose from?
A more user-friendly way to implement this would be use an auto-complete box instead of a drop-down list, and after the user has typed a few characters, it can use AJAX to search the API for customers whose names or IDs contain those characters. Then you'll only need to return a small number of records each time, and the user will not be stuck having to scroll for 10 minutes just to find a customer at the bottom of a list of 10,000 records.

Entity Framework and whole word matching - paging and filtering results

Background:
I am using ASP.NET MVC4, SQL Server 2008 R2, and Entity Framework 5 for a website.
The site accepts a delimited list of keywords to search database content on. It also needs to page the results to the user (currently 100 results per page).
This was going along smoothly until it was requested that the keyword searching is not done with partial matching, but whole word matching.
The problem
Performing the whole word match AFTER I already have the results back means that I might not have query.Pagesize of results to show - which messes up the UI paging. Of the 100 partial matches from SQL Server on the first page, 20 may end being removed with the whole word processing.
I currently am building my query using LINQ and doing a AND search on the keywords like so:
// Start with all the MyItems
var results = UnitOfWork.MyItemRepository.GetAll();
// Loop the keywords to AND them together
foreach(var keyword in query.Keywords)
{
var keywordCopy = keyword;
// Look for a hit on the keyword in the MyItem
results = results.Where(x => x.Title.Contains(keywordCopy));
}
And later on getting the total number of results, paging, and executing the query:
var totalCount = results.Count();
// Page the results
results = results.Skip((query.Page - 1) * query.Pagesize).Take(query.Pagesize);
...
// Finalize the query and execute it
var list = results.ToList();
Because I need to do whole word matching and not partial, I am processing with a regex the keywords and removing non-matches from list.
var keywordsRegexPattern = "^" + string.Concat(query.Keywords.Select(keyword => string.Format(#"(?=.*\b{0}\b)", Regex.Escape(keyword))));
foreach(var item in list.ToList())
{
var searchableData = some combined string of item data
// See if the keywords are whole word matched in the combined content
var isMatch = Regex.IsMatch(searchableData, keywordsRegexPattern, RegexOptions.IgnoreCase | RegexOptions.Singleline);
// If not a match, remove the item from the results
if(!isMatch)
{
list.Remove(item);
}
}
// Change list into custom list of paged items for the UI
var pagedResult = new PagedList<MyItem>(list, query.Page, query.Pagesize, totalCount);
return pagedResult;
Question
Does anyone know of a way to do whole word matching with EF and do result paging?
Ideas I've come up with but don't like:
Chunk the results. 100 results back, 20 partial keyword matches removed, go get another 20, repeat. This could result in doing multiple queries when getting all the data at once would have been faster. It also means it would be stealing potential results from the next page which would have to be tracked with some sort of offset.
Get ALL the rows back (no SQL paging), then process and page in C#. This seems bad to get all the results back every time.
Well I see two alternatives (I may miss something easier, but anyway)
Either you use string.Contains(keyword), retrieve all the corresponding datas from db, then filter with exact matching and make paging on the enumerated result (so you probably get "not too much result" from db).
The other way :
foreach(var keyword in query.Keywords)
{
//add space at start or end of keyword for contains
var containsKeyword = string.Format(" {0} ", keyword);
//add space at end only for startsWith
var startsWithKeyword = string.Format("{0} ", keyword);
//add space at start only for endsWith
var endsWithKeyword = string.Format(" {0}", keyword);
// Look for a hit on the keyword in the MyItem
results = results.Where(x => x.Title.Contains(containsKeyword) || x.Title.StartsWith(startsWithKeyword) || x.Title.EndsWith(endsWithKeyword));
}

Newbie performance issue with foreach ...need advice

This section simply reads from an excel spreadsheet. This part works fine with no performance issues.
IEnumerable<ImportViewModel> so=data.Select(row=>new ImportViewModel{
PersonId=(row.Field<string>("person_id")),
ValidationResult = ""
}).ToList();
Before I pass to a View I want to set ValidationResult so I have this piece of code. If I comment this out the model is passed to the view quickly. When I use the foreach it will take over a minute. If I hardcode a value for item.PersonId then it runs quickly. I know I'm doing something wrong, just not sure where to start and what the best practice is that I should be following.
foreach (var item in so)
{
if (db.Entity.Any(w => w.ID == item.PersonId))
{
item.ValidationResult = "Successful";
}
else
{
item.ValidationResult = "Error: ";
}
}
return View(so.ToList());
You are now performing a database call per item in your list. This is really hard on your database and thus your performance. Try to itterate trough your excel result, gather all users and select them in one query. Make a list from this query result (else the query call is performed every time you access the list). Then perform a match between the result list and your excel.
You need to do something like this :
var ids = so.Select(i=>i.PersonId).Distinct().ToList();
// Hitting Database just for this time to get all Users Ids
var usersIds = db.Entity.Where(u=>ids.Contains(u.ID)).Select(u=>u.ID).ToList();
foreach (var item in so)
{
if (usersIds.Contains(item.PersonId))
{
item.ValidationResult = "Successful";
}
else
{
item.ValidationResult = "Error: ";
}
}
return View(so.ToList());

Categories

Resources