I am trying to get a paged list of call records from Twilio
var records = Twilio.Rest.Api.V2010.Account.CallResource.Read(pageSize: 10).ToList();
But nowhere is there anywhere to specify what the page number to retrieve is? How would I do this?
Sales Engineer at Twilio here.
There isn't a way to go to a specific page. You can navigate through the pages using the previous_page_uri and next_page_uri fields. I think the philosophy is that pages are a very implicit/arbitrary way of navigating your data. It's better to be explicit about what you're actually looking for, e.g., "Records from this date to this date" by using query parameters like this:
Category=calls&StartDate=2017-10-13&EndDate=2017-10-13"
Related
In a method that takes the parameters int pageNum and int pageSize I am trying to return data from dynamics based on that specific page and size.
I am using a QueryExpression and I set the exp.PageInfo page number and page size to achieve this which works fine until page 51 with a page size of 100 which produces the error "Paging cookie is required when trying to retrieve a set of records on any high pages"
This brings me to near duplicate question territory (such as Dynamics CRM - How to get the second page of a fetchXml query beyond the 5000 elements? ) but the claims "you do not need the paging cookie" are just not correct at all it seems, there is nothing I can do for results beyond 5k that do not produce that error.
I am now paging over -the entire result set- (which allows me to get the PagingCookie from the previous results to pass to the next page request) and then returning the data I want from that set, but that is super slow. I have made it faster by dynamically altering the query in the paging loop so it only returns columns if the current data is in the requested page range which has shaved about 30secs off the query, but still very slow for such a large data set.
So, is there A) some thing that will enable me to get these high results without a paging cookie? Is this a QueryExpression limitation for example? or B) a faster way of handling this problem than iterating over all results until the page I want?
Thanks.
Unfortunately there's no way to "fast forward" to high pages in your query result.
You'll have to use the paging cookie just the way you are doing right now.
If you knew the last and next record from a previous query you could try creating a high-page-cookie to start with.
As #Aron suggest in his answer, the only improvement might be gained from sorting and/or filtering/partitioning the data (by createdon, etc.).
Filtering the data set to return fewer pages might be something to investigate. Or, if the desired page is closer to the end of the data set, reverse the sort order.
I want to implement server side paging in my Silverlight application. To get an idea of the steps that I would require I went through this custom paging in asp.net
article where they described how to design a SQL query to return the results according to the Page Requested and the Total no of records per page. I am however totally confused as to how am I going to call it from my Silverlight application. Like how am I going to specify it in the c# code.
The default paging using the DataPager is pretty simple.
PagedCollectionView pagingCollection = new PagedCollectionView(e.Result); //e.Result contains `List` returned by the method that calls the stored procedure GetProducts
pagerProductGrids.Source = pagingCollection;
gridProductGrid.ItemsSource = pagingCollection;
But I'm clueless about the procedure of doing it on my own. Like what properties I will be needing to get and set the Page Size, the total no of records etc i.e how am I going to configure my DataGrid and DataPager to pass StartingRowIndex and Maximum RowcOunt
Please help!
I came across this article a few years ago and this worked like a charme for me. I've added this to my framework and have been reusing this method ever since. The article is well explained and I believe this is exactly what you are looking for.
Paging Data from the Server with Silverlight
I am really new in c# programming. I would like some help from you guys (if possible). I have a website (it is a shopping website ) with data : products, price, description...etc. What I would like to do is: Since the website has a search capability so I would like to get the data from it by querying the search link and get only the important data (product id, name, price and description). When I perform the search I get many pages, and every time I press next I get new page with extra list of products. How can I simply make automation of these tasks?
I searched a lot over internet I found that I need to use webclient() with regular expression, and I thought that maybe a loop over the page content and over the search result pages would be necessary.
what do you think guys?
Website Example.
I´ll appreciate any effort from your side.
What you're describing is called scraping.
What you'll want is to use something like HtmlAgilityPack to get the website. Then you find the nodes you're interested in by using the DOM, and reading their inner text.
The whole process is rather complicated, but at least I've sent you off in the right direction. For the most part, search urls tend to have the same format.
In your link for instance
http://cdon.se/hemelektronik/advanced-search?manufacturer-id=&title=.&title-matchtype=1&genre-id=&page-size=15&sort-order=142&page=2
You can change 'page' to be smething else and you can go through all the pages that way.
Added:
Also don't TRY to use regex to parse html. It drove one particular person mad...
RegEx match open tags except XHTML self-contained tags
I stucked at a condition , where i need to share values between the pages. I want to share value from Codebehind via little or no javascript. I already have a question here on SO , but using JS. Still did'nt got any result so another approach i am asking.
So I want to know can i pass any .net object in query string. SO that i can unbox it on other end conveniently.
Update
Or is there any JavaScript approach, by passing it to windows modal dialog. or something like that.
What I am doing
What i was doing is that on my parent page load. I am extracting the properties from my class that has values fetched from db. and put it in a Session["mySession"]. Some thing like this.
Session["mySession"] = myClass.myStatus which is List<int>;
Now on one my event that checkbox click event from client side, i am opening a popup. and on its page load, extracting the list and filling the checkbox list on the child page.
Now from here user can modify its selection and close this page. Close is done via a button called save , on which i am iterating through the checked items and again sending it in Session["mySession"].
But the problem is here , when ever i again click on radio button to view the updated values , it displays the previous one. That is , If my total count of list is 3 from the db, and after modification it is 1. After reopening it still displays 3 instead of 1.
Yes, you could but you would have to serialize that value so that it could be encoded as a string. I think a much better approach would be to put the object in session rather than on the URL.
I would so something like this.
var stringNumbers = intNumbers.Select(i => i.ToString()).ToArray();
var qsValue = string.Join(",", stringNumbers);
Request.Redirect("Page.aspx?numbers=" + sqValue);
Keep in mind that if there are too many numbers the query string is not the best option. Also remember that anyone can see the query string so if this data needs to be secure do not use the query string. Keep in mind the suggestions of other posters.
Note
If you are using .NET 4 you can simplify the above code:
var qsValue = string.Join(",", intNumbers);
Make the object serializable and store it in an out-of-process session.
All pages on your web application will then be able to access the object.
you could serialize it and make it printable but you shouldn't
really, you shouldn't
The specification does not dictate a minimum or maximum URL length, but implementation varies by browser and version. For example, Internet Explorer does not support URLs that have more than 2083 characters.[6][7] There is no limit on the number of parameters in a URL; only the raw (as opposed to URL encoded) character length of the URL matters. Web servers may also impose limits on the length of the query string, depending on how the URL and query string is stored. If the URL is too long, the web server fails with the 414 Request-URI Too Long HTTP status code.
I would probably use a cookie to store the object.
I'm trying to create a customized search that displays results based on my FullTextSQLQuery results (i.e. user types 'Foo' clicks Search, my server-side code performs a FullTextSQLQuery bringing back PDF documents that contain 'Foo' in its text).
My question is what will I need to do after getting the results from my query in order to display the results to the user? Will I need to provide my own results aspx page or does SP have something that is out-of-box that I can use to perhaps pass my results along to?
I'm not aware of anything OOTB, but this is a simple matter of transforming the XML results into HTML using an XSL.