Does anyone know what a AsyncPageable is? - c#

I'm trying to do a get all from an Azure service and it returns an AsyncPageable. According to the doc it says
A collection of values that may take multiple service requests to iterate over.
Does that mean that it is equal to doing the request for a single item multiple times with a loop?

If a service call returns multiple values in pages it would return Pageable<T>/AsyncPageable<T> as a result. Check out Consuming Service Methods Returning AsyncPageable.
To get more clarity, have a look at below:
This shows control over receiving pages of values from the service use AsyncPageable<T>.AsPages method:
// call a service method, which returns AsyncPageable<T>
AsyncPageable<SecretProperties> response = client.GetPropertiesOfSecretsAsync();
await foreach (Page<SecretProperties> page in response.AsPages())
{
// enumerate through page items
foreach (SecretProperties secretProperties in page.Values)
{
Console.WriteLine(secretProperties.Name);
}
// get continuation token that can be used in AsPages call to resume enumeration
Console.WriteLine(page.ContinuationToken);
}
If your project doesn't have C# 8.0 enabled you can still iterate over AsyncPageable using a while loop:
// call a service method, which returns AsyncPageable<T>
AsyncPageable<SecretProperties> response = client.GetPropertiesOfSecretsAsync();
IAsyncEnumerator<SecretProperties> enumerator = response.GetAsyncEnumerator();
try
{
while (await enumerator.MoveNextAsync())
{
SecretProperties secretProperties = enumerator.Current;
Console.WriteLine(secretProperties.Name);
}
}
finally
{
await enumerator.DisposeAsync();
}
Check out Azure.Core Response samples to understand more about this.
To change page size, you can use pageSizeHint parameter to AsPages method.

Related

Return progress from Web API [duplicate]

I currently have a web API that
fetches a row of data using FromSqlRaw(...).ToListAsync() within a repository
returns this data as Ok(data.ToArray()) as Task<ActionResult<IEnumerable<MyClass>>> through a controller.
Now I am wondering whether I should or can use IAsyncEnumerable as a return type. The idea was to use this in the repository and the controller. However, in this (now decrepit) thread it states it should not be used. the proposed solution here would be something like:
FromSqlRaw(...).AsNoTracking().AsAsyncEnumerable()
As for the Controller I want keep the response wrapped with ActionResult to explicitly set the return code. However, that currently doesn't seem to work.
Should I just apply the solution for the repository and consume the result as a List in my controller or just keep it as it is?
The IAsyncEnumerable gives you an interface for pull-based asynchronous data retrieval. In other words this API represents an iterator where the next item is fetched asynchronously.
This means that you are receiving the data in several rounds and each in an asynchronous fashion.
Prior IAsyncEnumerable you could use IEnumerable<Task<T>>, which represents a bunch of asynchronous operations with return type T.
Whereas Task<IEnumerable<T>> represents a single asynchronous operation with a return type IEnumerable<T>.
Let's apply these knowledge to a WebAPI:
From an HTTP consumer point of view there is no difference between Task<ActionResult<T>> and ActionResult<T>. It is an implementation detail from users` perspective.
A WebAPI Controller's action implements a request-response model. Which means a single request is sent and a single response is received on the consumer-side.
If a consumer calls the same action again then a new controller will be instantiated and will process that request.
This means that the consumer of your API can't take advantage of IAsyncEnumerable if it is exposed as an action result type.
In .net 6 IAsyncEnumerable handling for MVC was changed when using System.Text.Json:
MVC no longer buffers IAsyncEnumerable instances. Instead, MVC relies on the support that System.Text.Json added for these types.
It means that controller will start sending output immediately and a client may start process it as it receives chunks of the response.
Here is an example with help of new minimal API:
Endpoint binding:
var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();
// this endpoint return IAsyncEnumerable<TestData>
app.MapGet("/asyncEnumerable/{count}", (int count) => GetLotsOfDataAsyncEnumerable(count));
// and this one returns Task<IEnumerable<TestData>>
app.MapGet("/{count}", async (int count) => await GetLotsOfDataAsync(count));
app.Run();
Controller methods:
async Task<IEnumerable<TestData>> GetLotsOfDataAsync(int count)
{
var list = new List<TestData>();
for (int i = 0; i < count; i++)
{
await Task.Delay(10);
list.Add(new TestData($"{i}"));
}
return list;
}
async IAsyncEnumerable<TestData> GetLotsOfDataAsyncEnumerable(int count)
{
for (int i = 0; i < count; i++)
{
await Task.Delay(10);
yield return new TestData($"{i}");
}
}
class TestData
{
public string Field { get; }
public TestData(string field)
{
Field = field;
}
}
count path variable allows to control how many data we want to retrieve in a single call.
I've tested it with curl command on a windows machine (here is the answer explaining how to measure performance with curl), results for 100 entries:
/100 /asyncEnumerable/100
time_namelookup: 0.000045s 0.000034s
time_connect: 0.000570s 0.000390s
time_appconnect: 0.000000s 0.000000s
time_pretransfer: 0.000648s 0.000435s
time_redirect: 0.000000s 0.000000s
time_starttransfer: 1.833341s 0.014880s
---------------------
time_total: 1.833411s 1.673477s
Important here to see is time_starttransfer, from curl manpage
The time, in seconds, it took from the start until the first byte was just about to be transferred. This includes time_pretransfer and also the time the server needed to calculate the result.
As you can see /asyncEnumerable endpoint started responding instantly, of course, clients of such endpoints have to be aware of such behavior to make good use of it.
Here how it looks in a cmdline:

Web api call in foreach loop - Timeout Error

I am using .NET Core Web API. I am working with some third party web api which I need to call in my Web API (so my .NET Core Web API is kind of wrapper on third party ones).
To get my result, I need to call more then one third party api using foreach loop. Details are as below:
First will be a Web API call which gives me result of around 4000 rows (each row is object of Id and value fields).
After that I need to loop through this 4000 rows and using each Id I need to call another API. On the result of this Web API I need to check some validation and return the valid ones.
I am able to make first Web API call successfully but when I do looping for another API call it gives me timeout error.
I have tried below things
1) making batches of 4000 rows and processing in batches.
2) Adding tasks in `foreach` loop and using `Task.WhenAll`
Example :
var batchSize = 50;
var returnData = new List<Order>();
foreach (var batchedItems in inventoriesList.Batch(batchSize)) //4000 rows
{
var tasks = new List<Task<Order>>();
foreach (var item in batchedItems)
{
tasks.Add(GetOrder(item.Value)); //call to another api
}
foreach (var task in await Task.WhenAll(tasks))
{
returnData.Add(task);
}
}
private async Task<Order> GetOrder(string id)
{
var order = await GetAsync<Order>(api-url);
if (order != null
&& order.IsAvailable == false
&& ValidateOrder(order)))
{
isValidOrder = true;
}
return isValidOrder == true ? order : null;
}
I have tried with LINQ as well rather then doing foreach loop for second API
call. Like below,
tasks = batchedInventories.Select(t => GetOrder(t.Value));
var result = await Task.WhenAll(tasks);
I have also tried with increasing KeepAliveTimeout of Kestrel. But no luck.
Could anybody suggest me correct and working way to do this?
I managed to resolve this error by creating static HttpClient for GetAsync method rather then creating instance for each request. Thanks a ton to John !!
I initialized static HttpClient in constructor with cookiecontainer.
Reference : Using httpclient throughout methods without losing session and cookies

If another request is running a method wait until finish

I'm developing an ASP.NET Web API application with C#, .NET Framework 4.7 and MongoDb.
I have this method:
[HttpPut]
[Route("api/Public/SendCommissioning/{serial}/{withChildren}")]
public HttpResponseMessage SendCommissioning(string serial, bool withChildren)
{
string errorMsg = "Cannot set commissioning.";
HttpResponseMessage response = null;
bool serverFound = true;
try
{
[...]
// Mongo
MongoHelper mgHelper = new MongoHelper();
mgHelper.InsertCommissioning(serial, withChildren);
}
catch (Exception ex)
{
_log.Error(ex.Message);
response = Request.CreateResponse(HttpStatusCode.InternalServerError);
response.ReasonPhrase = errorMsg;
}
return response;
}
Sometimes this method is called very quickly and I get an error here:
// Mongo
MongoHelper mgHelper = new MongoHelper();
mgHelper.InsertCommissioning(serial, withChildren);
Here I'm inserting the serials I received in order, and sometimes I get an error with a duplicated key in MongoDb:
I have a method to get the latest id used in Mongo (the primary key). And two requests get the same id, so when I try to insert it on Mongo I get an invalid key exception.
I thought to use a queue to store the serials and then consume them in the same order that I have received them. But I think I will get the same error when I try to store the serial in MongoDb.
Maybe if I can set a method that if it is running, I have to wait to run it, it will works. This method will have the part of insert the serials into Mongo.
How can I do that? A method that if it is running you can't run it in another Web Api request.
Or, do you know a better option?
By the way, I can't block this method. Maybe I need to run a thread with this synchronized part.

Best way to debug yield return that is consumed during serialization?

I'm working on an application that embeds JSON within the page. Some simplified example:
public ViewResult Page(IEnumerable<LolCat> lolCats)
{
var json = new
{
localCats = ToJson(lolCats),
};
return View( json ); // this gets serialized somewhere in the ASP pipeline
}
IEnumerable<object> ToJson(IEnumerable<LolCat> lolCats)
{
foreach ( var lolCat in lolCats )
yield return new { name = lolCat.name };
}
The JSON gets automatically serialized somewhere down the line in the ASP.NET pipeline.
In this example assume that sometimes a NULL slips into lolCats, throwing an exception. Problem is that the ToJson function might be called at a lot of different places throughout the application.
How do I find out which call to ToJson was the one responsible for the exception? The call stack ends in the Serializer that is actually consuming this IEnumerable, and therefore you don't see the 'original stacktrace'.
One simple fix would be to call ToList() within Page. But I'm looking for a way that doesn't break the laziness of the method.
Due to the deferred nature, you will never get which call to ToJson() actually produced the exception. The collection was never inspected in the first place until it was first enumerated (when it was serialized).
You need to inject into your enumerator some info about what called it.
e.g.,
IEnumerable<object> ToJson(IEnumerable<LolCat> lolCats, string id)
{
try
{
foreach (var lolCat in lolCats)
yield return new { name = lolCat.name };
}
catch (Exception ex)
{
throw new Exception(id, ex); // use a more appropriate exception
}
}
Then it's just a matter of generating an id that could help identify the caller.

AspNetCacheProfile attribute in inner method

I have an api method that gets a lot of data in the application start up.
90% of the data is relevant for all application users and the other 10% need to be changed by the user id and his environment and app version.
To avoid calling the first method every time user connect and make the start up slower,
i added : AspNetCacheProfile attribute.
But in this situation i can`t use the user id, environment and version at this method
because it is the data of the first user called the method.
So,
I added new method (the second one) and set the AspNetCacheProfile attribute above it
and called this method from the first one.
This way i can cache the general data and update the other 10% each call.
I just wanted to know .. will it work ?
I was not sure because the second method is not called directly, it is called from the first one.
[WebGet(UriTemplate = "/GetData")]
public APIResponse GetData()
{
try
{
MyCachedResponse = GetMyCachedResponse();
foreach (var info in data.info)
{
if (Something)
{
// Update some specific values inside
}
}
AppState.CurrentResponse.Data = data;
}
return AppState.CurrentResponse;
}
[AspNetCacheProfile("OneMinuteCaching")]
private MyCachedResponse GetMyCachedResponse()
{
return new MyCachedResponse(categories);
}

Categories

Resources