Get response from multiple Web Api end points in an efficient manner - c#

I am new to ASP.Net Web API. I need to achieve the following. I have two web API end points
http://someurl.azurewebsites.net/api/recipe/recipes This returns the recipes that are available.
[ID, Name, Type]
http://someurl.azurewebsites.net/api/recipe/{ID} This returns the recipe with [ID, Ingredients, Cost....]
I need to build a application to get the cheapest recipe in a timely manner.
I am able to get the desired result using the following code but at times it crashes and throws the following exception System.Threading.Tasks.TaskCanceledException: A task was canceled.
How can I achieve this in an efficient manner both using Controller or Javascript.
public ActionResult Index()
{
List<Receipe> receipelist = new List<Receipe>();
var baseAddress = "http://someurl.azurewebsites.net/api/recipe/recipes";
using (var client = new HttpClient())
{
client.DefaultRequestHeaders.Add("x-access-token", "sjd1HfkjU83");
using (var response = client.GetAsync(baseAddress).Result)
{
if (response.IsSuccessStatusCode)
{
var jsonString = response.Content.ReadAsStringAsync().GetAwaiter().GetResult();
var Receipes = JsonConvert.DeserializeObject<List<Receipe>>(jsonString.Substring(jsonString.IndexOf("Receipes") + 8, (jsonString.Length - jsonString.IndexOf("Receipes") - 9)));
if (Receipes != null)
{
foreach (Receipe Receipe in Receipes)
{
var baseAddress1 = "http://someurl.azurewebsites.net/api//api/recipe/" + Receipe.ID;
using (var client1 = new HttpClient())
{
client1.DefaultRequestHeaders.Add("x-access-token", "sjd1HfkjU83");
using (var response1 = client1.GetAsync(baseAddress1).Result)
{
var jsonString1 = response1.Content.ReadAsStringAsync().GetAwaiter().GetResult();
receipelist.Add(JsonConvert.DeserializeObject<Receipe>(jsonString1));
}
}
}
}
}
else
{
Console.WriteLine("{0} ({1})", (int)response.StatusCode, response.ReasonPhrase);
}
}
}
return View(receipelist);
}

There are several ways to go about this. Without experimenting, it is not really possible to say which would be the most efficient way. Or may be we really don't need to be so efficient. Consider the following steps:
Try to cache the result of the first response, i.e list of recipes available (Endpoint #1). You could cache that for 'x' timespan depending on your business requirement. That will most likely save you a first trip. My guess is that the 'list of available recipes' will not change very often; so you could probably cache that for a while(hours/days?). You could use a dictionary object as a data structure and cache this.
If you have control of the Endpoint #2, then I'd recommend providing an api that takes in 'list of ids' instead of just one id. So in essence you are asking Endpoint#2 to return 'a list of recipes with price' in one api call rather than looping for each recipe at a time. You could probably cache this data as well, depending on how often the price changes. My guess is this won't change very often either. When you get the 'list of prices with id', efficiently plug the price info to existing dictionary.
When you hit an exception when making an api calls, you could always return the stale data to the users instead of not displaying any data/errors. Be sure to let the users know by labeling in the UI something like "Recipe as of 2 hours ago".
With the above changes, your application should be able to perform well and in the worst case, display stale data.
Keep in mind that depending on the problem domain, we don't always have to be up to the second/minute consistent with the data. In real life and in reality the data on the internet is always stale. For example, by the time your end users sees the price and decides to click to buy a recipe, the price of the recipe might have already changed!
Hope this helps.

Related

Dynamics CRM: CreateRequest concurrency issue

I am using MS Dynamics CRM SDK with C#. In this I have a WCF service method which creates an entity record.
I am using CreateRequest in the method. Client is calling this method with 2 identical requests one after other immediately.
There is a fetch before creating a record. If the record is available we are updating it. However, 2 inserts are happening at the exact time.
So 2 records with identical data are getting created in CRM.
Can someone help to prevent concurrency?
You should force the duplicate detection rule & decide what to do. Read more
Account a = new Account();
a.Name = "My account";
CreateRequest req = new CreateRequest();
req.Parameters.Add("SuppressDuplicateDetection", false);
req.Target = a;
try
{
service.Execute(req);
}
catch (FaultException<OrganizationServiceFault> ex)
{
if (ex.Detail.ErrorCode == -2147220685)
{
// Account with name "My account" already exists
}
else
{
throw;
}
}
As Filburt commented in your question, the preferred approach would be to use an Alternate Key and Upsert requests but unfortunately that's not an option for you if you're working with CRM 2013.
In your scenario, I'd implement a very lightweight cache in the WCF service, probably using the MemoryCache object from the System.Runtime.Caching.dll library (small example). Before executing the query to CRM, you can check if the record exists in the cache and continue with you current processing if it doesn't (remembering to add the record to the cache with a small expiration time for potential concurrent executions) or handle the scenario where the record already exists in the cache (and here you can go from having quite complex checks to detect and prevent potential data loss/unnecessary updates to a simple and stupid Thread.Sleep(1000)).

TFS Object Model: Do WorkItems keep sessions open?

We're currently investigating an issue, that, according to the Firewall provider, we have at times around 1500 parallel sessions open. I have the strong suspicion, that our TFS-Replication, a Service, which fetches Workitems via TFS Object Model from an external TFS and saves some data into a local SQL database, is causing the issue.
The access to the object model is looking like this:
internal static async Task<IReadOnlyCollection<WorkItem>> QueryWorkItemsAsync(string wiqlQuery)
{
var tfsConfig = TFSConfiguration.Instances[Constants.TfsPrefix.External];
var uri = new Uri(tfsConfig.WebbaseUri);
var teamProject = new TfsTeamProjectCollection(uri, new VssBasicCredential(string.Empty, ConfigurationManager.AppSettings[Infrastructure.Constants.APP_SETTINGS_TFS_PAT]));
var workItemStore = teamProject.GetService<WorkItemStore>();
var query = new Query(workItemStore, wiqlQuery, null, false);
var result = await Task.Run(
() =>
{
var castedWorkItems = query.RunQuery().Cast<WorkItem>();
return castedWorkItems.ToList();
});
return result;
}
Nothing too fancy: A WIQL can be passed into the method. Currently, I'm fetching blocks, so the WIQL would look like
var wiql = "SELECT * FROM WorkItems";
wiql += $" WHERE [System.Id] > {minWorkItemId}";
wiql += $" AND [System.Id] <= {maxWorkItemId}";
wiql += " ORDER BY [System.Id] DESC";
Im pretty much doing nothing with this WorkItems except mapping some of their fields, but not writing, saving or anything. I didn't get any hint on the objects I'm using regarding open Sessions and also the WorkItem-Objects itself are only very short living in the memory.
Am I missing something here, that could explain the open sessions within that service?
The client object model does a number of things:
It keeps a connection pool for each user/projectcollection combination to speed up data transfers.
Each work item revision you hit needs to be fetched.
Each work item materialized from a query contains only the fields selected in the query, additional fields being accessed are fetched on demand.
The TFSTeamprojectCollection class implements IDisposable and must be cleaned up once in a while to ensure connections are closed. In internal cache is maintained, but it ensures that connections are closed.
Its probably a good idea to wrap this code in a try/catch block or provide the Team Project Collection through Dependency injection and manage the connection at a higher level (otherwise your additional fields will fail to be populated).
I don't know the very details behind the workitem class but I observed that when u e.g. specify in the select of the wiql only a few fields u can still access others ... And that is comparable slow. If I select all fields I later access through indexer it is much faster.
From that observation I would say: yes, a communication is kept open.

Calling wcf aync service in asp.net core 2.0 while waiting the result

I do understand that I might be getting this wrong since I am porting asp.net application to asp.net core 2.0 (mainly because the optimizations regarding load speed on pages) but I would ask the question anyway.
So my queries are working properly when I am fetching data only, however, I ran into a problem while having to fetch a file path from the database in order to download it on the client side. Since I don't need the whole model of the file I have 3 field dto on the client side that I fill up with the information regarding the file (etc location, size, filename) the problem is that when I send the async request toward the WCF service on Azure that's hold my entity framework link to the database the code continues further without waiting for the data to be retrieved from the database and throws null reference exception while attempting to fill the dto object that is to be sent further to the client in order to retrieve the file that's marked for downloading
This is my data access on the client side
internal async Task<AnimalDocument> GetAnimalDocument(int id)
{
var data = await _context.GetAnimalDocumentAsync(id);
var result = JsonConvert.DeserializeObject<AnimalDocument>(data);
return result;
}
And this is where I get the null exception
public SeriliazedFile GetFile(int id, int type)
{
var result = new SeriliazedFile();
if (type == 1)
{
var data = _context.GetHumanFile(id);
result.FileName = data.Result.DocumentName;
result.FilePath = data.Result.DocumentLocation;
result.FileSize = data.Result.FileSize.Value;
}
else if (type == 2)
{
var data = _context.GetAnimalDocument(id);
result.FileName = data.Result.DocumentName;
result.FilePath = data.Result.DocumentLocation;
result.FileSize = data.Result.FileSize.Value;
}
return result;
}
Is there a way to force the async request to wait for the result before returning Task that I retrieve from the WCF? I've tried telling _context.GetAnimalDocument(id).Wait(); however, nothing happens it still proceeds further without any result.I've noticed that the trigger to retrieve the data is fired after the ajax request that is sent toward the page returns 200 causing something like a deadlock but I might be wrong. If anyone could give me a work around it would be nice, I am pretty sure that I would figure it out on my own eventually but time is rare anyway, I hope you have a good day.
I am sorry for posting this, it was not a issue with the WCF or the code in any way, async works perfectly fine with asp.net core 2.0 the issue was with me. I am still adapting to the concept of have [FromBody] in front of the types in the functions, it appears to be that I missed one and I was getting id 0 by default (not that I can figure out why I would get 0 instead of null on integer field when there is no data but that doesn't matter anyway.) in my id field and the data layer was returning null value that's why I was getting null reference exception later.

Can I get only the new Facebook comments since the last time I checked instead of all comments every time?

Right now I have an app that allows a user to schedule/post a Facebook post and then monitor the likes/comments. One of the problems I foresee is that currently I am pulling every single comment/like whether it's been processed or not. What I would like to do instead is be able to say 'Give me all the NEW comments since XYZdate/XYZcomment.' Is this currently possible?
var accessToken = existingUserNode.Attributes["accessToken"].Value;
var facebookAPIMgr = new FacebookWrapper.FacebookAPIManager();
var msg = new FacebookWrapper.FacebookMessage()
{
AccessToken = accessToken,
FacebookMessageId = facebookPost.FacebookMessageId
};
//Get Facebook Message Comments
// Need to find a way to limit this to only new comments/likes
var comments = facebookAPIMgr.RetrieveComments(msg);
You can do time-based pagination as part of your graph API query. If you keep a unix timestamp of when you polled things last, you can simply do https://graph.facebook.com/{whatever}?since={last run}.
This worked when I was working heavily with the Graph API earlier this year, and is still around on the documentation, but considering how much Facebook loves to change stuff without telling anyone you may still encounter problems. So just a warning, YMMV.

Should I output cache my control that looks up twitter RSS onload?

I have a user control that is featured on several pages of a heavily hit site. Some of these pages include our blog sidebar, our forum sidebar, and smack right in the middle of our home page. That means this control is rendered a lot. The control it meant to read in a twitter RSS feed for a specific account and write out the last 2 tweets. Below is the majority of my Page_Load for the control. I have it in a try {} catch because it seems that on production the site is unable to load the XML often, so I need to flip on a friendly error message.
try {
var feed = XmlReader.Create(ResourceManager.GetString("TwitterRSS"));
var latestItems = SyndicationFeed
.Load(feed)
.GetRss20Formatter()
.Feed
.Items
.Take(2);
rptTwitter.ItemDataBound += new RepeaterItemEventHandler(rptTwitter_ItemDataBound);
rptTwitter.DataSource = latestItems;
rptTwitter.DataBind();
} catch (Exception ex) {
phError.Visible = true;
}
As you can see, I just fetch the 2 most recent items and repeat over them in a repeater for the front-end. If anything fails, I flip on the error PlaceHolder which says something like "Twitter is unavailable".
I often see this error message on the prod site so I'm wondering if it's making too many requests to the RSS feed. I was thinking about output caching the control for 10 minutes but I thought, "what if it gets cached in the error state"? Then it's guaranteed to display the error message panel for 10 minutes. My question is, is true that if it displays the error from the catch when its creating a newly cached version, will that truly be cached for 10 minutes (assuming I set Duration="600")? Does anyone have any tips as to how I can make this work better or cache when only real Twitter data is rendered, not the error message?
Thanks in advance
Instead of caching the entire page, I would cache the application data returned by your
var latestItems = ....
statement as well as if you receive an error. You can make the cache duration of each different so if you successfully get the data, cache it longer than if you got an error. One implementation would look like this:
object Twitter = Cache["MyTwitter"];
if(Twitter==null)
{
// cache empty
try
{
var latestItems = (load items)
Cache.Insert("MyTwitter", latestitems, null, DateTime.Now.AddSeconds(600),
Cache.NoSlidingExpiration);
Twitter = latestitems;
}
catch(Exception ex)
{
Cache.Insert("MyTwitter", ex.ToString(), null, DateTime.Now.AddSeconds(60),
Cache.NoSlidingExpiration);
Twitter = ex.ToString();
}
}
if(Twitter is string)
{
phError.Visible = true;
}
else
{
rptTwitter.DataSource = Twitter;
// rest of data binding code here
}
There are two parts here. The first part is to check the cache and if the object is not in the cache, do your loading. If there's an error just store a string in the cache.
Then with you object, if it's a string you know you've got an error. Otherwise it's the result of retrieving the twitter feed.

Categories

Resources