I have a business logic which has lot of DB fetch operations and a bit complex business logic.
Data fetched is rarely changed within the session of user.
Many Fetch opertaions(data fetched is rarely changed within the session of user).
For each and every action on the form(button click/ value change in Textbox etc...) we need to run the business logic to check if it's valid change.
Currently we are using Asp.net Forms Application and these business logic is in InSessionScope().
Currently we are working on migrating to Restful API(WebAPI).
Can we use sessions(InSessionScope()) in RESTFul?
If not in sessions how to avoid more database calls and use the same object on subsequent calls and increase performance?
Based on my personal experience NEVER use Session in a Rest Application as AspNET WebAPI are .. even if you can .. but instead use Tokens for Authorization and User Profilation (with AspNet Identity) and for performance (don't hit DB too many times) i suggest to you some ways as i have done:
1 - USE CACHE!! (there are some great frameworks and lib for cache ..you can use different Layers of cache .. Query .. Response of webapi ..for example I'm use to cache the entire API response (Json) and auto invaildate it on POST / PUT / DELETE request) ..in .NET you can use this https://github.com/filipw/Strathweb.CacheOutput
You can also use Redis for caching (if you don't want to cache locally in the Server but to have a distribuited cache)
2 - Try to think in NoSQL way .. in our application we use a mix of DB .. SQL Server but also MongoDB (expecially for big amount of data ) for example we use SQL server to manage AspNEt Identity but we use MongoDB to store our Product (we have about 6 milions of products) and it take about 1 sec for query (also with aggregation!!) ..
3 - Try to use LocalStorage on the FrontEnd if you can to store some information. .and then sync them when you need ..
Hope it can help you.. enjoy WebAPI ..enjoy REST!! (and leave webforms as soon as you can ...in my idea!!)
You can use tokens implemented by you or JwtToken.
If you choose implement custom token on login method you must return a token to you app, next in any api call pass this token like a header or query string and decrypt in server to validate propose.
Related
There are cache tag attributes like
vary-by-query
vary-by-route
vary-by-cookie
vary-by-user
vary-by priority
can be used in CSHTML directly for MVC.
What can be the best implementation to achieve the same in ASP.netCore for a cache mechanism implemented using IDistributedCache ?
There's no concept of vary in IDistributedCache. The cache tag attributes/response caching in general, is implemented as part of the request pipeline. While you can utilize IDistributedCache within the request pipeline, it is not part of that pipeline and doesn't natively have access to anything from the request.
You can somewhat implement this via the key of the entry you're adding. For example, if you want to vary the cache on the logged in user, simply prefix all your keys with something like $"User{userId}". Because the literal key would then be different user to user, the cache will obviously be as well. However, this is all a manual affair. You'd need to decide how to structure the keys and actually implement that in your app code when utilizing the cache.
Is it possible to cache once produced response on server-side and then redeliver it in response to the same request?
Let me explain:
I have an endloint that takes about 5 seconds to generate a response - this includes going to the database and fetching data, processing it, performing some computations on it, serealizing and gzipping the response - the entire thing takes 5 seconds.
Once this is done for the first time I want the result to be available for all the requests coming from all the users.
In my views client side caching, when you either cache the result on the client and do not hit the server at all for some time or when you hit the server but get 304 not-changed instead of the data is not good enough.
What i want is to hit the sever and if this enndpoint (with the same set of parameters) was already called by anyone then get the full response. Is it possible at all?
You have a number of options for this.
One option is API level caching, you create a key using the parameters required to generate the response, you go and fetch the data and save the pair in the cache. Then next time a request comes in, you recreate the key and go and check your cache first. If it's there, happy days, return it, if not, go fetch it and store it.
This of course depends on the amount of data you have, too muchm data or too big data and this will not work. You could also store it for a while, say 10 minutes, 1 hour etc.
If you have a lot of data and caching like this isn't possible then consider something else. Maybe create your own no-sql cache store ( using something like MongoDB maybe ),store it and retrieve it from there, without the need for any changes so it's a straight retrieve, thus very quick.
You could also use something like Redis Cache.
Lots of options, just choose whatever is appropriate.
I'm developing a jQuery website that will display a single record from my Azure Table Storage (ATS) account. I don't want to use jQuery to directly access the table, since that would require disclosure of my ATS account name and key in the jQuery code. I've tried to find a simple C# web service example project that would be the interface, but everything I can find is much more complicated than I need.
This web service will need just one API that jQuery will use: it will be passed two strings: the Partition Key and the Row Key for ATS, which will exactly match with an existing record in ATS. The result returned will be a string that jQuery will convert using JSON.parse() after it is received. If no record is found with the Partition and Row Keys passed in, an empty string should be returned.
If you know of an example of a simple C# web service that I could use as a starting point, I would greatly appreciate a link to it. It's been many years since I developed with C#, and the complicated nature of the table service API with all the associated crypto, hashing, signatures, etc. have left me confused.
Edit: I now realize that maybe both my jQuery code (providing the web UI) and the C# (providing the ATS interface) might work together in one .NET solution. I'm currently running the jQuery UI app standalone in its own .NET solution, due to my path of fumbling around trying things out.
I don't want to use jQuery to directly access the table, since that would require disclosure of my ATS account name and key in the jQuery code.
It seems that you do not want jQuery client directly make a GET request to query entity via table service Rest API, and you’d like to create a backend service for querying entity in table. As maccettura mentioned in comment, you can create a ASP.NET Web API project and do Query Entities operation in controller action.
[Route("queryentity/{pk}/{rk}")]
public CustomerEntity Get(string pk, string rk)
{
//you can install [Azure Storage Client Library for .NET](https://www.nuget.org/packages/WindowsAzure.Storage/)
//and then create a retrieve operation and pass both partition and row keys to retrieve a single entity
//TableOperation retrieveOperation = TableOperation.Retrieve<CustomerEntity>(pk, rk);
//or
//make [Query Entities](https://learn.microsoft.com/en-us/rest/api/storageservices/query-entities) operation as you did
return myCustomerEntity;
}
I don't understand some code in the Microsoft.Web.WebPages.OAuth namespace, specifically the OAuthWebSecurity class.
It's this method here:
internal static void RequestAuthenticationCore(HttpContextBase context,
string provider, string returnUrl)
{
IAuthenticationClient client = GetOAuthClient(provider);
var securityManager = new OpenAuthSecurityManager(context,
client, OAuthDataProvider);
securityManager.RequestAuthentication(returnUrl);
}
The first line is fine => grab the provider data, for this authentication request. Let's pretend this is a TwitterClient(..).
Now, we need to create a SecurityManager class .. which accepts three args. What is that 3rd arg? An OAuthDataProvider? That's defined as a static, here:
internal static IOpenAuthDataProvider OAuthDataProvider =
new WebPagesOAuthDataProvider();
And this creates a WebPagesOAuthDataProvider. This is my problem. What is this? And why does it have to be tightly coupled to an ExtendedMembershipProvider? What is an ExtendedMembershipProvider? Why is this needed?
In my web application I'm trying to use a RavenDb database and my own custom principal and custom identity. Nothing to do with Membership or SimpleMembership that comes with ASP.NET.
What is that class and why is it used, etc? What's it's purpose? Is this something that DNOA requires? and why?
I didn't write the code you mention, so I could be wrong here, but I believe the ASP.NET code you refer to is indeed bound to their Membership provider.
If you aren't using the ASP.NET membership provider, I would suggest you simply use DotNetOpenAuth directly (as opposed to through the facade that Microsoft added), which has no such tight coupling.
If you don't need the ASP.NET Membership system to provide local login accounts (accounts stored in your local membership database) on your system I wouldn't go down the Route of using any WebMatrix based bits (WebSecurity / OAuthWebSecurity).
They actually make it harder to interact with DNOA and more or less hide all the interesting bits at the same time anyway ...
As I needed local acounts I ended up pulling all the source code for this into my source code and then editing it from there (I had other reasons for doing this as well, not just to enrich the interaction with DNOA).
If you need local accounts - use WebMatrix
If you don't need local accounts - use DNOA directly.
I have a page that executes a long process, parsing over 6 million rows from several csv files into my database.
Question is just as when the user clicks "GO" to start processing and parsing 6 million rows I would like to set a Session Variable that is immediately available to the rest of my web site application so that any user of the web site knows that a user with a unique ID number has started parsing files without having to wait until the end of the 6 million rows processed?
Also with jQuery and JSON, I'd like to get feedback on a webpage as to which csv file is being processed and how many rows have been processed.
There could be other people parsing files at the same time, how could I track all of this and stop any mix up etc with other users even though there is no login or user authentication on the site?
I'm developing in C# with .NET 4.0 Entity Framework 4.0, jQuery, and MS SQL 2008 R2.
I was thinking of using Session Variables however in my static [WebMethod] for my jQuery JSON calls I am not able to pull back my Session unless I'm using HttpContext.Current.Session but I am not sure how if this solution would work?
Any guidance or ideas would be mostly appreciated.
Thanks
First of all: Session variables are not supposed to be seen for any user everywhere.
when some client connects to the server, there is a session made for them by the server, and next time the same user requests (within the expiration time of the session), the session (and it's variables) are usable.
You can use a static class for this if you intend to.
for example
public static class MyApplicationStateBag
{
public static Dictionary<string,object> Objects {get; private set;}
}
and for your progress report. you can use a asp:Timer to check the progress percentage every second or two.
here is a sample code that I have written for asp:Timer within an UpdatePanel:
Writing a code trigger for the updatepanel.
I suggest you use a Guid for identifying the current progress as the key to your state bag.
The correct way of doing this is via services, for example WCF services. You don't want to put immense load on the web server, which is not supposed to do that.
The usual scenario:
User clicks on GO button
Web server creates a job and starts this job on a separate WCF service
Each job has ID and metadata (status, start time, etc.) that is persisted to the storage
Web server returns response with job ID to the user
User, via AJAX (JQuery) queries the job in the storage, once completed you can retrieve results
You can also save Job ID to the session
P.S. it's not a direct answer to your question, but I hope it helps