Hi
I working on Asp .Net project(Web App).
I have to many values, properties and fields. In order to use them I need to save them in somewhere.(I know about Session, Application and Cookie).
application isn't good cuz it is same for all user's but session is better(I need these values in server-side). Also cookie isn't good and I won't use that for security problem.
Session is good but if I use session a lot I must pay for massive memory on Host-Server.
So is there any better place can hold them ?
Edit 1
For more Information recently I fetch these values from database (so I won't to save them in database). and also use Wcf service for get these values from database. I want to hold these values to use them (for example sending them for service-method to do something or do something visually in page).
Thanks All
As has been commented, there are many ways of implementing state management, depending on the amount of data your looking to persist, overall volume of traffic, hosting costs, maintainabilty, etc.
This MS link describes the pros and cons of some of the techniques.
Yes, Session is the best way to use. Actually its consumes the comparatively least memory of all while using the server side state management technique.
If the values are much higher than you can use to store those values in database with the sessionid as the key with them. So that it will consume some what less memory from the server.
A database could be a good idea. Maybe SQL Server Express or SQL Server Compact Edition
Related
I have to use a bulk amount of data through out the session for each user in MVC application (data is user specific)
Below are the two method I found it
Use a session object load the data into it and retrieve when ever needed
Create a XML file into the disk and retrieve the data when ever required
which one of the above is good for a web application?
If both are not good please let me know a convenient method.
I read about caching but that also take the memory concern as Session or not?
Please help me with simple example if anyone has gone through the same.
I will not go with any of the approaches you highlighted, both the approaches will give you problems and code modification will be required when you plan to scale you application and try to move to load balancer or web farm type of environment.
You should try to make your application as much stateless as possible.
You can store the data in database, if your tables are properly indexed it will not take much time to fetch the data (almost same as fetching data from a xml file).
If the data is not sensitive and you requires it in client side only, you can think of storing it into sessionStorage
We have a fairly busy distributed cloud based system that we want to introduce basic profiling into. Firstly we'd like to monitor web page render times and DB calls - we use EF and SQLServer.
The question is what is the best (performant & easy) way to record this information? My first thought is to store it in a DB, but would this cause a performance issue when a single page render may require multiple DB calls and hence, multiple inserts into the performance table.
Would it be better to store this information in memory only, or perhaps store in memory short term then batch persist to a DB later? Or is some other approach recommended?
If you simply send INSERT statements to the database without block-waiting to receive values of identity columns, it should be fairly lightweight for the web server.
If the database table does not have any keys, (or only a clustered key,) it should be fairly lightweight on the server, too.
This would certainly be the easiest approach, and would not take much to implement it and give it a try, so I would recommend that you check whether it covers your needs before trying anything else.
I have a Datatable which is fetching ~5,50,000 records from database(SQLite). When fetching it is slowing down the system.
I am storing these records on backend in SQLite Database and in frontend in Datatable.
Now what should i do so that database creation time(~10.5 hours) in backend and fetching time in Front end reduces.
Is there any other structure that can be used to do so. I have read that Dictionary & Binary File is fast. Can they be used for this purpose & how?
(Its not a web app. its a WPF desktop app where frontend and backend are on same machine).
Your basic problem, I believe, is not the strucutre that you want to maintain but the way you manage your data-flow. Instead of fetching all data from database into some strcuture (DataTable in your case), make some stored procedure to make a calculations you need on server-side and return to you data already calculated. In this way you gain several benefits, like:
servers holding database servers are usually faster then development or client machine
huge reduction of data trasmission, as you return only result of the calculation
EDIT
Considering edited post, I would say that DataTable is already highly optimized on in memory access, and I don't think changing it to something else will bring you a notable benefits. What I think can bring a banefit is a revision of a program flow. In other words, try to answer following questions:
do I need all that records contemporary ?
can I run some calculations in service, let's say, on night ?
can I use SQL Server Express (just example) and gain benefits of possibility to run a stored procedure, that may (have to be measured) run the stuff faster, even on the same machine ?
Have you thought about doing the calcuation at the location of the data, rather than fetching it? Can you give us some more information as what it's stored in and how you are fetching it and how you are processing it please.
It's hard to make a determination for a optimsation without the metrics and the background information.
It's easy to say - yes put the data in a file. But is the file local, is the network the problem, are you make best use of cores/threading etc.
Much better to start back from the data and see what needs to be done to it and then engineer the best optimisation.
Edit:
Ok so you are on the same machine? One thing you should really consider in this scenario is what are you doing to the data. Does it need to be SQL? If you are just using is to load a datatable? Or is a complexity you are not disclosing?
I've had a similar task- I just created a large text file and used memory mapping to read is efficicently without any overhead. Is this the kind of thing you're talking about?
You could try using a persisted dictionary like RaptorDB for storing and fetching/manipulating the data in an ArrayList. The proof of concept should not take long to build.
Now I'm going to use the SessionStateProvider class from the sample provided by Mircosoft into production .
In addition to clearing used session records from the blob, what
else do I need to take into consideration before using it in live site?
In the sample, SessionStateStoreData is serialized and stored in the blob. Instead, can I store in one of the columns of the table? What is the pros and cons of this approach?
While clearing the unnecessary session in the table and blob, what is the best and safest way to clear?
In my opinion the table based session state provider is not suitable to be used in production for a site. The hint is that Microsoft called it a sample. My main reason for this is that it doesn't deal with locking session data if there are several requests for the same session in a short period of time.
i have a similar requirement to stackoverflow to show a number of metrics on a page in my asp.net-mvc site that are very expensive to calculate. Stackoverflow has a lot of metrics on the page (like user accept rate, etc) which clearly is not being calculated on the fly on page request, given that it would be too slow.
What is a recommended practice for serving up calculated data really fast without the performance penalty (assuming we can accept that this data maybe a little out of date.
is this stored in some caching layer or stored in some other "results" database table so every day there is a job to calculate this data and store the results so they can be queries directly?
assuming that i am happy to deal with the delayed of having this data as a snapshot,what is the best solution for this type of problem.
Probably they may be relying on the Redis data store for such calculations and caching. This post from marcgravell may help.
yes, the answer is caching, how you do it is (can be) the complicated part, if you are using NHibernate adding caching is really easy, is part of your configuration and on the queries you just add .Cacheable and it manages it for you. Caching also depends on the type of environment, if you're using a single worker, web farm or web garden, you would have to build a caching layer to accomodate for your scenario
Although this is a somewhat-recent technique, one really great way to structure your system to make stuff like this possible is by using Command and Query Responsibility Segregation, more often referred to by CQRS.