I am creating a graphical tool in silverlight which reads data from multiple files and database.
i dont want to call the database again and again. i want to retrieve the data when required and keep it safe somewhere so if the user or any other user visits the same page, they can then access the data.
i want to use application state of asp.net Cache["Object"] but in Silverlight? what is the best methodolgy?
Since silverlight is running client side you need to cache serverside.
You could fetch your data with WCF.
Something along these lines:
What I have done in the past is to cache the query using a WCF using enterprise library:
public class YourWcfService
{
ICacheManager _cacheManager = null;
public YourWcfService()
{
_cacheManager = EnterpriseLibraryContainer.Current.GetInstance<ICacheManager>("Cache Manager");
}
}
your web method would look something like:
[OperationContract]
public List<Guid> SomeWebMethod()
{
if (_cacheManager.Contains("rgal")) // data in cache?
result = (List<Guid>)_cacheManager.GetData("rgal");
if (result == null)
{
result = FETCH FROM DATABASE HERE;
// cache for 120 minutes
_cacheManager.Add("rgal", result, CacheItemPriority.Normal, null, new AbsoluteTime(TimeSpan.FromMinutes(120)));
}
return result;
}
Silverlight controls run in browser/client side per user, so caching something for all users on the server is not possible.
You can cache data in the control for given user's session or in isolated storage for given user. But you can't do anything on the server without writing corresponding server side code.
Is the caching really necessary? Are you really pounding your database that bad?
Your DB is your storage. Unless you have a performance issue, this is premature optimization.
The new Enterprise Library Silverlight Integration Pack provides you with capabilities of caching on the client. 2 types of data caching are supported: in-memory and to isolated storage. You'll also get flexibility of configuration of the expiration policies (programmatically or via external config) and a config tool support.
Note: it's a code preview now, but should be releasing as final in May.
Related
We have this 90.000 lines Json object that takes 5 seconds to load. We need to load all this data since there is a lot of relations that needs to be included. Fortunately this data almost never change so it will basically always be the same (it do need to be able to snap out of the cache if the data updates)
Im not that good on server side cache so my question is.
If i cache this serverside i guess all users will be served the cached answer until the cache releases?
Can i cache this response on the server for until i somehow tell the api stop caching this? Its fine to hold 24 hours cache
Any libriaries that i can achieve this with?
Regards
You could use MemoryCache in .Net Framework
MemoryCache lives in the Process, so all users can access the same cached data if you want (on the same server)
You set the expiration time when saving to the cache.
Built in the Framework
Thanks for the input my first attempt looks like this. It seems to work. Please provide feedback
MemoryCache memoryCache = MemoryCache.Default;
var test = memoryCache.Get("testKey")
if (test != null) {
return Ok(test)
} else {
var data = GetData()
.ToList();
memoryCache.Add("testKey", data, DateTimeOffset.UtcNow.AddDays(1));
return Ok(data)
}
I have list of Objects containing no sensitive data ( i.e. simple settings of a widget at client side 5-6 integer values and Guid, probably multiple widgets at the same time).
At the moment, I am using ASP.NET MVC Session. It doesn't seems to be relieable. It is getting null time to time for certain keys those are stored inside that. It happens when I launch application at debugging or Network is slow.
Can I use Client side session Storage for storing all settings? ( i.e. sessionStorage or localStorage ).
I am concerned about IIS re-cycling which causes session to remove
or any server side exception.
Edit
public ActionResult ApplySettingsForDashboard(DashboardCommonSettings settings)
{
Session[string.Concat("SettingsToApply_",settingsToApply.PanelGuid)]=settings;
}
public DashboardCommonSettings GetSettingsFromSession(string PanelGuid)
{
var settings = Session[string.Concat("SettingsToApply_",PanelGuid)]
}
Comments and suggetions are welcomed.
Thanks
On the basis of comments, I want to answer in a way : Use Out-Proc Session storage ( persistent storage ) for reliable session storage as much as possible. It would be slow, so need to be used cautiously. In-Proc session storage is highly unreliable specifically at production environment. Either localStorage ( client side ) can be taken as an option when we don't have sensitive data.
Other answers are also welcome.
I have implemented REST service using WebAPI2, service implemeted to manage different sessions which are created and joined by different clients which are accessing service.
Session contains information about access of application functionality and information of participants which have joined same session.
Each client get session information and access list from server for synchronization purpose on every second. According to access changed, client functionality will changed(Enable/Disable).
I am using MemoryCache class to store session info in WebAPI service as below.
public static class SessionManager{
private static object objForLock = new object();
public static List<Session> SessionCollection
{
get
{
lock (objForLock)
{
MemoryCache memoryCache = MemoryCache.Default;
return memoryCache.Get("SessionCollection") as List<Session>;
// return HttpContext.Current.Application["SessionCollection"] as List<Session>;
}
}
set
{
lock (objForLock)
{
MemoryCache memoryCache = MemoryCache.Default;
memoryCache.Add("SessionCollection", value, DateTimeOffset.UtcNow.AddHours(5));
//HttpContext.Current.Application["SessionCollection"] = value;
}
}
}
}
My problem is regarding inconsistent behavior of cache.
When clients send synchronization call, it will gives inconsistent results. For some requests, clients gets proper data and for some requests client gets null data alternative after some requests.
I have add debugger and monitor the object for null result, then "memoryCache.Get("SessionCollection")" also null. After some consecutive request it will be proper again. I am not getting why this object is not persistent.
Alternative, I have tried "HttpContext.Current.Application["SessionCollection"]" as well, But same issue is there.
I have read about "app pool recycle", it recycle all cache after particulate time. If my cached object is recycled by app pool recycle, then how can I get this object again?
Please some can help me to get out of this issue. Thanks in advance.
You should store client specific information in Session instead of Cache. Cache should be for the whole application (shared)
However, it's not recommended as web api is built with RESTful in mind and RESTful services should be stateless (APIs do not cache state). Stateless applications have many benefits:
Reduce Memory Usage
Better scalability: Your application scales better. Image what happens if you store information of millions of client at the same time.
Better in load balancing scenario: every server can handle every client without losing state.
Session expiration problem.
In case you want to store client state, you could do it anyway. Please try the suggestions in the following post: ASP.NET Web API session or something?
In general, caching state locally on the web server is bad (both Session and local MemoryCache). The cache could lose for many reasons:
App pool recycle.
Load balancing environment
Multiple worker processes in IIS
Regarding your requirements:
Each client get session information and access list from server for
synchronization purpose on every second. According to access changed,
client functionality will changed(Enable/Disable).
I'm not sure if you want to update the other clients with new access list immediately when a client sends synchronization call. If that's the case, SignalR would be a better choice.
Otherwise, you could just store the updated access list somewhere (shared cache or even in database) and update the other clients whenever they reconnect with another request.
#ScottHanselman said about a bug in .NET 4 here. I hope this fix help you:
The temporary fix:
Create memory cache instance under disabled execution context flow
using (ExecutionContext.SuppressFlow()) {
// Create memory cache instance under disabled execution context flow
return new YourCacheThing.GeneralMemoryCache(…);
}
The Hotfix is http://support.microsoft.com/kb/2828843 and you can request it here: https://support.microsoft.com/contactus/emailcontact.aspx?scid=sw;%5BLN%5D;1422
Just a caution, MemoryCache will keep data in memory in single server. So if you have multiple web servers(in front of load balancers), that cache will not be available to other servers. You also use the cache name - "SessionCollection". That data will be shared to all clients. If you need to store data in cache unique to each client, you need to return a token (guid) to the client and use that token to get/update data in cache in subsequent requests.
Try introducing a class level variable. So your code will look like below. (Some code remove for clarity)
private readonly MemoryCache _memCache = MemoryCache.Default;
....
return _memCache.Get("SessionCollection") as List<Session>;
...
_memCache .Add("SessionCollection", value, DateTimeOffset.UtcNow.AddHours(5));
How can I use server side caching on a C# WCF Rest service?
For example, I generate a lot of data into one object (not through database) and I do not want to do that every call a (random) user makes. How can I cache the object.
Verifying question: Is it right that a HttpContext cache object is only between a specific client and the host?
Is it right that a HttpContext cache object is only between a specific client and the host?
No, it is a shared object, as per msdn
There is one instance of the Cache class per application domain. As a
result, the Cache object that is returned by the Cache property is the
Cache object for all requests in the application domain.
Depending on the load, you may also use a database for chaching (depending what you call caching). There are also in-memory databases specifically optimised for distributed caching, see memchached, redis and Memcache vs. Redis?
The HttpContext.Cache is local to the Application Domain, and so is shared by all code that runs in that Application Domain. It is certainly fast and flexible enough for most applications.
How you would use it, depends of course on your needs. You may use a serialized version of input parameters as the key, for instance, like in this example:
public MyObject GetMyObject(int size, string cultureId, string extra)
{
// Input validation first
...
// Determine cache key
string cacheKey = size.ToString() + cultureId.ToString() + extra.ToString();
// rest of your code here
}
Similar but not the same:
How to securely store database connection details
Securely connecting to database within a application
Hi all, I have a C# WinForms application connecting to a database server. The database connection string, including a generic user/pass, is placed in a NHibernate configuration file, which lies in the same directory as the exe file.
Now I have this issue: The user that runs the application should not get to know the username/password of the general database user because I don't want him to rummage around in the database directly.
Alternatively I could hardcode the connection string, which is bad because the administrator must be able to change it if the database is moved or if he wants to switch between dev/test/prod environments.
So long I've found three possibilities:
The first referenced question was generally answered by making the file only readable for the user that runs the application.
But that's not not enough in my case (the user running the application is a person. The database user/pass are general and shouldn't even be accessible by the person.)
The first answer additionally proposed to encrypt the connection data before writing it to the file.
With this approach, the administrator is not able anymore to configure the connection string because he cannot encrypt it by hand.
The second referenced question provides an approach for this very scenario but it seems very complicated.
My questions to you:
This is a very general issue, so isn't there any general "how-to-do-it" way, somehow a "design pattern"?
Is there some support in .NET's config infrastructure?
(optional, maybe out of scope) Can I combine that easily with the NHibernate configuration mechanism?
Update:
In response to the first answers: There are several reasons why I would want to connect to the database directly and not use a web service:
(N)Hibernate can only be used with a database, not webservices (am I right?)
We plan to provide offline capability, i.e. if the database or network should be down, the user can continue his work. To manage this, I'm thinking of having a local, in-proc database, e.g. SQL Server Compact, and using MS Sync framework to synchronize it with the server database as soon as it is up again.
Do you have any further ideas taking this into account?
First of all, letting untrusted users connect to a database is generally not a good idea. So many things can go wrong. Put a web service inbetween.
If you absolutely have to do it, make it so that it doesn't matter even if they get the username and password. Limit their privileges in the database so that they can only execute a few stored procedures that have built-in security checks.
Whatever you do, you can't give the username/password of a privileged user to an untrusted person. It's just asking for trouble. No matter how well you try to hide your credentials within an encrypted string inside a binary file or whatnot, there's always a way to find them out. Of course whether anyone'll actually do it depends on how interesting your data is, but silently hoping that mean people with debuggers will just leave you alone is not a very good security measure.
Actually the WebService approach (mentioned in some other answer) means that you move NHibernate and its logic to the web-service. The WebService then, exposes the db functionality available to the application using the WebService's methods.
There is practically only one user for the database, the one the WebService uses and if you want the application user to have different db privileges you abstract it from the WebService layer
In the end, the WinForms application is only aware of the location of the WebService where it requests data through the WebService's methods and you can apply any required security measure between these two endpoints.
For off-line capability it all boils down to making a secure way to persist your data to local storage and providing a synchronization method via the WebService
I have actually done this using a webservice that communicated with the DB and a WinForm application (.NET Compact Framework) that only talked to the webservice and in case of no cellular network coverage it would serialize the changes to the memory card (the data was not important so for my case obscure/obscene security measures where not taken)
UPDATE with a small example as requested (i do find it strange though to ask for an example on this)
you have set up your domain classes and nhibernate configuration and (for example) your repository stuff in a project of type ASP.NET WebService Application. For the sake of simplicity i'm only going to have a single web-service class Foo (in Foo.asmx.cs) and well as a single Bar domain class
so you get this (actual implementation varies):
namespace FWS
{
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
[System.ComponentModel.ToolboxItem(false)]
// To allow this Web Service to be called from script, using ASP.NET AJAX, uncomment the following line.
// [System.Web.Script.Services.ScriptService]
public class FooService : WebService
{
private readonly ILog errorLogger = LogManager.GetLogger("ErrorRollingLogFileAppender");
private readonly IDaoFactory daoFactory = new DaoFactory();
private readonly ISession nhSession = HibernateSessionManager.Instance.GetSession();
}
[WebMethod]
public Bar[] GetFavoriteBars(string someParam, int? onceMore){
return daoFactory.GetBarDao().GetFavoriteBars(someParam, onceMore); //returns a Bar[]
}
}
and we abstract the daobehaviour, or just use the nhsession directly, exposed as a webmethod.
Now from the WinForm application all you need to do is Add a WebReference which makes all necessary changes to configuration but also generates all necessary classes (in this example, it will create a Bar class as the web-service exposes it).
namespace WinFormK
{
public class KForm(): System.Windows.Forms.Form
{
public void Do()
{
var service = new FWS.FooService();
string filePath = "C:\\temp\FooData.xml";
Bar[] fetched = service.GetFavoriteBars("yes!", null);
//lets write this to local storage
var frosties = new XmlSerializer(typeof(Bar));
TextReader reader = new StreamReader(filePath);
try
{
var persisted = (T)frosties.Deserialize(reader);
}
catch(InvalidOperationException)
{
//spock, do something
}
finally
{
reader.Close();
reader.Dispose();
}
}
}
}
there are certain things you have to take note to:
You essentially lose lazy stuff, or at least you lose it in your winform application. The XML serializer cannot serialize proxies and as such you either turn of lazy fetching on those collections/properties or you use the [XmlIgnore] attribute which in turn do what it implies on serialization.
You cannot return interfaces on the WebMethod signatures. They have to be concrete classes. So, returning IList<Bar> will have to be transformed to List<Bar> or something of the like
The webservice is executed by IIS and is visible from a web browser. By default, only local browser requests will be served (but that can be changed) so you can test your data access layer separately of what your winform does.
The receiving end (winform app) has no knowledge of NHibernate whatsoever.
In the example above i've kept the same name for the dao-methods for the web-methods; As long as you didn't keep nhibernate--specific methods in your dao's (lets say like a NHibernate.Criterions.Order parameter) you will probably find no problem. In fact you can have as many .asmx classes in your webservice as you want, probably even 'map' them to the corresponding dao's (like public class FooService : WebService, public class BarService : WebService, public class CheService : WebService where each corresponds to a DAO).
You will probably have to write some kind of polling method between your endpoints to keep your presented data fresh.
WebService data is verbose; extremely so. It is advisable to zip them or something before sending them over the wire (and maybe encrypt them as well)
the win application only knows a configuration entry: http://server/FWS/FooService.asmx
Webservices have Session disabled by default. remember that before starting using the session for user data.
You will probably have to write some kind of authentication for the webservice
In the example above i am returning a Bar[] with Bar being mapped with nhibernate. More often than not this may not be the case and you may be required to write an auxiliary class WSBar where it adapts the original Bar class to what the webservice and the winform application can consume. This class is actually just a data carrier. Again this depends on how much integration exists with your domain classes and nhibernate as well as how muxh complicated your classes are: Certain data structures cannot be serialized by default.
This model may not suit what you have already done with your application
I think it's hard to do : it's like you don't want a user of stackoverflow to know his password.
A user can always trace his network traffic and see the user/password (you can had an encoding, but it still won't be 100% sure I think).
I think that you should add a webservice between your user and your database with a unique id for each user.
This is why database desktop apps suck. There is no good way to slice it. Best bet would be to use stored procedures or web services. Basically, another layer that can be locked down and control access to the database.