I plan to be storing all my config settings in my application's app.config section (using the ConfigurationManager.AppSettings class). As the user changes settings using the app's UI (clicking checkboxes, choosing radio buttons, etc.), I plan to be writing those changes out to the AppSettings. At the same time, while the program is running I plan to be accessing the AppSettings constantly from a process that will be constantly processing data. Changes to settings via the UI need to affect the data processing in real-time, which is why the process will be accessing the AppSettings constantly.
Is this a good idea with regard to performance? Using AppSettings is supposed to be "the right way" to store and access configuration settings when writing .Net apps, but I worry that this method wasn't intended for a constant load (at least in terms of settings being constantly read).
If anyone has experience with this, I would greatly appreciate the input.
Update: I should probably clarify a few points.
This is not a web application, so connecting a database to the application might be overkill simply for storing configuration settings. This is a Windows Forms application.
According to the MSDN documention, the ConfigurationManager is for storing not just application level settings, but user settings as well. (Especially important if, for instance, the application is installed as a partial-trust application.)
Update 2: I accepted lomaxx's answer because Properties does indeed look like a good solution, without having to add any additional layers to my application (such as a database). When using Properties, it already does all the caching that others suggested. This means any changes and subsequent reads are all done in memory, making it extremely fast. Properties only writes the changes to disk when you explicitly tell it to. This means I can make changes to the config settings on-the-fly at run time and then only do a final save out to disk when the program exits.
Just to verify it would actually be able to handle the load I need, I did some testing on my laptop and was able to do 750,000 reads and 7,500 writes per second using Properties. That is so far above and beyond what my application will ever even come close to needing that I feel quite safe in using Properties without impacting performance.
since you're using a winforms app, if it's in .net 2.0 there's actually a user settings system (called Properties) that is designed for this purpose. This article on MSDN has a pretty good introduction into this
If you're still worried about performance then take a look at SQL Compact Edition which is similar to SQLite but is the Microsoft offering which I've found plays very nicely with winforms and there's even the ability to make it work with Linq
Check out SQLite, it seems like a good option for this particular scenario.
Dylan,
Don't use the application config file for this purpose, use a SQL DB (SQLite, MySQL, MSSQL, whatever) because you'll have to worry less about concurrency issues during reads and writes to the config file.
You'll also have better flexibility in the type of data you want to store. The appSettings section is just a key/value list which you may outgrow as time passes and as the app matures. You could use custom config sections but then you're into a new problem area when it comes to the design.
The appSettings isn't really meant for what you are trying to do.
When your .NET application starts, it reads in the app.config file, and caches its contents in memory. For that reason, after you write to the app.config file, you'll have to somehow force the runtime to re-parse the app.config file so it can cache the settings again. This is unnecessary
The best approach would be to use a database to store your configuration settings.
Barring the use of a database, you could easily setup an external XML configuration file. When your application starts, you could cache its contents in a NameValueCollection object or HashTable object. As you change/add settings, you would do it to that cached copy. When your application shuts down, or at an appropriate time interval, you can write the cache contents back out to file.
Someone correct me if I'm wrong, but I don't think that AppSettings is typically meant to be used for these type of configuration settings. Normally you would only put in settings that remain fairly static (database connection strings, file paths, etc.). If you want to store customizable user settings, it would be better to create a separate preferences file, or ideally store those settings in a database.
I would not use config files for storing user data. Use a db.
Could I ask why you're not saving the user's settings in a database?
Generally, I save application settings that are changed very infrequently in the appSettings section (the default email address error logs are sent to, the number of minutes after which you are automatically logged out, etc.) The scope of this really is at the application, not at the user, and is generally used for deployment settings.
one thing I would look at doing is caching the appsettings on a read, then flushing the settings from the cache on the write which should minimize the amount of actual load the server has to deal with for processing the appSettings.
Also, if possible, look at breaking the appSettings up into configSections so you can read write and cache related settings.
Having said all that, I would seriously consider looking at storing these values in a database as you seem to actually be storing user preferences, and not application settings.
Related
Is it possible to have a Redis server running on two machines and each server specifies in the config file the same snapshot dump file name and directory, with the directory and file obviously being shared between both machines?
RavenDB seems to work fine with that, I can setup the whole server file directory on a Dropbox folder on my machine and do the same on the other machine with the two drop boxes syncing while the RavenDb servers read and write data from/to the database that is stored within the drop box folder.
I understand both DBs' concepts are very different, I just use the RavenDB experience as example to explain what I try to accomplish. Please note this is just for developing purposes not to run in production.
I am running Redis in Version 2.4.5 as a Windows service and use BookSleeve as client within C# .Net 4.5
Thanks
Most certainly not. This would be a sure way to ensure a corrupt file.
You might want to watch progress on Redis Cluster (http://redis.io/topics/cluster-spec), currently at the specification stage.
The only time you would use the dump file on a system which does not have persistence enabled is on boot time. However, if persistence is disabled it doesn't read from the dump file.
Even without server specific data on the dump file the possibility for corruption comes at any and every point when both services write to the file. You could set the persistence settings to only save if there have been, say, 59 million changes in 60 seconds. This could allow you to read the file on load but not save to it. You would then need to use
redis config set save ""
To disable saving in both but be able to save when you want, you would do the above and issue a by save command.
I also have to advise against doing this over a shared file system, which is what you'll need to do this with multiple machines accessing the same file. In your case you are talking about Dropbox as your shared file system, but this is likely to kill performance if you are persisting to disk.
But ultimately, I'd have to ask why you think you need this?
If you are using one for read only, then use a slave or two and do reads on the slaves. This way you don't have to worry about multiple instances corrupting a persistence file. This avoid the need for shared storage as you have two nodes running each with a copy of the data. This provides redundancy and you can relatively easily work a master/slave failover setup.
Ultimately, if you are just using it to develop something against, I don't see the need for such a setup. Just store configuration where you can download it (Dropbox, github, etc) and develop away. It isn't difficult, and certainly less complicated, to simply copy your dump file to Dropbox or anywhere else you need it than to do what you describe.
Problem:
I have multiple instances of the same C# application running on different PCs (OS: Windows XP, Windows 7) in the same LAN. I have to share some configuration data among them. Each process must have read-write access to the data. My employer insists on storing these shared data in a file, which is in a shared directory on one of these PCs.
Possible solutions:
Exclusive file opening: The data is stored in a TXT file (serialization to and from a binary file is also an option). Each process uses File.Open with FileShare.None when trying to open the file. Getting an IOException means that the file is already in use, so the process has to wait and try again later.
SQL Server CE embedded DB: The data is stored in an SDF file. The engine can handle at most 256 simultaneous connections (v3.5 SP2), which is more than enough.
SQLite embedded DB: The data is stored in an SQLite DB file. The documentation says SQLite works, but may be unreliable when used on a network share.
Other?
What is the preferred way to do this?
Don't know if is the best way, but I've done this in C ages ago, it was working well for me.
Each process will read and create a personal copy of the file and then work on that.
At a fixed moment (upon process termination or triggered via some UI or whatever you feel like) each process will send its copy of the file to a master process in charge of rebuilding the original file in the shared directory and signaling the other process that they need to reload.
Each process reloads the file (containing infos coming from all the other processes).
Of course this solution requires that the file writing process has knowledge on how to rebuild the file and how to resolve conflicts (but this depends on data format)
You don't really describe the type of data you're working with so I'd say the answer varies.
Using a proper DBMS for this would be best if the data you are working with could generally be considered record/field oriented (and under rare circumstance even if it isn't). In this case I would recommend MSSQL CE since its runtime will mitigate multi-user issues for you.
SQLite was generally considered a single user/application database (at least back when I used it in C) though things could have changed in the last 5 years. If you're using .NET 4 then there are few free adapters available for use from what I've found unless you're comfortable with a mixed framework application.
I would only monitor the file locking manually if you're in a situation where the data is pretty flat by design (like a log file), though if it was log like data I would probably look into how some of the open source log libraries do it. You basically said you have control over the data structure so I'd suggest redesigning the data to be more normalized/rigid to avoid using this solution.
Create a web service and make your programs pull the configuration from there. You can control file locking from inside the web service and not have to deal with that at the program level. This also affords you the abstraction that if you decide to change how the settings are stored (e.g. move them from a file to a database) you can do this without having to make any changes to your program.
Sometimes I need to set some string values from code, for example:
Page.Title = "This is a test page.";
or
lblSupportInfo.Text = "Please contact xxx-xxx-xxxx for support.";
These are just examples of data that can change anytime. Is it better stored in the settings file in application scope? What are other options.
It would be better to be able to change this by updating a configuration file rather than a code release (Resources).
How do other people handle this? If there are too many of them, the web.config can be very long.
I prefer to store such items in a database, as updates are a LOT easier than updating a .config file, or code. Most commercial software that I've worked with does the same.
If a DB isn't an option, then the .Config files or some other text file will do. Even an XML file would work as a viable option. With a separate XML file or text file, you also avoid the hassle of losing the session state of your users, which happens if you update the .config file and are using the standard in-proc session management.
In one of our applications, we use an XML document that was created by filling a Dataset from a database, and then using the Dataset's WriteXml() function to preserve the data as a file that can be deployed with the application. This is specifically intended for a group of people who are on the road, and can't always connect to the server to get the most recent data. The data is used to populate a survey form for secret shoppers. Their results are also saved in teh same way, and serialized on the laptop. When they connect to the corporate network, the results are uploaded via a web service, and processed into the results tables in the database.
We use a Key Value Pair in the Database. We load that key value pair into the Cache and then create Static Class for getting the values in Code on demand and easily. We make an admin page for clearing them out of cache when needed and the code recognizes that they have been clear and relaods them into cache on demand. This can be expanded to deal with other languages and be plugged into all other models as needed, however I would imagine you want want to only use it the Presentation layer. You can key you custom exceptions and then catch them in the presentation layer and that key will correspond to the appropriate message. This would give a Robust environment with lots of potential for growth. Hope this Helps
i did this in asp.net 2.0, so things might have changed since then, but the best way to do this is to store it in the database, and then use the asp.net resource provider to load the values.
So to your application it would look like its loading from a resource file, but it would from the database, and you get all the nice compile time benefits and built in asp.net tooling for resources and you can change your locale if you need to.
this article was the inspiration for the solution http://msdn.microsoft.com/en-us/library/aa905797.aspx
I am working on a Silverlight client and associated ASP.NET web services (not WCF), and I need to implement some features containing user preferences such as a "favourite items" system and whether they'd like word-wrapping or not. In order to make a pleasant (rather than infuriating) user experience, I want to persist these settings across sessions. A brief investigation suggests that there are two main possibilities.
Silverlight isolated storage
ASP.NET-accessible database
I realise that option 2 is probably the best option as it ensures that even if a user disables isolated storage for Silverlight, their preferences still persist, but I would like to avoid the burden of maintaining a database at this time, and I like the idea that the preferences are available for loading and editing even when server connectivity is unavailable. However, I am open to reasoned arguments why it might be preferrable to take this hit now rather than later.
What I am looking for is suggestions on the best way to implement settings persistence, in either scenario. For example, if isolated storage is used, should I use an XML format, or some other file layout for persisting the settings; if the database approach is used, do I have to design a settings table or is there a built-in mechanism in ASP.NET to support this, and how do I serve the preferences to the client?
So:
Which solution is the better solution for user preference persistence? How might settings be persisted in that solution, and how might the client access and update them?
Prior Research
Note that I have conducted a little prior research on the matter and found the following links, which seem to advocate either solution depending on which article you read.
http://www.ddj.com/windows/208300036
http://tinesware.blogspot.com/2008/12/persisting-user-settings-in-silverlight.html
Update
It turns out that Microsoft have provided settings persistence in isolated storage as a built-in part of Silverlight (I somehow missed it until after implementing an alternative). My answer below has more details on this.
I'm keeping the question open as even though Microsoft provides client-side settings persistence, it doesn't necessarily mean this is the best approach for persisting user preferences and I'd like to canvas more opinions and suggestions on this.
After investigating some more and implementing my own XML file-based settings persistence using IsolatedStorage, I discovered the IsolatedStorageSettings class and the IsolatedStorageSettings.ApplicationSettings object that is a key/value collection specifically for storing user-specific, application settings.
It all seems obvious now. Of course, in the long term, a mechanism for backing up and restoring settings using a server database would be a good enhancement to this client-side settings persistence.
I think in general the default would be to store on the server; only when there are specific compelling reasons to attempt to store on the client should we do so. The more you rely on storing in a medium you can't control, the more risk you take on.
That having been said, and setting myself on the "database" side of the argument, I would ask what the downside of a database is? You mentioned using XML - is your data only semi-structured? If so, why not store XML in a SQL database? Setting up something this simple would not generally be considered a "burden" by most standards. A simple web service could act as a go-between between your Silverlight client and the settings database.
If it is an important feature for you that users have access to their preferences while offline, then it looks like isolated storage is the way to go for you. If it's more important that users be able to save preferences even if they have turned off isolated storage (is this really a problem? I'd be tempted to call YAGNI on this, but I'm not terribly experienced with the Silverlight platform...) then you need to host a database. If both are important, then you're probably looking at some kind of hybrid solution; using isolated storage if available, then falling back to a database.
In other words, I think the needs of your application are more important than some abstract best practice.
I need to tweak some variables (only in a development setting) without having to restart IIS or anything (so I assume Web.Config is the wrong place to put them). Where is the easiest place to put about 500 config settings that have to be read for every request and written to, like I said, while IIS is running?
EDIT: Like I said, this is only for some Q&D development so I don't care about performance in any way. A database is a bit of overkill (and is probably more work than I want to deal with), I want something fast (like Settings), that I don't have to worry about parsing and can read from and write to. If I do XML, where do I write the file to so I don't have to spend time messing around with permissions?
In a database?
500 Config Settings to be read for every request? I'd put them in a database so they can be indexed and cached. A separate XML or data file would also most likely be cached in memory by the web server, but still wouldn't provide the performance an indexed database table could. But it depends on how you are accessing the settings.
You can just make your own "config" file. Just don't name it .config. Then you can read it just like a text file and set all your properties. Just have to either implement your own file monitoring class or something to know the file has changed so you can update your code.
With that many configuration options a database system, with some well thought caching is going to most likely be the best idea overall!
You have to be sure to consider the impacts of loading/storing them on all requests as well, as even with small sized values, that can be a big amount of overhead. SO caching is going to be very important.
I know you said you don't want a database, but with 500 settings, it just seems like the best solution.
That said, if you really don't want a database, you could always dump them into an xml file stored locally and just read/write when needed.