Update claims for an loggedon User? - c#

I have more higher level question for updating an onlogged user claims in an ASP.Net 4.8 website.
How should I do it?
If I recreates the claims on every post, then point of using claims seems pointless. So skipped that.
If I store a datetime for latest change on claims and then I check the datetime at everypost. But that feels wrong seems like point 1 but just reading smaller mass each time.
I do a session singleton "publisher/subscriber" (called N1 here) where it stores a change to the users. The user id be posted in the N1 and next time the user loads a page it check to see if it names in N1 and if it then it reloades it claims?
Number 3 seems to be the way to go, but it still feels not right. Any other suggestions?

I solved this with a static class with a threadsafe dictonary that was checked for userId(if it was changed). If it found it needed to reload the user claims and then remove the userid from the threadsafe dictonary.

Related

I'm looking for a way to store search preferences for a user

I've got a C#/ASP.net application. When a user searches for data using a few standard dropdowns and text boxes, I run a SQL query to grab all of the users search preferences and then auto-fill the controls based on what's returned. The user can then search using those presets, or change any of the choices and search again.
The problem is, this requires a call to the DB every time that search page is loaded. I was hoping there might be a way to grab all the preferences once when the user logs in the first time and then store them somehow, to lighten the load on my SQL Server db. Has anyone ever come across this issue and discovered an efficient way to handle it?
What about using the old goodies - Cookies?
HttpCookie aCookie = new HttpCookie("SearchPreferences");
aCookie.Values = /* your collection of preferences */;
aCookie.Expires = /* DateTime of expiration */;
Response.Cookies.Add(aCookie);
... or sessions you would destroy when an user logs out?
if (Session["SearchPreferences"] != null) {
/* loading the preferences */
} else {
/* preferences are already loaded */
}
You can also set their expiration using Session.TimeOut.
Edit:
As it may sound from the discussion below, both of these methods have their pros and cons. For that I've thought that adding a few more should give your opportunity to choose what suits you best.
System.Web.Caching.Cache seems like the most modern and fastest way of doing this.
Cache c = System.Web.Caching.Cache();
c.Add(key, value, dependencies, absoluteExpiration,
slidingExpiration, priority, onRemoveCallback);
However, there's still an option of ApplicationState, which is an object that can hold your values while the application is running.
If you're looking for client-side persistence, Cookies and Sessions (which still depend on cookies on the client, unless cookieless) is one option (see previous answer). You can also look into newer client side persistence options (where Web Storage has most browser support). Hth...

Does IIS Metabase return sites in Id ascending order?

I'm not sure if my question on the face of it makes full sense, so let me try and elaborate. At the moment I try and check if a website already exists in IIS by creating a new DirectoryEntry:
DirectoryEntry IISWebsites = new DirectoryEntry(MetaBasePath);
MetaBasePath is defined earlier as:
private const string MetaBasePath = "IIS://Localhost/W3SVC";
I check IISWebsites children in a foreach loop and just wondered if this will run through the children in Id order? From what I've read this is actually stored in the DirectoryEntry 'Name' property.
The reason I ask is that if the website name entered by the user in my web setup project isn't found then I want to return the highest id so I can add 1 to it and create a new website with the name supplied by the user.
Having tested this with my IIS it does seem to return it in this order but I need to be sure.
EDIT
I've found the following on Microsoft support (http://support.microsoft.com/kb/240941):
Note that when the metabase is searched for configuration information,
it is enumerated from the bottom, or subkey, to top, or node.
This seems to imply that it does do what I think, but it's not 100% clear if it works on site Id as I'm not sure how this relates to the subkey.
The documentation does not specifically define the order as by site ID so it would not be safe to assume it will always be sorted that way (particularly as your current application eventually gets used with new versions of .NET/Windows/IIS in the future).
Most likely the number of websites is not going to be big enough that enumerating them to find the max would not be a bottleneck.
Even so, you can run a search for websites and specify the order using DirectorySearcher.Sort.
Note that in regards to your edit and how configuration information is enumerated, that does not related to sort order. The one sentence taken out of context is not as clear. Read it in context of the whole paragraph and it is clear that the enumeration behavior is related to metabase property inheritance.

Generating unique ID for clients on a system running over a LAN in C#

I've a simple client registration system that runs over a network. The system is supposed to generate a unique three digit ID (primary key) with the current year concatenated (e.g. 001-2013). However, I've encountered the problem that the same primary keys being generated when two users from different computers (over a LAN) try to register different clients.
What if the user cancels the registration after an ID is already generated? I've to reuse that ID for another client. I've read about static variable but it didn't solve my problem. I really appreciate your ideas.
Unique and sequential IDs are hard to implement. To completely achive it you would have to serialize commiting creation of client information so ID generated only when data is actually stored, otherwise you'll endup with holes when something wrong happened during submittion.
If you don't need strict sequential numbers - giving out ranges of ID (1-22, 23-44,...) to each system is common approach. Instead of ranges you can give out lists of IDs to use ({1,3,233,234}, {235,236,237}) if you need to use as many IDs as possible.
Issue:
New item -001 is created, but not saved yet
New item -002 is created, but not saved yet
Item -001 is cancelled
What to do with ID -001?
The easiest solution is to simply not assign an ID until an item is definitely stored.
An alternative is, when finally saving an item, you look up the first free ID. If the item from step 2 (#2) is saved before the one from step 1, #2 gets ID -001. When #1 then gets saved, the saving logic sees that its claimed ID (-001) is in use, so it'll assign -002. So ID's get reassigned.
Finally you can simply find the next free ID when creating a new item. In the three steps described above, this'll mean you initially have a gap where -001 is supposed to be. If you now create a new item, your code will see -001 is unused and will assign that to the new item.
But, and that totally depends on your requirements which you didn't specify, now -001 was created later in time than -002, I do not know if that is allowed. Furthermore at any given moment you can have a gap in your numbering where an item has been cancelled. If it happens at the end of a reporting period, this will cause errors (-033, -034, -036).
You also might want to include an auto-incrementing primary key instead of this invoice number or whatever it is.

Current url into cookie file

Short question is : "How do I save current url into cookie file?"
And the whole problem is as follows. I have to save current filters that client have applied to my grid in his cookie, that next time he logs in the data looks just as he wants it to look like. The service doesn't have any "login" stuff, just provides available data to the user. This whole thing is written in ASP.MVC/C#.
If you have any other solutions to this task I will be happy to discuss them!
Thanks for giving a minute.
http://msdn.microsoft.com/en-us/library/ms178194.aspx
In short, you can access the user's cookie data by calling Response.Cookies to get a dictionary of HttpCookie objects keyed by the name you give each cookie (much like Session or ViewState data stores). You can add to this by specifying a cookie name in the indexer as if it were there and setting Value and ExpirationDate properties, or by creating a new HttpCookie and calling Cookies.Add.
It may not be necessary to store the whole URL, although a cookie can contain up to 4k of data. I would instead store the query string (which has the pertinent filter settings) under a unique name that that specific page will know to get its cookie data from ("<page name here>FilterSettings", perhaps). Then, on PreInit, get the Request, and if its QueryString is empty but there's a Cookie with saved filter settings, tack the saved query string onto the current url and redirect.
Remember that the client has control over whether to save cookie data; the browser may accept all, accept from trusted sources, ask on all, or refuse all. In this case, no big deal; it's pure convenience, which is exactly what a cookie should be used for. If this were valuable data, you may have had to persist it server-side based on the user.

Website database duplicate records

On every page of my website a token is passed in as a querystring parameter. The server-side code checks to see if the token already exists in the database. (The token is a uniqueidentifider field in the database). If the token exists then it will use the existing one, if not then it will create a new row with the new token.
The problem is once in a while I see a duplicate record in the database (two rows with the same uniqueidentifider). I have noticed the record insertion times were about half a second apart. My only guess is when the site is being visited for the first time the aspx pages weren't fully compiled. So it takes a few seconds and the user goes to another page of the site by typing in a different url and the two requests were executed almost at the same time.
Is there a way to prevent this duplicate record problem from happening? (on the server-side or in the database??...)
This is the code in questions that's part of every page of the website.
var record = (from x in db.Items
where x.Token == token
select x).FirstOrDefault();
if (record == null)
{
var x = new Item();
x.Id = Guid.NewGuid();
x.Token = token;
db.Items.InsertOnSubmit(x)
db.SubmitChanges;
}
Yes, create a unique index on your token field.
create unique index tab_token on your_table(token);
This way, database will make sure you will never store two records with the same token value. Keep in mind that your code might fail when running this due to the index constraint, so make sure you are catching that exception in your code and treat it accordingly.
What is probably happening is that two request are being served at the exact same time and some racing conditions are causing two tokens getting the same value. It would help to identify your problem if you post some code.

Categories

Resources