Running WinForms WebBrowser control as NETWORK_SERVICE - c#

I need to use the WinForms WebBrowser control in an ASP.NET application in order to take screenshots of webpages.
I'd much rather run it in a console app, and communicate to the app via the ASP.NET application, but it's not my call, I've been told it has to run as part of the website.
It all pretty much works apart from every call to navigate starts fresh with a new session, it's like the cookies aren't being persisted. As an experiment, I changed my IIS application pool to run as me rather than NETWORK_SERVICE, and it all works fine. So it's something strange about it running as network service.
I'm guessing that the network service account doesn't have the permissions to keep track of the ASP.NET_SessionId cookie and the auth cookie, but both these are non-persistent cookies, so I don't know why that would be.
Here is some simplified code so you can get the gist of what I'm doing. There is a bunch of logging and storing of images that I cut out, so it's not complete code. Basically a developer/tester/admin can run this code once (task kicked off via webpage), it will generate bitmaps of every page in the system, they can release a new version of the website and then run it again, it will tell you the differences (new pages, pages removed, pages changed).
public void Run()
{
var t = new Thread(RunThread);
t.SetApartmentState(ApartmentState.STA);
t.Start();
}
private void RunThread()
{
try
{
using (var browser = new WebBrowser())
{
browser.DocumentCompleted += BrowserDocumentCompleted;
if (!VisitPage(browser, "special page that sets some session flags for testing - makes content predictable")))
throw new TestRunException("Unable to disable randomness");
foreach (var page in pages)
{
VisitPage(browser, page);
}
}
}
// An unhandled exception in a background thread in ASP.NET causes the app to be recycled, so catch everything here
catch (Exception)
{
Status = TestStatus.Aborted;
}
}
private bool VisitPage(WebBrowser browser, string page)
{
finishedEvent.Reset();
var timeout = false;
stopwatch = new Stopwatch();
browser.Navigate(page);
stopwatch.Start();
while (!timeout && !finishedEvent.WaitOne(0))
{
Application.DoEvents();
if (stopwatch.ElapsedMilliseconds > 10000)
timeout = true;
}
Application.DoEvents();
if (timeout)
{
if (resource != null)
ShopData.Shop.LogPageCrawlTestLine(testRunID, resource, sequence++, (int)stopwatch.ElapsedMilliseconds, null);
}
browser.Stop();
return !timeout;
}
private void BrowserDocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e)
{
try
{
var browser = (WebBrowser)sender;
var width = browser.Document.Body.ScrollRectangle.Width;
var height = browser.Document.Body.ScrollRectangle.Height;
var timeTaken = (int)stopwatch.ElapsedMilliseconds;
if ((width == 0) || (height == 0))
{
return;
}
browser.Width = width;
browser.Height = height;
byte[] buffer;
using (var bitmap = new Bitmap(width, height))
using (var memoryStream = new MemoryStream())
{
browser.DrawToBitmap(bitmap, new Rectangle(0, 0, width, height));
bitmap.Save(memoryStream, ImageFormat.Bmp);
buffer = memoryStream.ToArray();
}
// stores image
}
finally
{
finishedEvent.Set();
}
}
So in summary:
The above code is started by a ASP.NET page and it runs in the backgound
It works fine if I run it in an IIS application pool set to run as a proper user
If I run it in an IIS application pool as NETWORK_SERVICE then every navigate has a new session, rather than persisting a session for the lifetime of the WebBrowser.
I can get it working fine outside of ASP.NET, but that is not an option for me currently, it has to run inside this ASP.NET application.
//Update
On the advice of my colleague I'm using impersonation in the background thread where I'm creating the WebBrowser, running as a local user. This works fine and I'm pretty happy with this solution. He's a stackoverflow user so I'll let him post the answer and get the credit :)

Using impersonation in the background thread would allow the WebBrowser to run as a user with permissions to run IE and thus store cookies, instead of it running as the Application Pool user (NETWORK_SERVICE).
You can setup the Impersonation in the web.config or programatically to restrict it to a specific thread.

not offcially supported due to its usage of WinInet. see http://support.microsoft.com/kb/238425 for limitations of WinInet in a service.

Do you see the session cookie in the requests coming from the WebBrowser control? I cannot find anything on how this control is supposed to handle cookies - if it ignores them you would get the behavior you are describing

Related

ASP.NET Core download from a Network Share (sometimes works?)

I have a pretty standard ASP.NET Core 2.2 web app.
I'm running into an issue with downloading files that are stored on a Network Share. We are using a method of impersonation in code (client requirements) to access the file share. Uploading to the share with the provided credentials works fine, so we know that (a) the impersonation is working and (b) the files ARE at the destination. The issue I am having comes from Downloading the file. It's a pretty standard download link that points to an action in a controller that gets the file information from the database and uses two of the database values (PathToFile and Filename) to get the location of the file and pull it back to the controller, followed by returning a file:
var fileRecord = //Get the record from the database.
byte[] bytes = null;
if(fileRecord != null)
{
try
{
string fullPath = $"{fileRecord.PathToFile}\\{fileRecord.Filename}";
await ImpersonationHelper.Impersonate(async () => { bytes = await System.IO.File.ReadAllBytesAsync(fullPath); }, _settings);
}
catch (Exception e)
{
return NotFound();
}
}
return File(bytes, System.Net.Mime.MediaTypeNames.Application.Octet, fileRecord.Filename);
For reference:
public static async Task Impersonate(Action actionToExecute, ApplicationSettings settings)
{
IntPtr tokenHandle = new IntPtr(0);
SafeAccessTokenHandle safeAccessTokenHandle = null;
ImpersonateLogin login = new ImpersonateLogin(settings);
Task<bool> returnValue = Task.Run(() => LogonUser(login.username, login.domain, login.password, 2, 0, out safeAccessTokenHandle));
if (false == returnValue.Result)
{
int ret = Marshal.GetLastWin32Error();
throw new System.ComponentModel.Win32Exception(ret);
}
if(safeAccessTokenHandle != null)
await returnValue.ContinueWith(antecedent => WindowsIdentity.RunImpersonated(safeAccessTokenHandle, () =>
{
actionToExecute();
}));
}
}
Locally, it works fine (we skip the impersonation with an appsetting) and the file comes back and set up as a download in the browser.
On the server, however, it doesn't work, but it also does work. It's strange: Clicking on the link will lead to an error page:
but refreshing this error page (ie. re-requesting that file) over and over again will make it work (usually every 2-4 refreshes will return the file correctly).
Has anyone encountered this, or something like this that can offer some insight?
As it turns out, it (seemingly?) had something to do with the method(s) being async.
Once I removed the (forced) async call to the impersonation, and all async calls right to the "Download" action, it all lined up and works 100% of the time now. From what I could find online, it looks like it was a "timing" issue with async/sync calls. The impersonation would happen AFTER the download file, so the user wouldn't have permission to actually fetch the file to download, but in some cases, the impersonation would happen first, so the file would come back. Making everything "non-async" fixed the issues I was having.

Get Number of Requests Queued in IIS using C#

I need to collect following two informations from WebRole running IIS-8 on Azure.
Number of requests queued in IIS
Number of requests current being processed by worker
Since we are on Azure cloud service, I believe it would be better to stick together with default IIS configuration provided by Azure.
Approach 1: Use WorkerProcess Request Collection
public void EnumerateWorkerProcess()
{
ServerManager manager = new ServerManager();
foreach (WorkerProcess proc in manager.WorkerProcesses)
{
RequestCollection req = proc.GetRequests(1000);
Debug.WriteLine(req.Count);
}
}
Cons:
Requires RequestMonitor to be enabled explicitly in IIS.
Approach 2: Use PerformanceCounter class
public void ReadPerformanceCounter()
{
var root = HostingEnvironment.MapPath("~/App_Data/PerfCount.txt");
PerformanceCounter counter = new PerformanceCounter(#"ASP.NET", "requests current", true);
float val = counter.NextValue();
using (StreamWriter perfWriter = new StreamWriter(root, true))
{
perfWriter.WriteLine(val);
}
}
Cons:
Requires higher privilege than currently running IIS process.
P.S. There has been a four years old SO post but not answered well.

Stop process if webBrowser control hangs

I am using the WebBrowser control.
This works fine most of the time however wehn navigating to a new page or waiting for a new page to load can sometimes hangs.
Is there a way to catch this? i.e. if the page is failing to navigate or load after a certain amount of time then kill the process?
I am using the - webBrowser1_DocumentCompleted event to pick up ertain behaviours when the page loads/navigates as expected however not sure how to catch if a page is hanging??
Maby you should try to implement some kind of timeout logic? There are quite many samples in web about this. F.e. this one
Also you might be interested in this event of WebBrowserControl ProgressChanged
This is due to that webbrowser component is very basic model of internet explorer, and it get stuck at ajax pages. You can fix this problem explicitly to use latest version of internet explorer... Using this code...
try
{
string installkey = #"SOFTWARE\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATION";
string entryLabel = "YourExe.exe";
string develop = "YourExe.vshost.exe";//This is for Visual Studio Debugging...
System.OperatingSystem osInfo = System.Environment.OSVersion;
string version = osInfo.Version.Major.ToString() + '.' + osInfo.Version.Minor.ToString();
uint editFlag = (uint)((version == "6.2") ? 0x2710 : 0x2328); // 6.2 = Windows 8 and therefore IE10
Microsoft.Win32.RegistryKey existingSubKey = Microsoft.Win32.Registry.LocalMachine.OpenSubKey(installkey, false); // readonly key
if (existingSubKey.GetValue(entryLabel) == null)
{
existingSubKey = Microsoft.Win32.Registry.LocalMachine.OpenSubKey(installkey, true); // writable key
existingSubKey.SetValue(entryLabel, unchecked((int)editFlag), Microsoft.Win32.RegistryValueKind.DWord);
}
if (existingSubKey.GetValue(develop) == null)
{
existingSubKey = Microsoft.Win32.Registry.LocalMachine.OpenSubKey(installkey, true); // writable key
existingSubKey.SetValue(develop, unchecked((int)editFlag), Microsoft.Win32.RegistryValueKind.DWord);
}
}
catch
{
MessageBox.Show("You Don't Have Admin Previlege to Overwrite System Settings");
}
}
Right Click Both your Exe. And vshost.exe and Run as Administrator To Update Registry for this Application....

Resolve 'configuration object is read only, because it has been committed by a call to ServerManager.CommitChanges()'?

I've written a custom action for an installer project that does the following:
Checks existing websites to see if any exist with the same name put
in by the user.
Creates the website in IIS if it doesn't exist.
Creates an application pool.
Assigns the application pool to the created website.
When it comes to assigning the application pool I get and error:
The configuration object is read only, because it has been committed
by a call to ServerManager.CommitChanges(). If write access is
required, use ServerManager to get a new reference.
This baffles me as it seems to suggest that I can't assign the newly created application pool with the ServerManager.CommitChanges() call. However, everything else works fine using this, which I wouldn't expect if this was an issue.
Here is my code:
I have a ServerManager instance created like so:
private ServerManager mgr = new ServerManager();
In my Install method I do the following:
Site site = CreateWebsite();
if (site != null)
{
CreateApplicationPool();
AssignAppPool(site);
}
Check existing websites - done in OnBeforeInstall method
private Site CheckWebsites()
{
SiteCollection sites = null;
Site site = null;
try
{
sites = mgr.Sites;
foreach (Site s in sites)
{
if (!string.IsNullOrEmpty(s.Name))
{
if (string.Compare(s.Name, targetSite, true) == 0) site = s;
}
}
}
catch{}
return site;
}
CreateWebSite method:
private Site CreateWebsite()
{
Site site = CheckWebsites();
if (site == null)
{
SiteCollection sites = mgr.Sites;
int port;
Int32.TryParse(targetPort, out port);
site = sites.Add(targetSite, targetDirectory, port);
mgr.CommitChanges();
}
else
{
//TO DO - if website already exists edit settings
}
return site;
}
Create App Pool
//non-relevant code...
ApplicationPool NewPool = mgr.ApplicationPools.Add(ApplicationPool);
NewPool.AutoStart = true;
NewPool.ManagedRuntimeVersion = "4.0";
NewPool.ManagedPipelineMode = ManagedPipelineMode.Classic;
mgr.CommitChanges();
Assign App Pool
private void AssignAppPool(Site site)
{
site.ApplicationDefaults.ApplicationPoolName = ApplicationPool; //ERRORS HERE
mgr.CommitChanges();
}
I can't see why a site could be created, an app pool created but then not assigned. Help.
I finally realised that the 'configuration object' referred to in the error was the 'site'. Seems obvious now, but basically I needed to re-get the site to then assign the app pool to it. I think this is allow the previous changes to take place and then pick them up. So I altered my code by removing the need to pass the Site into private void AssignAppPool() and just getting the site again like this:
Site site = mgr.Sites["TestWebApp"];

"Access is Denied" when using BreakRoleInheritence method on Drop Off Library in ItemUpdating event receiver

I have an event receiver that is using the ItemUpdating event. I am trying to disinherit the item's permissions from parent and strip all permissions on the item upon being uploaded to the Drop Off Library. The reason for this is that the document contains sensitive information and should not be visible to anyone once it arrives in the Drop Off Library.
The code below is throwing the following error when executing the CurrentListItem.BreakRoleInheritence(true) line upon reaching the Drop Off Library: Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))
SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite oSiteCollection = new SPSite(properties.ListItem.Web.Url))
{
using (SPWeb oWebsite = oSiteCollection.RootWeb)
{
SPListItem CurrentListItem = properties.ListItem;
SPRoleAssignmentCollection SPRoleAssColn = CurrentListItem.RoleAssignments;
oSiteCollection.AllowUnsafeUpdates = true;
oWebsite.AllowUnsafeUpdates = true;
CurrentListItem.BreakRoleInheritance(true);
oSiteCollection.AllowUnsafeUpdates = true;
oWebsite.AllowUnsafeUpdates = true;
for (int i = SPRoleAssColn.Count - 1; i >= 0; i--)
{
if (SPRoleAssColn[i].Member.Name != "System Account")
{
SPRoleAssColn.Remove(i);
}
}
}
}
});
I have also done the following things:
- made sure the Identity of the app pool is a site collection admin
- tried to get the list item again using GetItemById (but this throws an error that the item does not exist, since the item has still not been published yet - we need to remove permissions before the document is published - and we cannot force a check-in otherwise the drop off library might process and move the document to the its target library)
- tried getting the web and site options using guids
- tried many different combinations of placements with AllowUnsafeUpdates
- used the user token of a site collection admin to open the web objects
For some reason, the code above works fine when the document reaches the target library (as we are removing all the permissions AGAIN once the document arrives at the destination). This happens because the document is moved from the drop off library to the target library using the System Account.
Any thoughts on how to get around the "Access is Denied" error while utilizing a similar approach as above?
And how this item is being sent to Drop off library? Via workflow, or when using OfficialFile.asmx? Can you check if this eventreceiver works properly when you manually upload file (most probably you will have to click 'submit' to force update, but maybe you can set fields in such way, that none of your Rules would match? I files get there via workflow, make sure that also account on which SPTimer4 service is running is site administrator on this site collection.
Also be careful, - i think there are two things that may not work as you expect in your event receiver:
SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (var oSiteCollection = new SPSite(properties.ListItem.Web.Url))
{
using (var oWebsite = oSiteCollection.RootWeb)
{
var currentListItem = properties.ListItem;
oSiteCollection.AllowUnsafeUpdates = true;
oWebsite.AllowUnsafeUpdates = true;
currentListItem.BreakRoleInheritance(true);
//You should take the RoleAssignments after breaking inheritance our you will be working on parents permissions.
var spRoleAssColn = currentListItem.RoleAssignments;
oSiteCollection.AllowUnsafeUpdates = true;
oWebsite.AllowUnsafeUpdates = true;
for (int i = spRoleAssColn.Count - 1; i >= 0; i--)
{
//I think it won't allow you to remove permissions for site administrator
if (spRoleAssColn[i].Member.Name != "System Account" && !oWebsite.EnsureUser(spRoleAssColn[i].Member.LoginName).IsSiteAdmin)
{
spRoleAssColn.Remove(i);
}
}
}
}
});

Categories

Resources