Recently deploying an mvc4 project to a virtual directory on a shared iis6 server, I was able to get the server to allow reading files (a problem stemming partially from extensionless URLs).
I have not however, been able write anything to the database or to txt file logs, since deployment. Speaking to the hosting service I've been able to determine that the user is Network User, that all users (including asp net) have read/write privileges. I've tried moving the log files into a separate Log folder, also to no avail.
the error I'm getting is:
"[UnauthorizedAccessException: Access to the path
'D:\wwwroot\wwwroot1\isphost\psychtech\psychtech.co.il\Testing\Log\LogHttpRequ
ests.txt' is denied.]
Any and all ideas would be appreciated!
The message is pretty clear, the user of the app pool under which run the IIS worker of your site can't access this path to write to the file. The problem might be the same to save to your .mdb.
I suggest you to contact your hoster and ask him about this, it will probably be able to help.
On a side note, you may whant to use a proper database as the file/mdb ones aren't meant for load and concurent acces as it can occur in a web environement. It will also be more scalable, more reliable, more secure and more resilient to failure.
After many hours, the hosting company IT specialist demystified the problem: as I'm deploying an .net 4 application on an iis6 server, the server utilizes a different user, namely: NETWORK SERVICE. Once this service was granted write privileges, the problem was solved.
I believe this is what reporter was hinting at.
A note of caution: this configuration is highly susceptible to SQL Injection, requiring additional safety measures in the code
Related
I have a site running Crystal Reports Viewer in IIS 7 but for security reasons we want it to run the application pool under the ApplicationPoolIdentity. We noticed the application pool keeps crashing when we switched it from Network Service to ApplicationPoolIdentity and we want to give extra permissions to the ApplicationPoolIdentity on certain folders on the drive in order to remedy this situation.
The problem is, we gave a ton of permissions to different folders, but as far as we can tell we cannot pinpoint why the application pool keeps crashing and what file it cannot access.
We checked a lot of different log files, but maybe we skipped over some that could be of more importance. Is there anything out there to show us where the problem is, whether it be a certain log I haven't come across or some sort of tracing I can use to get to the files that I need to give this identity permissions?
The pool crashes when trying to load Crystal Reports Viewer and gives no errors, just says it is unreachable.
IIRC Crystal Reports Viewer is a COM object... they usually require special case regarding permissions and might even be unreliable because some COM objects need a "desktop" to work correctly... not really sure whether Crystal Reports Viewer is one of those BUT I would recommend asking the vendor what configuration/permission are needed in your scenario...
Actually you may be able to get this to work by setting the permissions correctly for appPoolIdentity. See the following post on the IIS team blog.
http://blogs.iis.net/webdevelopertips/archive/2009/10/02/tip-98-did-you-know-the-default-application-pool-identity-in-iis-7-5-windows-7-changed-from-networkservice-to-apppoolidentity.aspx
I've got a webserver where people upload files. What I need to do is take those files and write them to a file share on the Active Directory domain. The problem -- the webserver is not on the domain.
So, how is the best way to do this? I would have thought this would be easy, something along the lines of create a connection with some credentials and do it. But apparently not. The closest I've found is Impersonation with WindowsIdentity.Impersonate, but everything I've read says that is a bad idea in a production environment.
Any ideas? I'm working on a solution that FTPs the files, but that's unsatisfying too, and a fallback plan.
I'm using c# and .net 4.0 in (obviously) a windows environment.
Edit: I should point out that I can't run servers (or services) that access the outside on that domain. The FTPing is a temporary workaround.
I would have another program probably a Windows service pick the files from the web service file location and move them to the active directory directory. I would probably have this process execute from the location where they are being copied to. Make them available in a share on the web server visible only to the process's user and admins.
I think that an FTP solution is better than using a Windows Share; however, I would think a web service of some type would be the best option for an inter-domain file exchange. That said, if you've got it working with WindowsIdentity.Impersonate -- why not use it? What context did you read that it was a bad idea?
Is there any way that you can map this file share as Network Driver. If you can do that, you don't need to manager Security and will be super easy to access these files as if they are local.
hi
I have a FileUpload and I save a picture and this is the error
Access to the path 'D:\Hosting\0000000\html\images\APgt_logo.jpg' is denied.
an my local is working well
whats the problem?
this is the code
new_row["Product_imag"] = FileUpload1.FileName.ToString();
FileUpload1.SaveAs(Server.MapPath("/images/"+ FileUpload1.FileName.ToString()));
The user that is running the Web Service process (ie w3wp.exe) doesn't have sufficient rights on the folder you are trying to write to.
Either you have to set write rights in IIS, or you have to adjust the security in the file system. Or both.
You need to provide more information if you want help with that.
This is a permissions problem on your web server.
When you run the project locally, the local web server is executing using your permissions which has write access to the directory in question.
When running on the server, the user the app pool is executing under does not have permission to write to the directory. This is normal as it usually shouldn't.
You might ask this over at serverfault.com to get some good recommendations on how to do this in a secure manner.
We've got a process currently which causes ASP.NET websites to be redeployed. The code is itself an ASP.NET application. The current method, which has worked for quite a while, is simply to loop over all the files in one folder and copy them over the top of the files in the webroot.
The problem that's arisen is that occasionally files end up being in use and hence can't be copied over. This has in the past been intermittent to the point it didn't matter but on some of our higher traffic sites it happens the majority of the time now.
I'm wondering if anyone has a workaround or alternative approach to this that I haven't thought of. Currently my ideas are:
Simply retry each file until it works. That's going to cause errors for a short time though which isn't really that good.
Deploy to a new folder and update IIS's webroot to the new folder. I'm not sure how to do this short of running the application as an administrator and running batch files, which is very untidy.
Does anyone know what the best way to do this is, or if it's possible to do #2 without running the publishing application as a user who has admin access (Willing to grant it special privileges, but I'd prefer to stop short of administrator)?
Edit
Clarification of infrastructure... We have 2 IIS 7 webservers in an NLB running their webroots off a shared NAS (To be more clear, they're using the exact same webroot on the NAS). We do a lot of deploys, to the point where any approach we can't automate really won't be viable.
What you need to do is temporary stop IIS from processing any incoming requests for that app, so you can copy the new files and then start it again. This will lead to a small downtime for your clients, but unless your website is mission critical, that shouldn't be that big of a problem.
ASP.NET has a feature that targets exactly this scenario. Basically, it boils down to temporarily creating a file named App_Offline.htm in the root of your webapp. Once the file is there, IIS will takedown the worker process for you app and unload any files in use. Once you copy over your files, you can delete the App_Offline.htm file and IIS will happily start churning again.
Note that while that file is there, IIS will serve its content as a response to any requests to your webapp. So be careful what you put in the file. :-)
Another solution is IIS Programmatic Administration.
Then you can copy your new/updated web to an alternative directory then switch the IIS root of your webapp to this alternative directory. Then you don't matter if files are locked in the original root. This a good solution for website availability.
However it requires some permission tuning...
You can do it via ADSI or WMI for IIS 6 or Microsoft.Web.Administration for IIS 7.
About your 2., note that WMI don't require administrator privileges as ADSI do. You can configure rights by objects. Check your WMI console (mmc).
Since you're already load balancing between 2 web servers, you can:
In the load balancer, take web server A offline, so only web server B is in use.
Deploy the updated site to web server A.
(As a bonus, you can do an extra test pass on web server A before it goes into production.)
In the load balancer, take B offline and put A online, so only web server A is in use.
Deploy the updated site to web server B.
(As a bonus, you can do an extra test pass on web server B before it goes into production.)
In the load balancer, put B back online. Now both web servers are upgraded and back in use in production.
List item
You could also try to modify the timestamp of web.config in the root folder before attempting to copy the files. This will unload the application and free used files.
Unless you're manually opening a handle to a file on your web server, IIS won't keep locks on your files.
Try shutting down other services that might be locking your files. Some examples of common services that do just that:
Windows Search
Google Desktop Search
Windows Backup
any other anti-virus or indexing software
We had the same server (2003) and the same problem. Certain dll's were being locked and putting the App_Offline.htm in the website root did jack diddly for us.
Solution:
File permissions!
We were using a web service which runs under the Network Service account or the IIS_WPG account to deploy updates to the web site. Thus it needed write access to all the files. I already knew this, and had already set the permissions on the directory a while ago. But for some strange reason, the necessary permissions were not set on this one problem dll. You should check the permissions not only on the directory, but on the problem file as well.
We gave Network Service and IIS_WPG users read/write access to the entire web root directory and that solved our file in use, file locked, timeout, and access denied issues.
I am scratching my head about this. My scenario are that I need to upload a file to the company server machine(to a folder on c:) from our hosting one(totally different server). I don't know how I should do this. Any of you got tips or code on how this is done.
Thanks Guys
I would set up an FTP server (like the one in IIS or a third-party server) on the Company Server. If security is an issue then you'll want to set up SFTP (secure FTP) rather than vanilla FTP since FTP is not a natively secure transfer protocol. Then create a service on the Hosting Server to pick up the file(s) as they come in and ship them to the company server using C#/.NET's FTP control. Honestly, it should be pretty straightforward.
Update: Reading your question, I am under the strong impression that you will NOT have a web site running on the company server. That is, you do not need a file upload control in your web app (or already know how to implement one given that the control is right in the web page toolbox). Your question, as I understand it, is how to get a file from the web server over to the company server.
Update 2: Added a note about security. Note that this is less of a concern if the servers are on the same subdomain and won't be routed outside of the company network and/or if the data is not sensitive. I didn't think of this at first because I am working a project like this now but our data is not, in any way, sensitive.
Darren Johnstone's File Upload control is as good a solution as you will find anywhere. It has the ability to handle large files without impacting the ASP.NET server memory, and can display file upload progress without requiring a Flash or Silverlight dependency.
http://darrenjohnstone.net/2008/07/15/aspnet-file-upload-module-version-2-beta-1/
There isnt enough info to tell your whole hosting scenario but I have a few suggestions that might get you started in the right direction:
Is your external server owned by another company or group and you cant modify it? If not you might consider hosting the process on the same machine, either in process or as a separate service on the machine. If it cannot be modified, you might consider hosting the service on the destination machine, that way its in the same place as the files need to show up at.
Do the files need to stay in sync with the process? I.e. do they need to be uploaded, moved and verified as a single operation? If not then a separate process is probably the best way to go. The separate process will give you some flexibility, but remember it will be a separate process and a separate set of code to manage and work with
How big is the file(s) that are being uploaded? Do they vary by upload? are the plain files, binaries (zips, executables, etc)? If the files are small you have more options than if they are large. If they are small enough, you can even relay them in line.
Depending on the answers to the above some of these might work for you:
Use MSMQ. This will work for simple messages under about 3MB without too much hassle. Its ideal for messages that can be directly worked with (such as XML).
Use direct HTTP(s) relaying. On the host machine open a HTTP(s) connetion to the destination machine and transfer the file. Again this will work better for smaller files (i.e. only a few KB since it will be done in-line)
If you have access to the host machine, deploy a separate process on the machine which builds or collects the files and uses any of the listed methods to send them to the destination machine.
You can use SCP, FTP (of any form SFTP, etc) on either the host machine (if you have access) or the target machine to host the incoming files and use a batch process to move the files. This will have a lot of issues to address, such as file size, keeping submissions in sync, and timing. I would consider this as a last resort, depending on the situation.
Again depending on message size, you could also use a layer of abstraction such as a DB to act as the intermediate layer between the two machines. This will work as long as the two machines can see the DB (or other storage location) and both act on it. SQL Server Service Broker could be used for this purpose (and most other DB products offer similar products).
You can look at other products like WSO2 ESB or NServiceBus to facilitate messaging between the two apps and do it inline.
Hopefully that will give you some starting points to look into.