Dear all, i have following code to open a file on click of a button
System.Diagnostics.Process.Start("soffice.exe",filepath);
soffice.exe is to open .odt files & filepath is containing the complete path of the file which i want to open.
This is working perfectly when i m executing the code on my local system, but as i m hosting it on the iis server (5.1), its not taking any action (event not throwing any error too). My filepath is accessing a folder in my project, not outside. Kindly suggest the possible reasons and solutions
In response to your comment above...
First of all, by "web service account" I don't mean web services. I mean the service account on the web server under which the web application runs. This could perhaps be the account of the user logged in to the website, or the default IIS account, etc. The best way to address this would be to fully qualify the path to soffice.exe when calling it, that way you don't have to worry about the PATH environment variable. (Additionally, you don't have to worry as much about another application being run maliciously or by accident and doing something unexpected with unknown permissions.)
Second, there seems to be a critical design flaw in your approach. Even if you do manage to get the application to launch on the server, it's going to launch on the server. Is the resource manager sitting at the actual web server? If not, then opening the file in the application on the server will do him no good whatsoever. If he is sitting at the server, then he's the only person who can use this.
You don't want to open the file on the server. You want to deliver the file to the client. Then if the user (the resource manager in your example) can open the file in soffice.exe on his local machine. If his environment is set up correctly, it should open automatically. (Though the browser will also give him the option to save the file locally and then open it.) Simply linking to the file should suffice. Is there any particular reason why it wouldn't?
If you need to use a form post rather than simply a link in order to deliver the file, you can still stream the file from your server-side code. Here's a previous question discussing how to do that. Basically the process involves clearing the output buffer, setting the headers (content length, content type, suggested file name, etc.), streaming the bytes, and flushing/closing the output buffer.
Related
In my web-app (WebPages, C#.NET) I have a drag and drop file box where user's can drag files from windows explorer and, once dropped, it will save in a given location on a shared drive. This part is working fine. The box looks something like this:
The problem is, that it also reads files from the same directory and my user's would like to be able to open the files from this interface on dblclick. I have written an ajax request with jquery (the ajax, too, is working fine), but I can't seem to get the files to open on the user's machine no matter what I try.
Most references I try and look up point me towards System.Diagnostics.Process.Start(#"<directory goes here>") but this doesn't really do anything. It will open some process on the server side, but nothing opens, either on the server or on the user's machine.
What they'd like to do, for instance, is double click 'Hazcom.xls' and it would use the default associated application to open the file. In this case, of course, Microsoft Excel.
Is this even possible or am I chasing a wild goose here?
Sources I've Tried:
Open file with associated application
http://www.csharp-examples.net/open-file-with-default-application/
How can I open Windows Explorer to a certain directory from within a WPF app?
c# open file with default application and parameters
There have been a few more sources I've tried, as well, but they're all pretty much in the same vein as these.
Additional Info:
The internal Intranet application runs on a server using IIS 8
The solution is desired to be opened on the user's machine and not, say, the server itself.
The path to the files is dynamically changing depending on what they have loaded into the interface.
Though, I'm not expecting this to be a solution viable for client side (jquery) I'd be happy to look into that if that's the only solution available.
I'd also settle for simply opening the file location, instead of the actual file itself, but I've had no luck with this either, for what looks like the same reasons as the original problem.
I recently added a way for my web application (ASP.NET written in C#) to go to a folder which contains a bunch of spreadsheets and import them into SQL server tables. I set the folders and file names using an admin table so it knows how to handle each file and which table they should go to etc. It even keeps track of the file dates and times so it ignores anything that isn't new since the last time it imported them. Very cool but it only works on my development machine, most likely because the path is easily recognized there.
I'd like others to be able to do this but I can't seem get the web application to access a pre-arranged path on the users local machine. Now I'm assuming this is normal (we shouldn't be able to have a web application reach into someone's machine and grab files!) but is there some way to either do it using a known path or by having a user select the local folder? Is it possibly done more easily if I put the files in a folder within the site?
Dana
If I understand your question correctly, the approach is that you want a user to type in a local file path and you process it.
This will not work through a website. And from a security perspective this is very wise as you point out. So unless you install some client application on the local machine it is not possible.
You will need a file-upload dialog and have the user explicity locate the files for you, click upload and process them on the server.
Some other strategies here:
https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications
but it still requires the user to select them manually.
I have a Visual Studio project that gets installed to about half of the PCs in our company.
The application has an error logging routine that appends error messages to a text file that is kept here:
static string _appPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData), Application.CompanyName);
For the most part, this works well and allows me to get to the log files on individual PCs.
It does not work everywhere. Some users have limited accounts, which do not have permissions to write to the Environment.SpecialFolder.CommonApplicationData folder.
I could modify the path to store here:
static string _newPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData), Application.CompanyName);
However, then when I need to collect log files, I must dig into every single user account folder on each PC to get to the logs.
What I would like to do is store these messages on our internal network. The problem with this is how to handle multiple write instances.
static string _netPath = Path.Combine(#"\\FileServer1", Application.CompanyName);
I'd like to use Messaging, but I'm not sure how to do that - or if it would be a good solution to this. Management wants me to push this update out this Friday (today is Wednesday).
What would be the easiest, most reliable method of using _netPath to store log files from some 100 PCs in our network?
Note: Please, if you reference MSDN, paste the article. We have an old Microsoft ISA firewall that is not able to pass MSDN websites. (The ISA server times out every time, and it is so frusterating! See ISA Proxy Server can't... for my unanswered question on it.)
Save each pc's log file under a different name in your network folder instead of having all the pc's share the same log file.
You can use their network ID for the name or anything else you can generate automatically.
Add the PC name to the filename too...
Then you dont have all the instances writing to the same file, and when you get a support call, you can go straight to the file for the pc of the user its been logged from...
Alternatively, just use a plain WCF service (even self-hosted) to manage the writing to the file, all other machines just call the service method
You could use MSMQ via WCF too
What if you used a centralized service to do that? Create a simple logging webservice on an internal server somewhere and send the data to the webservice. That way only the webservice would need access to the log files, as long as everyone can touch the service. Then you can store them all in one folder or do anything you want with them.
It wouldn't have to be a webservice, of course, but that's a pretty quick and simple way, and it's the first that springs to mind. Really any centralized application that your clients can hit should do the trick.
I have a website that occasionally needs to have a handful of the tables in its database updated. The updates come from another system that exports to comma delimited text files. I can then either FTP the text files to the web server, send them in through an admin upload page, or manually log in to Remote Desktop to download the text files. I have all my C# code written to parse the files, check the database contents, and decide what to do.
Should I code the sync logic to be part of a file upload page, protected in the admin section of the site or should I create a Windows Service that constantly looks for files to process in a particular directory that I can drop files in through FTP?
I have used Windows Services in the past and they have worked great, but if I ever have to make a change to the code it can take longer than it would if I just had to modify an ASPX.
Are their security benefits one way or another?
Performance benefits?
ASPX page wins the "ease of maintenance" category.
I would create a Windows Service to watch a secure folder and use a directory watcher to look for new files. Since the files are coming from another system, it is asynchronous in nature, and it is much more performant to have a Windows Service running separately to watch for updates as they happen. It can also parse the files and update the database for you.
Depending on who maintains the remote system, the easiest way is to grant permission to the service to access the files on a secure, shared folder. Then you won't need to do anything manually.
We've got a process currently which causes ASP.NET websites to be redeployed. The code is itself an ASP.NET application. The current method, which has worked for quite a while, is simply to loop over all the files in one folder and copy them over the top of the files in the webroot.
The problem that's arisen is that occasionally files end up being in use and hence can't be copied over. This has in the past been intermittent to the point it didn't matter but on some of our higher traffic sites it happens the majority of the time now.
I'm wondering if anyone has a workaround or alternative approach to this that I haven't thought of. Currently my ideas are:
Simply retry each file until it works. That's going to cause errors for a short time though which isn't really that good.
Deploy to a new folder and update IIS's webroot to the new folder. I'm not sure how to do this short of running the application as an administrator and running batch files, which is very untidy.
Does anyone know what the best way to do this is, or if it's possible to do #2 without running the publishing application as a user who has admin access (Willing to grant it special privileges, but I'd prefer to stop short of administrator)?
Edit
Clarification of infrastructure... We have 2 IIS 7 webservers in an NLB running their webroots off a shared NAS (To be more clear, they're using the exact same webroot on the NAS). We do a lot of deploys, to the point where any approach we can't automate really won't be viable.
What you need to do is temporary stop IIS from processing any incoming requests for that app, so you can copy the new files and then start it again. This will lead to a small downtime for your clients, but unless your website is mission critical, that shouldn't be that big of a problem.
ASP.NET has a feature that targets exactly this scenario. Basically, it boils down to temporarily creating a file named App_Offline.htm in the root of your webapp. Once the file is there, IIS will takedown the worker process for you app and unload any files in use. Once you copy over your files, you can delete the App_Offline.htm file and IIS will happily start churning again.
Note that while that file is there, IIS will serve its content as a response to any requests to your webapp. So be careful what you put in the file. :-)
Another solution is IIS Programmatic Administration.
Then you can copy your new/updated web to an alternative directory then switch the IIS root of your webapp to this alternative directory. Then you don't matter if files are locked in the original root. This a good solution for website availability.
However it requires some permission tuning...
You can do it via ADSI or WMI for IIS 6 or Microsoft.Web.Administration for IIS 7.
About your 2., note that WMI don't require administrator privileges as ADSI do. You can configure rights by objects. Check your WMI console (mmc).
Since you're already load balancing between 2 web servers, you can:
In the load balancer, take web server A offline, so only web server B is in use.
Deploy the updated site to web server A.
(As a bonus, you can do an extra test pass on web server A before it goes into production.)
In the load balancer, take B offline and put A online, so only web server A is in use.
Deploy the updated site to web server B.
(As a bonus, you can do an extra test pass on web server B before it goes into production.)
In the load balancer, put B back online. Now both web servers are upgraded and back in use in production.
List item
You could also try to modify the timestamp of web.config in the root folder before attempting to copy the files. This will unload the application and free used files.
Unless you're manually opening a handle to a file on your web server, IIS won't keep locks on your files.
Try shutting down other services that might be locking your files. Some examples of common services that do just that:
Windows Search
Google Desktop Search
Windows Backup
any other anti-virus or indexing software
We had the same server (2003) and the same problem. Certain dll's were being locked and putting the App_Offline.htm in the website root did jack diddly for us.
Solution:
File permissions!
We were using a web service which runs under the Network Service account or the IIS_WPG account to deploy updates to the web site. Thus it needed write access to all the files. I already knew this, and had already set the permissions on the directory a while ago. But for some strange reason, the necessary permissions were not set on this one problem dll. You should check the permissions not only on the directory, but on the problem file as well.
We gave Network Service and IIS_WPG users read/write access to the entire web root directory and that solved our file in use, file locked, timeout, and access denied issues.