Is is possible to edit the raw html content of a deployed .aspx page without recompiling or redeploying?
For example say I have a deployed .net 4.6 aspx website. If i give someone FTP access to the .aspx files on the server can that person edit and update the layout of html elements without recompiling?
This might occur when a website is deployed onto a web host and a 3rd party contractor is requested to help improve the gui without being giving full access to the .sln file [eg c#\wcf\classes etc..].
Yes, this is possible.
.aspx files are not precompiled (unless you specifically say so, in which case you won't see them on the server).
One thing to keep in mind is that every time you change an .aspx-file, it will be recompiled the first time it's requested. Also after a number of changes (I belive this to be 20), the application pool will restart.
These page recompilations and apppool reboots could impact performance on your site.
Apart from that the usual guidelines apply and I would advise not to do this:
the changes won't be synced back to your local repository
You have little control or testing over the changes
You have no backups of these changes
It seems it should be possible to edit deployed aspx files, just to be aware the changes will not show up in the solution file which was used to deploy them but only be visible on the device where the edits occurred.
yes, according to my experience, we can change .aspx, .cshtml, .html, .css and any image file in the host server if we have access to that server. Solution do not need to be compiled to change all these files because we upload directly these files on the host server. So for any change on these file either on our local machine or on host server, we can just copy and paste file from either location using remote connection to synchronize manually.
Only For all .cs file we only upload a dll file so solution need to be recompiled before uploading for any changes of class file.
So, if third party make any changes on host file like .apsx, .css. image file then these can be copied to the local machine directly to override the old one and checkin to the repository.
Related
In my web-app (WebPages, C#.NET) I have a drag and drop file box where user's can drag files from windows explorer and, once dropped, it will save in a given location on a shared drive. This part is working fine. The box looks something like this:
The problem is, that it also reads files from the same directory and my user's would like to be able to open the files from this interface on dblclick. I have written an ajax request with jquery (the ajax, too, is working fine), but I can't seem to get the files to open on the user's machine no matter what I try.
Most references I try and look up point me towards System.Diagnostics.Process.Start(#"<directory goes here>") but this doesn't really do anything. It will open some process on the server side, but nothing opens, either on the server or on the user's machine.
What they'd like to do, for instance, is double click 'Hazcom.xls' and it would use the default associated application to open the file. In this case, of course, Microsoft Excel.
Is this even possible or am I chasing a wild goose here?
Sources I've Tried:
Open file with associated application
http://www.csharp-examples.net/open-file-with-default-application/
How can I open Windows Explorer to a certain directory from within a WPF app?
c# open file with default application and parameters
There have been a few more sources I've tried, as well, but they're all pretty much in the same vein as these.
Additional Info:
The internal Intranet application runs on a server using IIS 8
The solution is desired to be opened on the user's machine and not, say, the server itself.
The path to the files is dynamically changing depending on what they have loaded into the interface.
Though, I'm not expecting this to be a solution viable for client side (jquery) I'd be happy to look into that if that's the only solution available.
I'd also settle for simply opening the file location, instead of the actual file itself, but I've had no luck with this either, for what looks like the same reasons as the original problem.
I recently added a way for my web application (ASP.NET written in C#) to go to a folder which contains a bunch of spreadsheets and import them into SQL server tables. I set the folders and file names using an admin table so it knows how to handle each file and which table they should go to etc. It even keeps track of the file dates and times so it ignores anything that isn't new since the last time it imported them. Very cool but it only works on my development machine, most likely because the path is easily recognized there.
I'd like others to be able to do this but I can't seem get the web application to access a pre-arranged path on the users local machine. Now I'm assuming this is normal (we shouldn't be able to have a web application reach into someone's machine and grab files!) but is there some way to either do it using a known path or by having a user select the local folder? Is it possibly done more easily if I put the files in a folder within the site?
Dana
If I understand your question correctly, the approach is that you want a user to type in a local file path and you process it.
This will not work through a website. And from a security perspective this is very wise as you point out. So unless you install some client application on the local machine it is not possible.
You will need a file-upload dialog and have the user explicity locate the files for you, click upload and process them on the server.
Some other strategies here:
https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications
but it still requires the user to select them manually.
I'm writing a multi threaded console application which downloads pdf files from the web and copies it locally on to our content Server location(windows server). This is also the same location from which the files will be served to our website.
I am skeptical about the approach, because of concurrrency issues such as if the user on the web site requests a pdf file from the content server, and at the same time the file is being written to or being updated by the console application there might be an IO Exception. (The application also makes updates to the pdf files if the original contents change over time)
Is there a way to control the concurrency issue?
You probably want your operations on creating and updating the files where they are served to be atomic, so that any other processes dealing with those files get the correct version, not the one that is still open for writing.
Instead of actually writing the files to where they will be served, you could write them to a temporary directory and then move them into the directory where they will be served from.
Similarly, for updating them, you should check that when your application is updating those pdfs that the files themselves are not changed until writing has finished. You could test this by making your application sleep after it has started writing to the file, for example.
The details depend on which web server software you are using, but the key to this problem is to give each version of the file a different name. The same URL, mind you, but a different name on the underlying file system.
Once a newer version of the file is ready, change the web server's configuration so that the URL points to the new file. In any reasonably functional web server this should be an atomic operation.
If the web server doesn't have built-in support for this, you could serve the files via a custom server-side script.
Mark the files hidden until the copy or update is complete.
I have a website that occasionally needs to have a handful of the tables in its database updated. The updates come from another system that exports to comma delimited text files. I can then either FTP the text files to the web server, send them in through an admin upload page, or manually log in to Remote Desktop to download the text files. I have all my C# code written to parse the files, check the database contents, and decide what to do.
Should I code the sync logic to be part of a file upload page, protected in the admin section of the site or should I create a Windows Service that constantly looks for files to process in a particular directory that I can drop files in through FTP?
I have used Windows Services in the past and they have worked great, but if I ever have to make a change to the code it can take longer than it would if I just had to modify an ASPX.
Are their security benefits one way or another?
Performance benefits?
ASPX page wins the "ease of maintenance" category.
I would create a Windows Service to watch a secure folder and use a directory watcher to look for new files. Since the files are coming from another system, it is asynchronous in nature, and it is much more performant to have a Windows Service running separately to watch for updates as they happen. It can also parse the files and update the database for you.
Depending on who maintains the remote system, the easiest way is to grant permission to the service to access the files on a secure, shared folder. Then you won't need to do anything manually.
One of the projects I am working on includes a website that
is hosted on a cheap shared hosting server.
Whenever I upload to the server some updated files, they don't
necessarily become available immediately.
It can take from 15 to 30 minutes before the server actually
starts using the new files instead of the old ones and in some
cases I even need to re-re-upload the updated files.
Some more info:
- C# webforms files (.aspx and .aspx.cs)
- If there was no previous file with that name on the server
then the file always become immediately available
- But if I first delete the older file and refresh the page
I get immediately a "file not found" error but if I then upload
the newer file the "file not found error" stops immediately but I
get back the older file again.
I understand that the server isn't actually serving the .aspx
page but rather using the compiled to dll version that it has made
(right?) so maybe this is a compiling problem on the server somehow?
I'm not sure if this would be better on serverfault.com
but as a programmer SO is where I usually come.
Any idea why this is happenning and preferably some solution
on how to fix this behavior so that when I upload an updated page
we can start using it immediately?
Thank you.
Usually, touching your web.config file will recycle the web server - if you do that, you should flush any caches. Just upload a new web.config with a trivial change and see if that helps.
If you are using .NET 2.0 websites, you can have problems with the .dlls in the bin folder. Changing to a Web application should solve your problem permanently.
http://webproject.scottgu.com/
I have seen this behavior on one of my sites as well.
In my case the issues began just after the hosting provider had moved my site to their new SAN-solution.
It turned out that this new storage solution did not support "file system watchers".
And without it IIS would never receive any notification of when a file has been updated or not.
The workaround they introduced was to move the applications into new application pools with regular intervals. (This gives the symptoms you are describing with updates only being applied at regular intervals.)
The only solution I found was to move my sites to a different hosting provider.