sync database to uploaded file using Windows Service or website code - c#

I have a website that occasionally needs to have a handful of the tables in its database updated. The updates come from another system that exports to comma delimited text files. I can then either FTP the text files to the web server, send them in through an admin upload page, or manually log in to Remote Desktop to download the text files. I have all my C# code written to parse the files, check the database contents, and decide what to do.
Should I code the sync logic to be part of a file upload page, protected in the admin section of the site or should I create a Windows Service that constantly looks for files to process in a particular directory that I can drop files in through FTP?
I have used Windows Services in the past and they have worked great, but if I ever have to make a change to the code it can take longer than it would if I just had to modify an ASPX.
Are their security benefits one way or another?
Performance benefits?
ASPX page wins the "ease of maintenance" category.

I would create a Windows Service to watch a secure folder and use a directory watcher to look for new files. Since the files are coming from another system, it is asynchronous in nature, and it is much more performant to have a Windows Service running separately to watch for updates as they happen. It can also parse the files and update the database for you.
Depending on who maintains the remote system, the easiest way is to grant permission to the service to access the files on a secure, shared folder. Then you won't need to do anything manually.

Related

Opening files locally from web application

I recently added a way for my web application (ASP.NET written in C#) to go to a folder which contains a bunch of spreadsheets and import them into SQL server tables. I set the folders and file names using an admin table so it knows how to handle each file and which table they should go to etc. It even keeps track of the file dates and times so it ignores anything that isn't new since the last time it imported them. Very cool but it only works on my development machine, most likely because the path is easily recognized there.
I'd like others to be able to do this but I can't seem get the web application to access a pre-arranged path on the users local machine. Now I'm assuming this is normal (we shouldn't be able to have a web application reach into someone's machine and grab files!) but is there some way to either do it using a known path or by having a user select the local folder? Is it possibly done more easily if I put the files in a folder within the site?
Dana
If I understand your question correctly, the approach is that you want a user to type in a local file path and you process it.
This will not work through a website. And from a security perspective this is very wise as you point out. So unless you install some client application on the local machine it is not possible.
You will need a file-upload dialog and have the user explicity locate the files for you, click upload and process them on the server.
Some other strategies here:
https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications
but it still requires the user to select them manually.

Server-Clients Sharing Files on same network C#

Map Drive
C#
Server and Client are on the same network
I am building a small software for laboratory. Tests are requested and laboratory technicians perform the tests, and enter the results, and the software creates reports, transforms them to pdf, and saves them on the server in a folder called archive, so that when doctors log onto the software, they can see the files in a form with a grid in it. And only from that form, the files can be opened.
Everything works fine, but I need to restrict the access to the file called archive so that users can not access it manually. I only need my software to be able to access it.
So what I intend to do is that once the client software logs onto the server software, I want the server software to send a username and password to the client software that remains hidden from users. Because only the client software should be able to create, delete, and change files in the archive folder located on the server.
How can I put a username and password on that file ? Do I need to create an account on the server ? Or is there another way to do this ?
I would recommend against working with passwords. You don't wish to store them, handle changes and risk them getting out. If something is available to the client program, it's generally available to the users. You can't trust the the client or the user.
Why not let the server have access to the folder? Don't give users any permissions on the archive folder but have the client send the files to the server (by adding this option to the server's API). The server can validate the user as legitimate (using your own authentication and authorization, if you have any) and that the files appear OK and then place them inside the Archive folder. That way only the user running the server process has access to the folder and you don't have to mess around with passwords and accounts.
Another idea, if you do not wish to change the API with the server too much. Create another folder, which users will have access to. The client will upload the files to this folder. The server could scan the folder periodically or get notified about the file from the Client, check the file and move it to the protected Archive folder.

Set *.exe file properties after upload to server

I'm trying to find a way to write "meta" information to EXE files that are uploaded to my IIS/ASP.NET web service. Here's a little bit of background:
I need to write one arbitary string into the properties
It'll be URL that I write as "metadata", if that matters
Example: https://example.com/someFolder/someOtherFolder
The files are mainly installers originally created by InnoSetup
The web server is running IIS 7.5 with ASP.NET on top of Server 2008 R2 (Standard)
Why am I trying to write this information?
Ultimately the EXEs are made available to users for download. When the application runs, it needs to know the web URL in order to execute properly. Currently we have a plain text box where the user can input the URL, but that has proven to be error prone (despite prompting/error checking/...)
Why can't I just write the metadata in the EXE when it's created?
I could do that, but the EXE could be uploaded to a variety of different servers, each with their own unique URL "metadata". I'm trying to avoid creating a separate build script for each server.
Why not just create a *.zip file with the *.exe and an extra piece of metadata?
I suppose I could do that too -- but then the user would have to actually unzip the download so that the real installer could read the metadata. I had something similar to this before and most people never unzipped the full download and that posed its own problems.
So is this even possible? I guess as a last resort I could use the uploaded EXE to create a new EXE, but I'm trying to avoid doing that (gets into problems with signed EXEs, etc.)

Concurrency Issue on IO operation

I'm writing a multi threaded console application which downloads pdf files from the web and copies it locally on to our content Server location(windows server). This is also the same location from which the files will be served to our website.
I am skeptical about the approach, because of concurrrency issues such as if the user on the web site requests a pdf file from the content server, and at the same time the file is being written to or being updated by the console application there might be an IO Exception. (The application also makes updates to the pdf files if the original contents change over time)
Is there a way to control the concurrency issue?
You probably want your operations on creating and updating the files where they are served to be atomic, so that any other processes dealing with those files get the correct version, not the one that is still open for writing.
Instead of actually writing the files to where they will be served, you could write them to a temporary directory and then move them into the directory where they will be served from.
Similarly, for updating them, you should check that when your application is updating those pdfs that the files themselves are not changed until writing has finished. You could test this by making your application sleep after it has started writing to the file, for example.
The details depend on which web server software you are using, but the key to this problem is to give each version of the file a different name. The same URL, mind you, but a different name on the underlying file system.
Once a newer version of the file is ready, change the web server's configuration so that the URL points to the new file. In any reasonably functional web server this should be an atomic operation.
If the web server doesn't have built-in support for this, you could serve the files via a custom server-side script.
Mark the files hidden until the copy or update is complete.

How can I save multiple files locally in Silverlight?

My problem is I have a LOB application that can possibly save multiple files (number of files only known at runtime) based on user inputs. Saving this as a single file and having the user break them apart, or zipping them up as a single file is not an option unfortunately.
SaveFileDialog seems suited to only save 1 file at a time. Third party controls may be an option but I have yet to find any that serve this purpose. Thanks!
The browser security model guidelines (outside of Silverlight) prohibit web application logic (script or otherwise) from having direct access to the local file system.
Consider what havoc a malicious web site could wreak on your computer if web application script could write arbitrary files to arbitrary locations on the local hard disk!
For this reason, Silverlight isolates your code away from the local file system. Silverlight manages the Open File or Save File dialogs, but your web app code never gets to see the full path of the file names directly for security reasons. The Silverlight dialog only supports working with one filename / path at a time.
Silverlight does offer isolated storage on the local machine in which your web app could write multiple files. However, as noted in comments, isolated storage is isolated in both directions - it keeps the web app isolated from the local file system, and that makes it difficult for the end user to access the contents of the isolated storage outside of the browser. (Difficult enough to make it infeasible for nontechnical users, but not difficult enough to call isolated storage "secure" from malicious snooping).
Short of writing your own native executable browser extension (for each different browser brand and version you wish to support) (or non-sandboxed javascript plugin for some browsers), I don't think there is a way for a web app to push data into multiple local files convenient to use outside of the browser in one user action.
Since this is an LOB in the intranet zone have you considered asking your users to install the app as OOB with Elevated trust. This would allow you to write files to the users Documents folder without the SaveFileDialog.
The other option is to zip the files with a single SaveFileDialog call.
There are no other Silverlight oriented solution.

Categories

Resources