My problem is I have a LOB application that can possibly save multiple files (number of files only known at runtime) based on user inputs. Saving this as a single file and having the user break them apart, or zipping them up as a single file is not an option unfortunately.
SaveFileDialog seems suited to only save 1 file at a time. Third party controls may be an option but I have yet to find any that serve this purpose. Thanks!
The browser security model guidelines (outside of Silverlight) prohibit web application logic (script or otherwise) from having direct access to the local file system.
Consider what havoc a malicious web site could wreak on your computer if web application script could write arbitrary files to arbitrary locations on the local hard disk!
For this reason, Silverlight isolates your code away from the local file system. Silverlight manages the Open File or Save File dialogs, but your web app code never gets to see the full path of the file names directly for security reasons. The Silverlight dialog only supports working with one filename / path at a time.
Silverlight does offer isolated storage on the local machine in which your web app could write multiple files. However, as noted in comments, isolated storage is isolated in both directions - it keeps the web app isolated from the local file system, and that makes it difficult for the end user to access the contents of the isolated storage outside of the browser. (Difficult enough to make it infeasible for nontechnical users, but not difficult enough to call isolated storage "secure" from malicious snooping).
Short of writing your own native executable browser extension (for each different browser brand and version you wish to support) (or non-sandboxed javascript plugin for some browsers), I don't think there is a way for a web app to push data into multiple local files convenient to use outside of the browser in one user action.
Since this is an LOB in the intranet zone have you considered asking your users to install the app as OOB with Elevated trust. This would allow you to write files to the users Documents folder without the SaveFileDialog.
The other option is to zip the files with a single SaveFileDialog call.
There are no other Silverlight oriented solution.
Related
I recently added a way for my web application (ASP.NET written in C#) to go to a folder which contains a bunch of spreadsheets and import them into SQL server tables. I set the folders and file names using an admin table so it knows how to handle each file and which table they should go to etc. It even keeps track of the file dates and times so it ignores anything that isn't new since the last time it imported them. Very cool but it only works on my development machine, most likely because the path is easily recognized there.
I'd like others to be able to do this but I can't seem get the web application to access a pre-arranged path on the users local machine. Now I'm assuming this is normal (we shouldn't be able to have a web application reach into someone's machine and grab files!) but is there some way to either do it using a known path or by having a user select the local folder? Is it possibly done more easily if I put the files in a folder within the site?
Dana
If I understand your question correctly, the approach is that you want a user to type in a local file path and you process it.
This will not work through a website. And from a security perspective this is very wise as you point out. So unless you install some client application on the local machine it is not possible.
You will need a file-upload dialog and have the user explicity locate the files for you, click upload and process them on the server.
Some other strategies here:
https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications
but it still requires the user to select them manually.
1) I am developing a desktop application that connects to a access database to store some information. This access database is on a server. I can get to that server using FTP sequence. The server also has the capability to establish connection to access database. Right now, my application downloads the database file into a folder on the computer edits it and puts it back on the server. I would really love to know if its possible to connect to the access database, make changes to it all without downloading it so that I can save time.
2) If its not possible to do what I was asking for in question 1. Say, I share my application with my colleges and I want them to be able to do the same with the database editing. after I make an .exe file out of my project and send it to them. Do they need to install ACE.oledb.12 on every computer that I want to run it on?
As Access is a file based system rather than a dedicated database server, "remote connections" don't exactly exist as all data processing has to be done locally. However as long as you are able to setup either a VPN to the server where the Access file is stored, or even better map the path as a network drive then you should be able to access it without having to download the file first. If you only have FTP access though then it wouldn't be possible.
If all you are using is Jet/ACE, the database that Access normally uses, the other users will at least have to have the drivers, which are free, or if you are working within Access itself, you will need the runtime, also free.
Actually, your terminology you using is wrong. You don't connect to a word file. You don't connect to a power point file.
So you have to keep in mind here you are not really connecting to some text file or mdb file that just happens to be sitting on a hard drive.
You are thus simply opening a file.
I mean it is silly to say we connect to a word file, or we connect to a Power point file. So in the case of the office suite and those basic simple files that resides on the disk drive?
We are talking about plane Jane windows file.
A horse is a horse is a horse.
A file is a file is a file.
So you don't connect to the jpg file sitting on the hard drive, you OPEN the jpg file. So if you talking about your current setup it best to use the correct terminology here. You are not connecting to that Access accDB file, but are in fact opening a PLANE JANE windows file. If you look close at your connection string, it will ALWAYS have a fully qualified windows path name in it that resolves to a file sitting on a folder.
So if you place that file on some server, say web server, then then you still faced with having to open that windows file. This is NOT ANY different than wanting to open a power point file. That means if you going to connect over the internet then you need to EXTEND the windows file system (this means you will need some kind of VPN). At the end of the day, if you cannot use windows networking to browse and open the folder where that file resides, then you cannot open that file with Access (or more specific in your case the JET data engine).
So for example, if the server where the file resides is a non windows box, say Linux, then you need to "add or use" or "install" the windows file and networking system on that box. A common choice in Linux is to install and run Samba on that computer.
Keep in mind that in the case of using SQL server, then you are connecting to a SERVICE running ON THAT server. In this case then you not opening a file on the remote box, but are simply are using a socket (TCP/IP) connection to some service. So you not just opening a silly old windows file that happens to be an accdb file on that system in this case.
So as such when you use FTP or http, these are not real "windows" networking systems that allow you to open + use a plane Jane windows file.
FTP will require the WHOLE FILE to be downloaded local.
PROBLEM!
If the web site or web server has the accDB file open, then how are you going to have the web server CLOSE the file BEFORE you upload and OVERWRITE the file? In other words, if that file is open by the web server, then you should NOT be making a copy and even downloading a copy via FTP until the SERVER AND WEB SITE CLOSES THE FILE! This means you NEED permission to STOP the web server while you do this!
So as such, just keep in mind the concept here that you are NOT connecting to some file, but you are just going to OPEN the file. You need to be able to OPEN the file, and you need to be using the windows networking system to do this open file in the case of Access. I mean, you don't have to install Access(JET) on the target computer. You don't have to install Power Point on the target computer to open a power point file.
You don't have to install word on that system to open a word file.
You don't have to install Excel on that system to open an Excel file.
So you are ONLY opening a file that just happens to be on some other computer.
So the Access database engine and software MUST be installed on your computer (no .exe possible here). You can most certainly package up your application as an installable windows application that then can be installed on each computer. So a free edition of the Access runtime is available, but you still going to have to install that free version of Access on computers that use Access, even if it is the free runtime edition. However, these days, I not really aware of any popular development system that produces just .exe files without requiring a runtime of some kind, be it .net, VB6, Java, or in this case Access – so some kind of support and runtime files are quite much a common requirement in most systems in use today.
So, just keep in mind you are opening a plane Jane windows file.
As such, your path name of http, or FTP is not a allowed windows path name and is not a windows file/networking system. As such a path name has nothing to do with windows networking and opening of a simple file sitting on the hard disk. So HTTP or FTP etc. are not based on windows networking and file system.
I don’t think this basic concept is too hard to grasp, but at the end of the day the concept you need to grasp and learn is that when you open a windows file sitting in a folder on the hard drive, then then will you need the windows file system to open such files. The idea and concept of opening a file in a folder might be new to you, but it is a basic requirement and understanding you need to solve this issue.
As noted, you can consider a VPN, but I explain why such a setup is not going to work in this article:
Using a wan with ms-access? How fast, how far?
http://www.kallal.ca/Wan/Wans.html
(do read the above – as it explains that you CAN open such files over the internet, but ALSO explains that such connections are WAY TOO SLOW! – remember high speed internet is WAY TOO SLOW here for this use!).
I suppose another possibility would be to consider the new web publishing ability that Access has. In the following video note how I switch to running the Access application 100% in the browser. The resulting application does not need any ActiveX or Silverlight. So the web pages run + work fine on my smartphone and even my iPad.
http://www.youtube.com/watch?v=AU4mH0jPntI
So you can use the new web publishing feature, and that would allow one to use http: to use the application
I'm writing a multi threaded console application which downloads pdf files from the web and copies it locally on to our content Server location(windows server). This is also the same location from which the files will be served to our website.
I am skeptical about the approach, because of concurrrency issues such as if the user on the web site requests a pdf file from the content server, and at the same time the file is being written to or being updated by the console application there might be an IO Exception. (The application also makes updates to the pdf files if the original contents change over time)
Is there a way to control the concurrency issue?
You probably want your operations on creating and updating the files where they are served to be atomic, so that any other processes dealing with those files get the correct version, not the one that is still open for writing.
Instead of actually writing the files to where they will be served, you could write them to a temporary directory and then move them into the directory where they will be served from.
Similarly, for updating them, you should check that when your application is updating those pdfs that the files themselves are not changed until writing has finished. You could test this by making your application sleep after it has started writing to the file, for example.
The details depend on which web server software you are using, but the key to this problem is to give each version of the file a different name. The same URL, mind you, but a different name on the underlying file system.
Once a newer version of the file is ready, change the web server's configuration so that the URL points to the new file. In any reasonably functional web server this should be an atomic operation.
If the web server doesn't have built-in support for this, you could serve the files via a custom server-side script.
Mark the files hidden until the copy or update is complete.
I have a website that occasionally needs to have a handful of the tables in its database updated. The updates come from another system that exports to comma delimited text files. I can then either FTP the text files to the web server, send them in through an admin upload page, or manually log in to Remote Desktop to download the text files. I have all my C# code written to parse the files, check the database contents, and decide what to do.
Should I code the sync logic to be part of a file upload page, protected in the admin section of the site or should I create a Windows Service that constantly looks for files to process in a particular directory that I can drop files in through FTP?
I have used Windows Services in the past and they have worked great, but if I ever have to make a change to the code it can take longer than it would if I just had to modify an ASPX.
Are their security benefits one way or another?
Performance benefits?
ASPX page wins the "ease of maintenance" category.
I would create a Windows Service to watch a secure folder and use a directory watcher to look for new files. Since the files are coming from another system, it is asynchronous in nature, and it is much more performant to have a Windows Service running separately to watch for updates as they happen. It can also parse the files and update the database for you.
Depending on who maintains the remote system, the easiest way is to grant permission to the service to access the files on a secure, shared folder. Then you won't need to do anything manually.
Is it possible to write a filesystem for Windows in pure usermode, or more specifically purely in managed code? I am thinking of something very similar to GMAILFS. Excluding what it is doing under the covers (GMAIL, Amazon, etc..) the main goal would be to provide a drive letter and support all of the basic file operations, and possibly even adding my own structures for storing metadata, etc..
Windows provides several approaches to building a user-mode file system for different purposes, depending on your storage location and features that you need to support. Two of them, Projected File System API and Cloud Files API were recently provided as part of the Windows 10 updates.
Windows Projected File System API
Projected File System API is designed to represent some hierarchical data, such as for example Windows Registry, in the form of a file system.
Unlike Cloud Files (see below) it does not provide any information about file status and hides the fact that this is not the “real” file system. Example.
Windows Cloud Sync Engine API
Cloud Sync Engine API (Cloud Files API, Cloud Filter API) is used in OneDrive on Windows 10 under the hood. It provides folder content loading during the first request, on-demand files content loading in several different modes, and offline files support. It integrates directly into Windows File Manager and Windows Notification Center and provides file status (offline, in-sync, conflict, pinned) and file content transfer progress.
The Cloud Files API runs under regular user permissions and does not require admin privileges for file system mounting or any API calls. Example.
Windows Shell Namespace Extensions API
While Shell Namespace Extension is not a real file system, in many cases you will use it to extend the functionality of the Projected File System and Cloud Files API. For example, you will it to add custom commands to context menus in Windows File Manager as well as you can create nodes that look and behave like a real file system (again, applications would not be able to read or write to such nodes, this is just a user interface).
Cloud Files API is using a namespace extension to show your sync root at the top level in Windows File Manager.
It's difficult. I'd take a look at some projects which have done some of the hard work for you, e.g. Dokan.
Yes. It's possible and has been successfully done for the ext2 filesystem.
Note that you will need to write your own driver which will require Microsoft signing to be run on some OSes.
Sure, you can abstract the regular file operations and have them running in the cloud (see Google Apps, Amazon S3, Microsoft Azure etc.). But if you'd like to talk to local devices - including the local HD - you'll have to use system APIs and those use drivers (system/kernel mode).
As long as all you want is a storage service -no problem. If you want a real OS, you'll need to talk to real hardware and that means drivers.
Just as a reference - our Callback File System is a maintained and supported solution for creation of filesystems in user-mode.