I am creating an application that uses Quartz.NET to automatically download and upload files to various sources (HTTP, FTP and Network paths) based upon a regular exprsesion. Users can select multiple paths for each download and upload operation, so a typical job may be to download files from a http server, and also download from an ftp server, and upload all files to a network path.
Currently, I am downloading all files from all the download sources, and storing them in a folder (With the name of a folder being a GUID specific to that job). Then for the upload stage, it will simply read all files from that directory, and upload them to the path, which is great.
Problem is, for specific paths, the user may request these to be deleted after upload has completed, which is an issue as how can I find out where a file come from in a folder? I've been trying to think of ways around this, such as creating folders for each download path, but I'd need to check for duplicate names on download rather than upload, plus I'd need to merge both subfolders...etc!
Can anyone offer any ideas? Many thanks
Think about this in a object oriented manner.
Create a class like this
public class File
{
public string source;
public string destination;
public bool deleteSource; //if true delete the source after the copy
}
Now create a list of File classes like List<File> files and keep that as variable in your app.
Add objects to the list in the start and then traverse the list and copy / upload files. Check the deleteSource property and if it is true delete the file after the copy operation.
This is a basic idea and expand this class as required.
What I want to stress is that think of a problem in the object oriented way and start designing
When you download a file, can you create a separate text file that contains the source and destination paths? That way you can read in that mapping later and process them as necessary based on the source.
Related
I'm trying to develop a simple SFTP file transferring project with following operations
Upload
Download
Move files within the remote server
Remove files
While uploading in session.PutFiles() we have a property called transferOptions.FileMask to filter the files.
But I didn't see anything of that kind in session.MoveFile() and in session.RemoveFiles()
My question is what should I do if I need to move/remove only selected files?
The Session.RemoveFiles accepts a file mask.
So you can do:
session.RemoveFiles("/home/user/*.txt");
It's the same as with the Session.PutFiles. The TransferOptions.FileMask is actually intended for advanced selections, like when you want to select files recursively, or when you want to exclude certain types of files.
session.PutFiles(#"c:\toupload\*.txt", "/home/user/");
With the TransferOption.FileMask, WinSCP would upload all matching files recursively. While with a simple file mask as the argument to the .PutFiles, it's not recursive.
The Session.MoveFile actually supports the file mask in its first argument too, though it's an undocumented feature.
A proper way would be to list remote directory, select desired files and call the Session.MoveFile for each.
See Listing files matching wildcard. It's a PowerShell example, but mapping it to C# should be easy.
My C# application downloads a .zip that contains at least a .dcm file.
After decompression I get something like:
download/XXXX/YYYYYY/ZZZZZZ/file.dcm
I don't know the name of these intermediary X,Y,Z folders, and I don't know how many of them exist, but I'm certain that at least a single .dcm file exists at the end of the path.
How can I get the full path of folders between download and the folder with .dcm files? (assume Windows filesystem and .Net Framework 4.0).
This will give you a list of all the files contained within the download file that would match your filename:
Directory.GetFiles("C:\\path_to_download_folder", "file.dcm", SearchOption.AllDirectories);
You could then parse the returned filepaths for whatever parts you needed. The System.IO.Path methods will probably give you want you need instead of rolling your own.
Additionally, if your application might be downloading multiple files throughout the day, and you always need to retrieve the path of the very latest matching file, you could send the filepath to a System.IO.FileInfo, which lets you get the creation time of the file, which you could use to determine which file is the newest.
I want to bind a file to a folder so that when the file is moved/copied/deleted the folder is also automatically moved/copied/deleted along with the file (Similar to how html files are linked to the folder containing their resources)
At the moment this behaviour is activated only for html files and their corresponding "htmlFilename_files" folders. Is there a way to register another file extension for such behavior? Or is there a hack or feature that provides similar behavior? Thanks.
Even for HTML this behavior is specific to Explorer.
The problem is that there's no "copy" operation on filesystem level (and Move operation is different from what you see in user mode).
Technically you can create a filesystem filter driver and track RenameOrMove and Delete operations on some file, then perform some operation with the directory. But this won't work for copy which is a sequence of "read" + "write" operations and you would have hard time matching reads and writes (but, well, I can think of some ways to track copying using filter driver as well). You can create a filter driver yourself, but this requires C programming and special knowledge. You can write a driver yourself (read MSDN article) or use our CallbackFilter product (it provides a pre-created driver and .NET API for integration with your software).
If data file is yours, it makes sense to keep all files in one virtual container - this can be MHT file (HTML + supplementary files are combined into MHT by Internet Explorer), ZIP archive or SolFS storage (SolFS is our virtual file system product). Then there's only one file for the user to manage and your application has all files inside.
If you need to let external applications access files in the container, then the virtual disk can be created - such disk will expose contents of the container. Contents of any container can be exposed as a virtual disk using now-dead Dokan or our Callback File System product.
As we all know that we can not get the full path of the file using File Upload control, we will follow the process for saving the file in to our application by creating a folder and by getting that folder path as follows
Server.MapPath
But i am having a scenario to select 1200 excel files, not at a time. I will select each and every excel file and read the requied content from that excel and saving the information to Database. While doing this i am saving the files to the Application Folder by creating a folder Excel. As i am having 1200 files all these files will be saved in to this folder after each and every run.
Is it the correct method to follow or not I don't know
I am looking for an alternative solution rather than saving the file to folder. I would like to save the full path of file temporarily until the process was executed.
So can any tell me the best way as per my requirement.
Grrbrr404 is correct. You can perfectly take the byte [] from the FileUpload.PostedFile and save it to the database directly without using the intermediate folder. You could store the file name with extension on a separate column so you know how to stream it later, in case you need to.
The debate of whether it's good or bad to store these things on the database itself or on the filesystem is very heated. I don't think either approach is best over the other; you'll have to look at your resources and your particular situation and make the appropriate decision. Search for "Store images on database or filesystem" in here or Google and you'll see what I mean.
See this one, for example.
I have a tab delimited text file that I need to upload to a secure folder for SSIS to import into SQL Server. The file will be uploaded by an external user VIA a web app.
My challenge is that I need to check this file for some things before I am allowed to let it go to the secure folder. Namely I need to make sure the user has a specific column.
I can do this if I save the file to a folder in the web app. However the nature of the data in the file is such that we do not wish to place this file anywhere other then the secured folder. I also can not place it directly to this folder because the SSIS package is set to trigger as soon as the file shows up there.
What I need to do is find a way, if there is one, to parse the file in memory and if it passes all the checks, upload it to the secure folder.
I'm using C#.NET and the FileUpload Control.
My search so far has included all kinds of information but they all require saving the file somewhere first and then working with it.
Thank you so much for your time. If anybody can point me to an object or some code I can check out I would be most grateful.
Rather than calling SaveAs, use the FileContent property to access the contents of the file as a Stream, then you can do whatever processing is required before you save it manually.
For example, something like this:
string data;
using(StreamReader reader = new StreamReader(fileUpload.FileContent))
{
data = reader.ReadToEnd();
}
The data variable now contains the contents of the file as a string. You can do whatever processing you like, then save it (or not) to the appropriate location.