WP background file transfer, more than 25 files - c#

according to this topic
http://msdn.microsoft.com/en-us/library/windowsphone/develop/hh202959(v=vs.105).aspx
I'm trying to download more than 25 mp3 files from a list, in background, I made lot of different tries, basically I tried to pass a list, remove the file downloaded and recall the function again... but doesn't work with app in background... maybe because it's a variable? should I store into isolated storage? here is the last code:
ObservableCollection<File> remoteFileList = new ObservableCollection<File>();
public void downloadList()
{
if ((remoteFileList.Count > 0) && (BackgroundTransferService.Requests.Count() < 5))
{
File t = remoteFileList.First();
BackgroundTransferRequest transfer = startDownload(t.Name);
transfer.TransferProgressChanged += new EventHandler<BackgroundTransferEventArgs>(transfer_TransferProgressChanged);
remoteFileList.Remove(t);
}
}
public void transfer_TransferStatusChanged(object sender, BackgroundTransferEventArgs e)
{
BackgroundTransferRequest b = e.Request as BackgroundTransferRequest;
System.Diagnostics.Debug.WriteLine(b.TransferStatus);
ProcessTransfer(e.Request);
downloadList();
}

To pop items off the BackgroundTransfer queue, you need to call the Remove() method in the BackgroundTransferService class. You cannot have more than 25 requests in a queue without popping something out of it.

Related

Getting time remaining from DotNetZip packaging

I have this code:
using (var zip = new ZipFile())
{
zip.CompressionLevel = CompressionLevel.None;
zip.AddDirectory(myDirectoryInfo.FullName);
zip.UseZip64WhenSaving = Zip64Option.Always;
zip.SaveProgress += SaveProgress;
zip.Save(outputPackage);
}
private void SaveProgress(object sender, SaveProgressEventArgs e)
{
if (e.EntriesTotal > 0 && e.EntriesSaved > 0)
{
var counts = String.Format("{0} / {1}", e.EntriesSaved, e.EntriesTotal);
var percentcompletion = ((double)e.EntriesSaved / e.EntriesTotal) * 100;
}
}
What I really want to do is estimate the time remaining for the packaging to complete. But in SaveProgress the SaveProgressEventArgs values BytesTransferred and TotalBytesToTransfer have values of 0. I believe I need these to accurately estimate time?
So first, am I supposed to have values from these? Seems like the packaging is working okay. Second, what's the best way to estimate time remaining here, and third, is there a way to ensure that this is the fastest way to package a large directory? I don't want to compress- this is a directory filled with already compressed files that just need to be stuffed into an archive.

.NET FileSystemWatcher goes into infinite loop when moving file

I have a issue with the FileSystemWatcher. I'm using it in a windows service to monitor certain folders and when a file is copied, it proccesses that file using a SSIS package. Everything works fine, but every now and then, the FileWatcher picks up the same file and fires the Created event multiple times in a infinate loop. The code below works as follow:
Firstly, this method is called by the windows service and creates a watcher :
private void CreateFileWatcherEvent(SSISPackageSetting packageSettings)
{
// Create a new FileSystemWatcher and set its properties.
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.IncludeSubdirectories = false;
watcher.Path = packageSettings.FileWatchPath;
/* Watch for changes in LastAccess and LastWrite times, and
the renaming of files or directories. */
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName | NotifyFilters.Size;
//Watch for all files
watcher.Filter = "*.*";
watcher.Created += (s, e) => FileCreated(e, packageSettings);
// Begin watching.
watcher.EnableRaisingEvents = true;
}
Next up, The Watcher.Created event looks something like this:
private void FileCreated(FileSystemEventArgs e, SSISPackageSetting packageSettings)
{
//Bunch of other code not important to the issue
ProcessFile(packageSettings, e.FullPath, fileExtension);
}
The ProcessFile method looks something like this:
private void ProcessFile(SSISPackageSetting packageSetting,string Filename,string fileExtension)
{
//COMPLETE A BUNCH OF SSIS TASKS TO PROCESS THE FILE
//NOW WE NEED TO CREATE THE OUTPUT FILE SO THAT SSIS CAN WRITE TO IT
string errorOutPutfileName = packageSetting.ImportFailurePath + #"\FailedRows" + System.DateTime.Now.ToFileTime() + packageSetting.ErrorRowsFileExtension;
File.Create(errorOutPutfileName).Close();
MoveFileToSuccessPath(Filename, packageSetting);
}
Lastly, the MoveFile Method looks like this:
private void MoveFileToSuccessPath(string filename, SSISPackageSetting ssisPackage)
{
try
{
string newFilename = MakeFilenameUnique(filename);
System.IO.File.Move(filename, ssisPackage.ArchivePath.EndsWith("\\") ? ssisPackage.ArchivePath + newFilename : ssisPackage.ArchivePath + "\\" + newFilename);
}
catch (Exception ex)
{
SaveToApplicationLog(string.Format
("Error ocurred while moving a file to the success path. Filename {0}. Archive Path {1}. Error {2}", filename, ssisPackage.ArchivePath,ex.ToString()), EventLogEntryType.Error);
}
}
So somewhere in there, we go into a infinite loop and the FileWatcher keeps on picking up the same file. Anyone have any idea? This happens randomly and intermittently.
When using the FileSystemWatcher I tend to use a dictionary to add the files to when the notification event fires. I then have a separate thread using a timer which picks files up from this collection when they are more than a few seconds old, somewhere around 5 seconds.
If my processing is also likely to change the last access time and I watch that too then I also implement a checksum which I keep in a dictionary along with the filename and last processed time for every file and use that to suppress it firing multiple times in a row. You don't have to use an expensive one to calculate, I have used md5 and even crc32 - you are only trying to prevent multiple notifications.
EDIT
This example code is very situation specific and makes lots of assumptions you may need to change. It doesn't list all your code, just somethind like the bits you need to add:
// So, first thing to do is add a dictionary to store file info:
internal class FileWatchInfo
{
public DateTime LatestTime { get; set; }
public bool IsProcessing { get; set; }
public string FullName { get; set; }
public string Checksum { get; set; }
}
SortedDictionary<string, FileWatchInfo> fileInfos = new SortedDictionary<string, FileWatchInfo>();
private readonly object SyncRoot = new object();
// Now, when you set up the watcher, also set up a [`Timer`][1] to monitor that dictionary.
CreateFileWatcherEvent(new SSISPackageSetting{ FileWatchPath = "H:\\test"});
int processFilesInMilliseconds = 5000;
Timer timer = new Timer(ProcessFiles, null, processFilesInMilliseconds, processFilesInMilliseconds);
// In FileCreated, don't process the file but add it to a list
private void FileCreated(FileSystemEventArgs e) {
var finf = new FileInfo(e.FullPath);
DateTime latest = finf.LastAccessTimeUtc > finf.LastWriteTimeUtc
? finf.LastAccessTimeUtc : finf.LastWriteTimeUtc;
latest = latest > finf.CreationTimeUtc ? latest : finf.CreationTimeUtc;
// Beware of issues if other code sets the file times to crazy times in the past/future
lock (SyncRoot) {
// You need to work out what to do if you actually need to add this file again (i.e. someone
// has edited it in the 5 seconds since it was created, and the time it took you to process it)
if (!this.fileInfos.ContainsKey(e.FullPath)) {
FileWatchInfo info = new FileWatchInfo {
FullName = e.FullPath,
LatestTime = latest,
IsProcessing = false, Processed = false,
Checksum = null
};
this.fileInfos.Add(e.FullPath, info);
}
}
}
And finally, here is the process method as it now is
private void ProcessFiles(object state) {
FileWatchInfo toProcess = null;
List<string> toRemove = new List<string>();
lock (this.SyncRoot) {
foreach (var info in this.fileInfos) {
// You may want to sort your list by latest to avoid files being left in the queue for a long time
if (info.Value.Checksum == null) {
// If this fires the watcher, it doesn't matter, but beware of big files,
// which may mean you need to move this outside the lock
string md5Value;
using (var md5 = MD5.Create()) {
using (var stream = File.OpenRead(info.Value.FullName)) {
info.Value.Checksum =
BitConverter.ToString(md5.ComputeHash(stream)).Replace("-", "").ToLower();
}
}
}
// Data store (myFileInfoStore) is code I haven't included - use a Dictionary which you remove files from
// after a few minutes, or a permanent database to store file checksums
if ((info.Value.Processed && info.Value.ProcessedTime.AddSeconds(5) < DateTime.UtcNow)
|| myFileInfoStore.GetFileInfo(info.Value.FullName).Checksum == info.Value.Checksum) {
toRemove.Add(info.Key);
}
else if (!info.Value.Processed && !info.Value.IsProcessing
&& info.Value.LatestTime.AddSeconds(5) < DateTime.UtcNow) {
info.Value.IsProcessing = true;
toProcess = info.Value;
// This processes one file at a time, you could equally add a bunch to a list for parallel processing
break;
}
}
foreach (var filePath in toRemove) {
this.fileInfos.Remove(filePath);
}
}
if (toProcess != null)
{
ProcessFile(packageSettings, toProcess.FullName, new FileInfo(toProcess.FullName).Extension);
}
}
Finally, ProcessFile needs to process your file, then once completed go inside a lock, mark the info in the fileInfos dictionary as Processed, set the ProcessedTime, and then exit the lock and move the file. You will also want to update the checksum if it changes after an acceptable amount of time has passed.
It is very hard to provide a complete sample as I know nothing about your situation, but this is the general pattern I use. You will need to consider file rates, how frequently they are updated etc. You can probably bring down the time intervals to sub second instead of 5 seconds and still be ok.

How to set a dynamic number of threadCounter variables?

I'm not really into multithreading so probably the question is stupid but it seems I cannot find a way to solve this problem (especially because I'm using C# and I've been using it for a month).
I have a dynamic number of directories (I got it from a query in the DB). Inside those queries there are a certain amount of files.
For each directory I need to use a method to transfer these files using FTP in a cuncurrent way because I have basically no limit in FTP max connections (not my word, it's written in the specifics).
But I still need to control the max amount of files transfered per directory. So I need to count the files I'm transfering (increment/decrement).
How could I do it? Should I use something like an array and use the Monitor class?
Edit: Framework 3.5
You can use the Semaphore class to throttle the number of concurrent files per directory. You would probably want to have one semaphore per directory so that the number of FTP uploads per directory can be controlled independently.
public class Example
{
public void ProcessAllFilesAsync()
{
var semaphores = new Dictionary<string, Semaphore>();
foreach (string filePath in GetFiles())
{
string filePathCapture = filePath; // Needed to perform the closure correctly.
string directoryPath = Path.GetDirectoryName(filePath);
if (!semaphores.ContainsKey(directoryPath))
{
int allowed = NUM_OF_CONCURRENT_OPERATIONS;
semaphores.Add(directoryPath, new Semaphore(allowed, allowed));
}
var semaphore = semaphores[directoryPath];
ThreadPool.QueueUserWorkItem(
(state) =>
{
semaphore.WaitOne();
try
{
DoFtpOperation(filePathCapture);
}
finally
{
semaphore.Release();
}
}, null);
}
}
}
var allDirectories = db.GetAllDirectories();
foreach(var directoryPath in allDirectories)
{
DirectoryInfo directories = new DirectoryInfo(directoryPath);
//Loop through every file in that Directory
foreach(var fileInDir in directories.GetFiles()) {
//Check if we have reached our max limit
if (numberFTPConnections == MAXFTPCONNECTIONS){
Thread.Sleep(1000);
}
//code to copy to FTP
//This can be Aync, when then transfer is completed
//decrement the numberFTPConnections so then next file can be transfered.
}
}
You can try something along the lines above. Note that It's just the basic logic and there are proberly better ways to do this.

ASP.NET Schedule deletion of temporary files

Question: I have an ASP.NET application which creates temporary PDF files (for the user to download).
Now, many users over many days can create many PDFs, which take much disk space.
What's the best way to schedule deletion of files older than 1 day/ 8 hours ?
Preferably in the asp.net application itselfs...
For each temporary file that you need to create, make a note of the filename in the session:
// create temporary file:
string fileName = System.IO.Path.GetTempFileName();
Session[string.Concat("temporaryFile", Guid.NewGuid().ToString("d"))] = fileName;
// TODO: write to file
Next, add the following cleanup code to global.asax:
<%# Application Language="C#" %>
<script RunAt="server">
void Session_End(object sender, EventArgs e) {
// Code that runs when a session ends.
// Note: The Session_End event is raised only when the sessionstate mode
// is set to InProc in the Web.config file. If session mode is set to StateServer
// or SQLServer, the event is not raised.
// remove files that has been uploaded, but not actively 'saved' or 'canceled' by the user
foreach (string key in Session.Keys) {
if (key.StartsWith("temporaryFile", StringComparison.OrdinalIgnoreCase)) {
try {
string fileName = (string)Session[key];
Session[key] = string.Empty;
if ((fileName.Length > 0) && (System.IO.File.Exists(fileName))) {
System.IO.File.Delete(fileName);
}
} catch (Exception) { }
}
}
}
</script>
UPDATE: I'm now accually using a new (improved) method than the one described above. The new one involves HttpRuntime.Cache and checking that the files are older than 8 hours. I'll post it here if anyones interested. Here's my new global.asax.cs:
using System;
using System.Web;
using System.Text;
using System.IO;
using System.Xml;
using System.Web.Caching;
public partial class global : System.Web.HttpApplication {
protected void Application_Start() {
RemoveTemporaryFiles();
RemoveTemporaryFilesSchedule();
}
public void RemoveTemporaryFiles() {
string pathTemp = "d:\\uploads\\";
if ((pathTemp.Length > 0) && (Directory.Exists(pathTemp))) {
foreach (string file in Directory.GetFiles(pathTemp)) {
try {
FileInfo fi = new FileInfo(file);
if (fi.CreationTime < DateTime.Now.AddHours(-8)) {
File.Delete(file);
}
} catch (Exception) { }
}
}
}
public void RemoveTemporaryFilesSchedule() {
HttpRuntime.Cache.Insert("RemoveTemporaryFiles", string.Empty, null, DateTime.Now.AddHours(1), Cache.NoSlidingExpiration, CacheItemPriority.NotRemovable, delegate(string id, object o, CacheItemRemovedReason cirr) {
if (id.Equals("RemoveTemporaryFiles", StringComparison.OrdinalIgnoreCase)) {
RemoveTemporaryFiles();
RemoveTemporaryFilesSchedule();
}
});
}
}
Try using Path.GetTempPath(). It will give you a path to a windows temp folder. Then it will be up to windows to clean up :)
You can read more about the method here http://msdn.microsoft.com/en-us/library/system.io.path.gettemppath.aspx
The Best way is to create a batch file which it be called by the windows task scheduler one at the interval that you want.
OR
you can create a windows service with the class above
public class CleanUpBot
{
public bool KeepAlive;
private Thread _cleanUpThread;
public void Run()
{
_cleanUpThread = new Thread(StartCleanUp);
}
private void StartCleanUp()
{
do
{
// HERE THE LOGIC FOR DELETE FILES
_cleanUpThread.Join(TIME_IN_MILLISECOND);
}while(KeepAlive)
}
}
Notice that you can also call this class at the pageLoad and it wont affect the process time because the treatment is in another thread. Just remove the do-while and the Thread.Join().
How do you store the files? If possible, you could just go with a simple solution, where all files are stored in a folder named after the current date and time.
Then create a simple page or httphandler that will delete old folders. You could call this page at intervals using a Windows schedule or other cron job.
Create a timer on Appication_Start and schedule the timer to call a method on every 1 hours and flush the files older than 8 hours or 1 day or whatever duration you need.
I sort of agree with whats said in the answer by dirk.
The idea being that the temp folder in which you drop the files to is a fixed known location however i differ slightly ...
Each time a file is created add the filename to a list in the session object (assuming there isn't thousands, if there is when this list hits a given cap do the next bit)
when the session ends the Session_End event should be raised in global.asax should be raised. Iterate all the files in the list and remove them.
private const string TEMPDIRPATH = #"C:\\mytempdir\";
private const int DELETEAFTERHOURS = 8;
private void cleanTempDir()
{
foreach (string filePath in Directory.GetFiles(TEMPDIRPATH))
{
FileInfo fi = new FileInfo(filePath);
if (!(fi.LastWriteTime.CompareTo(DateTime.Now.AddHours(DELETEAFTERHOURS * -1)) <= 0)) //created or modified more than x hours ago? if not, continue to the next file
{
continue;
}
try
{
File.Delete(filePath);
}
catch (Exception)
{
//something happened and the file probably isn't deleted. the next time give it another shot
}
}
}
The code above will remove the files in the temp directory that are created or modified more than 8 hours ago.
However I would suggest to use another approach. As Fredrik Johansson suggested, you can delete the files created by the user when the session ends. Better is to work with an extra directory based on the session ID of the user in you temp directory. When the session ends you simply delete the directory created for the user.
private const string TEMPDIRPATH = #"C:\\mytempdir\";
string tempDirUserPath = Path.Combine(TEMPDIRPATH, HttpContext.Current.User.Identity.Name);
private void removeTempDirUser(string path)
{
try
{
Directory.Delete(path);
}
catch (Exception)
{
//an exception occured while deleting the directory.
}
}
Use the cache expiry notification to trigger file deletion:
private static void DeleteLater(string path)
{
HttpContext.Current.Cache.Add(path, path, null, Cache.NoAbsoluteExpiration, new TimeSpan(0, 8, 0, 0), CacheItemPriority.NotRemovable, UploadedFileCacheCallback);
}
private static void UploadedFileCacheCallback(string key, object value, CacheItemRemovedReason reason)
{
var path = (string) value;
Debug.WriteLine(string.Format("Deleting upladed file '{0}'", path));
File.Delete(path);
}
ref: MSDN | How to: Notify an Application When an Item Is Removed from the Cache

Delete files from the folder older than 4 days

I would like to run a timer for every 5 hours and delete the files from the folder older than 4 days. Could you please with sample code?
DateTime CutOffDate = DateTime.Now.AddDays(-4)
DirectoryInfo di = new DirectoryInfo(folderPath);
FileInfo[] fi = di.GetFiles();
for (int i = 0; i < fi.Length; i++)
{
if (fi[i].LastWriteTime < CutOffDate)
{
File.Delete(fi[i].FullName);
}
}
You can substitute LastWriteTime property for something else, that's just what I use when clearing out an Image Cache in an app I have.
EDIT:
Though this doesnt include the timer part... I'll let you figure that part out yourself. A little Googling should show you several ways to do it on a schedule.
Since it hasn't been mentioned, I would recommend using a System.Threading.Timer for something like this. Here's an example implementation:
System.Threading.Timer DeleteFileTimer = null;
private void CreateStartTimer()
{
TimeSpan InitialInterval = new TimeSpan(0,0,5);
TimeSpan RegularInterval = new TimeSpan(5,0,0);
DeleteFileTimer = new System.Threading.Timer(QueryDeleteFiles, null,
InitialInterval, RegularInterval);
}
private void QueryDeleteFiles(object state)
{
//Delete Files Here... (Fires Every Five Hours).
//Warning: Don't update any UI elements from here without Invoke()ing
System.Diagnostics.Debug.WriteLine("Deleting Files...");
}
private void StopDestroyTimer()
{
DeleteFileTimer.Change(System.Threading.Timeout.Infinite,
System.Threading.Timeout.Infinite);
DeleteFileTimer.Dispose();
}
This way, you can run your file deletion code in a windows service with minimal hassle.

Categories

Resources