Does anyone know of a .Net library where a file can be copied / pasted or moved without changing any of the timestamps. The functionality I am looking for is contained in a program called robocopy.exe, but I would like this functionality without having to share that binary.
Thoughts?
public static void CopyFileExactly(string copyFromPath, string copyToPath)
{
var origin = new FileInfo(copyFromPath);
origin.CopyTo(copyToPath, true);
var destination = new FileInfo(copyToPath);
destination.CreationTime = origin.CreationTime;
destination.LastWriteTime = origin.LastWriteTime;
destination.LastAccessTime = origin.LastAccessTime;
}
When executing without administrative privileges Roy's answer will throw an exception (UnauthorizedAccessException) when attempting to overwrite existing read only files or when attempting to set the timestamps on copied read only files.
The following solution is based on Roy's answer but extends it to overwrite read only files and to change the timestamps on copied read only files while preserving the read only attribute of the file all while still executing without admin privilege.
public static void CopyFileExactly(string copyFromPath, string copyToPath)
{
if (File.Exists(copyToPath))
{
var target = new FileInfo(copyToPath);
if (target.IsReadOnly)
target.IsReadOnly = false;
}
var origin = new FileInfo(copyFromPath);
origin.CopyTo(copyToPath, true);
var destination = new FileInfo(copyToPath);
if (destination.IsReadOnly)
{
destination.IsReadOnly = false;
destination.CreationTime = origin.CreationTime;
destination.LastWriteTime = origin.LastWriteTime;
destination.LastAccessTime = origin.LastAccessTime;
destination.IsReadOnly = true;
}
else
{
destination.CreationTime = origin.CreationTime;
destination.LastWriteTime = origin.LastWriteTime;
destination.LastAccessTime = origin.LastAccessTime;
}
}
You can read and write all the timestamps there are, using the FileInfo class:
CreationTime
LastAccessTime
LastWriteTime
You should be able to read the values you need, make whatever changes you wish and then restore the previous values by using the properties of FileInfo.
Related
I have a UWP application which perform to capture and process images from a camera. This project leverage Microsoft Cognitive Services Face Recognition API and I'm exploring the application's existing functionality for awhile now. My goal is that when the image of a person is identified by the camera (through Face Recognition API service), I want to show the associated image of that person.
With that, the images are captured and stored in a local directory of my machine. I want to retrieve the image file and render it on the screen once the person is identified.
The code below shows the async Task method ProcessCameraCapture
private async Task ProcessCameraCapture(ImageAnalyzer e)
{
if (e == null)
{
this.UpdateUIForNoFacesDetected();
this.isProcessingPhoto = false;
return;
}
DateTime start = DateTime.Now;
await e.DetectFacesAsync();
if (e.DetectedFaces.Any())
{
string names;
await e.IdentifyFacesAsync();
this.greetingTextBlock.Text = this.GetGreettingFromFaces(e, out names);
if (e.IdentifiedPersons.Any())
{
this.greetingTextBlock.Foreground = new SolidColorBrush(Windows.UI.Colors.GreenYellow);
this.greetingSymbol.Foreground = new SolidColorBrush(Windows.UI.Colors.GreenYellow);
this.greetingSymbol.Symbol = Symbol.Comment;
GetSavedFilePhoto(names);
}
else
{
this.greetingTextBlock.Foreground = new SolidColorBrush(Windows.UI.Colors.Yellow);
this.greetingSymbol.Foreground = new SolidColorBrush(Windows.UI.Colors.Yellow);
this.greetingSymbol.Symbol = Symbol.View;
}
}
else
{
this.UpdateUIForNoFacesDetected();
}
TimeSpan latency = DateTime.Now - start;
this.faceLantencyDebugText.Text = string.Format("Face API latency: {0}ms", (int)latency.TotalMilliseconds);
this.isProcessingPhoto = false;
}
In GetSavedFilePhoto, I passed the string names argument once the person is identified.
Code below for the GetSavedFilePhoto method
private void GetSavedFilePhoto(string personName)
{
if (string.IsNullOrWhiteSpace(personName)) return;
var directoryPath = #"D:\PersonImages";
var directories = Directory.GetDirectories(directoryPath);
var filePaths = Directory.GetFiles(directoryPath, "*.jpg", SearchOption.AllDirectories);
}
However, in GetSavedFilePhoto method the variable directories returned an empty string of array when using directoryPath string variable. Directory "D:\PersonImages" is a valid and existing folder in my machine and, it contains subfolders with images inside. I also tried Directory.GetFiles to retrieve the jpg images but still returned an empty string.
I think it should work because I have used Directory class several times but not inside an asyncTask method. Does using async caused the files not returned when using I/O operation?
Sorry for this stupid question, but I really don't understand.
Any help is greatly appreciated.
Using Directory.GetFiles or Directory.GetDirectories method can get the folder/file in the local folder of the Application by the following code. But it could not open D:\.
var directories = Directory.GetDirectories(ApplicationData.Current.LocalFolder.Path);
In UWP app you can only access two locations at default (local folder and install folder), others need capabilities setting or file open picker.Details please reference file access permission.
If you need access to all files in D:\, the user must manually pick the D:\ drive using the FolderPicker, then you have permissions to access to files in this drive.
var picker = new Windows.Storage.Pickers.FileOpenPicker();
picker.ViewMode = Windows.Storage.Pickers.PickerViewMode.Thumbnail;
picker.SuggestedStartLocation =
Windows.Storage.Pickers.PickerLocationId.ComputerFolder;
picker.FileTypeFilter.Add(".jpg");
picker.FileTypeFilter.Add(".jpeg");
picker.FileTypeFilter.Add(".png");
Windows.Storage.StorageFile file = await picker.PickSingleFileAsync();
if (file != null)
{
// Application now has read/write access to the picked file
}
else
{
//do some stuff
}
i´m currently working on a programm which updates templates on our companies Team Foundation Server. I am having those new templates locally on my disk and want to replace the existing ones on the server. I was trying different approaches and this is my newest version. The problem is that either
the new file is "in use" when accessing it through coding in c#(while not in use when i try to replace it in runtime using the normal explorer).
the replacement is not appearing in the pending changes, the pendingChanges array is initial.
using (var tfs = TeamFoundationServerFactory.GetServer("myserver"))
{
var versionControlServer = tfs.GetService(typeof(VersionControlServer)) as VersionControlServer;
// Create a new workspace for the currently authenticated user.
var workspace = versionControlServer.CreateWorkspace("Temporary Workspace", versionControlServer.AuthorizedUser);
try
{
// Check if a mapping already exists.
var workingFolder = new WorkingFolder("$serverpath", #"c:\tempFolder");
// Create the mapping (if it exists already, it just overides it, that is fine).
workspace.CreateMapping(workingFolder);
workspace.Get(VersionSpec.Latest, GetOptions.GetAll);
string[] paths = new string[1];
paths[0] = "test.pdf";
workspace.PendEdit(paths, RecursionType.Full, null, LockLevel.None);
// Go through the folder structure defined and create it locally, then check in the changes.
CreateFolderStructure(workspace, workingFolder.LocalItem);
// Check in the changes made.
int a = workspace.CheckIn(workspace.GetPendingChanges(), "This is my comment");
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
// Cleanup the workspace.
workspace.Delete();
// Remove the temp folder used.
Directory.Delete(#"C:\tempFolder", true);
}
}
}
static void CreateFolderStructure(Workspace workspace, string initialPath)
{
workspace.PendDelete("$serverpath/test.pdf", RecursionType.None);
File.Copy(#"C:\test\testnew.pdf", #"C:\tempfolder\test", true);
workspace.PendAdd(#"C:\tempfolder\test.pdf");
}
I found a solution to the problem. The workspace which was used by "authorizedUser" was obviously not enough.. I found out that a need a TeamFoundationIdentity to do it. Here is a guide on how to fix the issue.
http://blogs.msdn.com/b/taylaf/archive/2010/03/29/using-tfs-impersonation-with-the-version-control-client-apis.aspx
I need to do basic one way file synchronization from local to remote server. I tried to use Microsoft Sync Framework, and it works just fine. However, I will need two features which I can not get now:
If file has been deleted on the destination, next synchronization should recreate it from the source
If file has been changed on the destination, next synchronization should replace it from the source
Is it possible to get that by using some options in SyncOrchestrator.Synchronize() function ?
Existing code is based on MSDN article:
public static void SyncFileSystemReplicaOneWay(string sourcePath, string destinationPath)
{
FileSyncProvider sourceProvider = null;
FileSyncProvider destinationProvider = null;
try
{
sourceProvider = new FileSyncProvider(sourcePath);
destinationProvider = new FileSyncProvider(destinationPath);
var agent = new SyncOrchestrator();
agent.LocalProvider = sourceProvider;
agent.RemoteProvider = destinationProvider;
agent.Direction = SyncDirectionOrder.Upload;
var stats = agent.Synchronize();
}
finally
{
if (sourceProvider != null) sourceProvider.Dispose();
if (destinationProvider != null) destinationProvider.Dispose();
}
}
both scenarios will not work out of the box without additional code.
when a sync is done, it detects changes in the source and applies it on destination.
if your case, the change was made on the destination and the source has no way of telling that you deleted a file on destination to include it on change enumeration. if the file on source is modified, you will have better luck as that would be result to a conflict and offer you the opportunity to override the delete on the destination with the file from the source.
same for your second question.
an alternative will be to run detect changes on destination, find out which file were deleted/updated and grabbing those files from the source.
I have a program with a FileSystemWatcher which watches for itself to be updated to a new version by an external program (which involves renaming the current executable and copying a new one in its place).
The problem is, when the file it's watching is in the Program Files directory, the FileVersionInfo.GetVersionInfo() doesn't get the new version information, it returns the same thing it got the first time. So if it updated from 1.1 to 1.2, it would say "Upgraded from 1.1 to 1.1" instead of "Upgraded from 1.1 to 1.2". It works correctly in the debug directory, but under Program Files, it won't get the correct value.
Here's the essence of what it's doing, without all the exception handling and disposing and logging and thread invoking and such:
string oldVersion;
long oldSize;
DateTime oldLastModified;
FileSystemWatcher fs;
string fullpath;
public void Watch()
{
fullpath = Assembly.GetEntryAssembly().Location;
oldVersion = FileVersionInfo.GetVersionInfo(fullpath).ProductVersion;
var fi = new FileInfo(fullpath);
oldSize = fi.Length;
oldLastModified = fi.LastWriteTime;
fs = new FileSystemWatcher(
Path.GetDirectoryName(fullpath), Path.GetFileName(file));
fs.Changed += FileSystemEventHandler;
fs.Created += FileSystemEventHandler;
fs.EnableRaisingEvents = true;
}
void FileSystemEventHandler(object sender, FileSystemEventArgs e)
{
if (string.Equals(e.FullPath, fullpath, StringComparison.OrdinalIgnoreCase))
{
var fi = new FileInfo(fullpath);
if (fi.Length != oldSize
|| fi.LastWriteTime != oldLastModified)
{
var newversion = FileVersionInfo.GetVersionInfo(fullpath).ProductVersion;
NotifyUser(oldVersion, newversion);
}
}
}
How do I make GetVersionInfo() refresh to see the new version? Is there something else I should be calling instead?
I'm answering my own question because there doesn't seem to be much interest. If anyone has a better answer, I'll accept that instead...
As far as I can tell, there is no way to make it refresh. Instead I worked around the issue:
return AssemblyName.GetAssemblyName(fullpath).Version.ToString();
Combined with code that makes sure it only gets called once, it seems to work just fine.
I have an application written in C#, and I am seeking to write some information to the hidden ProgramData in order to access the same connection string from both the application's front end and back end.
I am accessing the directory using path variables as follows:
private bool ProgramDataWriteFile(string contentToWrite)
{
try
{
string strProgramDataPath = "%PROGRAMDATA%";
string directoryPath = Environment.ExpandEnvironmentVariables(strProgramDataPath) + "\\MyApp\\";
string path = Environment.ExpandEnvironmentVariables(strProgramDataPath)+"\\MyApp\\ConnectionInfo.txt";
if (Directory.Exists(directoryPath))
{
System.IO.StreamWriter file = new System.IO.StreamWriter(path);
file.Write(contentToWrite);
file.Close();
}
else
{
Directory.CreateDirectory(directoryPath);
System.IO.StreamWriter file = new System.IO.StreamWriter(path);
file.Write(contentToWrite);
file.Close();
}
return true;
}
catch (Exception e)
{
}
return false;
}
This seems to work correctly. However, my question is, when I used this path variable: %AllUsersProfile%(%PROGRAMDATA%)
instead, it expanded into an illegal(and redundant) file path : C:\ProgramData(C:\ProgramData)\
However, I thought that the latter path variable was the correct full name. Was I just using it incorrectly? I need to ensure that this connection info will be accessible to all users, will just using %PROGRAMDATA% allow that? I am using Windows 7 in case that is relevant.
From here:
FOLDERID_ProgramData / System.Environment.SpecialFolder.CommonApplicationData
The user would never want to browse here in Explorer, and settings changed here should affect every user on the machine. The default location is %systemdrive%\ProgramData, which is a hidden folder, on an installation of Windows Vista. You'll want to create your directory and set the ACLs you need at install time.
So, just use %PROGRAMDATA%, or better still:
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData)