In my project I have created a logging system which is basically a shell on top of nLog. I am trying to unittest the archive feature of the logging system. It is currently setup to do rolling archives with a max archive files of 5 (Nlog is setup via code, no configuration file is used):
var myFileTarget = new FileTarget();
LogConfig.AddTarget("file", myFileTarget);
myFileTarget.FileName = LogFile;
myFileTarget.Layout = LogFileLayout;
myFileTarget.AutoFlush = true;
//Archive specifics
var token = "{#}";
var archiveFileName = $"{Path.GetFileNameWithoutExtension(LogFile)}.{token}.{Path.GetExtension(LogFile)}";
myFileTarget.ArchiveFileName = archiveFileName;
myFileTarget.ArchiveNumbering = ArchiveNumberingMode.Rolling;
myFileTarget.ArchiveEvery = toFileArchivePeriod(LogStyle);
myFileTarget.EnableFileDelete = true;
myFileTarget.MaxArchiveFiles = TTL; //Time to Live
myFileTarget.DeleteOldFileOnStartup = true;
To simulate that a lot of logs already exists, I create a range of logs with the same structure as the ArchiveFileName above:
LogFileName = "LogTTLDailyTest.log";
LogStyle = "daily";
LogTTL = 5; //Time To Live
//Arrange old filelogs
for (int i = 0; i < 100; i++)
{
var filename = $"LogTTLDailyTest.{i}.log";
File.WriteAllText(TestLogsDirectory + filename, "UNITTEST");
var creationTime = DateTime.Now.AddDays((i + 1)* -1);
File.SetCreationTime(TestLogsDirectory + filename, creationTime);
File.SetLastWriteTime(TestLogsDirectory + filename, creationTime);
}
But when I write a log to nlog via my Log system it does not see the old log files I created and therefore do not delete them. It does however clean up the old current log file, so deleting files work:
NLog: 2017-07-20 12:54:20.7434 Info Closing old configuration.
NLog: 2017-07-20 12:54:20.7434 Info Found 36 configuration items
NLog: 2017-07-20 12:54:20.7594 Info Found 36 configuration items
NLog: 2017-07-20 12:54:26.9157 Info Deleting old archive file: 'C:\<projectpath>\bin\Debug\unittestlogsea984b05-3c33-4142-9d1a-c900bad89006\LogTTLDailyTest.log'.
My current theory is that the nlog sees the old logs but have some kind of validation process of the contents of the files which I only fill up with "UNITTEST" as content, but I haven't been able to "restart" nlog or force it to see the logs.
Hope you can help me
This is no issue here, just me not setting the test up right. What I forgot is that rolling does not delete, but merely rename all files to the next rolling number. The last file in this rolling manner is deleted. Any other file is ignored by nlog. I also forgot to set my current log file a day back. This meant that nlog saw I had a current file that was not a day old. But because i had DeleteOldFileOnStartup activated it deleted that file.
So, to fix my mistake I made sure I only create as many files as I needed or less and made sure the latest file did not have a rolling number and was a day old. I also removed the DeleteOldFileOnStartup option. The Nlog is now doing what I expect it to do flawlessly.
Related
I have a solution I created a little over a year ago that uses PnP.Core to upload file to a specific folder on SharePoint. It was all well until a couple days ago where that solution started generating error that says
To update this folder, go to the channel in Microsoft Teams
I am at a bit of a loss as to why and what is causing this.
Below is a minimal code sample of what I have. I should mention that the folder is getting created but fails with said error when uploading the file to the folder.
Any pointers would be greatly appreciated.
authenticate using officeDev.PnP.Core.AuthenticationManager
...
Folder Root_Folder = web.GetFolderByServerRelativeUrl(Root_Folder_Relative_Url_Path);
//Create new subFolder to load files into
string Folder_Name = _Folder_Name;
Root_Folder.Folders.Add(Folder_Name);
Root_Folder.Update();
//Add file to new Folder
Folder Subject_Folder = web.GetFolderByServerRelativeUrl(Root_Folder_Relative_Url_Path + "/" + Folder_Name);
FileCreationInformation Subject_Result_File = new FileCreationInformation {
ContentStream = new MemoryStream(_File_To_Upload),
Url = _File_Name,
Overwrite = true
};
Microsoft.SharePoint.Client.File uploadFile = Subject_Folder.Files.Add(Subject_Result_File);
Subject_Folder.Update();
Client_Ctx.ExecuteQuery();
Looks like the Update method was the issue. Removing it and just letting the ExecuteQuery handle all the operation fixed it.
I'm using Nlog and I need to change the name of the default log to include information such as a company name. I've used this code a long time ago on an a console app and it renamed the file as expected.
I'm now trying to use the same code in a new app and it's creating a new log file instead of just renaming the current one. For example, I now have two files (2019-10.07.log and 2019-10-07_CompanyName.log). The default log will have few initial log entries and then it the remainder of the logs go into the new one.
Looking for any suggestions. I've been searching for fixes but everything points me back to the code I'm already using.
NLog v4.6.7
fileNameOnly = "CompanyName";
FileTarget defaultTarget = FindNLogTargetByName("DefaultTarget");
defaultTarget.FileName = logDirectory + string.Format("{0:yyyy-MM-dd}", DateTime.Now) + "_" + fileNameOnly + ".log";
LogManager.ReconfigExistingLoggers();
NLog doesn't support renaming an existing file. If a new file name is used, all the logs will be appended to the new file.
So for the file name you need to use System.IO.File.Move(path, pathnew) and change NLog.
Unfortunately it's a bit tricky when doing high volume logging, as NLog will recreate the old log file until the target is changed.
NLog can load settings (like Company name) from app.config or appsettings.json.
Just update your NLog.config to reference the setting. Ex.
<target type="file" name="myfile" fileName="${appsetting:CompanyName}${shortdate}.log" />
See also: https://github.com/NLog/NLog/wiki/AppSetting-Layout-Renderer (Net Framework)
See also: https://github.com/NLog/NLog/wiki/ConfigSetting-Layout-Renderer (Net Core)
I have a situation here. I want to read files based on their creation of last modified time. Initially i used FileSystemWatcher so that i was notified when a new file was coming, but later i realized that if the system on which my software is running goes down or restarts the location where files were being dropped will still continue.
To make it easier for understanding i will give an example:
System A - File Server (Files are created every 2 min in a directory on this server)
System B - My Software will run and Monitor files from the Path of System A
If System B goes restarts and is up again after 10 min the FileSystemWatcher will skip all these files which were generated in those 10 min.
How Can I ensure that those files generated in those 10 min of time are also captured?
Let me know if my question is still not understandable.
If you don't want to split it up in two systems, you have to persist a little bit of information.
You could store the current timestamp in a file, every time a new event was fired on the filesystem watcher. Every time your service starts, you can read all files from the filesystem that are newer than the last timestamp. This way you shouldn't miss a file.
I would split this application into two parts and running a filesystemwatcher-wcf service that buffers the files created in this 10 minutes and will send it to system b when it is restarted. I can't see a other way, sorry.
I think the FileSystemWatcher must write info about file system into DB (or other type of storage). When System B starts, watcher compares current file system with this info and will raise events about changes.
Copy the entire files from the Source Machine and paste into the destination based on condition..
string dirPath = #"C:\A";
string DestinPath = #"C:\B";
if (Directory.Exists(dirPath) && Directory.Exists(DestinPath))
{
DirectoryInfo di = new DirectoryInfo(dirPath);
foreach (var file in di.GetFiles())
{
string destinFile = DestinPath + "\\" + file.Name;
if (File.Exists(destinFile))
{
continue;
}
else
file.CopyTo(destinFile);
}
}
Not sure if I understood your question correctly, but based on what I get and assuming both systems are in sync in terms of time, if for example you want to get files that have been modified within ten minutes ago:
DateTime tenMinutesAgo = DateTime.Now.AddMinutes(-10);
string[] systemAFiles = System.IO.Directory.GetFiles(systemAPath);
foreach (string files in systemAFiles)
{
DateTime lastWriteTime = System.IO.File.GetLastWriteTime(files);
if (lastWriteTime > tenMinutesAgo) //produced after ten minutes ago
{
//read file
}
}
I understood that these files are "generated" so they have been created or modified. If they have simply been moved from one folder to another this will not work. In that case the best way is to write a snapshot of the files in that list (and writing it to some sort of a save file) and compare it when it is running again.
I am writing a script task in my SSIS package to delete the log file being used by the package if it is older than X number of days. Here is my current code:
if (File.Exists((String)Dts.Variables["ErrorLogLocation"].Value))
{
DateTime logCreatedDate = File.GetCreationTime((String)Dts.Variables["ErrorLogLocation"].Value);
if (logCreatedDate < DateTime.Now.AddDays(-3))
{
try
{
File.Delete((String)Dts.Variables["ErrorLogLocation"].Value);
}
catch (Exception e)
{
using (StreamWriter sw = File.AppendText((String)Dts.Variables["ErrorLogLocation"].Value))
{
sw.WriteLine(DateTime.Now + " : "+ e);
}
}
using (StreamWriter sw = File.AppendText((String)Dts.Variables["ErrorLogLocation"].Value))
{
sw.WriteLine("New log Creation Date: " + File.GetCreationTime((String)Dts.Variables["ErrorLogLocation"].Value));
}
}
}
it seems like it would work, but the issue is when I delete the file, then write to a new file with the same name, the information inside is being wiped as expected, but the file creation date remains the same as before the deletion. This is an issue because i am basing when to delete this file on that datetime. The expected behavior in my mind is it would delete the file, then write to a completely new file with a new created date.
Any idea what may be causing this? Is it due to the fact I am deleting a file then immediately appending to a file with the same name(which i would assume would just create a new file if it doesnt exist?) is this file still being kept in memory or something during that period?
From the documentation:
NTFS-formatted drives may cache information about a file, such as file
creation time, for a short period of time. As a result, it may be
necessary to explicitly set the creation time of a file if you are
overwriting or replacing an existing file.
Based on this information, you must call File.SetCreationTime to make sure it gets the desired timestamp.
I feel kind of stupid posting this, but it seems like a genuine issue that I've made sufficiently simple so as to demonstrate that it should not fail. As part of my work I am responsible for maintaining build systems that take files under version control, and copy them to other locations. Sounds simple, but I've constantly experienced file access violations when attempting to copy files that I've supposedly already set as 'Normal'.
The code sample below simply creates a set of test files, makes them read only, and then copies them over to another folder. If the files already exist in the destination folder, the RO attribute is cleared so that the file copy will not fail.
The code works to a point, but at seemingly random points an exception is thrown when the file copy is attempted. The code is all single threaded, so unless .NET is doing something under the hood that causes a delay on the setting of attributes I can't really explain the problem.
If anyone can explain why this is happening I'd be interested. I'm not looking for a solution unless there is something I am definitely doing wrong, as I've handled the issue already, I'm just curious as no one else seems to have reported anything related to this.
After a few iterations I get something like:
A first chance exception of type 'System.UnauthorizedAccessException' occurred in mscorlib.dll
Additional information: Access to the path 'C:\TempFolderB\TEMPFILENAME8903.txt' is denied.
One other fact, if you get the file attributes BEFORE the file copy, the resulting state says the file attributes indeed Normal, yet examination of the local file shows it as Read Only.
/// <summary>
/// Test copying multiple files from one folder to another while resetting RO attr
/// </summary>
static void MultiFileCopyTest()
{
/// Temp folders for our test files
string folderA = #"C:\TempFolderA";
string folderB = #"C:\TempFolderB";
/// Number of files to create
const int fileCount = 10000;
/// If the test folders do not exist populate them with some test files
if (System.IO.Directory.Exists(folderA) == false)
{
const int bufferSize = 32768;
System.IO.Directory.CreateDirectory(folderA);
System.IO.Directory.CreateDirectory(folderB);
byte[] tempBuffer = new byte[bufferSize];
/// Create a bunch of files and make them all Read Only
for (int i = 0; i < fileCount; i++)
{
string filename = folderA + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
if (System.IO.File.Exists(filename) == false)
{
System.IO.FileStream str = System.IO.File.Create(filename);
str.Write(tempBuffer, 0, bufferSize);
str.Close();
}
/// Ensure files are Read Only
System.IO.File.SetAttributes(filename, System.IO.FileAttributes.ReadOnly);
}
}
/// Number of iterations around the folders
const int maxIterations = 100;
for (int idx = 0; idx < maxIterations; idx++)
{
Console.WriteLine("Iteration {0}", idx);
/// Loop for copying all files after resetting the RO attribute
for (int i = 0; i < fileCount; i++)
{
string filenameA = folderA + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
string filenameB = folderB + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
try
{
if (System.IO.File.Exists(filenameB) == true)
{
System.IO.File.SetAttributes(filenameB, System.IO.FileAttributes.Normal);
}
System.IO.File.Copy(filenameA, filenameB, true);
}
catch (System.UnauthorizedAccessException ex)
{
Console.WriteLine(ex.Message);
}
}
}
}
(This isn't a full answer, but I don't have enough reputation yet to post comments...)
I don't think you are doing anything wrong, when I run your test code I can reproduce the problem every time. I have never got past 10 iterations without the error occurring.
I did a further test, which might shed some light on the issue:
I set all of the files in TempFolderA to hidden.
I then ensured all of the files in TempFolderB were NOT hidden.
I put a break-point on Console.WriteLine(ex.Message)
I ran the code, if it got past iteration 1 then I stopped, reset the hidden attributes and ran again.
After a couple of tries, I got a failure in the 1st iteration so I opened Windows Explorer on TempFolderB and scrolled down to the problematic file.
The file was 0 bytes in size, but had RHA attributes set.
Unfortunately I have no idea why this is. Process Monitor doesn't show any other activity which could be relevant.
Well, right in the documentation for the System.IO.File.SetAttributes(string path, System.IO.FileAttributes attributes) method, I found the following:
Exceptions:
System.UnauthorizedException:
path specified a file that is read-only.
-or- This operation is not supported on the current platform.
-or- The caller does not have the required permission.
So, if I had to guess, the file in the destination (e.g. filenameB) did in fact already exist. It was marked Read-Only, and so, the exception was thrown as per the documentation above.
Instead, what you need to do is remove the Read-Only attribute via an inverse bit mask:
if (FileExists(filenameB))
{
// Remove the read-only attribute
FileAttributes attributes = File.GetAttributes(filenameB);
attributes &= ~FileAttributes.ReadOnly;
File.SetAttributes(filenameB, attributes);
// You can't OR the Normal attribute with other attributes--see MSDN.
File.SetAttributes(filenameB, FileAttributes.Normal);
}
To be fair, the documentation on the SetAttributes method isn't real clear about how to go about setting file attributes once a file is marked as Readonly. Sure, there's an example (using the Hidden attribute), but they don't explicitly say that you need to use an inverted bitmask to remove the Hidden or Readonly attributes. One could easily assume it's just how they chose to "unset" the attribute. It's also not clear from the documentation about what would happen, for instance, if you marked the file thusly:
File.SetAttributes(pathToFile, FileAttributes.Normal);
File.SetAttributes(pathToFile, FileAttributes.Archived);
Does this result in the file first having Normal attributes set, then only Archived, or does it result in the file having Normal set, and then _additionallyhavingArchived` set, resulting in a Normal, but Archived file? I believe it's the former, rather than the latter, based on how attributes are "removed" from a file using the inverted bitmask.
If anyone finds anything contrary, please post a comment and I'll update my answer accordingly.
HTH.
This may cause because user don't have sufficient permission to access file system ,
work around :
1,Try to run application in administrative mode
OR
2,Try to run visual studio in administrative mode (if you are using debugger)