I have a large number of async target wrappers created like this:
var fileTarget = new FileTarget("file_"+unique_id)
{
FileName = $"{unique_id}.log"),
Layout = layout_string,
DeleteOldFileOnStartup = false,
KeepFileOpen = true,
AutoFlush = false
};
var asyncFileTarget =
new NLog.Targets.Wrappers.AsyncTargetWrapper(fileTarget, 10000, NLog.Targets.Wrappers.AsyncTargetWrapperOverflowAction.Block);
asyncFileTarget.Name = unique_id;
m_job_logger_factory.Configuration.AddTarget(asyncFileTarget);
m_job_logger_factory.Configuration.AddRule(TranslateEnum(logLevel), NLog.LogLevel.Fatal, asyncFileTarget, assignedLogger.Name);
m_job_logger_factory.ReconfigExistingLoggers();
Each target has a separate logger assigned to it.
Then I log a couple of lines to each of the created targets through their assigned loggers.
Nothing is logged to the file yet, as they are async, and the queue is not full.
Now, if I do
Target t = m_job_logger_factory.Configuration.FindTargetByName(unique_id);
t.Flush((e) => { });
I'd expect only the unique_id file to have the new logs, instead I find all the files updated (as if I had called flush on the LogManager/LogFactory, and not this specific target).
Is this configuration incorrect or is this a bug with target.Flush(AsyncContinuation)?
Related
I'm trying to use a PreparationTask to grab my ResourceFiles to be used as input data.
My prep task looks like this:
myJob.JobPreparationTask = new JobPreparationTask { CommandLine = jobPrepCmdLine };
How do I configure my job with a PreparationTask to download ResourceFiles from my AutoStorageContainer to pool VM's?
I tried:
var inputFiles = new List<ResourceFile> { };
var file = ResourceFile.FromAutoStorageContainer("fgrp-jill2");
inputFiles.Add(file);
myJob.JobPreparationTask.ResourceFiles = inputFiles;
But get a null object error, even when the inputFiles.Add is showing at least 1 file recognized.
You should use the Storage SDK in conjunction with Batch in this scenario. You can follow this as an example: https://learn.microsoft.com/en-us/azure/batch/quick-run-dotnet#preliminaries
A Job preparation task functions very similar to a regular Start Task in that it runs before the tasks execution. In the example from the link, you'll see that we reference the Blob client, container name and file path. I'll paste the sample here:
List<string> inputFilePaths = new List<string>
{
"taskdata0.txt",
"taskdata1.txt",
"taskdata2.txt"
};
List<ResourceFile> inputFiles = new List<ResourceFile>();
foreach (string filePath in inputFilePaths)
{
inputFiles.Add(UploadFileToContainer(blobClient, inputContainerName, filePath));
}
I have two projects A and B, both of them use a NLog libiary. Now I have an issue:
if A writes loginfo into the log file first, then the B never logs. And if B writes loginfo into the log file first, then A never logs.
As A and B use a same NLog libiary, so they use the same Nlog Config, but they will be built in two processors, here is the config info.
Does somebody have any good idea on this issue?
//Set NLog Config by:
//https://github.com/nlog/NLog/wiki/Configuration-API
private static Logger GenerateLogInstance()
{
// Step 1. Create configuration object
var config = new LoggingConfiguration();
// Step 2. Create targets
var fileTarget = new FileTarget()
{
FileName = #"C:\Logs\${shortdate}.log",
Layout = #"${longdate} ${uppercase:${level}} ${message}${onexception:${newline}EXCEPTION\: ${exception:format=ToString}}"
};
//var wrapper = new AsyncTargetWrapper(fileTarget, 5000, AsyncTargetWrapperOverflowAction.Discard);
// Step 3. Define rules
config.AddTarget("myprojectLog", fileTarget);
config.LoggingRules.Add(new NLog.Config.LoggingRule("*", NLog.LogLevel.Trace, fileTarget));
// Step 4. Activate the configuration
var factory = new LogFactory(config);
return factory.GetLogger("myprojectLog");
}
I don't use nlog but take a look at the following. You may need to set concurrentWrites="true"
File target
concurrentWrites - Enables support for optimized concurrent writes to
same log file from multiple processes on the same machine-host, when
using keepFileOpen = true. By using a special technique that lets it
keep the files open from multiple processes. If only single process
(and single AppDomain) application is logging, then it is faster to
set to concurrentWrites = False. Boolean Default: True. Note: in UWP
this setting should be false
Could you try this instead:
private static LogFactory GenerateLogFactory()
{
// Step 0. Create isolated LogFactory
var logFactory = new LogFactory();
// Step 1. Create configuration object
var config = new LoggingConfiguration(logFactory);
// Step 2. Create targets
var fileTarget = new FileTarget()
{
FileName = #"C:\Logs\${shortdate}.log",
Layout = #"${longdate} ${uppercase:${level}} ${message}${onexception:${newline}EXCEPTION\: ${exception:format=ToString}}"
};
// Step 3. Define rules
config.AddTarget("myprojectLog", fileTarget);
config.LoggingRules.Add(new NLog.Config.LoggingRule("*", NLog.LogLevel.Trace, fileTarget));
// Step 4. Activate the configuration
logFactory.Configuration = config;
return logFactory;
}
private static Logger GenerateLogInstance()
{
return GenerateLogFactory().GetLogger("myprojectLog");
}
Btw. if two projects in the same solution is using this same method, then you can consider doing this:
Lazy<LogFactory> LazyLogFactory = new Lazy<LogFactory>(() => GenerateLogFactory());
private static Logger GenerateLogInstance(string loggerName = "myprojectLog")
{
return LazyLogFactory.Value.GetLogger(loggerName);
}
[BeforeFeature]
public static void BeforeFeature()
{
featureTitle = $"{FeatureContext.Current.FeatureInfo.Title}";
featureRollFileAppender = new RollingFileAppender
{
AppendToFile = true,
StaticLogFileName = true,
Threshold = Level.All,
Name = "FeatureAppender",
File = "test.log",
Layout = new PatternLayout("%date %m%newline%exception"),
};
featureRollFileAppender.ActivateOptions();
log.Info("test");
}
I am attempting to use log4net to output a simple string, however, once the file has been generated, it does not contain any data.
No errors are thrown and the test does complete successfully.
It turns out that the previously selected RollingFileAppender was still open and I needed to select another RollingFileAppender. This is one of the issues when using multiple log files. Once this was resolved, the Info() method would output to my desired log file.
I was able to resolve my issue by adding the following code:
BasicConfigurator.Configure(nameRunRollFileAppender);
log = LogManager.GetLogger(typeof(Tracer));
log.Info("Output some data");
In the latest Microsoft.Azure.WebJobs.ServiceBus package, it gives you the ability to receive batches of messages from eventhubs. I would like to set how many messages I want to receive in a batch.
The core ServiceBus library allows you to overload the Receive() function and provide the batch size.
How does one do this in the initial config of an EventHubs receiver, or is something else required?
You can do this in Functions via the eventHub configuration block in host.json as described here. E.g.:
{
"eventHub": {
"maxBatchSize": 500,
"prefetchCount": 100
}
}
We apply those configuration settings to the EventProcessorOptions when we create the EventProcessorHost (see here).
Steph,
The MaxBatchSize can be configured through EventProcessorOptions, you can pass it as a parameter when creating a new EventHubConfiguration.
var options = EventProcessorOptions.DefaultOptions;
options.MaxBatchSize = 50;
var eventHubConfig = new EventHubConfiguration(options);
string eventHubName = "MyHubName";
eventHubConfig.AddSender(eventHubName, "Endpoint=sb://test.servicebus.windows.net/;SharedAccessKeyName=SendRule;SharedAccessKey=xxxxxxxx");
eventHubConfig.AddReceiver(eventHubName, "Endpoint=sb://test.servicebus.windows.net/;SharedAccessKeyName=ReceiveRule;SharedAccessKey=yyyyyyy");
config.UseEventHub(eventHubConfig);
JobHost host = new JobHost(config);
As you can notice in the source code of EventHubConfiguration.cs if no EventProcessorOptions is specified, the MaxBatchSize is set to 1000 instead of 10 by default.
public EventHubConfiguration(
EventProcessorOptions options,
PartitionManagerOptions partitionOptions = null)
{
if (options == null)
{
options = EventProcessorOptions.DefaultOptions;
options.MaxBatchSize = 1000;
}
_partitionOptions = partitionOptions;
_options = options;
}
I would like to read, modify and write back csproj files.
I've found this code, but unfortunately Engine class is depreciated.
Engine engine = new Engine()
Project project = new Project(engine);
project.Load("myproject.csproj");
project.SetProperty("SignAssembly", "true");
project.Save("myproject.csproj");
So I've continued based on the hint I should use Evaluation.ProjectCollection instead of Engine:
var collection = new ProjectCollection();
collection.DefaultToolsVersion = "4.0";
var project = new Project(collection);
// project.Load("myproject.csproj") There is NO Load method :-(
project.FullPath = "myproject.csproj"; // Instead of load? Does nothing...
// ... modify the project
project.Save(); // Interestingly there is a Save() method
There is no Load method anymore. I've tried to set the property FullPath, but the project still seems empty. Missed I something?
(Please note I do know that the .csproj file is a standard XML file with XSD schema and I know that we could read/write it by using XDocument or XmlDocument. That's a backup plan. Just seeing the .Save() method on the Project class I think I missed something if I can not load an existing .csproj. thx)
I've actually found the answer, hopefully will help others:
Instead of creating a new Project(...) and trying to .Load(...) it, we should use a factory method of the ProjectCollection class.
// Instead of:
// var project = new Project(collection);
// project.FullPath = "myproject.csproj"; // Instead of load? Does nothing...
// use this:
var project = collection.LoadProject("myproject.csproj")
Since i can't comment:
This won't work in .net core without first setting the MSBuild.exe path variable. The code to do so can be found here
https://blog.rsuter.com/missing-sdk-when-using-the-microsoft-build-package-in-net-core/
and is written here
private static void SetMsBuildExePath()
{
try
{
var startInfo = new ProcessStartInfo("dotnet", "--list-sdks")
{
RedirectStandardOutput = true
};
var process = Process.Start(startInfo);
process.WaitForExit(1000);
var output = process.StandardOutput.ReadToEnd();
var sdkPaths = Regex.Matches(output, "([0-9]+.[0-9]+.[0-9]+) \\[(.*)\\]")
.OfType<Match>()
.Select(m => System.IO.Path.Combine(m.Groups[2].Value, m.Groups[1].Value, "MSBuild.dll"));
var sdkPath = sdkPaths.Last();
Environment.SetEnvironmentVariable("MSBUILD_EXE_PATH", sdkPath);
}
catch (Exception exception)
{
Console.Write("Could not set MSBUILD_EXE_PATH: " + exception);
}
}