How to run .NET and React task simultaneously? - c#

I have a multi-root work space with the following structure:
{
"folders": [
{
"name": "client", //React using WSL2
"path": "client"
},
{
// Docs and release notes
"name": "server", //.NET MVC
"path": "server"
}
]
Currently, I have to start each task (client and server task) separately. Is there a way to set VS Code so they both launch at the same time?
Best regards

Related

Steeltoe Serilog Dynamic Logging not working with .NET 7

I am trying to setup Serilog Dynamic Logging with Steeltoe. I am using .NET 7. I am using the Host Builder to add the functionality. Code example below using Host builder:
.AddDynamicSerilog((cfg, log) => log.ReadFrom.Configuration(cfg.Configuration))
I have set a breakpoint in the extension here but my code is not hitting this breakpoint:
public static IHostBuilder AddDynamicSerilog(
this IHostBuilder hostBuilder,
Action<HostBuilderContext, LoggerConfiguration> configureLogger = null,
bool preserveStaticLogger = false,
bool preserveDefaultConsole = false)
Steeltoe Configuration:
"Serilog": {
"MinimumLevel": "Information",
"Enrich": [ "FromLogContext", "WithMachineName", "WithThreadId" ],
"WriteTo": [
{
"Name": "Console"
},
{
"Name": "File",
"Args": { "path": "Logs/log.txt" }
}
],
"Properties": {
"Application": "Test"
}
}
My application is logging when i use the logger but it is logging in just the regular format no application name or tracing extensions etc. so I am assuming my configuration is not being read somehow? Is the Serilog Dynamic Logging package compatible with .NET 7? Or is there something else going on?
Tried given examples. Logging does not look like Serilog configuration is being read or applied from settings file. Also extension code isn't executed.
You can try this particular sample here: https://github.com/SteeltoeOSS/Samples/tree/main/Management/src/CloudFoundry
Change the target to <TargetFramework>net7.0</TargetFramework>
I just tried this and you can see in the picture the Serilog configuration being read. Note the versions of Steeltoe etc in the sample (to help narrow down the issue).
As far as debugging - we enable sourcelink on our nuggets so you should be able to view and debug Steeltoe code by enabling it in your debug options.
I suspect the problem is that outputTemplate is missing in appsettings. It needs to include "{Properties}", so that scoped values are included in the message.
"Serilog": {
"MinimumLevel": "Information",
"Enrich": [ "FromLogContext", "WithMachineName", "WithThreadId" ],
"WriteTo": [
{
"Name": "Console",
"Args": {
"outputTemplate": "[{Timestamp:HH:mm:ss} {Level:u3}] {SourceContext}: {Properties} {NewLine} {EventId} {Message:lj}{NewLine}{Exception}"
}
},
{
"Name": "File",
"Args": { "path": "Logs/log.txt" }
}
],
"Properties": {
"Application": "Test"
}
}
I took the following steps to reproduce the issue and make it work:
In Visual Studio: File > New > Project > ASP.NET Core Web API
From the Package Manager Console:
install-package Serilog.AspNetCore
install-package Steeltoe.Extensions.Logging.DynamicLogger
install-package Steeltoe.Extensions.Logging.DynamicSerilogCore
install-package Steeltoe.Management.TracingCore
Add to Program.cs:
// Add services to the container.
builder.AddDistributedTracincAspNetCore();
builder.AddDynamicSerilog((cfg, log) =>
log.ReadFrom.Configuration(cfg.Configuration));
builder.AddDynamicLogging();
Add the section from above to appsettings.Development.json
Run the application
This prints the following line on my console:
[14:13:08 INF] Microsoft.AspNetCore.Hosting.Diagnostics: {Protocol="HTTP/2", Method="GET", ContentType=null, ContentLength=null, Scheme="https", Host="localhost:7142", PathBase="", Path="/weatherforecast", QueryString="", RequestId="0HMNU1ISSN7T6:00000001", RequestPath="/weatherforecast", ConnectionId="0HMNU1ISSN7T6", Scope=[" [DynamicLoggingWebApi,52a92b64eb0209466ca872311bf309cc,e425a4b06ed50908,0000000000000000,true] "], Application="Test"}
{ Id: 1 } Request starting HTTP/2 GET https://localhost:7142/weatherforecast - -
From the line above, the tracing info is:
[DynamicLoggingWebApi,52a92b64eb0209466ca872311bf309cc,e425a4b06ed50908,0000000000000000,true]
Hope that helps!
By the way, I didn't have any problems stepping into the sources. To set up Visual Studio, follow the instructions at https://devblogs.microsoft.com/dotnet/improving-debug-time-productivity-with-source-link/#enabling-source-link.

asp.net core razor pages project in visual code localhost:5001 return This site can’t be reached

Windows Server 2012,
ASP.NET Core 3.0
I installed new application razor pages and try to run in debug mode in visual code
I open project on localhost:5001 but get error in browser:
This site can’t be reached. The webpage at https://localhost:5001/ might be temporarily down or it may have moved permanently to a new web address.
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": ".NET Core Launch (web)",
"type": "coreclr",
"request": "launch",
"preLaunchTask": "build",
"program": "${workspaceFolder}/bin/Debug/netcoreapp3.0/WebApplication1.dll",
"args": [],
"cwd": "${workspaceFolder}",
"stopAtEntry": false,
"serverReadyAction": {
"action": "openExternally",
"pattern": "^\\s*Now listening on:\\s+(https?://\\S+)"
},
"env": {
"ASPNETCORE_ENVIRONMENT": "Development"
},
"sourceFileMap": {
"/Pages": "${workspaceFolder}/Pages"
}
},
{
"name": ".NET Core Attach",
"type": "coreclr",
"request": "attach",
"processId": "${command:pickProcess}"
}
]
}
Additional information:
I tried to run project in visual studio. If I select run with IIS is everything fine, but if I use Project name is similar error.
Do you have below entry in your startup file.
app.UseMvc(routes =>
{
routes.MapRoute("default", "{controller=Home}/{action=Index}/{id?}");
});

how to run exe file in azure data factroy with input parameters?

i have a console application. i built this application and uploaded it to the Azure blob storage. Then i run this application Azure data factory pipeline. All are fine but the problem is if i want to add new parameters(get input) to console application how can i do that? Is there any specific way to do it?
{
"name": "samplebatch",
"type": "Custom",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"command": "SampleApp.exe",
"folderPath": "customactv2/SampleApp",
"resourceLinkedService": {
"referenceName": "StorageLinkedService",
"type": "LinkedServiceReference"
}
"linkedServiceName": {
"referenceName": "dataloadbatchservice",
"type": "LinkedServiceReference"
}
}
This is what i have done so far in the data factory pipeline code.
Please refer to the extendedProperties property in typeProperties, you could use it.
User-defined properties that can be passed to the custom application
in JSON format so your custom code can reference additional properties
Doc: https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-dotnet-custom-activity#custom-activity
Sample:https://github.com/Azure/Azure-DataFactory/blob/master/Samples/ADFv2CustomActivitySample/MyCustomActivityPipeline.json

Azure functions - Read all files from SFTP location

I need to read all the text files available from a SFTP location(without specifying filename) using Azure function triggers.
Currently, I am able to read a particular file when the name of the file is specfied - as below:
ExternalFileTrigger
Binding:
{
"bindings": [
{
"type": "apiHubFileTrigger",
"name": "input",
"direction": "in",
"path": "Outbox",
"connection": "sftp1_SFTP"
},
{
"type": "apiHubFile",
"name": "output",
"direction": "out",
"path": "Inbox/test.txt",
"connection": "googledrive_GOOGLEDRIVE"
}
],
"disabled": false
}
Func:
using System;
public static void Run(string input, out string output, TraceWriter log)
{
log.Info("File found: "+input);
output = input;
}
Request Body:
Outbox/test.txt
Note: Any file irrespective of filename pushed to SFTP location should be read by the Azure function.
You can use Binding parameters instead of hard coding file Name: test.txt. You can also specify patterns for the filename. See here for more details.
But why go to the trouble of instantiating the SFTP client yourself?
Just use the prebuilt SFTP connector in Logic Apps, and call the Function to further do your custom processing. You can probably do everything inside the Logic App if your goal is to store the file in Google Drive.

How to get a C# Arcanist linter working?

Here is what it says when I run arc linters --verbose:
AVAILABLE csharp (C#)
Configuration Options
severity (optional map<string|int, string>)
Provide a map from lint codes to adjusted severity levels: error,
warning, advice, autofix or disabled.
severity.rules (optional map<string, string>)
Provide a map of regular expressions to severity levels. All matching
codes have their severity adjusted.
discovery (map<string, list<string>>)
Provide a discovery map.
binary (string)
Override default binary.
What is a discovery map? It doesn't tell you what it is, but you MUST have it. Unfortunately, the source code did not enlighten me.
binary... I don't want to override the default binary... What is the default binary? I HAVE to override it, so I need to get one? Where do I get a C# linter binary compatible with Arcanist? The only one I could find is https://github.com/hach-que/cstools, but it throws an exception.
Let me know if I can add more info.
cstools is the one you need;you can see this if you look into the ArcanistCSharpLinter.php, where it looks for cslint.exe as the default binary.
--- Edit ---
Not sure if you're looking at the right tool; the one being referred to is this one developed by hach-que, here:
https://github.com/hach-que/cstools
I'm not super-familiar with this linter (don't do any serious C# development myself), but a quick peek in the source code suggests that the discovery map is simply a string map called "discovery", that has settings for the linter.
See also: https://github.com/hach-que/cstools/issues/1
Copy Pasting examples here since that seems to be SO policy:
Sample .arcconfig:
{
"project_id": "Tychaia",
"conduit_uri": "https://code.redpointsoftware.com.au/",
"arc.autostash": true,
"load": [
"Build/Arcanist"
],
"unit.engine": "XUnitTestEngine",
"unit.csharp.xunit.binary": "packages/xunit.runners.1.9.1/tools/xunit.console.clr4.exe",
"unit.csharp.cscover.binary": "cstools/cscover/bin/Debug/cscover.exe",
"unit.csharp.coverage.match": "/^Tychaia.*\\.(dll|exe)$/",
"unit.csharp.discovery": {
"([^/]+)/(.*?)\\.cs": [
[ "$1.Tests/$1.Tests.Linux.csproj", "$1.Tests/bin/Debug/$1.Tests.dll" ],
[ "$1.Tests/$1.Tests.Windows.csproj", "$1.Tests/bin/Debug/$1.Tests.dll" ]
],
"([^\\\\]+)\\\\(.*?)\\.cs": [
[ "$1.Tests\\$1.Tests.Windows.csproj", "$1.Tests\\bin\\Debug\\$1.Tests.dll" ]
],
"([^/]+)\\.Tests/(.*?)\\.cs": [
[ "$1.Tests/$1.Tests.Linux.csproj", "$1.Tests/bin/Debug/$1.Tests.dll" ],
[ "$1.Tests/$1.Tests.Windows.csproj", "$1.Tests/bin/Debug/$1.Tests.dll" ]
],
"([^\\\\]+)\\.Tests\\\\(.*?)\\.cs": [
[ "$1.Tests\\$1.Tests.Windows.csproj", "$1.Tests\\bin\\Debug\\$1.Tests.dll" ]
]
}
}
Sample .arclint:
{
"linters": {
"csharp": {
"type": "csharp",
"include": "(\\.cs$)",
"exclude": [ "(\\.Designer\\.cs$)", "(Phabricator\\.Conduit(.*+))", "(TychaiaProfilerEntityUtil\\.cs)" ],
"binary": "cstools/cslint/bin/Debug/cslint.exe",
"discovery": {
"([^/]+)/(.*?)\\.cs": [
"$1/$1.Linux.csproj"
],
"([^\\\\]+)\\\\(.*?)\\.cs": [
"$1\\$1.Windows.csproj"
]
}
},
"license": {
"type": "tychaialicense",
"include": "(\\.cs$)",
"exclude": [ "(\\.Designer\\.cs$)", "(Phabricator\\.Conduit(.*+))", "(TychaiaProfilerEntityUtil\\.cs)" ]
}
}
}
Anyway, if you're having trouble getting cstools to work, I'd recommend opening an issue on the Github repo; he seems to be pretty responsive. Plus, as an open-source developer, it's always great to hear that others are using your work.

Categories

Resources