I need to read all the text files available from a SFTP location(without specifying filename) using Azure function triggers.
Currently, I am able to read a particular file when the name of the file is specfied - as below:
ExternalFileTrigger
Binding:
{
"bindings": [
{
"type": "apiHubFileTrigger",
"name": "input",
"direction": "in",
"path": "Outbox",
"connection": "sftp1_SFTP"
},
{
"type": "apiHubFile",
"name": "output",
"direction": "out",
"path": "Inbox/test.txt",
"connection": "googledrive_GOOGLEDRIVE"
}
],
"disabled": false
}
Func:
using System;
public static void Run(string input, out string output, TraceWriter log)
{
log.Info("File found: "+input);
output = input;
}
Request Body:
Outbox/test.txt
Note: Any file irrespective of filename pushed to SFTP location should be read by the Azure function.
You can use Binding parameters instead of hard coding file Name: test.txt. You can also specify patterns for the filename. See here for more details.
But why go to the trouble of instantiating the SFTP client yourself?
Just use the prebuilt SFTP connector in Logic Apps, and call the Function to further do your custom processing. You can probably do everything inside the Logic App if your goal is to store the file in Google Drive.
Related
I am trying to setup Serilog Dynamic Logging with Steeltoe. I am using .NET 7. I am using the Host Builder to add the functionality. Code example below using Host builder:
.AddDynamicSerilog((cfg, log) => log.ReadFrom.Configuration(cfg.Configuration))
I have set a breakpoint in the extension here but my code is not hitting this breakpoint:
public static IHostBuilder AddDynamicSerilog(
this IHostBuilder hostBuilder,
Action<HostBuilderContext, LoggerConfiguration> configureLogger = null,
bool preserveStaticLogger = false,
bool preserveDefaultConsole = false)
Steeltoe Configuration:
"Serilog": {
"MinimumLevel": "Information",
"Enrich": [ "FromLogContext", "WithMachineName", "WithThreadId" ],
"WriteTo": [
{
"Name": "Console"
},
{
"Name": "File",
"Args": { "path": "Logs/log.txt" }
}
],
"Properties": {
"Application": "Test"
}
}
My application is logging when i use the logger but it is logging in just the regular format no application name or tracing extensions etc. so I am assuming my configuration is not being read somehow? Is the Serilog Dynamic Logging package compatible with .NET 7? Or is there something else going on?
Tried given examples. Logging does not look like Serilog configuration is being read or applied from settings file. Also extension code isn't executed.
You can try this particular sample here: https://github.com/SteeltoeOSS/Samples/tree/main/Management/src/CloudFoundry
Change the target to <TargetFramework>net7.0</TargetFramework>
I just tried this and you can see in the picture the Serilog configuration being read. Note the versions of Steeltoe etc in the sample (to help narrow down the issue).
As far as debugging - we enable sourcelink on our nuggets so you should be able to view and debug Steeltoe code by enabling it in your debug options.
I suspect the problem is that outputTemplate is missing in appsettings. It needs to include "{Properties}", so that scoped values are included in the message.
"Serilog": {
"MinimumLevel": "Information",
"Enrich": [ "FromLogContext", "WithMachineName", "WithThreadId" ],
"WriteTo": [
{
"Name": "Console",
"Args": {
"outputTemplate": "[{Timestamp:HH:mm:ss} {Level:u3}] {SourceContext}: {Properties} {NewLine} {EventId} {Message:lj}{NewLine}{Exception}"
}
},
{
"Name": "File",
"Args": { "path": "Logs/log.txt" }
}
],
"Properties": {
"Application": "Test"
}
}
I took the following steps to reproduce the issue and make it work:
In Visual Studio: File > New > Project > ASP.NET Core Web API
From the Package Manager Console:
install-package Serilog.AspNetCore
install-package Steeltoe.Extensions.Logging.DynamicLogger
install-package Steeltoe.Extensions.Logging.DynamicSerilogCore
install-package Steeltoe.Management.TracingCore
Add to Program.cs:
// Add services to the container.
builder.AddDistributedTracincAspNetCore();
builder.AddDynamicSerilog((cfg, log) =>
log.ReadFrom.Configuration(cfg.Configuration));
builder.AddDynamicLogging();
Add the section from above to appsettings.Development.json
Run the application
This prints the following line on my console:
[14:13:08 INF] Microsoft.AspNetCore.Hosting.Diagnostics: {Protocol="HTTP/2", Method="GET", ContentType=null, ContentLength=null, Scheme="https", Host="localhost:7142", PathBase="", Path="/weatherforecast", QueryString="", RequestId="0HMNU1ISSN7T6:00000001", RequestPath="/weatherforecast", ConnectionId="0HMNU1ISSN7T6", Scope=[" [DynamicLoggingWebApi,52a92b64eb0209466ca872311bf309cc,e425a4b06ed50908,0000000000000000,true] "], Application="Test"}
{ Id: 1 } Request starting HTTP/2 GET https://localhost:7142/weatherforecast - -
From the line above, the tracing info is:
[DynamicLoggingWebApi,52a92b64eb0209466ca872311bf309cc,e425a4b06ed50908,0000000000000000,true]
Hope that helps!
By the way, I didn't have any problems stepping into the sources. To set up Visual Studio, follow the instructions at https://devblogs.microsoft.com/dotnet/improving-debug-time-productivity-with-source-link/#enabling-source-link.
I have a multi-root work space with the following structure:
{
"folders": [
{
"name": "client", //React using WSL2
"path": "client"
},
{
// Docs and release notes
"name": "server", //.NET MVC
"path": "server"
}
]
Currently, I have to start each task (client and server task) separately. Is there a way to set VS Code so they both launch at the same time?
Best regards
i have a console application. i built this application and uploaded it to the Azure blob storage. Then i run this application Azure data factory pipeline. All are fine but the problem is if i want to add new parameters(get input) to console application how can i do that? Is there any specific way to do it?
{
"name": "samplebatch",
"type": "Custom",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"command": "SampleApp.exe",
"folderPath": "customactv2/SampleApp",
"resourceLinkedService": {
"referenceName": "StorageLinkedService",
"type": "LinkedServiceReference"
}
"linkedServiceName": {
"referenceName": "dataloadbatchservice",
"type": "LinkedServiceReference"
}
}
This is what i have done so far in the data factory pipeline code.
Please refer to the extendedProperties property in typeProperties, you could use it.
User-defined properties that can be passed to the custom application
in JSON format so your custom code can reference additional properties
Doc: https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-dotnet-custom-activity#custom-activity
Sample:https://github.com/Azure/Azure-DataFactory/blob/master/Samples/ADFv2CustomActivitySample/MyCustomActivityPipeline.json
I am trying to get project information from tfs server programitaically.I want to know how to acess the capacity information.Ive serached for it online and it says that that capacity info is stored in [dbo].[tbl_TeamConfigurationCapacity].
But am not understanding how to query for the table using wiql.Anyone have any idea about it ?
This table is only available in the Project Collection database and querying that table is not supported through SQL nor WIQL. While technically possible through SQL, any direct access of the Project Collection database is unsupported and the underlying structure may change between major versions, updates and even hotfixes.
Instead of directly accessing the capacity in the database, the supported method is to use the REST api to query the capacity.
Example:
GET https://{instance}/DefaultCollection/{project}/{team}/_apis/work/TeamSettings/Iterations/{iterationid}/Capacities?api-version={version}
GET https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/Fabrikam-Fiber/_apis/work/teamsettings/iterations/2ec76bfe-ba74-4060-970d-4567a3e997ee/capacities?api-version=2.0-preview.1
{
"values": [
{
"teamMember": {
"id": "8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d",
"displayName": "Chuck Reinhart",
"uniqueName": "fabrikamfiber3#hotmail.com",
"url": "https://fabrikam-fiber-inc.vssps.visualstudio.com/_apis/Identities/8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d",
"imageUrl": "https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/_api/_common/identityImage?id=8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d"
},
"activities": [
{
"capacityPerDay": 0,
"name": null
}
],
"daysOff": [],
"url": "https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/6d823a47-2d51-4f31-acff-74927f88ee1e/748b18b6-4b3c-425a-bcae-ff9b3e703012/_apis/work/teamsettings/iterations/2ec76bfe-ba74-4060-970d-4567a3e997ee/capacities/8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d"
}
]
}
Here is what it says when I run arc linters --verbose:
AVAILABLE csharp (C#)
Configuration Options
severity (optional map<string|int, string>)
Provide a map from lint codes to adjusted severity levels: error,
warning, advice, autofix or disabled.
severity.rules (optional map<string, string>)
Provide a map of regular expressions to severity levels. All matching
codes have their severity adjusted.
discovery (map<string, list<string>>)
Provide a discovery map.
binary (string)
Override default binary.
What is a discovery map? It doesn't tell you what it is, but you MUST have it. Unfortunately, the source code did not enlighten me.
binary... I don't want to override the default binary... What is the default binary? I HAVE to override it, so I need to get one? Where do I get a C# linter binary compatible with Arcanist? The only one I could find is https://github.com/hach-que/cstools, but it throws an exception.
Let me know if I can add more info.
cstools is the one you need;you can see this if you look into the ArcanistCSharpLinter.php, where it looks for cslint.exe as the default binary.
--- Edit ---
Not sure if you're looking at the right tool; the one being referred to is this one developed by hach-que, here:
https://github.com/hach-que/cstools
I'm not super-familiar with this linter (don't do any serious C# development myself), but a quick peek in the source code suggests that the discovery map is simply a string map called "discovery", that has settings for the linter.
See also: https://github.com/hach-que/cstools/issues/1
Copy Pasting examples here since that seems to be SO policy:
Sample .arcconfig:
{
"project_id": "Tychaia",
"conduit_uri": "https://code.redpointsoftware.com.au/",
"arc.autostash": true,
"load": [
"Build/Arcanist"
],
"unit.engine": "XUnitTestEngine",
"unit.csharp.xunit.binary": "packages/xunit.runners.1.9.1/tools/xunit.console.clr4.exe",
"unit.csharp.cscover.binary": "cstools/cscover/bin/Debug/cscover.exe",
"unit.csharp.coverage.match": "/^Tychaia.*\\.(dll|exe)$/",
"unit.csharp.discovery": {
"([^/]+)/(.*?)\\.cs": [
[ "$1.Tests/$1.Tests.Linux.csproj", "$1.Tests/bin/Debug/$1.Tests.dll" ],
[ "$1.Tests/$1.Tests.Windows.csproj", "$1.Tests/bin/Debug/$1.Tests.dll" ]
],
"([^\\\\]+)\\\\(.*?)\\.cs": [
[ "$1.Tests\\$1.Tests.Windows.csproj", "$1.Tests\\bin\\Debug\\$1.Tests.dll" ]
],
"([^/]+)\\.Tests/(.*?)\\.cs": [
[ "$1.Tests/$1.Tests.Linux.csproj", "$1.Tests/bin/Debug/$1.Tests.dll" ],
[ "$1.Tests/$1.Tests.Windows.csproj", "$1.Tests/bin/Debug/$1.Tests.dll" ]
],
"([^\\\\]+)\\.Tests\\\\(.*?)\\.cs": [
[ "$1.Tests\\$1.Tests.Windows.csproj", "$1.Tests\\bin\\Debug\\$1.Tests.dll" ]
]
}
}
Sample .arclint:
{
"linters": {
"csharp": {
"type": "csharp",
"include": "(\\.cs$)",
"exclude": [ "(\\.Designer\\.cs$)", "(Phabricator\\.Conduit(.*+))", "(TychaiaProfilerEntityUtil\\.cs)" ],
"binary": "cstools/cslint/bin/Debug/cslint.exe",
"discovery": {
"([^/]+)/(.*?)\\.cs": [
"$1/$1.Linux.csproj"
],
"([^\\\\]+)\\\\(.*?)\\.cs": [
"$1\\$1.Windows.csproj"
]
}
},
"license": {
"type": "tychaialicense",
"include": "(\\.cs$)",
"exclude": [ "(\\.Designer\\.cs$)", "(Phabricator\\.Conduit(.*+))", "(TychaiaProfilerEntityUtil\\.cs)" ]
}
}
}
Anyway, if you're having trouble getting cstools to work, I'd recommend opening an issue on the Github repo; he seems to be pretty responsive. Plus, as an open-source developer, it's always great to hear that others are using your work.