i have a console application. i built this application and uploaded it to the Azure blob storage. Then i run this application Azure data factory pipeline. All are fine but the problem is if i want to add new parameters(get input) to console application how can i do that? Is there any specific way to do it?
{
"name": "samplebatch",
"type": "Custom",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"command": "SampleApp.exe",
"folderPath": "customactv2/SampleApp",
"resourceLinkedService": {
"referenceName": "StorageLinkedService",
"type": "LinkedServiceReference"
}
"linkedServiceName": {
"referenceName": "dataloadbatchservice",
"type": "LinkedServiceReference"
}
}
This is what i have done so far in the data factory pipeline code.
Please refer to the extendedProperties property in typeProperties, you could use it.
User-defined properties that can be passed to the custom application
in JSON format so your custom code can reference additional properties
Doc: https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-dotnet-custom-activity#custom-activity
Sample:https://github.com/Azure/Azure-DataFactory/blob/master/Samples/ADFv2CustomActivitySample/MyCustomActivityPipeline.json
Related
I have a C# desktop app and a django webapp that share a set of common class/model types. The C# app is exporting json files containing instances of these models, and I am trying to import these into the django database.
The complication is that the parent model contained within the json file has properties that may reference the same sub-model in multiple places. For example, the json file looks something like this:
{
"$id": "1",
"SubModels": {
"$id": "2",
"$values": [
{
"$id": "3",
"name": "Dave",
"value": "123"
},
{
"$id": "4",
"name": "John",
"value": "42"
}
]
},
"PreferredSubModel: {
"$ref": "4"
}
}
Which was created using the using System.Text.Json.Serialization C# library with the ReferenceHandler = ReferenceHandler.Preserve serialisation option. In django, I've converted the json file into a dictionary using model_dictionary = JSONParser().parse(json_file).
Are there any existing functions (in the django Python environment) that can handle this $id/$ref system to maintain class instances, or do I need to code my own deserializer? If the latter, does anyone have any suggestions for the best way to handle it?
I'm new to django and json files, so hopefully I've just been googling the wrong terms and something exists...
I've been having to give a role to new users, problem I have to add them to 2 different cubes in 6 different environments, which is 12 times adding the user and processing the rights table, which amounts to around an hour on my company's rather weak laptop, for EVERY new user.
Is there any way to juste write some code with a list of users you wanna add to a list of cubes, and you'd just tell it to process the table after each addition ? It'd be a real life saver right now.
In SSIS, you can use the Analysis Services Execute DDL Task. This can take a TMSL script as input, which would look like below.
1) sequence - this command allows you to perform multiple operations
2) createOrReplace - this will refresh the role with the new list of members. Note that every existing member needs to be included in the role or they will be wiped out
3) refresh - processes the table
In ssis, you might create a connection to each environment and loop through a set of script files, so that you would not need to modify the package to add new members.
However, I would also suggest switching to an AD group instead of adding explicit users to the role. Then you would only need to refresh table.
{
"sequence": {
"operations": [{
"createOrReplace": {
"object": {
"database": "<Your Database>",
"role": "<Your Role Name>"
},
"role": {
"name": "Reader",
"modelPermission": "read",
"members": [{
"memberName": "<Your Domain>\\<User 1>",
"memberName": "<Your Domain>\\<User 2>",
<All the users in the role...>
}
]
}
}
}, {
"refresh": {
"type": "full",
"objects": [{
"database": "<Your Database>",
"table": "<Your Table>"
}
]
}
}
]
}
}
I need to read all the text files available from a SFTP location(without specifying filename) using Azure function triggers.
Currently, I am able to read a particular file when the name of the file is specfied - as below:
ExternalFileTrigger
Binding:
{
"bindings": [
{
"type": "apiHubFileTrigger",
"name": "input",
"direction": "in",
"path": "Outbox",
"connection": "sftp1_SFTP"
},
{
"type": "apiHubFile",
"name": "output",
"direction": "out",
"path": "Inbox/test.txt",
"connection": "googledrive_GOOGLEDRIVE"
}
],
"disabled": false
}
Func:
using System;
public static void Run(string input, out string output, TraceWriter log)
{
log.Info("File found: "+input);
output = input;
}
Request Body:
Outbox/test.txt
Note: Any file irrespective of filename pushed to SFTP location should be read by the Azure function.
You can use Binding parameters instead of hard coding file Name: test.txt. You can also specify patterns for the filename. See here for more details.
But why go to the trouble of instantiating the SFTP client yourself?
Just use the prebuilt SFTP connector in Logic Apps, and call the Function to further do your custom processing. You can probably do everything inside the Logic App if your goal is to store the file in Google Drive.
I am trying to get project information from tfs server programitaically.I want to know how to acess the capacity information.Ive serached for it online and it says that that capacity info is stored in [dbo].[tbl_TeamConfigurationCapacity].
But am not understanding how to query for the table using wiql.Anyone have any idea about it ?
This table is only available in the Project Collection database and querying that table is not supported through SQL nor WIQL. While technically possible through SQL, any direct access of the Project Collection database is unsupported and the underlying structure may change between major versions, updates and even hotfixes.
Instead of directly accessing the capacity in the database, the supported method is to use the REST api to query the capacity.
Example:
GET https://{instance}/DefaultCollection/{project}/{team}/_apis/work/TeamSettings/Iterations/{iterationid}/Capacities?api-version={version}
GET https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/Fabrikam-Fiber/_apis/work/teamsettings/iterations/2ec76bfe-ba74-4060-970d-4567a3e997ee/capacities?api-version=2.0-preview.1
{
"values": [
{
"teamMember": {
"id": "8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d",
"displayName": "Chuck Reinhart",
"uniqueName": "fabrikamfiber3#hotmail.com",
"url": "https://fabrikam-fiber-inc.vssps.visualstudio.com/_apis/Identities/8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d",
"imageUrl": "https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/_api/_common/identityImage?id=8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d"
},
"activities": [
{
"capacityPerDay": 0,
"name": null
}
],
"daysOff": [],
"url": "https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/6d823a47-2d51-4f31-acff-74927f88ee1e/748b18b6-4b3c-425a-bcae-ff9b3e703012/_apis/work/teamsettings/iterations/2ec76bfe-ba74-4060-970d-4567a3e997ee/capacities/8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d"
}
]
}
We are building an application, using Box .NET sdk, to display the content of a customer Box account. Our synchronisation tool use the Box content API to retrieve folders and files and build a cache from this information. To detect if changes have happened since last synchronisation, we compare a folder modified_at field.
When inserting or updating a file, the parent folder modified_at fields is updated to the correct timestamp.
When deleting a file, the parent folder timestamp stays the same. Is it a bug or the correct behavior ?
Official forum question : https://community.box.com/t5/Developer-Forum/Box-Content-API-Is-modified-at-field-of-parent-folder-updated/td-p/15335
This is a known issue, but we currently do not have a timeline on a fix. Here is a workaround to discover if what files have been recently deleted.
(1) Call the Events API with these parameters: "stream_type=admin_logs&event_type=delete". This will return a list of items that have been deleted, along with each item's parent folder id.
Example Request
curl "https://api.box.com/2.0/events?stream_type=admin_logs&event_type=delete" -H "Authorization: Bearer AUTH_TOKEN"
Example Response
{
"chunk_size": 1,
"next_stream_position": "0000000000000000000",
"entries": [
{
"source": {
"item_type": "file",
"item_id": "00000000000",
"item_name": "example-file.txt",
"parent": {
"type": "folder",
"name": "Example Folder Name",
"id": "0000000000"
}
},
"created_by": {
"type": "user",
"id": "000000000",
"name": "Example Name",
"login": "example#example.com"
},
"created_at": "2016-04-15T00:00:00-07:00",
"event_id": "00000000-0000-0000-0000-000000000000",
"event_type": "DELETE",
"ip_address": "Unknown IP",
"type": "event",
"session_id": null,
"additional_details": {
"version_id": "00000000000"
}
}
]
}
(2) Use the next_stream_position returned in step 1 on subsequent calls to get the deleted items after that point.