I am trying to get project information from tfs server programitaically.I want to know how to acess the capacity information.Ive serached for it online and it says that that capacity info is stored in [dbo].[tbl_TeamConfigurationCapacity].
But am not understanding how to query for the table using wiql.Anyone have any idea about it ?
This table is only available in the Project Collection database and querying that table is not supported through SQL nor WIQL. While technically possible through SQL, any direct access of the Project Collection database is unsupported and the underlying structure may change between major versions, updates and even hotfixes.
Instead of directly accessing the capacity in the database, the supported method is to use the REST api to query the capacity.
Example:
GET https://{instance}/DefaultCollection/{project}/{team}/_apis/work/TeamSettings/Iterations/{iterationid}/Capacities?api-version={version}
GET https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/Fabrikam-Fiber/_apis/work/teamsettings/iterations/2ec76bfe-ba74-4060-970d-4567a3e997ee/capacities?api-version=2.0-preview.1
{
"values": [
{
"teamMember": {
"id": "8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d",
"displayName": "Chuck Reinhart",
"uniqueName": "fabrikamfiber3#hotmail.com",
"url": "https://fabrikam-fiber-inc.vssps.visualstudio.com/_apis/Identities/8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d",
"imageUrl": "https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/_api/_common/identityImage?id=8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d"
},
"activities": [
{
"capacityPerDay": 0,
"name": null
}
],
"daysOff": [],
"url": "https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/6d823a47-2d51-4f31-acff-74927f88ee1e/748b18b6-4b3c-425a-bcae-ff9b3e703012/_apis/work/teamsettings/iterations/2ec76bfe-ba74-4060-970d-4567a3e997ee/capacities/8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d"
}
]
}
Related
I have a C# desktop app and a django webapp that share a set of common class/model types. The C# app is exporting json files containing instances of these models, and I am trying to import these into the django database.
The complication is that the parent model contained within the json file has properties that may reference the same sub-model in multiple places. For example, the json file looks something like this:
{
"$id": "1",
"SubModels": {
"$id": "2",
"$values": [
{
"$id": "3",
"name": "Dave",
"value": "123"
},
{
"$id": "4",
"name": "John",
"value": "42"
}
]
},
"PreferredSubModel: {
"$ref": "4"
}
}
Which was created using the using System.Text.Json.Serialization C# library with the ReferenceHandler = ReferenceHandler.Preserve serialisation option. In django, I've converted the json file into a dictionary using model_dictionary = JSONParser().parse(json_file).
Are there any existing functions (in the django Python environment) that can handle this $id/$ref system to maintain class instances, or do I need to code my own deserializer? If the latter, does anyone have any suggestions for the best way to handle it?
I'm new to django and json files, so hopefully I've just been googling the wrong terms and something exists...
I currently encountered the following problem parsing an adaptive card.
This is the card:
{
"type": "AdaptiveCard",
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.4",
"body": [
{
"type": "TextBlock",
"text": "{{DATE(${$root.AdditionalData['DUE-DATE']},COMPACT)}}",
"wrap": true
}
]
}
This is the card-content:
{
"AdditionalData": {
"DUE-DATE": "2021-09-10T16:29:59Z"
}
}
Code:
c# on .NET Framework 4.7.2 where layout is a string with the above card and content is a string with the above card-content:
AdaptiveCardTemplate template = new AdaptiveCardTemplate(layout);
string cardJson = template.Expand(content);
AdaptiveCardParseResult card = AdaptiveCard.FromJson(cardJson);
And it crashes with:
AdaptiveCards.AdaptiveSerializationException: 'Error reading string. Unexpected token: Undefined. Path 'text', line 1, position 137.'
JsonReaderException: Error reading string. Unexpected token: Undefined. Path 'text', line 1, position 137.
The generated JSON on cardJson looks wrong to me at the text property:
{"type":"AdaptiveCard","$schema":"http://adaptivecards.io/schemas/adaptive-card.json","version":"1.4","body":[{"type":"TextBlock","text":,"wrap":true}]}
I'm using the adaptive cards nuget packages:
AdaptiveCards 2.7.2
AdaptiveCards.Templating 1.2.
Did I encounter a parsing bug? The value for the text property should be 10.9.2021.
In the designer on adaptivecards.io everything works fine for some reason. Does anyone have a fix/workaround?
If you want the literal "text":"10.9.2021" to appear in your cardJson, use "${formatDateTime(AdditionalData['DUE-DATE'], 'd.M.yyyy')}" to generate the required value for your TextBlock:
{
"type": "AdaptiveCard",
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.4",
"body": [
{
"type": "TextBlock",
"text": "${formatDateTime(AdditionalData['DUE-DATE'], 'd.M.yyyy')}",
"wrap": true
}
]
}
This causes all date formatting to be performed by the AdaptiveCardTemplate and results in:
{
"type":"AdaptiveCard",
"$schema":"http://adaptivecards.io/schemas/adaptive-card.json",
"version":"1.4",
"body":[
{
"type":"TextBlock",
"text":"10.9.2021",
"wrap":true
}
]
}
Demo fiddle #1 here.
If you would prefer "{{DATE(2021-09-10T16:29:59Z, COMPACT)}}" in your cardJson which delegates date formatting to the TextBlock, use:
{
"type": "AdaptiveCard",
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.4",
"body": [
{
"type": "TextBlock",
"text": "{{DATE(${AdditionalData['DUE-DATE']}, COMPACT)}}",
"wrap": true
}
]
}
Which results in
"text": "{{DATE(2021-09-10T16:29:59Z, COMPACT)}}",
Demo fiddle #2 here.
Notes:
According to the Microsoft documentation:
Use Dot-notation to access sub-objects of an object hierarchy. E.g., ${myParent.myChild}
Use Indexer syntax to retrieve properties by key or items in an array. E.g., ${myArray[0]}
But, when accessing an object property with a hyphen (or some other reserved operator) in its name, it is apparently necessary to use the Indexer syntax ['DUE-DATE'] instead of Dot‑notation to retrieve its value, passing the property name inside a single-quoted string as the indexer.
According to the docs
There are a few reserved keywords to access various binding scopes. ...
"$root": "The root data object. Useful when iterating to escape to parent object",
Thus you do not need to use $root when accessing properties of the currently scoped object (which is, by default, the root) when using Dot-notation. If for whatever reason you need or want to address the root object directly, you may use $root like so:
"text": "${formatDateTime($root.AdditionalData['DUE-DATE'], 'd.M.yyyy')}",
Demo fiddle #3 here.
However, it seems that using $root in combination with {{DATE()}} causes malformed JSON to be generated. I.e.
"text": "{{DATE(${$root.AdditionalData['DUE-DATE']}, COMPACT)}}",
results in "text":, as indicated in your question.
Demo fiddle #4 here.
This looks to be a bug in the framework. Possibly the parser is choking on the sequence of tokens ${$, as your issue somewhat resembles Issue #6026: [Authoring][.NET][Templating] Inconsistency in accessing $root inside a $when property in adaptive card templating which reports a failure to parse "$when": "${$root.UserName != null}" correctly.
You can avoid the problem either by omitting $root entirely, or by wrapping $root.AdditionalData['DUE-DATE'] in an additional formatDateTime() like so:
"text": "{{DATE(${formatDateTime($root.AdditionalData['DUE-DATE'])}, COMPACT)}}",
Resulting in
"text": "{{DATE(2021-09-10T16:29:59.000Z, COMPACT)}}",
Demo fiddle #5 here.
From the documentation page Adaptive Card Templating SDKs: Troubleshooting:
Q. Why date/time in RFC 3389 format e.g "2017-02-14T06:08:00Z" when used with template doesn't works with TIME/DATE functions?
A. .NET sdk nuget version 1.0.0-rc.0 exhibits this behavior. this behavior is corrected in the subsequent releases... Please use formatDateTime() function to format the date/time string to RFC 3389 as seen in this example, or you can bypass TIME/DATE functions, and just use formatDateTime(). For more information on formatDateTime(), please go here.
While this recommendation to use formatDateTime was to fix a problem in 1.0.0-rc.0, the trick also resolves the issue mentioned in note #2 above.
I've been having to give a role to new users, problem I have to add them to 2 different cubes in 6 different environments, which is 12 times adding the user and processing the rights table, which amounts to around an hour on my company's rather weak laptop, for EVERY new user.
Is there any way to juste write some code with a list of users you wanna add to a list of cubes, and you'd just tell it to process the table after each addition ? It'd be a real life saver right now.
In SSIS, you can use the Analysis Services Execute DDL Task. This can take a TMSL script as input, which would look like below.
1) sequence - this command allows you to perform multiple operations
2) createOrReplace - this will refresh the role with the new list of members. Note that every existing member needs to be included in the role or they will be wiped out
3) refresh - processes the table
In ssis, you might create a connection to each environment and loop through a set of script files, so that you would not need to modify the package to add new members.
However, I would also suggest switching to an AD group instead of adding explicit users to the role. Then you would only need to refresh table.
{
"sequence": {
"operations": [{
"createOrReplace": {
"object": {
"database": "<Your Database>",
"role": "<Your Role Name>"
},
"role": {
"name": "Reader",
"modelPermission": "read",
"members": [{
"memberName": "<Your Domain>\\<User 1>",
"memberName": "<Your Domain>\\<User 2>",
<All the users in the role...>
}
]
}
}
}, {
"refresh": {
"type": "full",
"objects": [{
"database": "<Your Database>",
"table": "<Your Table>"
}
]
}
}
]
}
}
i have a console application. i built this application and uploaded it to the Azure blob storage. Then i run this application Azure data factory pipeline. All are fine but the problem is if i want to add new parameters(get input) to console application how can i do that? Is there any specific way to do it?
{
"name": "samplebatch",
"type": "Custom",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"command": "SampleApp.exe",
"folderPath": "customactv2/SampleApp",
"resourceLinkedService": {
"referenceName": "StorageLinkedService",
"type": "LinkedServiceReference"
}
"linkedServiceName": {
"referenceName": "dataloadbatchservice",
"type": "LinkedServiceReference"
}
}
This is what i have done so far in the data factory pipeline code.
Please refer to the extendedProperties property in typeProperties, you could use it.
User-defined properties that can be passed to the custom application
in JSON format so your custom code can reference additional properties
Doc: https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-dotnet-custom-activity#custom-activity
Sample:https://github.com/Azure/Azure-DataFactory/blob/master/Samples/ADFv2CustomActivitySample/MyCustomActivityPipeline.json
We are building an application, using Box .NET sdk, to display the content of a customer Box account. Our synchronisation tool use the Box content API to retrieve folders and files and build a cache from this information. To detect if changes have happened since last synchronisation, we compare a folder modified_at field.
When inserting or updating a file, the parent folder modified_at fields is updated to the correct timestamp.
When deleting a file, the parent folder timestamp stays the same. Is it a bug or the correct behavior ?
Official forum question : https://community.box.com/t5/Developer-Forum/Box-Content-API-Is-modified-at-field-of-parent-folder-updated/td-p/15335
This is a known issue, but we currently do not have a timeline on a fix. Here is a workaround to discover if what files have been recently deleted.
(1) Call the Events API with these parameters: "stream_type=admin_logs&event_type=delete". This will return a list of items that have been deleted, along with each item's parent folder id.
Example Request
curl "https://api.box.com/2.0/events?stream_type=admin_logs&event_type=delete" -H "Authorization: Bearer AUTH_TOKEN"
Example Response
{
"chunk_size": 1,
"next_stream_position": "0000000000000000000",
"entries": [
{
"source": {
"item_type": "file",
"item_id": "00000000000",
"item_name": "example-file.txt",
"parent": {
"type": "folder",
"name": "Example Folder Name",
"id": "0000000000"
}
},
"created_by": {
"type": "user",
"id": "000000000",
"name": "Example Name",
"login": "example#example.com"
},
"created_at": "2016-04-15T00:00:00-07:00",
"event_id": "00000000-0000-0000-0000-000000000000",
"event_type": "DELETE",
"ip_address": "Unknown IP",
"type": "event",
"session_id": null,
"additional_details": {
"version_id": "00000000000"
}
}
]
}
(2) Use the next_stream_position returned in step 1 on subsequent calls to get the deleted items after that point.