I want to attach a work item on tfs to a build. i am reading SVN logs which contain the work item number and try to update the actual build to attach them.
workitems.Add(workItemStore.GetWorkItem(workitemid));
buildDetail.Information.AddAssociatedWorkItems(workitems.ToArray());
When I try to hit buildDetail.Information.Save(); or buildDetail.Save();, I get an AccessDeniedException.
See another post.
So I wanted to try with REST...
After loads of pages on MSDN, I concluded there is no .NET Client Library that takes care of builds. It looks like my only option is to patch a json into the TFS:
PATCH https://{instance}/DefaultCollection/{project}/_apis/build/builds/{buildId}?api-version={version}
How do I add the workitems the right way?
EDIT: I've found an old post which mentioned a dll in the TFS namespace which has the same capabilities like the call from above. Unluckily, it's not refered in MSDN. Same problem here: no way to add workitems.
To spread the issue and adress it to MS: Patrick has created a post on uservoice
UPDATE:
I managed to link a build in the work item. Here is an approach in c#:
var json = new JsonPatchDocument
{
new JsonPatchOperation()
{
Operation = Operation.Add,
Path = "/relations/-",
Value = new WorkItemRelation()
{
Rel = "ArtifactLink",
Url = build.Uri.ToString()
}
}
};
var client = new WebClient { UseDefaultCredentials = true };
client.Headers.Add(HttpRequestHeader.ContentType, "application/json-patch+json");
client.UploadData(
options.CollectionUri + "/_apis/wit/workitems/" + workitemid + "?api-version=1.0",
"PATCH",
Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(json)));
But there is still no direct binding in the build:
And the work items says unknown executable linktype
According to the message given in the workitem, I assume that I am using the false linktype. Does anybody have a reference for me, what types I am able to and should use?
URI UPDATE:
I am already using the mentioned uri:
MINOR SOLUTION:
I had to add the name "Build" to the Attributes of the patch. I still does not recognize it in the build itself but for now, I can work with the link as build type.
var json = new JsonPatchDocument
{
new JsonPatchOperation()
{
Operation = Operation.Add,
Path = "/relations/-",
Value = new WorkItemRelation()
{
Rel = "ArtifactLink",
Url = build.Uri.ToString()
Attributes = new Dictionary<string, object>()
{
{ "name", "Build" },
{ "comment", build.Result.ToString() }
}
}
}
};
You could add a workitem to a build by updating the workitem to add a relation link to the build via Rest API. Refer to this link for details: Add a Link.
After you add a link to the build in the workitem, the workitem would be show in the build summary.
Following is the content sample of the body,
[
{
"op": "test",
"path": "/rev",
"value": "2"
},
{
"op": "add",
"path": "/relations/-",
"value":
{
"rel": "ArtifactLink",
"url": "vstfs:///Build/Build/12351"
}
}
]
Add code sample:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json;
using System.Net.Http;
using System.Net.Http.Headers;
namespace AssociateWorkItemToBuild
{
class Program
{
static void Main(string[] args)
{
using (HttpClient client = new HttpClient())
{
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", Convert.ToBase64String(
ASCIIEncoding.ASCII.GetBytes(
string.Format("{0}:{1}", "username", "password"))));
string Url = "https://xxxxxx.visualstudio.com/_apis/wit/workitems/149?api-version=1.0";
StringBuilder body = new StringBuilder();
body.Append("[");
body.Append("{");
body.Append("\"op\":\"add\",");
body.Append(" \"path\":\"/relations/-\",");
body.Append("\"value\":");
body.Append("{");
body.Append("\"rel\": \"ArtifactLink\",");
body.Append("\"url\": \"vstfs:///Build/Build/12596\"");
body.Append("}");
body.Append("}");
body.Append("]");
var method = new HttpMethod("PATCH");
var request = new HttpRequestMessage(method, Url)
{
Content = new StringContent(body.ToString(), Encoding.UTF8,
"application/json-patch+json")
};
using (HttpResponseMessage response = client.SendAsync(request).Result)
{
response.EnsureSuccessStatusCode();
string responseBody = response.Content.ReadAsStringAsync().Result;
}
}
}
}
}
Update
Just as Eddie replied, you could add a workitem to a build by updating the workitem to add a relation link to the build via Rest API.
About LinkType there has been a clear demo in Eddie's answer. You need to use build uri.
The format needs to be vstfs:///Build/Build/8320
The BuildUri for the TFS Build tasks is a property that needs to be set so that those tasks can communicate with the server about the build they are performing actions for.
You can also use $env:BUILD_BUILDURI in a powershell script, more detail info and ways you can also refer this blog from MSDN: Build association with work Items in vNext Finally will get :
Related
I am sending request to an api with HttpClient. The response from the request works without any problem, but I cannot parse the key values I want in the response. According to the research I have done, I tried such a code, but the incoming data returns empty in this way. How can I get the values I want from the incoming data?
using LoggerApi.Methods;
using System;
using System.Net.Http;
using System.Text;
using Newtonsoft.Json;
using System.Linq;
namespace LoggerApi.Methods
{
public class ApiMethods
{
public async static Task<object> GetClientInformations(string authenticationCode, string username = "username")
{
var client = new HttpClient();
var userInformationEndpoint = new Uri("https://myurl.com/url");
var userInformationPayload = new UserInformationPayload()
{
Login = username
};
var serializePayload = JsonConvert.SerializeObject(userInformationPayload);
var payload = new StringContent(serializePayload, Encoding.UTF8, "application/json");
var res = await client.PostAsync(userInformationEndpoint, payload).Result.Content.ReadAsStringAsync();
var responseResultJson = JsonConvert.DeserializeObject<object>(res);
return responseResultJson;
}
}
}
this code output is empty looks like this
{
"HasError": [],
"AlertType": [],
"AlertMessage": [],
"ModelErrors": [],
"Data": [
[
[]
],
[
[
[
[
[]
],
[
[]
],
[
[]
],
[
[]
]
]
]
]
]
}
But when I return var res directly instead of var responseResultJson from the function, the result is like this. What I want to do here is to access values such as Login, FirstName, LastName, Id from the incoming data. How can I do that?
{"HasError":false,"AlertType":"success","AlertMessage":"Operation has completed successfully","ModelErrors":[],"Data":{"Count":1,"Objects":[{"Id":291031530,"CurrencyId":"TRY","FirstName":"Scott","LastName":"Marshall","MiddleName":"William","Login":"scotty3"}]}}
There are multiple possible solutions. Once of them is the following.
using System.Net.Http.Json;
using System.Text.Json;
namespace Program
{
class Program
{
// We need this to create a new ToDo
record TodoDto(int UserId, string Title, string Body);
// That's what's stored in the backend service and returned by the API
record TodoModel(int Id, int UserId, string Title, string Body);
public static async Task Main(string[] args)
{
// HTTP Client
var httpClient = new HttpClient();
// required as Properties are PascalCase in C# but JSON properties here are camelCase (that's usually the case)
var serializerOptions = new JsonSerializerOptions {
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
};
// create new todo
var newTodo = new TodoDto(1, "My new todo", "I need to...");
// Add todo in the backend
var response = await httpClient.PostAsJsonAsync<TodoDto>("https://jsonplaceholder.typicode.com/posts", newTodo, serializerOptions);
if(response.IsSuccessStatusCode){
// if we have a succesful request we can deserialize the response
var newTodoModel = JsonSerializer.DeserializeAsync<TodoModel>(await response.Content.ReadAsStreamAsync(), serializerOptions);
Console.WriteLine(newTodoModel);
}
else
{
Console.WriteLine($"Request failed with status code {(int)response.StatusCode} ({response.StatusCode})");
}
}
}
Some remarks:
I am using JsonPlaceholder for some sample API endpoints to provide a working example here
Deserializing the API response to JSON could fail e.g. when the API returns something unexpected, so you should check for that especially if you don't control the API that you're calling.
For this to work you need to include the System.Net.Http.Json namespace at it contains the PostAsJsonAsync() extension method (see HttpClienJsonExtensions Class - Microsoft Learn for more nice and convenient extension methods).
There is no need for Newtonsoft JSON parser, as you can now use the built-in JSON Parser in the System.Text.Json namespace.
I prefer working with Streams as you can then use async all the way throughout your code which is a best practice (see Async best practices - Async All the Way)
By default a case-sensitive matching between your C# class and the JSON will be attempted. JSON usually uses camelCase and this API does so as well. Therefore we need to tell the (de-)serializer to use camelCase as well using JsonSerializerOptions.
I am using using Microsoft.TeamFoundation.Build.WebApi, Microsoft.TeamFoundation.Core.WebApi, Microsoft.VisualStudio.Services.Common, Microsoft.VisualStudio.Services.WebApi
to trigger pipelines from my asp.net web app.
I can trigger pipelines, get work items, definitions but I am struggling to set a scheduled trigger for a pipeline using buildClient.QueueBuildAsync(build)
I tried using abstract class BuildTrigger in Microsoft.TeamFoundation.Build.WebApi that contains property DefinitionTriggerType enum with schedule on 8 number but how can I implement it to my pipeline before queuing?
This is how I run pipeline:
var credential = new VssBasicCredential(string.Empty, patToken);
var connection = new VssConnection(new Uri("https://dev.azure.com/myteam5468/"), credential);
var buildClient = connection.GetClient<BuildHttpClient>();
var projectClient = connection.GetClient<ProjectHttpClient>();
projectClient.GetProjects();
var project = projectClient.GetProject(projectName).Result;
Console.WriteLine(project.Name);
var definition = buildClient.GetDefinitionAsync(projectName, userSelectedPipelineID).Result;
Console.WriteLine(definition.Name);
var build = new Build()
{
Definition = definition,
Project = project,
};`
//Runs Pipeline
buildClient.QueueBuildAsync(build).Wait();
Console.WriteLine(string.Format("Pipeline {0} started running successfully", definition.Name));
How can we define scheduled triggers using TFS libraries?
Update 1:
buildClient.UpdateDefinitionAsync(definition.Triggers[0].TriggerType = DefinitionTriggerType.Schedule);
Above code as per Daniel's comment takes in the definition for updating the existing definition in the project but I am unable to set the triggerType because it is read-only.
Update 2:
//Create Trigger
ScheduleTrigger trigger = new ScheduleTrigger();
//Create Schedule
Schedule schedule = new Schedule();
schedule.StartHours = 10;
schedule.StartMinutes = 00;
//Add schedule to trigger in a list
trigger.Schedules = new List<Schedule>();
trigger.Schedules.Add(schedule);
Above code is using TeamFoundation.Build.WebApi to create trigger but according to latest comment we can't set triggers using the Same TFS sdk.
What is the REST Api structure for creating and scheduling triggers in the devops repository?
Thanks in advance
Hi I checked the doc for BuildDefinition.Triggers Property, and it shows that triggers could only be listed and does not have other property.
Property Value
List BuildTrigger
And I also check this sdk in visual studio. And the trigger could only be listed.
So I suppose that you could use rest api to customize your pipeline trigger property in C#. I tried to capture the rest api in UI and found below.
PUT https://dev.azure.com/<org>/<project>/_apis/build/definitions/<pipelineID>?api-version=5.0-preview.6
===============================================================
Update 1/6
During more investigations with set the schedule trigger with rest api, I found modifying the request body is overcomplicated, especially setting the schedule date.
The schedule date in the request body is not using "Monday, Tuesday...Sunday", but using parameter daystobuild with numbers value like "only Monday as 1", "only Friday as 16", "both Saturday and Sunday as 96".
request body
"triggers": [
{
"branchFilters": [],
"pathFilters": [],
"settingsSourceType": 2,
"batchChanges": false,
"maxConcurrentBuildsPerBranch": 1,
"triggerType": "continuousIntegration"
},
{
"schedules": [
{
"branchFilters": [
"+refs/heads/main"
],
"timeZoneId": "",
"startHours": 6,
"startMinutes": 0,
"daysToBuild": 31,
"scheduleOnlyWithChanges": true
},
{
"branchFilters": [
"+refs/heads/1"
],
"timeZoneId": "",
"startHours": 3,
"startMinutes": 0,
"daysToBuild": "saturday",
"scheduleOnlyWithChanges": true
}
],
"triggerType": "schedule"
}
],
So I suggest that you could set the schedule trigger via Azure DevOps UI directly for both classic and yaml. And yaml pipeline alos got the Cron Definition
And there is also another method for schedule the pipeline programmatically.
You could create a console app with rest api method like below.
Run a pipeline with rest api
Queue a build with rest api
using RestSharp;
using System;
namespace ConsoleApp18
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
var url = "https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipelineId}/runs?api-version=7.0";
string pat = "{PAT}";
var client = new RestClient(url);
var request = new RestRequest(url, Method.Post);
request.AddHeader("Authorization", "Basic "+pat);
request.AddHeader("Content-Type", "application/json");
var body = #"{
" + "\n" +
#""":"""
" + "\n" +
#"}";
request.AddParameter("application/json", body, ParameterType.RequestBody);
var response = client.Execute(request);
Console.WriteLine(response.Content);
}
}
}
And after build the app, you could set task schedule for this application. It also supports some special schedules, and in my test, I set as "on Monday and Tuesday every 3 weeks"
You set schedules on the build definition itself. You don't set a schedule on a queued instance of the build. You'd use the UpdateDefinitionAsync method in the API libraries to accomplish that.
Or if you're using YAML, you define the schedule in the YAML.
As per microsoft community this feature is not yet possible with Tfs .net api.
I am currently using SendGrid's Dynamic Template. I just created a new version of a template that I would like to use in development. I couldn't find any documentation on how to do that.
I am using SendGrid's C# library, this is what I have so far.
var dynamicTemplateData = new Dictionary<string, string>
{
{"name", $"{user.FirstName}"},
{ "date", appointment.AppointmentDateTime.ToLocationDateTime(location.TimeZoneInformation).ToString("f") },
{"location", $"{location.Address} {location.Address2}. {location.City}, {location.State}"},
{"receipt_url", $"{_appSettings.SiteUrl}appointment/{appointment.Id}"},
{"order_id", $"{appointment.ExternalOrderId}"}
};
var msg = MailHelper.CreateSingleTemplateEmail(
new EmailAddress("mail#xxxx.com", "xxxx"),
new EmailAddress(user.EMail),
"d-adwadadadadadadadadadadad",
dynamicTemplateData);
I tried however, didn't find a way. So that I tried is to get a version using a template API endpoint, replace dynamic data and send an email.
You can get a template by
var response = await client.RequestAsync(method: SendGridClient.Method.GET, urlPath: "templates/" + templateId + "/versions/" + version);
Version ID will be there in the URL when you browse it,
I currently have an Azure Function that I would like to have update a QnaMaker Knowledge Base every day or so. Currently everything is connected and working fine, however I can only send Qna Objects (qna pairs) and not urls to files on a website of mine. So in the example I provided below, while it should populate the KB with 2 questions and the file from the url, it only populates the questions.
Currently this is not giving me any kind of error, in fact the response code from my call to the KB comes back as 204. So it it getting through, but still not adding the file to the KB as it should.
NOTE: The file being imported in this example (alice-I.html) is a random one for this demonstration (not mine, for security), but the issue is the same. If I directly add this file to the QnaMaker from the KB site itself it works fine, but it won't update from the Azure Function Code.
Any insights into what is happening would be great.
Content Being Sent To Knowledge Base
string replace_kb = #"{
'qnaList': [
{
'id': 0,
'answer': 'A-1',
'source': 'Custom Editorial',
'questions': [
'Q-1'
],
'metadata': []
},
{
'id': 1,
'answer': 'A-2',
'source': 'Custom Editorial',
'questions': [
'Q-2'
],
'metadata': [
{
'name': 'category',
'value': 'api'
}
]
}
],
'files': [
{
'fileName': 'alice-I.html',
'fileUri': 'https://www.cs.cmu.edu/~rgs/alice-I.html'
}
]
}";
Code Sending Content To Knowledge Base
using (var clientF = new HttpClient())
using (var requestF = new HttpRequestMessage())
{
requestF.Method = HttpMethod.Put;
requestF.RequestUri = new Uri(<your-uri>);
requestF.Content = new StringContent(replace_kb, Encoding.UTF8, "application/json");
requestF.Headers.Add("Ocp-Apim-Subscription-Key", <your-key>);
var responseF = await clientF.SendAsync(requestF);
if (responseF.IsSuccessStatusCode)
{
log.LogInformation("{'result' : 'Success.'}");
log.LogInformation($"------------>{responseF}");
}
else
{
await responseF.Content.ReadAsStringAsync();
log.LogInformation($"------------>{responseF}");
}
}
So I still don't know how to get the above working, but I got it to work a different way. Basically I used the UpdateKbOperationDTO Class listed here: class
This still isn't the perfect solution, but it allows me to update my KB with files using code instead of the interface.
Below is my new code:
QnAMakerClient qnaC = new QnAMakerClient(new ApiKeyServiceClientCredentials(<subscription-key>)) { Endpoint = "https://<your-custom-domain>.cognitiveservices.azure.com"};
log.LogInformation("Delete-->Start");
List<string> toDelete = new List<string>();
toDelete.Add("<my-file>");
var updateDelete = await qnaC.Knowledgebase.UpdateAsync(kbId, new UpdateKbOperationDTO
{
// Create JSON of changes ///
Add = null,
Update = null,
Delete = new UpdateKbOperationDTODelete(null, toDelete)
});
log.LogInformation("Delete-->Done");
log.LogInformation("Add-->Start");
List<FileDTO> toAdd = new List<FileDTO>();
toAdd.Add(new FileDTO("<my-file>", "<url-to-file>"));
var updateAdd = await qnaC.Knowledgebase.UpdateAsync(kbId, new UpdateKbOperationDTO
{
// Create JSON of changes ///
Add = new UpdateKbOperationDTOAdd(null, null, toAdd),
Update = null,
Delete = null
});
log.LogInformation("Add-->Done");
I am trying to post data to a list on SharePoint Online with the C# HttpClient. this is my code:
using (var client = new SPHttpClient(webUri, userName, password))
{
var listTitle = "HttpClientList";
var itemPayload = new
{
__metadata = new
{
type = "SP.Data.HttpClientListListItem"
},
Title = "test3",<--column name "Title"
_x0071_cr5 = "value3"<--column name "Value"
};
var endpointUrl = string.Format("{0}/_api/web/lists/getbytitle('{1}')/items", webUri, listTitle);
var data = client.ExecuteJson(endpointUrl, HttpMethod.Post, itemPayload);
Console.WriteLine("Task item '{0}' has been created", data["d"]["Title"]);
Console.ReadLine();
As of right now, I am getting status "400, BadRequest". My guess is that I am missing something or feeding the post request the wrong data.
I have been following this blogpost Blogpost
I can only execute the verb GET. All other verbs gives me "400, BadRequest"
Solved it with varoius solutions.
Added the right ListItem HttpClientListListItem.
created all columns thru the ListSettings panel.
Replaced the client.ExecuteJson() with client.ExecuteJson(endpointUrl, HttpMethod.Post, headers, default(string));.
Thanks for all the help!