Deleted tasks are being retrieved using the query .net Rest API - c#

I just deleted a task using the Rally website, but when a search for task using the REST API it doesn't return it. I assumed that it should return with the flag "Recycled".
Can anybody help me?
Regards,
Paulo

This is an inconsistency in the WSAPI. Unfortunately all queries are implicitly scoped (Recycled = false) so nothing that has been deleted will ever be returned from the artifact endpoints. There is also no way to access the contents of the recycle bin through the WSAPI.
I would encourage you to vote for the idea for this functionality at https://ideas.rallydev.com/ideas/D2374.

Although it's not ideal, you can get to the Recycle Bin through this REST endpoint:
https://rally1.rallydev.com/slm/webservice/1.40/recyclebin.js?workspace=/workspace/12345678910&project=/project/12345678911
Where the long integers the Workspace and Project OID's of interest.
Recycle bin entries look like the following:
{
_rallyAPIMajor: "1",
_rallyAPIMinor: "40",
_ref: "https://rally1.rallydev.com/slm/webservice/1.40/recyclebinentry/12345678910.js",
_refObjectName: "Test Case 3: Load in, run Analysis on Integer Grids",
_type: "RecycleBinEntry"
}
Where the Recycle Bin OID is unique and different from the OID of the Artifact that was deleted, so there's not a good way to map the Recycle Bin Entry to the Artifact that was deleted to create it. The Object Name could work, although you run the risk of duplicates. The Recycle Bin Entries also come with the same limitations as does the Recycle Bin in the UI - child objects are not shown/accessible.
If you want to walk the Recycle Bin from .NET, here's a quick example:
namespace RestExample_QueryRecycleBin {
class Program
{
static void Main(string[] args)
{
//Initialize the REST API
RallyRestApi restApi;
String userName = "user#company.com";
String userPassword = "topsecret";
// Set Rally parameters
String rallyURL = "https://rally1.rallydev.com";
String rallyWSAPIVersion = "1.40";
//Initialize the REST API
restApi = new RallyRestApi(userName,
userPassword,
rallyURL,
rallyWSAPIVersion);
// Specify workspace and project
string myWorkspace = "/workspace/12345678910";
string myProject = "/project/12345678911";
//Query for items
Request request = new Request("recyclebinentry");
request.Workspace = myWorkspace;
request.Project = myProject;
QueryResult queryResult = restApi.Query(request);
foreach (var result in queryResult.Results)
{
//Process item
string itemName = result["_refObjectName"];
string itemRef = result["_ref"];
Console.WriteLine(itemRef + ", " + itemName);
}
Console.ReadKey();
}
}
}

Related

Google DriveService Files.List() not returning results

Edit:
I've tried granting the SA access to my personal drive (within the organization Workspace) to do some troubleshooting. After granting rights to the SA to a particular folder and rewriting the code to examine that folder, it successfully returned information about files within the test folder. The conclusion is the SA has been set-up correctly by our IT department and does have adequate scope and rights to read files in our organizations Workspace. So, the questions remain: why can't it return information about files in a Shared Drive? What other parameters need to be set in order to get it to return those files? Are there entirely other functions that need to be used? I did notice the deprecated TeamDrives.List() function, but the guidance when trying to use it was to use Files.List() as I had written originally.
--- end edit ---
We have a Google Workspace environment. I've been granted a Service Account (SA) by our IT department and am trying to use it to help maintain access rights. The SA has been granted Content Manager rights to a shared drive instance.
I've tried following along this YouTube tutorial. In stepping through the code execution, it appears to log in correctly, but it is not returning any files. I've tried substituting the full URL for the file ID of the root folder I'd like to examine, but then it returns a 404 error, so I think it is finding the correct folder.
If the file ID is used the code runs without errors, it simply returns no files (and there are hundreds of folders and files within the root).
Any suggestions?
namespace DriveQuickstart
{
class Program
{
static string[] Scopes = { DriveService.Scope.DriveReadonly };
private const string PathToServiceAccountKeyFile = #"<path to jason Service Account file>";
private const string ServiceAccountEmail = #"<Service Account "email">";
static void Main(string[] args)
{
MainAsync().Wait();
}
static async Task MainAsync()
{
var credential = GoogleCredential.FromFile(PathToServiceAccountKeyFile)
.CreateScoped(new[] { DriveService.ScopeConstants.Drive });
var service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential
});
var request = service.Files.List();
request.IncludeItemsFromAllDrives = true;
request.SupportsAllDrives = true;
request.Q = "parents in '<id of "root" folder in shared drive>'";
FileList results = await request.ExecuteAsync();
foreach (var driveFile in results.Files)
{
Console.WriteLine($"{driveFile.Name} {driveFile.MimeType} {driveFile.Id}");
}
}
}
}
OK, it appears the #DAIMTO example is specific to personal drives. The Q() parameter syntax is incorrect for Team drives in the example. To make it work in Team environment:
IncludeItemsFromAllDrives parameter must be set to true
SupportsAllDrives parameter must be set to true
the Q search parameter syntax for finding specific directories is:
Q = "'folder_ID' in parents and mimeType = 'application/vnd.google-apps.folder'"; -- or mimeType of your choice
(note: this is reversed from the youtube example of "parents in 'folder_ID'")

Get folder hierarchy with Google Drive API [C# / .NET]

I am looking for an elegant way to get the folder hierarchy, beginning with my root folder, using the C# Google Drive API V3.
Currently, you can get the root folder and its parents by
var getRequest = driveService.Files.Get("root");
getRequest.Fields = "parents";
var file = getRequest.Execute();
but I am looking for a way to get the children, not the parents, so I can recursively go down the file structure.
Setting getRequest.Fields = 'children' is not a valid field option.
recursively fetching children is a very time consuming way to fetch the full hierarchy. Much better is to run a query to fetch all folders in a single GET (well it might take more than one if you have more than 1,000 folders) and then traverse their parent properties to build up the hierarchy in memory. Bear in mind that (afaik) there is nothing that prevents a folder hierarchy being cyclic, thus folder1 owns folder2 owns folder3 owns folder1, so whichever strategy you follow, check that you aren't in a loop.
If you're new to GDrive, it's important to realise early on that Folders are simply labels, rather than containers. So cyclic relationships and files with multiple parents is quite normal. They were originally called Collections, but got renamed to Folders to appease those members of the community that couldn't get their head around labels.
I hope this is the answer you were looking for. getHeirarchy Recursively digs Google Drive and stores the file titles into a text file.
public System.IO.StreamWriter w = new System.IO.StreamWriter("Hierarchy.txt", false);
string intend = " ";
private void getHierarchy(Google.Apis.Drive.v2.Data.File Res, DriveService driveService)
{
if (Res.MimeType == "application/vnd.google-apps.folder")
{
w.Write(intend + Res.Title + " :" + Environment.NewLine);
intend += " ";
foreach (var res in ResFromFolder(driveService, Res.Id).ToList())
getHierarchy(res, driveService);
intend = intend.Remove(intend.Length - 5);
}
else
{
w.Write(intend + Res.Title + Environment.NewLine);
}
}
You can call the function something like:
w.Write("My Drive:" + Environment.NewLine);
foreach (var Res in ResFromFolder(driveService, "root").ToList())
getHierarchy(Res, driveService);
w.Close();
Here, root can be replaced with the ID of any Directory to get it's structure. This will generate the entire Drive's structure.
The ResFromFolder method returns a list of Google.Apis.Drive.v2.Data.File metadata contained in a directory.
public List<Google.Apis.Drive.v2.Data.File> ResFromFolder(DriveService service, string folderId)
{
var request = service.Children.List(folderId);
request.MaxResults = 1000;
List<Google.Apis.Drive.v2.Data.File> TList = new List<Google.Apis.Drive.v2.Data.File>();
do
{
var children = request.Execute();
foreach (ChildReference child in children.Items)
{
TList.Add(service.Files.Get(child.Id).Execute());
}
request.PageToken = children.NextPageToken;
} while (!String.IsNullOrEmpty(request.PageToken));
return TList;
}
This code produces output something like
However as pinoyyid mentioned, it does consume a good deal of time if Drive contains a large number of files and folders.
Get folder hierarchy with Google Drive API [C# / .NET]
Google.Apis.Drive.v3.DriveService service = GetService();
List<GoogleDriveFile> folderList = new List<GoogleDriveFile>();
Google.Apis.Drive.v3.FilesResource.ListRequest request = service.Files.List();
//https://developers.google.com/drenter code hereive/api/v3/search-shareddrives
request.Q = string.Format("mimeType='application/vnd.google-apps.folder' and '{0}' in parents", folderId)`enter code here`;
request.Fields = "files(id, name)";
Google.Apis.Drive.v3.Data.FileList result = request.Execute();
foreach (var file in result.Files)
{
GoogleDriveFile googleDriveFile = new GoogleDriveFile
{
Id = file.Id,
Name = file.Name,
Size = file.Size,
Version = file.Version,
CreatedTime = file.CreatedTime,
Parents = file.Parents
};
folderList.Add(googleDriveFile);
}
return folderList;

How do I get build details in a custom workflow activity?

I need to add a custom activity to the default workflow template to increase assembly versions at the earliest point possible in the build process.
What I would like to achieve is to create and map the exact same workspace (that is be created further down in the workflow) inside my custom activity so that I can check out an xml file, increase the version number held within, write it back to the xml file and check the xml file back in.
I'm aware that this workspace will be created later on in the workflow but that will be too late in the build process for what I'm trying to achieve, so instead of moving any of the activities or duplicating them in a position above my custom activity (this should be ok as this workspace will be deleted and recreated again later)
I think the details I need are the BuildDirectory, WorkspaceName and SourcesDirectory. Can anyone tell me how to achieve the creation of the workspace or how obtain this data in code?
the build will be carried out on a build server, and I am using TFS 2010 and c#.
Thanks in advance
I followed the series of blog articles by Ewald Hofman as a primer and created a custom activity that does the check-out, update and check-in of a GlobalAssemblyInfo file that I parse the current version from. My task is inserted at the top of the "Update Drop Location" which is right after it does the "Get the build" portion of the workflow. I just use require the IBuildDetail and a File Mask as arguments from which you can pull out the VersionControlServer to be able to access TFS. My code is below:
protected override string Execute(CodeActivityContext context)
{
// Obtain the runtime value of the input arguments.
string assemblyInfoFileMask = context.GetValue(AssemblyInfoFileMask);
IBuildDetail buildDetail = context.GetValue(BuildDetail);
var workspace = buildDetail.BuildDefinition.Workspace;
var versionControl = buildDetail.BuildServer.TeamProjectCollection.GetService<VersionControlServer>();
Regex regex = new Regex(AttributeKey + VersionRegex);
// Iterate of the folder mappings in the workspace and find the AssemblyInfo files
// that match the mask.
foreach (var folder in workspace.Mappings)
{
string path = Path.Combine(folder.ServerItem, assemblyInfoFileMask);
context.TrackBuildMessage(string.Format("Checking for file: {0}", path));
ItemSet itemSet = versionControl.GetItems(path, RecursionType.Full);
foreach (Item item in itemSet.Items)
{
context.TrackBuildMessage(string.Format("Download {0}", item.ServerItem));
string localFile = Path.GetTempFileName();
try
{
// Download the file and try to extract the version.
item.DownloadFile(localFile);
string text = File.ReadAllText(localFile);
Match match = regex.Match(text);
if (match.Success)
{
string versionNumber = match.Value.Substring(AttributeKey.Length + 2, match.Value.Length - AttributeKey.Length - 4);
Version version = new Version(versionNumber);
Version newVersion = new Version(version.Major, version.Minor, version.Build + 1, version.Revision);
context.TrackBuildMessage(string.Format("Version found {0}", newVersion));
return newVersion.ToString();
}
}
finally
{
File.Delete(localFile);
}
}
}
return null;
}

How do you get the latest version of source code using the Team Foundation Server SDK?

I'm attempting to pull the latest version of source code out of TFS programmatically using the SDK, and what I've done somehow does not work:
string workspaceName = "MyWorkspace";
string projectPath = "/TestApp";
string workingDirectory = "C:\Projects\Test\TestApp";
VersionControlServer sourceControl; // actually instantiated before this method...
Workspace[] workspaces = sourceControl.QueryWorkspaces(workspaceName, sourceControl.AuthenticatedUser, Workstation.Current.Name);
if (workspaces.Length > 0)
{
sourceControl.DeleteWorkspace(workspaceName, sourceControl.AuthenticatedUser);
}
Workspace workspace = sourceControl.CreateWorkspace(workspaceName, sourceControl.AuthenticatedUser, "Temporary Workspace");
try
{
workspace.Map(projectPath, workingDirectory);
GetRequest request = new GetRequest(new ItemSpec(projectPath, RecursionType.Full), VersionSpec.Latest);
GetStatus status = workspace.Get(request, GetOptions.GetAll | GetOptions.Overwrite); // this line doesn't do anything - no failures or errors
}
finally
{
if (workspace != null)
{
workspace.Delete();
}
}
The approach is basically creating a temporary workspace, using the Get() method to grab all the items for this project, and then removing the workspace. Is this the correct way to do this? Any examples would be helpful.
I ended up using a different approach that seems to work, mainly taking advantage of the Item.DownloadFile() method:
VersionControlServer sourceControl; // actually instantiated...
ItemSet items = sourceControl.GetItems(sourcePath, VersionSpec.Latest, RecursionType.Full);
foreach (Item item in items.Items)
{
// build relative path
string relativePath = BuildRelativePath(sourcePath, item.ServerItem);
switch (item.ItemType)
{
case ItemType.Any:
throw new ArgumentOutOfRangeException("ItemType returned was Any; expected File or Folder.");
case ItemType.File:
item.DownloadFile(Path.Combine(targetPath, relativePath));
break;
case ItemType.Folder:
Directory.CreateDirectory(Path.Combine(targetPath, relativePath));
break;
}
}
I completed and implemented the code into a button as web asp.net solution.
For the project to work in the references should be added the Microsoft.TeamFoundation.Client and Microsoft.TeamFoundation.VersionControl.Client references and in the code the statements using Microsoft.TeamFoundation.Client; and using Microsoft.TeamFoundation.VersionControl.Client;
protected void Button1_Click(object sender, EventArgs e)
{
string workspaceName = "MyWorkspace";
string projectPath = #"$/TeamProject"; // the container Project (like a tabel in sql/ or like a folder) containing the projects sources in a collection (like a database in sql/ or also like a folder) from TFS
string workingDirectory = #"D:\New1"; // local folder where to save projects sources
TeamFoundationServer tfs = new TeamFoundationServer("http://test-server:8080/tfs/CollectionName", System.Net.CredentialCache.DefaultCredentials);
// tfs server url including the Collection Name -- CollectionName as the existing name of the collection from the tfs server
tfs.EnsureAuthenticated();
VersionControlServer sourceControl = (VersionControlServer)tfs.GetService(typeof(VersionControlServer));
Workspace[] workspaces = sourceControl.QueryWorkspaces(workspaceName, sourceControl.AuthenticatedUser, Workstation.Current.Name);
if (workspaces.Length > 0)
{
sourceControl.DeleteWorkspace(workspaceName, sourceControl.AuthenticatedUser);
}
Workspace workspace = sourceControl.CreateWorkspace(workspaceName, sourceControl.AuthenticatedUser, "Temporary Workspace");
try
{
workspace.Map(projectPath, workingDirectory);
GetRequest request = new GetRequest(new ItemSpec(projectPath, RecursionType.Full), VersionSpec.Latest);
GetStatus status = workspace.Get(request, GetOptions.GetAll | GetOptions.Overwrite); // this line doesn't do anything - no failures or errors
}
finally
{
if (workspace != null)
{
workspace.Delete();
Label1.Text = "The Projects have been brought into the Folder " + workingDirectory;
}
}
}
Your approach is valid.
Your error is in your project path. Use something like this instead:
string projectPath = "$/PathToApp/TestApp";
I agree with Joerage that your server path is probably the culprit. To get more insight into what's happening, you need to wire up some events on the VersionControlServer object. At minimum you'll want Getting, NonFatalError, and Conflict.
Complete list: http://msdn.microsoft.com/en-us/library/microsoft.teamfoundation.versioncontrol.client.versioncontrolserver_events(VS.80).aspx
I had a similar situation where I needed to download contents of 'a' folder from tfs into an existing workspace, without creating a new workspace. With help from the above answers, I was able to put something together that works fine for me as of now. There is however a limitation. This works for contents of 'a' folder with just files and not another folder inside it -I have not tried that out. Maybe that would involve some minor updates. Sharing code, just in case someone is searching for this. I really like the fact that this approach does not deal with workspace [-create and delete], since that is not desired.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Configuration;
using Microsoft.TeamFoundation.VersionControl.Client;
using Microsoft.TeamFoundation.Client;
using System.IO;
namespace DownloadFolder
{
class Program
{
static void Main(string[] args)
{
string teamProjectCollectionUrl = "http://<YourTFSUrl>:8080/tfs/DefaultCollection"; // Get the version control server
TfsTeamProjectCollection teamProjectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri(teamProjectCollectionUrl));
VersionControlServer vcs = teamProjectCollection.GetService<VersionControlServer>();
String Sourcepath = args[0]; // The folder path in TFS - "$/<TeamProject>/<FirstLevelFolder>/<SecondLevelFolder>"
String DestinationPath = args[1]; //The folder in local machine - "C:\MyTempFolder"
ItemSet items = vcs.GetItems(Sourcepath, VersionSpec.Latest, RecursionType.Full);
String FolderName = null;
foreach (Item item in items.Items)
{
String ItemName = Path.GetFileName(item.ServerItem);
switch (item.ItemType)
{
case ItemType.File:
item.DownloadFile(Path.Combine(DestinationPath, FolderName, ItemName));
break;
case ItemType.Folder:
FolderName = Path.GetFileName(item.ServerItem);
Directory.CreateDirectory(Path.Combine(DestinationPath, ItemName));
break;
}
}
}
}
}
While running this from the command prompt copy all the supporting dlls along with exe
cmd>> DownloadFolder.exe "$/<TeamProject>/<FirstLevelFolder>/<SecondLevelFolder>" "C:\MyTempFolder"

Get User Account Status (Locked/Unlocked) from Active Directory on C-Sharp / C#

I need to find a way to check if an Active Directory UserAccount has his account locked or not.
I've tried userAccountControl property in a Windows 2000 AD but that property does not change a byte when I force an account to get locked (by trying to log on to a workstation providing the wrong password for that specific user) And I can tell by using ADExplorer.exe utility made by semi-god -> Mr. Russinovich
I've seen that in the 3.5 Framework they use the method .InvokeGet("userLockedOut"); but I'm trying to do this in a Enterprise Application that was written in .Net Framework 1.1 and there's no chance of using newer ones (just if you thought of suggesting so).
Here is a link with all the info on Active Directory stuff...
http://www.codeproject.com/KB/system/everythingInAD.aspx
Found this, it is a little more than I have done in the past (can't find exact snippets) though the key is doing a directory search and limiting based on the lockouttime for your user(s) that are returned. Additionally for a particular user, you can limit your search further using additional properties. The codeproject link above has that particular logic (for search limiting) I believe.
class Lockout : IDisposable
{
DirectoryContext context;
DirectoryEntry root;
DomainPolicy policy;
public Lockout(string domainName)
{
this.context = new DirectoryContext(
DirectoryContextType.Domain,
domainName
);
//get our current domain policy
Domain domain = Domain.GetDomain(this.context);
this.root = domain.GetDirectoryEntry();
this.policy = new DomainPolicy(this.root);
}
public void FindLockedAccounts()
{
//default for when accounts stay locked indefinitely
string qry = "(lockoutTime>=1)";
TimeSpan duration = this.policy.LockoutDuration;
if (duration != TimeSpan.MaxValue)
{
DateTime lockoutThreshold =
DateTime.Now.Subtract(duration);
qry = String.Format(
"(lockoutTime>={0})",
lockoutThreshold.ToFileTime()
);
}
DirectorySearcher ds = new DirectorySearcher(
this.root,
qry
);
using (SearchResultCollection src = ds.FindAll())
{
foreach (SearchResult sr in src)
{
long ticks =
(long)sr.Properties["lockoutTime"][0];
Console.WriteLine(
"{0} locked out at {1}",
sr.Properties["name"][0],
DateTime.FromFileTime(ticks)
);
}
}
}
public void Dispose()
{
if (this.root != null)
{
this.root.Dispose();
}
}
}
Code was pulled from this post: http://social.msdn.microsoft.com/Forums/en-US/csharpgeneral/thread/5e0fadc2-f27b-48f6-a6ac-644e12256c67/
After seeing the .NET 1.1, check this thread out: http://forums.asp.net/t/434077.aspx, using the lockouttime in the filter should still do the trick.
Specifically in the thread (after the larger code post which provides alot of the syntax):
(&(objectClass=user)(objectCategory=person)(lockoutTime>=1));
One other thing, it turns out that if you are using .NET v.1.1, then S.DS converts the Integer8 to the long integer correctly for you (does not work with 1.0) - which means you can do away with reflection code (in the post):
//use the filter from above
SearchResultCollection src = ds.FindAll();
foreach(SearchResult sr in src)
{
DateTime lockoutTime = DateTime.FromFileTime((long)sr.Properties["lockoutTime][0]);
Response.Output.Write("Locked Out on: {0}", lockoutTime.ToString());
}

Categories

Resources