I use Kiln.Net library to connect to mercurial repository. I need to get base information (commits, lines of code changed..). Then it should group that info further to show progress for each author. But still have no success.
Code to connect:
var account = "exampleRepo"; // examplerepo.kilnhg.com
var user = "exampleUsername"; // username
var password = "examplePassword"; // password
using (Kiln myAccount = Kiln.AuthenticateOnDemand(account, user, password)) // Here 404 error
{
// Returns changeset history for the repository
Changeset[] changesets;
changesets = myAccount.GetHistory(repo.ID, 100);
// Returns the list of all available projects
Project[] projects;
projects = myAccount.GetProjects();
projects = myAccount.Call<Project[]>(KilnApiCall.Projects, null);
}
While debuggin I got that auth URL seems good. It is like:
https://exampleRepo.kilnhg.com/Kiln/Api/1.0/Auth/Login?sUser=exampleUsername&sPassword=examplePassword
But after execute request I always getting 404 error Not Found. Thanks in advance for your help
The problem was fixed. Kiln.Net library was generating bad URL. Good one is just without "/Kiln":
https://exampleRepo.kilnhg.com/Api/1.0/Auth/Login?sUser=exampleUsername&sPassword=examplePassword
Related
I cannot open repository created via Octokit from Github web browser (using standard url github.com/user/repo-name)
But what works is url github.com/user/repo-name.git
Additionally, Github shows this message on the page:
Cannot retrieve the latest commit at this time.
I assume that it might be some problems in .git configuration files by they looks pretty much the same as in repos I create in web browser.
What might be causing the problem?
I can clone this repo using context.CloneUrl which works just fine.
I'm creating a repository via Octokit, which works fine
var basicAuth = new Octokit.Credentials(Login, Password);
var client = new GitHubClient(new ProductHeaderValue(repoName)) {Credentials = basicAuth};
var repository = new NewRepository(repoName)
{
AutoInit = false,
Description = null,
Private = false
};
var context = await client.Repository.Create(repository);
Additionally I'm using LibGit2Sharp to create and initialize local repository
Github was experiencing difficulties since last night check out https://status.github.com/messages . It might be back up now (10hrs later)
Requirement:
Using libgit2sharp I want to pull (fetch + merge) latest from a specific git remote branch to my currently checked out local branch, without having to pass any other argument, like user credentials etc. Basically I am trying to replicate git pull origin my-remote-branch
Details:
I want to automate certain Git operations from C#. I can simply do what I want by invoking git.exe (if I know the path), like git.exe --git-dir=my-repo-directory pull origin my-remote-branch. Notice that here the only external parameters I have to supply are my-repo-directory and my-remote-branch. Git gets everything right, like the name, password, email, current working branch (even if it doesnt have remote attached) and git pull simply works. I dont have to pass any of those parameters manually. I assume Git gets them from current Git settings for the repo (from %HOME% folder?).
Is there a way to simulate that in LibGit2Sharp?
What I tried:
using (var repo = new Repository("my-repo-directory"))
{
PullOptions pullOptions = new PullOptions()
{
MergeOptions = new MergeOptions()
{
FastForwardStrategy = FastForwardStrategy.Default
}
};
MergeResult mergeResult = Commands.Pull(
repo,
new Signature("my name", "my email", DateTimeOffset.Now), // I dont want to provide these
pullOptions
);
}
Which fails since it says there is no tracking branch. I dont necessarily need a tracking remote branch. I just want to fetch latest from a specific random remote repo and perform automerge if possible.
Just to see if it works I tried:
using (var repo = new Repository("my-repo-directory"))
{
var trackingBranch = repo.Branches["remotes/origin/my-remote-branch"];
if (trackingBranch.IsRemote) // even though I dont want to set tracking branch like this
{
var branch = repo.Head;
repo.Branches.Update(branch, b => b.TrackedBranch = trackingBranch.CanonicalName);
}
PullOptions pullOptions = new PullOptions()
{
MergeOptions = new MergeOptions()
{
FastForwardStrategy = FastForwardStrategy.Default
}
};
MergeResult mergeResult = Commands.Pull(
repo,
new Signature("my name", "my email", DateTimeOffset.Now),
pullOptions
);
}
This fails with
request failed with status code: 401
Additional info:
I dont want to invoke git.exe directly because I cant hardcode the git exe path. Also, since I cant pass username, email etc at runtime, is there a way libgit2sharp get them by itself from the repository settings, like how git.exe does?
I assume Git gets them from current Git settings for the repo (from %HOME% folder?).
It depends entirely on what the remote "origin" is:
an ssh URL (in which case Git will rely on %HOME%\.ssh). ssh support for libgit2sharp is followed by issue 7 and (rejected) PR 1072: you might have to use leobuskin/libgit2sharp-ssh
an https URL (in which case Git should rely on a Git credential helper (see Git Storage), which is not supported directly by libgit2, as mentioned here.
You will have to code that helper yourself in CSharp: "Retrieve Credentials from Windows Credentials Store using C#".
Or, as commented by Edward Thomson:
Pass a CredentialsHandler in your FetchOptions.
Return either UsernamePasswordCredentials or DefaultCredentials from your handler, as appropriate.
See here for a UsernamePasswordCredentials example.
See also LibGit2Sharp.Tests/TestHelpers/Constants.cs and other occurrences.
Regarding the pull operation, it involves a Command Fetch, which involves a refspec. As in "Git pull/fetch with refspec differences", you can pass the source:destination branch names for your pull (even if there is no tracking information).
That is what is used in LibGit2Sharp.Tests/FetchFixture.cs.
string refSpec = string.Format("refs/heads/{2}:refs/remotes/{0}/{1}", remoteName, localBranchName, remoteBranchName);
Commands.Fetch(repo, remoteName, new string[] { refSpec }, new FetchOptions {
TagFetchMode = TagFetchMode.None,
OnUpdateTips = expectedFetchState.RemoteUpdateTipsHandler
}, null);
The last line of the following code results in an "Operation returned an invalid status code 'BadRequest'" exception and I don't understand why:
Given the following code :
var tenantDomain = ConfigurationManager.AppSettings["TenantDomain"];
var clientId = ConfigurationManager.AppSettings["ClientID"];
var secret = ConfigurationManager.AppSettings["ClientSecret"];
var subscriptionId = ConfigurationManager.AppSettings["SubscriptionID"];
var serviceCreds = await ApplicationTokenProvider.LoginSilentAsync(tenantDomain, clientId, secret);
var bmc = new BillingManagementClient(serviceCreds);
bmc.SubscriptionId = subscriptionId;
List<Invoice> allInvoices = bmc.Invoices.List().ToList();
Suggestions anyone ? Should I specify a date period explicitly ? How?
Suggestions anyone ? Should I specify a date period explicitly ? How?
If we want to access Billing we need to assign the Billing Reader role to someone that needs access to the subscription billing. We could get the detail steps for the azure official tutorials. I also test the code you mentioned, there is no issue with code, if it is supported. The following is the snippet from the official tutorials.
The Billing Reader feature is in preview, and does not yet support enterprise (EA) subscriptions or non-global clouds.
Please have a try to login Azure Portal to check whether have access to Access to invoice. If you see the Access to invoice is disabled, it seems that the subscription type is not supported.
If you still have further questions, could contact support to get your issue resolved quickly.
Update
Thanks to a comment by #IvanL, it turns out that the problem is Google specific. I have since tried other providers and for those everything works as expected. Google just doesn't seem to send claims information. Haven't yet been able to figure out why or what I need to differently to get Google to send it.
A wild stab in the dark says it may be related to the realm being defaulted to http://:/ as I have seen an answer by Andrew Arnott that Google changes the claimed identifier for the same account based on the realm passed with the authentication request.
Another possibly important tidbit of information: unlike many of the examples that can be found around the web for using dotnetopenauth, I am not using a "simple" textbox and composing the openIdIdentifier myself, but I am using the openID selector and that is providing the openIdIdentifier passed to the ValidateAtOpenIdProvider. (As per the Adding OpenID authentication to your ASP.NET MVC 4 application article.)
Question is: why is IAuthenticationResponse.GetExtension() always returning null when using Google as the openId provider, when otherwise all relevant gotcha's with regard to Google (Email requested as required, AXFetchAsSregTransform, etc) have been addressed?
Original
I am struggling with getting DotNetOpenAuth to parse the response returned from the provider. Followed the instructions of Adding OpenID authentication to your ASP.NET MVC 4 application up to the point where the login should be working and a login result in a return to the home page with the user's name (nick name) displayed at the top right. (That is up to "The user should at this point see the following:" just over half way down the article).
I am using Visual Studio Web Developer 2010 Express with C#. DotNetOpenAuth version is 4.0.3.12153 (according to the packages.config, 4.0.3.12163 according to Windows Explorer).
My web.config was modified following the instructions in Activating AXFetchAsSregTransform which was the solution for DotNetOpenId - Open Id get some data
Unfortunately it wasn't enough to get it working for me.
The openid-selector is working fine and resulting in a correct selection of the openid provider. The authentication request is created as follows:
public IAuthenticationRequest ValidateAtOpenIdProvider(string openIdIdentifier)
{
IAuthenticationRequest openIdRequest = openId.CreateRequest(Identifier.Parse(openIdIdentifier));
var fields = new ClaimsRequest()
{
Email = DemandLevel.Require,
FullName = DemandLevel.Require,
Nickname = DemandLevel.Require
};
openIdRequest.AddExtension(fields);
return openIdRequest;
}
This all works. I can login and authorize the page to receive my information, which then results in a call to GetUser:
public OpenIdUser GetUser()
{
OpenIdUser user = null;
IAuthenticationResponse openIdResponse = openId.GetResponse();
if (openIdResponse.IsSuccessful())
{
user = ResponseIntoUser(openIdResponse);
}
return user;
}
openIdResponse.IsSuccessful is implemented as an extension method (see linked article):
return response != null && response.Status == AuthenticationStatus.Authenticated;
and always is successful as the ResponseIntoUser method is entered:
private OpenIdUser ResponseIntoUser(IAuthenticationResponse response)
{
OpenIdUser user = null;
var claimResponseUntrusted = response.GetUntrustedExtension<ClaimsResponse>();
var claimResponse = response.GetExtension<ClaimsResponse>();
// For this to work with the newer/est version of DotNetOpenAuth, make sure web.config
// file contains required settings. See link for more details.
// http://www.dotnetopenauth.net/developers/help/the-axfetchassregtransform-behavior/
if (claimResponse != null)
{
user = new OpenIdUser(claimResponse, response.ClaimedIdentifier);
}
else if (claimResponseUntrusted != null)
{
user = new OpenIdUser(claimResponseUntrusted, response.ClaimedIdentifier);
}
else
{
user = new OpenIdUser("ikke#gmail.com;ikke van ikkenstein;ikke nick;ikkeclaimedid");
}
return user;
}
My version above only differs from the code in the linked article by my addition of the final else block to ensure that I always get the home page with a user name and a logoff link displayed (which helps when trying to do this several times in succession).
I have tried both Google and Yahoo. Both authenticate fine, both return an identity assertion as logged by the WebDev server. However, GetUntrustedExtenstion and GetExtension always return null. I always get to see "ikke nick" from the last else, never the name I actually used to authenticate.
I am at a loss on how to continue to try and get this to work. It probably is some oversight on my part (I am an experienced developer but just started dipping my toes in C# and web front-end development), and I can't see it.
Any and all suggestions on how to proceed / debug this are very much welcome.
Are you using Google as OpenId provider to test your solution against? Because Google has/had the habit of including the Claims only the first time you authenticate the application. So perhaps try using a fresh google account and see if that works?
Sorry for the slow response, doing a big migration at a client this week :-) Glad that this little comment resolved your issue.
I'm trying to connect my website to the Paypal Sandbox in order to use the Express Checkout feature. I've used this link as reference but i keep getting the 10002 Error "Security header is not valid".
From the documentation this has to be a invalid credentials problem but if i made the request manually through soapUI it returns "Sucess", if i use the curl command it also works as expected.
Scenario: ASP.NET page with two Web References one to https://www.sandbox.paypal.com/wsdl/PayPalSvc.wsdl and another to https://www.paypalobjects.com/wsdl/PayPalSvc.wsdl, the given credentials are Username, Password and Signature as you can see in the following code snippet:
using CloudShop.com.paypal.sandbox.www;
namespace CloudShop
{
public static PayPalAPIAASoapBinding BuildPayPalWebservice()
{
UserIdPasswordType credentials = new UserIdPasswordType()
{
Username = CloudShopConf.PayPalAPIUsername,
Password = CloudShopConf.PayPalAPIPassword,
Signature = CloudShopConf.PayPalAPISignature
};
PayPalAPIAASoapBinding paypal = new PayPalAPIAASoapBinding();
paypal.RequesterCredentials = new CustomSecurityHeaderType()
{
Credentials = credentials
};
return paypal;
}
Right now i would like to know how to proceed with the debug. What could be wrong?
Some ideas:
Check if you are using the Live-Credentials for the sandbox account.
Are you using https://api-3t.sandbox.paypal.com/2.0/ (especially the -3t part) as the endpoint? You should as you are using Signature authentication.
As usual, you should step through every setting you are using: protocol, API Endpoint, Version, Credentials etc. and compare you're manual SoapUI call with the information stored in you shop configuration.
I also found a blog article on this error that might help resolving this issue.