I cannot open repository created via Octokit from Github web browser (using standard url github.com/user/repo-name)
But what works is url github.com/user/repo-name.git
Additionally, Github shows this message on the page:
Cannot retrieve the latest commit at this time.
I assume that it might be some problems in .git configuration files by they looks pretty much the same as in repos I create in web browser.
What might be causing the problem?
I can clone this repo using context.CloneUrl which works just fine.
I'm creating a repository via Octokit, which works fine
var basicAuth = new Octokit.Credentials(Login, Password);
var client = new GitHubClient(new ProductHeaderValue(repoName)) {Credentials = basicAuth};
var repository = new NewRepository(repoName)
{
AutoInit = false,
Description = null,
Private = false
};
var context = await client.Repository.Create(repository);
Additionally I'm using LibGit2Sharp to create and initialize local repository
Github was experiencing difficulties since last night check out https://status.github.com/messages . It might be back up now (10hrs later)
Related
Our team has been creating API tests with Specflow using RestSharp as the API client. The API I am testing simply creates an asset (that has a few properties) with a POST method and then I use a GET method to get the data for the new asset, so I can deserialize and verify a few properties to ensure that the new asset has been created.
For example, I pass the new asset with a name property where Name="Asset15" (where a new asset of ID=15 is created from the sequence) and then I get the info passing the ID=15 to verify that the new asset with Name="Asset15" exists. Everything seemed to be working until recently.
Without changing any code, the test now creates the new asset with Name="Asset20" with ID=20, for example, but the GET method seems to be returning the record of Name="Asset19" and ID=19 instead, causing my test to obviously fail even when I see manually that asset of Name="Asset20" and ID=20 had been created.
There seems to be some caching issue and I was wondering what would be a way to clear this cache.
I have seen an article somewhere where the person fixed this by merely restarting Visual Studio. I tried that only to get results 2 records behind instead.
This is how I setup my test:
_settings.BaseUrl = new Uri(ConfigurationManager.AppSettings["baseUrl"].ToString());
_settings.RestClient.BaseUrl = _settings.BaseUrl;
Execute the POST method to create the asset:
_settings.PostRequest = new RestRequest("CreateAsset", Method.POST);
_settings.PostRequest.RequestFormat = DataFormat.Json;
_settings.PostRequest.AddJsonBody(testData);
_settings.PostResponse = _settings.RestClient.Execute(_settings.PostRequest);
Later I execute the GET method, deserialize, and validate that the information is correct:
_settings.GetRequest = new RestRequest("GetAsset?id=20", Method.GET);
_settings.GetResponse = _settings.RestClient.Execute(_settings.GetRequest);
var deserial = new JsonDeserializer();
var output = deserial.Deserialize<Dictionary<string, string>>(_settings.GetResponse);
var result = output["Name"];
Assert.That(result.Equals(testData.Name), $"Error: ...");
The test was passing and now it seems to be 1 or 2 records behind. Can someone help and let me know what I am doing wrong? Thanks in advance!
The issue here was with the API and not the test code. This can be closed since it is a non-issue.
I was watching a tutorial on how to script a bot using C# and the instructor used (to my knowledge) an old call to TwitchClient which takes credentials and references. However, it is currently not the case and I'm wondering now what might be a good way to work around it. Currently, the method takes a websocket and logger but I have suspicion that you still need to use credentials and references.
Any help will be appreciated.
Here's the video with the timestamp: https://youtu.be/5f1T9hQqJps?t=8m3s
Instead of the single line in the video, these two lines should now achieve mostly the same effect:
client = new TwitchClient();
client.Initialize(credentials, "channel");
If you want to also enable logging (like in the video), then you will need to provide an instance of ILogger to the first call like so:
client = new TwitchClient(null, myLoggingInstance);
The WebSocket parameter is used for testing (so you can generate your own traffic to test your bot), the docs advise not to set this.
its quite simple actually, even the github page shows a simple example:
ConnectionCredentials credentials = new ConnectionCredentials("twitch_username", "access_token");
var clientOptions = new ClientOptions
{
MessagesAllowedInPeriod = 750,
ThrottlingPeriod = TimeSpan.FromSeconds(30)
};
WebSocketClient customClient = new WebSocketClient(clientOptions);
client = new TwitchClient(customClient);
client.Initialize(credentials, "channel");
client.OnLog += Client_OnLog;
client.Connect();
then later declare this function:
private void Client_OnLog(object sender, OnLogArgs e)
{
Console.WriteLine($"{e.DateTime.ToString()}: {e.BotUsername} - {e.Data}");
}
Requirement:
Using libgit2sharp I want to pull (fetch + merge) latest from a specific git remote branch to my currently checked out local branch, without having to pass any other argument, like user credentials etc. Basically I am trying to replicate git pull origin my-remote-branch
Details:
I want to automate certain Git operations from C#. I can simply do what I want by invoking git.exe (if I know the path), like git.exe --git-dir=my-repo-directory pull origin my-remote-branch. Notice that here the only external parameters I have to supply are my-repo-directory and my-remote-branch. Git gets everything right, like the name, password, email, current working branch (even if it doesnt have remote attached) and git pull simply works. I dont have to pass any of those parameters manually. I assume Git gets them from current Git settings for the repo (from %HOME% folder?).
Is there a way to simulate that in LibGit2Sharp?
What I tried:
using (var repo = new Repository("my-repo-directory"))
{
PullOptions pullOptions = new PullOptions()
{
MergeOptions = new MergeOptions()
{
FastForwardStrategy = FastForwardStrategy.Default
}
};
MergeResult mergeResult = Commands.Pull(
repo,
new Signature("my name", "my email", DateTimeOffset.Now), // I dont want to provide these
pullOptions
);
}
Which fails since it says there is no tracking branch. I dont necessarily need a tracking remote branch. I just want to fetch latest from a specific random remote repo and perform automerge if possible.
Just to see if it works I tried:
using (var repo = new Repository("my-repo-directory"))
{
var trackingBranch = repo.Branches["remotes/origin/my-remote-branch"];
if (trackingBranch.IsRemote) // even though I dont want to set tracking branch like this
{
var branch = repo.Head;
repo.Branches.Update(branch, b => b.TrackedBranch = trackingBranch.CanonicalName);
}
PullOptions pullOptions = new PullOptions()
{
MergeOptions = new MergeOptions()
{
FastForwardStrategy = FastForwardStrategy.Default
}
};
MergeResult mergeResult = Commands.Pull(
repo,
new Signature("my name", "my email", DateTimeOffset.Now),
pullOptions
);
}
This fails with
request failed with status code: 401
Additional info:
I dont want to invoke git.exe directly because I cant hardcode the git exe path. Also, since I cant pass username, email etc at runtime, is there a way libgit2sharp get them by itself from the repository settings, like how git.exe does?
I assume Git gets them from current Git settings for the repo (from %HOME% folder?).
It depends entirely on what the remote "origin" is:
an ssh URL (in which case Git will rely on %HOME%\.ssh). ssh support for libgit2sharp is followed by issue 7 and (rejected) PR 1072: you might have to use leobuskin/libgit2sharp-ssh
an https URL (in which case Git should rely on a Git credential helper (see Git Storage), which is not supported directly by libgit2, as mentioned here.
You will have to code that helper yourself in CSharp: "Retrieve Credentials from Windows Credentials Store using C#".
Or, as commented by Edward Thomson:
Pass a CredentialsHandler in your FetchOptions.
Return either UsernamePasswordCredentials or DefaultCredentials from your handler, as appropriate.
See here for a UsernamePasswordCredentials example.
See also LibGit2Sharp.Tests/TestHelpers/Constants.cs and other occurrences.
Regarding the pull operation, it involves a Command Fetch, which involves a refspec. As in "Git pull/fetch with refspec differences", you can pass the source:destination branch names for your pull (even if there is no tracking information).
That is what is used in LibGit2Sharp.Tests/FetchFixture.cs.
string refSpec = string.Format("refs/heads/{2}:refs/remotes/{0}/{1}", remoteName, localBranchName, remoteBranchName);
Commands.Fetch(repo, remoteName, new string[] { refSpec }, new FetchOptions {
TagFetchMode = TagFetchMode.None,
OnUpdateTips = expectedFetchState.RemoteUpdateTipsHandler
}, null);
I have a Console Application project written in C# which I've added Application Insights to with the following NuGet packages.
Microsoft.ApplicationInsights
Microsoft.ApplicationInsights.Agent.Intercept
Microsoft.ApplicationInsights.DependencyCollector
Microsoft.ApplicationInsights.NLogTarget
Microsoft.ApplicationInsights.PerfCounterCollector
Microsoft.ApplicationInsights.Web
Microsoft.ApplicationInsights.WindowsServer
Microsoft.ApplicationInsights.WindowsServer.TelemetryChannel
I've configured my InstrumentationKey in the config file and I'm firing up a TelemetryClient on startup using the with the following code:
var telemetryClient = new TelemetryClient();
telemetryClient.Context.User.Id = Environment.UserName;
telemetryClient.Context.Session.Id = Guid.NewGuid().ToString();
telemetryClient.Context.Device.OperatingSystem = Environment.OSVersion.ToString();
Everything is working well except AI is not capturing any requests that get sent to Mongo, I can see requests going off to SQL server in the 'Application map' but no sign of any other external requests. Is there any way that I can see telemetry of requests made to Mongo?
EDIT - Thanks to Peter Bons I ended up with pretty much the following which works like a charm and allows me to distinguish between success and failure:
var telemetryClient = new TelemetryClient();
var connectionString = connectionStringSettings.ConnectionString;
var mongoUrl = new MongoUrl(connectionString);
var mongoClientSettings = MongoClientSettings.FromUrl(mongoUrl);
mongoClientSettings.ClusterConfigurator = clusterConfigurator =>
{
clusterConfigurator.Subscribe<CommandSucceededEvent>(e =>
{
telemetryClient.TrackDependency("MongoDB", e.CommandName, DateTime.Now.Subtract(e.Duration), e.Duration, true);
});
clusterConfigurator.Subscribe<CommandFailedEvent>(e =>
{
telemetryClient.TrackDependency("MongoDB", $"{e.CommandName} - {e.ToString()}", DateTime.Now.Subtract(e.Duration), e.Duration, false);
});
};
var mongoClient = new MongoClient(mongoClientSettings);
I am not familiar with MongoDB but as far as I can tell there is no default support for it when it comes to Application Insights. But that does not mean you cannot do this, it will just involve some more code.
Again, I am not familiar with MongoDB but according to http://www.mattburkedev.com/logging-queries-from-mongodb-c-number-driver/ there is built-in support for logging the generated queries. Now, we only need to hook this up to Application Insights.
Since you already know how to use the TelemetryClient we can use the custom tracking methods provided by that class. See https://learn.microsoft.com/nl-nl/azure/application-insights/app-insights-api-custom-events-metrics for the available custom tracking methods.
All you need to do is to insert some code like this:
telemetryClient.TrackDependency(
"MongoDB", // The name of the dependency
query, // Text of the query
DateTime.Now, // Time that query is executed
TimeSpan.FromSeconds(0), // Time taken to execute query
true); // Indicates success
The class telemetryClient is thread-safe so you can reuse it.
Now, according to the referenced blogpost you should be able to do something like this:
var client = new MongoClient(new MongoClientSettings()
{
Server = new MongoServerAddress("localhost"),
ClusterConfigurator = cb =>
{
cb.Subscribe<CommandStartedEvent>(e =>
{
telemetryClient.TrackDependency(
"MongoDB", // The name of the dependency
e.Command.ToJson() // Text of the query
DateTime.Now, // Time that query is executed
TimeSpan.FromSeconds(0), // Time taken to execute query
true); // Indicates success
});
}
});
Again, I am not familiar with MongoDB but I hope this is a starting point for your imagination on how to adapt it to your needs using your knowledge of MongoDB.
EDIT:
If there is also a CommandCompletedEvent or similar event as opposed to the CommandStartedEvent event you should probably track the dependency there because you should then be able to calculate (or simpel read) the time spent and maybe get the actual value for the success indicator.
I use Kiln.Net library to connect to mercurial repository. I need to get base information (commits, lines of code changed..). Then it should group that info further to show progress for each author. But still have no success.
Code to connect:
var account = "exampleRepo"; // examplerepo.kilnhg.com
var user = "exampleUsername"; // username
var password = "examplePassword"; // password
using (Kiln myAccount = Kiln.AuthenticateOnDemand(account, user, password)) // Here 404 error
{
// Returns changeset history for the repository
Changeset[] changesets;
changesets = myAccount.GetHistory(repo.ID, 100);
// Returns the list of all available projects
Project[] projects;
projects = myAccount.GetProjects();
projects = myAccount.Call<Project[]>(KilnApiCall.Projects, null);
}
While debuggin I got that auth URL seems good. It is like:
https://exampleRepo.kilnhg.com/Kiln/Api/1.0/Auth/Login?sUser=exampleUsername&sPassword=examplePassword
But after execute request I always getting 404 error Not Found. Thanks in advance for your help
The problem was fixed. Kiln.Net library was generating bad URL. Good one is just without "/Kiln":
https://exampleRepo.kilnhg.com/Api/1.0/Auth/Login?sUser=exampleUsername&sPassword=examplePassword