Apache ignite cache viewer like Redis Desktop Manager - c#

I am loving Apache Ignite particularly as distributed caching. However I have realised that the tooling is not as good.
I am looking for a simple desktop tool to be able to view and search the cache for values etc something similar to Redis Deskop Manager
I am in WINDOWS environment. My google searches has returned "DBeaver" which I have downloaded and configured but doesn't show my cache key values. The other one has been "Web Console" though this is a web based and I prefer some desktop thing - Not sure if I can install this locally?
Anything else around?
much appreciated.

I think the closest you can get is LINQPad + .NET Thin Client.
Ignite NuGet package actually includes LINQPad sample to get first 5 items from every cache in cluster and display them, you can modify it to your needs.
This approach requires some coding, but is quite flexible with LINQ capabilities and rich API at your disposal, plus LINQPad data display features.
Sample code:
var cfg = new IgniteClientConfiguration { Host = "127.0.0.1" };
using (var client = Ignition.StartClient(cfg))
{
// Create cache for demo purpose.
var fooCache = client.GetOrCreateCache<int, object>("thin-client-test").WithKeepBinary<int, IBinaryObject>();
fooCache[1] = client.GetBinary().GetBuilder("foo")
.SetStringField("Name", "John")
.SetTimestampField("Birthday", new DateTime(2001, 5, 15).ToUniversalTime())
.Build();
var cacheNames = client.GetCacheNames();
"Diplaying first 5 items from each cache:".Dump();
foreach (var name in cacheNames)
{
var cache = client.GetCache<object, object>(name).WithKeepBinary<object, object>();
var items = cache.Query(new ScanQuery<object, object>()).Take(5)
.ToDictionary(x => x.Key.ToString(), x => x.Value.ToString());
items.Dump(name);
}
}
```

GridGain has GUI tool which allows you connect to your grid, peek into caches, as well as many more things.
It is a part of commercial offering, but will connect to Apache Ignite grids.

Related

Microsoft.Azure.Cosmos Most Efficient way to Export Large Data

I need to export thousands of files from a Cosmos DB, and I am wondering if there may be a more efficient way to get all these documents (but I haven't been able to figure one out by browsing the documentation and searching).
Right now I am using the FeedIterator to get my results:
Database database = m_cosmosClient.GetDatabase(m_databaseId);
DatabaseResponse databaseResponse = await database.ReadAsync();
// The response from Azure Cosmos
DatabaseProperties properties = databaseResponse;
Container container = databaseResponse.Database.GetContainer(m_cosmosDbContainer);
QueryDefinition query = new QueryDefinition(queryString);
QueryRequestOptions queryOptions = new QueryRequestOptions { MaxItemCount = 10000, MaxBufferedItemCount = 10000 };
List<Article> results = new List<Article>();
FeedIterator<Article> resultSetIterator = container.GetItemQueryIterator<Article>(query, null, queryOptions);
while (resultSetIterator.HasMoreResults)
{
FeedResponse<Article> response = await resultSetIterator.ReadNextAsync();
results.AddRange(response);
if (response.Diagnostics != null)
{
Console.WriteLine($"\nQueryWithSqlParameters Diagnostics: {response.Diagnostics.ToString()}");
}
}
I am worried that without some form of multi-tasking that I could run out of memory, and then again it is always nice to have a faster run time.
The Cosmos DB Data Migration Tool is a good (and simple) option if you want to run the extract from a workstation. It can be run interactively or automated using scripts.
Creating a job in Azure Data Factory is a bit more complex but also offers a lot more flexibility.
This article discusses the various options for data migration in and out of Cosmos DB.

Octopus client, getting version from project name in C#

First of, I am completely new to octopus client, used it for the first time just before posting this.
So, I've been landed with this project to update the version number on a webpage monitoring some of our octopus deployed projects. I have been looking around the octopus client and not really gotten anywhere. The best I have so far is:
OctopusServerEndpoint endPoint = new OctopusServerEndpoint(server, apiKey);
OctopusRepository repo = new OctopusRepository(endPoint);
var releases = repo.Releases.FindAll();
From these releases I can get the ProjectId and even the Version, the issue is that releases is 600 strong and I am only looking for 15 of them.
The existing code I have to work from used to parse the version from local files so that is all out the window. Also, the existing code only deals with the actual names of the projects, like "AWOBridge", not their ProjectId, which is "Projects-27".
Right now my only option is to manually write up a keyList or map to correlate the names I have with the IDs in the octopus client, which I of course rather not since it is not very extendable or good code practice in my opinion.
So if anyone has any idea on how to use the names directly with octopus client and get the version number from that I would very much appriciate it.
I'll be getting down into octopus client while waiting. Let's see if I beat you to it!
Guess I beat you to it!
I'll just leave an answer here if anyone ever has the same problem.
I ended up using the dashboardto get what I needed:
OctopusServerEndpoint endPoint = new OctopusServerEndpoint(server, apiKey);
OctopusRepository repo = new OctopusRepository(endPoint);
DashboardResource dash = repo.Dashboards.GetDashboard();
List<DashboardItemResource> items = dash.Items;
DashboardItemResource item = new DashboardItemResource();
List<DashboardProjectResource> projs = dash.Projects;
var projID = projs.Find(x => x.Name == projectName).Id;
item = items.Find(x => x.ProjectId == projID && x.IsCurrent == true);
The dashboard is great since it contains all the info that the web dashboard shows. So you can use Project, Release, Deployment and Environment with all the information they contain.
Hope this helps someone in the future!
I'm using LINQPad to run C# snippets for Octopus automation using the Octopus Client library and I have come up with following to get any version of a project making use of Regular expression pattern. It works quite well if you use Pre-release semantic versioning.
For example to get latest release for a project:
var project = Repo.Projects.FindByName("MyProjectName");
var release = GetReleaseForProject(project);
To get specific release use that has 'rc1' in the version for example (also useful if you use source code branch name in the version published to Octopus:
var release = GetReleaseForProject(project, "rc1");
public ReleaseResource GetReleaseForProject(ProjectResource project, string versionPattern = "")
{
// create compiled regex expression to use for search
var regex = new Regex(versionPattern, RegexOptions.Compiled | RegexOptions.CultureInvariant | RegexOptions.IgnoreCase);
var releases = Repo.Projects.GetReleases(project);
if (!string.IsNullOrWhiteSpace(versionPattern) && !releases.Items.Any(r => regex.IsMatch(r.Version)))
{
return null;
}
return (!string.IsNullOrWhiteSpace(versionPattern)) ? releases.Items.Where(r => regex.IsMatch(r.Version))?.First() : releases.Items?.First();;
}

How to find in VSPackage which version control system a solution uses

I'm new to extending Visual Studio and I'm trying to find way to find which source control system is used by current solution.
I created VsPackage project and I am able to obtain reference to solution via IVsSolution and to hook up to solution events via IVsSolutionEvents.
Inside OnAfterSolutionOpen (or possibly some other if there's an alternative) I would like to act differently basing on whether the solution uses TFS or Git or something else. How can I obtain this information?
I plan to support as many Visual Studio versions as possible, but if it isn't possible I would like to support at least VS2012 and higher.
Ok, after several hours of digging I've found a solution to this. Thanks to the article of Mark Rendle and the source code for his NoGit extension I've found, that the list of registered source control plugins is located in registry: HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\12.0_Config\SourceControlProviders (in case of VS 2013).
So now, we can have both plugin guid, and the name of the provider. This sample code can fetch those values:
var key = #"Software\Microsoft\VisualStudio\" + "12.0" + #"_Config\SourceControlProviders";
var subkey = Microsoft.Win32.Registry.CurrentUser.OpenSubKey(key);
var providerNames = subkey.GetSubKeyNames().Dump();
var dict = new Dictionary<Guid, String>();
foreach (var provGuidString in subkey.GetSubKeyNames())
{
var provName = (string)subkey.OpenSubKey(provGuidString).GetValue("");
dict.Add(Guid.Parse(provGuidString), provName);
}
Now, there are two ways I've found to obtain guid of currently active provider.
important update: Apparently the second way of obtaining currently active plugin does not work as expected. I strongly advise using first solution.
This is the way that bases on the extension mentioned earlier:
var getProvider = GetService(typeof(IVsRegisterScciProvider)) as IVsGetScciProviderInterface;
Guid pGuid;
getProvider.GetSourceControlProviderID(out pGuid);
Or we can just go to HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\12.0\CurrentSourceControlProvider and get the default value of this key:
var key2 = #"Software\Microsoft\VisualStudio\12.0\CurrentSourceControlProvider";
var guidString = (string)Microsoft.Win32.Registry.CurrentUser.OpenSubKey(key2).GetValue("");
var currentGuid = Guid.Parse(guidString);
Now we just take var activeProviderName = dict[currentGuid]; and that's all.

Barcode lookups for games, books, CDs

I am looking to write my own media library and was wondering are there any free .Net APIs out there to identify a product based on a given barcode? As a secondary point are there .Net APIs to return cover art for books, CD, games etc based on a barcode.
You can also use amazon web services for this. You'll need to have an account and API key (free) and download the toolkit for Visual Studio.
To answer your specific question, a barcode scan returns a UPC. You can use the ItemLookup command for the Amazon web service for this lookup, setting the IdType to "UPC" and the ItemId to the appropriate UPC (barcode scan).
For more info, navigate to the "API reference" in Amazon's developer guide.
Here is a very basic C# code sample:
com.amazon.webservices.ItemLookup itemLookup = new com.amazon.webservices.ItemLookup();
itemLookup.AWSAccessKeyId = "XXXXXXXXXXXXXXXXXXXXXX";
com.amazon.webservices.ItemLookupRequest request = new com.amazon.webservices.ItemLookupRequest();
request.IdType = com.amazon.webservices.ItemLookupRequestIdType.UPC;
request.ItemId = new String[] { "00028000133177" };
request.ResponseGroup = new String[] { "Small", "AlternateVersions" };
itemLookup.Request = new com.amazon.webservices.ItemLookupRequest[] { request };
try
{
com.amazon.webservices.ItemLookupResponse itemLookupResponse = com.amazon.webservices.AWSECommerceService.ItemLookup(itemLookup);
com.amazon.webservices.Item item = itemLookupResponse.Items[0].Item[0];
System.Console.WriteLine(item.ItemAttributes.Title);
}
catch (Exception e)
{
System.Console.Error.WriteLine(e);
}
Of note, I've written a program to do exactly this. I've found that Amazon does not always have a match for every UPC (which is expected), but a majority of my items are found in Amazon. Once you do find a match, you may want to store the ASIN / UPC relationship somewhere, so that you can reference the item by the ASIN (Amazon ID) going forward.
Try this: http://www.ozgrid.com/barcodes/barcode-reader.htm
EDIT1:
An API for the www.upcdatabase.com Barcode Query: http://www.upcdatabase.com/xmlrpc.asp
I cannot provide much information about this, but this might help you. This page has C# VB APIs for querying with a Barcode.
Hooked In Motion has one: http://www.hookedinmotion.com/?page_id=246

What is the Fastest way to read event log on remote machine?

I am working on an application which reads eventlogs(Application) from remote machines. I am making use of EventLog class in .net and then iterating on the Log entries but this is very slow. In some cases, some machines have 40000+ log entries and it takes hours to iterate through the entries.
what is the best way to accomplish this task? Are there any other classes in .net which are faster or in any other technology?
Man, I feel your pain. We had the exact same issue in our app.
Your solution has a branch depending on what server version you're running on and what server version your "target" machine is running on.
If you're both on Vista or Windows Server 2008, you're in luck. You should look at System.Diagnostics.Eventing.Reader.EventLogQuery and System.Diagnostics.Eventing.Reader.EventLogReader. These are new in .net 3.5.
Basically, you can build a query in XML and ship it over to run on the remote computer. Maybe you're just searching for events of a specific type, or maybe just new events from a specific point in time. The search runs on the remote machine, and then you just get back the matching events. The new classes are much faster than the old .net 2.0 way, but again, they are only supported on Vista or Windows Server 2008.
For our app when the target is NOT on Vista/Win2008, we downloaded the raw .evt file from the remote system, and then parsed the file using its binary format. There are several sources of data about the event log format for .evt files (pre-Vista), including link text and an article I recall on codeproject.com that had some c# code.
Vista and Windows Server 2008 machines use a new .evtx format that is a new format, so you can't use the same binary parsing approach across all versions. But the new EventLogQuery and EventLogReader classes are so fast that you won't have to. It's now perfectly speedy to just use the built-in classes.
Event Log Reader is horribly slow... too slow. WTF Microsoft?
Use LogParser 2.2 - Search for C# and LogParser on the Internet (or you can use the log parser commands from the command line). I don't want to duplicate the work already contributed by others.
I pull the log from the remote system by having the log exported as an EVTX file. I then copy the file from the remote system. This process is really quick - even with a network that spans the planet (I had issues with having the log exported to a network resource). Once you have it local, you can do your searches and processing.
There are multiple reasons for having the EVTX - I won't get into the reasons why we do this.
The following is a working example of the code to save a copy of the log as an EVTX:
(Notes: "device" is the network host name or IP. "LogName" is the name of the log desired: "System", "Security", or "Application". outputPathOnRemoteSystem is the path on the remote computer, such as "c:\temp\%hostname%.%LogName%.%YYYYMMDD_HH.MM%.evtx".)
static public bool DumpLog(string device, string LogName, string outputPathOnRemoteSystem, out string errMessage)
{
bool wasExported = false;
string errorMessage = "";
try
{
System.Diagnostics.Eventing.Reader.EventLogSession els = new System.Diagnostics.Eventing.Reader.EventLogSession(device);
els.ExportLogAndMessages(LogName, PathType.LogName, "*", outputPathOnRemoteSystem);
wasExported = true;
}
catch (UnauthorizedAccessException e)
{
errorMessage = "Unauthorized - Access Denied: " + e.Message;
}
catch (EventLogNotFoundException e)
{
errorMessage = "Event Log Not Found: " + e.Message;
}
catch (EventLogException e)
{
errorMessage = "Export Failed: " + e.Message + ", Log: " + LogName + ", Device: " + device;
}
errMessage = errorMessage;
return wasExported;
}
A good Explanation/Example can be found on MSDN.
EventLogSession session = new EventLogSession(Environment.MachineName);
// [System/Level=2] filters out the errors
// Where "Log" is the log you want to get data from.
EventLogQuery query = new EventLogQuery("Log", PathType.LogName, "*[System/Level=2]");
EventLogReader reader = new EventLogReader(query);
for (EventRecord eventInstance = reader.ReadEvent();
null != eventInstance;
eventInstance = reader.ReadEvent())
{
// Output or save your event data here.
}
When waiting 5-20 minutes with the old code this one does it in less than 10 seconds.
Maybe WMI can help you:
WMI with C#
Have you tried using the remoting features in powershell 2.0? They allow you to execute cmdlets (like ones to read event logs) on remote machines and return the results (as objects, of course) to the calling session.
You could place a Program at those machines that save the log to file and sends it to your webapplication i think that would be alot faster as you can do the looping local but im not sure how to do it so i cant ive you any code :(
I recently did such thing via WCF callback interface however my clients interacted with the server through WCF and adding a WCF Callback was easy in my project, full code with examples is available here
Just had the same issue and want to share my solution. It makes a search through application, system and security eventlogs from 260 seconds (using EventLog) about a 100 times faster (using EventLogQuery).
And this in a way where it is possible to check if the event message contains a pattern or any other check without the requirement of FormatDescription().
My trick is to use the same mechanism as PowerShells Get-WinEvent does and then pass it through the result check.
Here is my code to find all events within last 4 days where the event message contains a filter pattern.
string[] eventLogSources = {"Application", "System", "Security"};
var messagePattern = "*Your Message Search Pattern*";
var timeStamp = DateTime.Now.AddDays(-4);
var matchingEvents = new List<EventRecord>();
foreach (var eventLogSource in eventLogSources)
{
var i = 0;
var query = string.Format("*[System[TimeCreated[#SystemTime >= '{0}']]]",
timeStamp.ToUniversalTime().ToString("o"));
var elq = new EventLogQuery(eventLogSource, PathType.LogName, query);
var elr = new EventLogReader(elq);
EventRecord entryEventRecord;
while ((entryEventRecord = elr.ReadEvent()) != null)
{
if ((entryEventRecord.Properties)
.FirstOrDefault(x => (x.Value.ToString()).Contains(messagePattern)) != null)
{
matchingEvents.Add(entryEventRecord);
i++;
}
}
}
Maybe that the remote computers could do a little bit of computing. So this way your server would only deal with relevant information. It would be a kind of cluster using the remote computer to do some light filtering and the server would the the analysis part.

Categories

Resources