Get ElasticSearch bulk queue status with NEST - c#

I have a program that performs several bulk index operation on an ElasticSearch cluster. At some point, I start getting errors like this one (snipped):
RemoteTransportException[...][indices:data/write/bulk[s]]]; nested: EsRejectedExecutionException[rejected execution (queue capacity 100) ...];
Is there a way I can verify the status of the bulk upload queue, ideally using NEST, so that I can slow down the client application in case I see that the queue on the server is getting full?
The NodesInfo method looks interesting, but I don't see how to access the information I need:
using Nest;
using System;
class Program {
static void Main(string[] args) {
ElasticClient client = new ElasticClient(new ConnectionSettings(new Uri("http://whatever:9200/")));
var nodesInfoResponse = client.NodesInfo();
if (nodesInfoResponse.IsValid) {
foreach (var n in nodesInfoResponse.Nodes) {
Console.WriteLine($"Node: {n.Key}");
var bulk = n.Value.ThreadPool["bulk"];
// ???
}
}
}
}

You need to use NodesStats() and not NodesInfo().
var nodesStatsResponse = client.NodesStats();
if (nodesStatsResponse.IsValid)
{
foreach (var node in nodesStatsResponse.Nodes)
{
long bulkThreadPoolQueueSize = node.Value.ThreadPool["bulk"].Queue;
}
}
UPDATE:
The above query will bring in a lot of information than required. A highly optimized request for getting the same information is through the usage of _cat/thread_pool API. See below:
var catThreadPoolResponse = client.CatThreadPool(d => d.H("host", "bulk.queue"));
if (catThreadPoolResponse.IsValid)
{
foreach (var record in catThreadPoolResponse.Records)
{
string nodeName = record.Host;
long bulkThreadPoolQueueSize = int.Parse(record.Bulk.Queue);
Console.WriteLine($"Node [{nodeName}] : BulkThreadPoolQueueSize [{bulkThreadPoolQueueSize}]");
}
}

Related

CosmosDb - returns null properties .NET

I have a problem with downloading CosmosDb data, even when doing it like in the tutorial.
So in the beginning, my CosmosDb looks like this:
I tried to simply add a new class:
public class CaseModel
{
[JsonProperty("_id")]
public string Id { get; set; }
[JsonProperty("vin")]
public string Vin { get; set; }
}
and then just do like it is mentioned in the documentation
using (FeedIterator<Case> iterator = collection.GetItemLinqQueryable<Case>(true).ToFeedIterator())
{
while (iterator.HasMoreResults)
{
foreach (var item in await iterator.ReadNextAsync())
{
var x = item;
}
}
}
This way, the code iterates over many elements (like it is working),
but the properties are always null - as if the mapping would not work:
Then I tried something like this
using (FeedIterator<Case> feedIterator = collection.GetItemQueryIterator<Case>(
"select * from cases",
null,
new QueryRequestOptions() { PartitionKey = new PartitionKey("shardedKey") }))
{
while (feedIterator.HasMoreResults)
{
foreach (var item in await feedIterator.ReadNextAsync())
{
var x = item;
}
}
}
But this query returns no results.
I have no idea what is wrong.
Lately I was working with CosmosDb on Azure some year ago, and was doing some similar things.
The only thing that I think is strange, that the elements are marked as 'documents'
In the end, my code which should work looks like this
var dbClient = new CosmosClient(info.ConnectionString);
var db = dbClient.GetDatabase(info.DatabaseName);
var collection = db.GetContainer(info.Collection);
using (FeedIterator<CaseModel> iterator = collection.GetItemLinqQueryable<CaseModel>(true)
.ToFeedIterator())
{
while (iterator.HasMoreResults)
{
foreach (var item in await iterator.ReadNextAsync())
{
var x = item;
}
}
}
In the debug windows, I see that 3 steps at the beginning (like connect with connection string, then get database then get-container) work.
You are mixing APIs. The SDK you are referencing to (Microsoft.Azure.Cosmos) is the SQL API SDK: https://learn.microsoft.com/azure/cosmos-db/sql/sql-api-sdk-dotnet-standard
The screenshot in your question is from a Mongo API account.
Either you use a SQL API account with that SDK or you use the C# Mongo driver to interact with your Mongo API account.
SQL API accounts use id as the property for Ids/document identifier, not _id.

ServiceStack.Redis unable to connect sPort

I've been trying figure out for a few days now why I am getting exceptions such as http://i.imgur.com/cfCBWRS.png
public virtual bool CreateOrUpdateValueById<T>(TQuery query, TResult value)
{
using (var redisClient = Connection.RedisManager.GetClient())
{
var redis = redisClient.As<TResult>();
var key = query.GetKeyWithId();
redis.SetEntry(key, value);
return true;
}
}
which runs in loop of several hundred of items.
foreach (var playlistItem in playlistItems)
{
var query = new PlaylistItemsQuery(playlistItem.Id, playlistItem.PlaylistId);
_playlistItemsQueryHandler.CreateOrUpdateValueById<PlaylistItemDto>(query, playlistItem);
}
also happens for any get query
public virtual IEnumerable<TResult> GetAllValues(TQuery query)
{
using (var redisClient = Connection.RedisManager.GetReadOnlyClient()
{
var keys = redisClient.ScanAllKeys(query.GetKeyWithAllIds()).ToList();
return redisClient.GetValues<TResult>(keys);
}
}
i use singleton class for redispool
public static IRedisClientsManager RedisManager { get; } = new PooledRedisClientManager
{
ConnectTimeout = 60000
};
I am hosting redis on localhost windows which is not officially supported, can this really be the case?
The error message suggests the Redis Client is unable to make a TCP Connection with the Remote Redis Server. If you're not using a licensed version of ServiceStack.Redis v4 then it could mean you've exceeded the ServiceStack.Redis Free Quota Limits.
Otherwise confirm that you can connect to it from redis-cli.exe, if you can't you can try restarting the redis-server.

Emails being sent twice

We have an email queue table in the database. It holds the subject, HTML body, to address, from address etc.
In Global.asax every interval, the Process() function is called which despatches a set number of emails. Here's the code:
namespace v2.Email.Queue
{
public class Settings
{
// How often process() should be called in seconds
public const int PROCESS_BATCH_EVERY_SECONDS = 1;
// How many emails should be sent in each batch. Consult SES send rates.
public const int EMAILS_PER_BATCH = 20;
}
public class Functions
{
private static Object QueueLock = new Object();
/// <summary>
/// Process the queue
/// </summary>
public static void Process()
{
lock (QueueLock)
{
using (var db = new MainContext())
{
var emails = db.v2EmailQueues.OrderBy(c => c.ID).Take(Settings.EMAILS_PER_BATCH);
foreach (var email in emails)
{
var sent = Amazon.Emailer.SendEmail(email.FromAddress, email.ToAddress, email.Subject,
email.HTML);
if (sent)
db.ExecuteCommand("DELETE FROM v2EmailQueue WHERE ID = " + email.ID);
else
db.ExecuteCommand("UPDATE v2EmailQueue Set FailCount = FailCount + 1 WHERE ID = " + email.ID);
}
}
}
}
The problem is that every now and then it's sending one email twice.
Is there any reason from the code above that could explain this double sending?
Small test as per Matthews suggestion
const int testRecordID = 8296;
using (var db = new MainContext())
{
context.Response.Write(db.tblLogs.SingleOrDefault(c => c.ID == testRecordID) == null ? "Not Found\n\n" : "Found\n\n");
db.ExecuteCommand("DELETE FROM tblLogs WHERE ID = " + testRecordID);
context.Response.Write(db.tblLogs.SingleOrDefault(c => c.ID == testRecordID) == null ? "Not Found\n\n" : "Found\n\n");
}
using (var db = new MainContext())
{
context.Response.Write(db.tblLogs.SingleOrDefault(c => c.ID == testRecordID) == null ? "Not Found\n\n" : "Found\n\n");
}
Returns when there is a record:
Found
Found
Not Found
If I use this method to clear the context cache after the delete sql query it returns:
Found
Not Found
Not Found
However still not sure if it's the root cause of the problem though. I would of thought the locking would definitely stop double sends.
The issue that your having is due to the way Entity Framework does its internal cache.
In order to increase performance, Entity Framework will cache entities to avoid doing a database hit.
Entity Framework will update its cache when you are doing certain operations on DbSet.
Entity Framework does not understand that your "DELETE FROM ... WHERE ..." statement should invalidate the cache because EF is not an SQL engine (and does not know the meaning of the statement you wrote). Thus, to allow EF to do its job, you should use the DbSet methods that EF understands.
for (var email in db.v2EmailQueues.OrderBy(c => c.ID).Take(Settings.EMAILS_PER_BATCH))
{
// whatever your amazon code was...
if (sent)
{
db.v2EmailQueues.Remove(email);
}
else
{
email.FailCount++;
}
}
// this will update the database, and its internal cache.
db.SaveChanges();
On a side note, you should leverage the ORM as much as possible, not only will it save time debugging, it makes your code easier to understand.

Webservice Performance

I have a couple of load balanced application server running on IIS 7. I need to check how many webservice calls are made from each of the server. I also need to check this at a particular instance. Do we have some thing in .net which communicates with both the server and gives me the snapshot at a particular instance.
Thanks
You could use Perfmon to add statistics regarding the number of calls. Once you're doing that you could also add timing data as well... You can then use Perfmon on the local box or hook up to it remotely with any number of tools.
Sorry I can't point you to specifics -- I've only seen it done, not done it myself :) But I think it is pretty straightforward.
And some sample code, showing how you could implement performance counters:
using System;
using System.Configuration;
using System.Diagnostics;
namespace TEST
{
// sample implementation
public static class PerformanceHelper
{
// update a performance counter value
public static void UpdateCounter(string WebMethodName, int count)
{
// to be able to turn the monitoring on or off
if (ConfigurationManager.AppSettings["PerformanceMonitor"].ToUpper() == "TRUE")
{
PerformanceCounter counter;
if (!PerformanceCounterCategory.Exists("SAMPLE"))
{
CounterCreationDataCollection listCounters = new CounterCreationDataCollection();
CounterCreationData newCounter = new CounterCreationData(WebMethodName, WebMethodName, PerformanceCounterType.NumberOfItems64);
listCounters.Add(newCounter);
PerformanceCounterCategory.Create("SAMPLE", "DESCRIPTION", new PerformanceCounterCategoryType(), listCounters);
}
else
{
if (!PerformanceCounterCategory.CounterExists(WebMethodName, "SAMPLE"))
{
CounterCreationDataCollection rebuildCounterList = new CounterCreationDataCollection();
CounterCreationData newCounter = new CounterCreationData(WebMethodName, WebMethodName, PerformanceCounterType.NumberOfItems64);
rebuildCounterList.Add(newCounter);
PerformanceCounterCategory category = new PerformanceCounterCategory("SAMPLE");
foreach (var item in category.GetCounters())
{
CounterCreationData existingCounter = new CounterCreationData(item.CounterName, item.CounterName, item.CounterType);
rebuildCounterList.Add(existingCounter);
}
PerformanceCounterCategory.Delete("SAMPLE");
PerformanceCounterCategory.Create("SAMPLE", "DESCRIPTION", new PerformanceCounterCategoryType(), rebuildCounterList);
}
}
counter = new PerformanceCounter("SAMPLE", WebMethodName, false);
if (count == -1)
counter.IncrementBy(-1);
else
counter.IncrementBy(count);
}
}
}
}

Examples of Quickbase API in C#

I am fairly new to the use of APIs and haven't touched Quickbase until today. I was researching the Quickbase API and it seemed as if all the examples I saw were written in XML or some similar variant. Is there a way to write code in C# that will do the same things that I saw could be done on the Quickbase website's API documentation? If you know of any code examples, please let me know.
There is a QuickBase C# SDK that might help get you started.
using System;
using Intuit.QuickBase.Client;
namespace MyProgram.QB.Interaction
{
class MyApplication
{
static void Main(string[] args)
{
var client = QuickBase.Client.QuickBase.Login("your_QB_username", "your_QB_password");
var application = client.Connect("your_app_dbid", "your_app_token");
var table = application.GetTable("your_table_dbid");
table.Query();
foreach(var record in table.Records)
{
Console.WriteLine(record["your_column_heading"]);
}
client.Logout();
}
}
}
There is also a QuickBase API Wrapper example as well.
Back in 2009 I wrote an .NET API for QuickBase which makes working with the platform easy, it also supports uploading and downloading of attached files.
IQuickBaseService svc = new QuickBaseService("user", "pass", "URL", "token");
Schema schema = svc.GetSchema("DBID");
Console.WriteLine("Schema : {0}", schema.Name);
Console.WriteLine("Variables - ");
for (KeyValuePair<string, string> ent in schema.Variables.OrderBy(en => en.Key)) {
Console.WriteLine("Var: {0} = {1}", ent.Key, ent.Value);
}
for (Query q : schema.Queries) {
// Work with queries.
}
// schema.Children
// schema.Fields
// ...
svc.SignOut();
Performing a query is simple.
QueryResult res;
res = svc.Query("tableid", 1); // Execute query number 1
res = svc.Query("tableid", "{{140.EX.'1'}}") // execute QB query text
foreach (QueryRow row in result.Rows) {
// Do something with row, use get<type>, not all shown here.
// row.GetBool(1);
// row.GetInt(1);
// row.GetLong(1);
// row.GetFloat(1);
// row.GetDouble(1);
// row.GetDecimal(1);
// row.GetString(1);
// row.GetDate(1);
// row.GetDateTime(1);
// row.GetObject(1);
}
QuickBase SDK Code is now moved to github https://github.com/QuickbaseAdmirer/QuickBase-C-Sharp-SDK

Categories

Resources