ConcurrentDictionary - AddOrUpdate issue - c#

I'm using this code below to try to update the values in a dictionary object depending on its key.
public static ConcurrentDictionary<string, SingleUserStatisticsViewModel> UsersViewModel = new ConcurrentDictionary<string, SingleUserStatisticsViewModel>();
var userSession = new UserSessionStatistic()
{
Id = "12345", Browser = "Netscape"
};
var userViewModel = new SingleUserStatisticsViewModel()
{
UserSessionStatistic = userSession,
StartTime = DateTime.Now
};
//Add first time
MyStaticClass.UsersViewModel.AddOrUpdate(userViewModel.UserSessionStatistic.Id, userViewModel, (key, model) => model);
//try to Update
var userSession2 = new UserSessionStatistic()
{
Id = "12345",
Browser = "not getting updated????"
};
var userViewModel2 = new SingleUserStatisticsViewModel()
{
UserSessionStatistic = userSession2,
StartTime = DateTime.Now
};
MyStaticClass.UsersViewModel.AddOrUpdate(userViewModel2.UserSessionStatistic.Id, userViewModel2, (key, model) => model);
But the UsersessionStatistic object in userViewModel2 is not getting updated in the ConcurrentDictionary (it's Browser propery still says "Netscape"), what am I doing wrong?

About the value factory, the docs say:
updateValueFactory Type: System.Func The
function used to generate a new value for an existing key based on the
key's existing value
Which means your passing it the existing value. You need to update it with the new one instead:
MyStaticClass.UsersViewModel.AddOrUpdate(userViewModel2.UserSessionStatistic.Id,
userViewModel2,
(key, oldModel) => userViewModel2);

Related

How to use different ChangeTracker instances for each inMemory Database?

I'm trying to use InMemory Database to help test out my methods.
I'm having trouble running multiple DataRows in a test because all but one will give me an error when trying to add a default record to the context. If I manually run each individually they all pass but as soon as I run the whole test they fail when all but one reach mCntx.Brokers.Add(new() ... with the following error.
System.InvalidOperationException: 'The instance of entity type 'Broker' cannot be tracked because another instance with the same key value for {'Id'} is already being tracked. When attaching existing entities, ensure that only one entity instance with a given key value is attached. Consider using 'DbContextOptionsBuilder.EnableSensitiveDataLogging' to see the conflicting key values.'
I've tried setting up a new Db name as is recommended but it still seems to draw from the same ChangeTracker.
Test Code
[TestMethod()]
[DataRow(null, _ValidBrokerNameNotInDb, _ValidBrokerageIdInDb, _ValidBrokerageNameInDb)]
[DataRow(null, _ValidBrokerNameNotInDb, null, _ValidBrokerageNameInDb)]
[DataRow(null, _ValidBrokerNameNotInDb, _ValidBrokerageIdInDb, null)]
[DataRow(null, _ValidBrokerNameNotInDb, null, _ValidBrokerageNameNotInDb)]
public void BrokerValidationTest_CreateInDb(int? brokerId, string brokerName, int? brokerageId, string brokerageName)
{
DataTable dataTable = GetDataTable(brokerId, brokerName, brokerageId, brokerageName);
List<UploadIssue> uploadIssues = new();
var rand = new Random().NextDouble();
using (SqlDboDbContext cntx = GetContext($"BrokerValidationTest_CreateInDb{rand}"))
{
InsertUpdateDatabaseService.ImportBroker(dataTable, cntx, ref uploadIssues);
cntx.ChangeTracker.Clear();
}
Assert.IsFalse(uploadIssues.Any(x => x.UploadIssueType == UploadIssueTypes.Error));
Assert.IsFalse(uploadIssues.Any(x => x.UploadIssueType == UploadIssueTypes.DependencyError));
Assert.IsFalse(uploadIssues.Any(x => x.UploadIssueType == UploadIssueTypes.Prompt));
}
public static SqlDboDbContext GetContext(string dbName)
{
var rand = new Random().NextDouble();
var options = new DbContextOptionsBuilder<SqlDboDbContext>()
.UseInMemoryDatabase(databaseName: dbName + rand.ToString())
.Options;
SqlDboDbContext mCntx = new SqlDboDbContext(options);
mCntx.Brokers.Add(new()
{
Id = _ValidBrokerIdInDb,
Name = _ValidBrokerNameInDb,
BrokerageId = _ValidBrokerageIdInDb,
Brokerage = DbBrokerage
});
mCntx.SaveChanges();
return mCntx;
}
The issue it seems to be with Brokerage = DbBrokerage
private static Brokerage DbBrokerage = new()
{
Id = _ValidBrokerageIdInDb,
Name = _ValidBrokerageNameInDb
};
Once I changed it to creating a new instance of Brokerage instead of using the static instance it started working.
From
mCntx.Brokers.Add(new()
{
Id = _ValidBrokerIdInDb,
Name = _ValidBrokerNameInDb,
BrokerageId = _ValidBrokerageIdInDb,
Brokerage = DbBrokerage
});
To
mCntx.Brokers.Add(new()
{
Id = _ValidBrokerIdInDb,
Name = _ValidBrokerNameInDb,
BrokerageId = _ValidBrokerageIdInDb,
Brokerage = new()
{
Id = _ValidBrokerageIdInDb,
Name = _ValidBrokerageNameInDb
}
});

DynamoDB query doesn't return all attributes

I have a table in DynamoDB.
Table name test-vod
Primary partition key guid (String)
Primary sort key -
With additional attributes as you can see below.
The goal is to query the table using one of the columns that are not a primary key srcVideo, to accomplish that we created a second local index.
And using the low-level API from DynamoDB SDK NuGet package we query with the below code (open to other options instead of low-level API).
var queryRequest = new QueryRequest
{
TableName = $"{_environmentName}-vod",
IndexName = "srcVideo-index",
ScanIndexForward = true,
KeyConditionExpression = "srcVideo = :v_srcVideo",
ExpressionAttributeValues = new Dictionary<string, AttributeValue>()
{
{":v_srcVideo",new AttributeValue {S = inputMediaKey}}
}
};
var response = await _client.QueryAsync(queryRequest, cancellationToken);
// Does not exist
var hlsUrl = response.Items
.SelectMany(p => p)
.SingleOrDefault(p => p.Key.Equals("hlsUrl"));
I am interested to retrieve 3 attributes (fields) from the response hlsUrl, dashUrl, workflowsStatus but all 3 missing, the response contains a Dictionary with a count of keys 27, these are only 27 out of the 35 available columns.
I have tried using ProjectionExpression and other query combinations with no success.
You don't show the CREATE TABLE you've used...
Sounds like your index wasn't created with the Projection attribute you really want...
Default is , KEYS_ONLY. Sounds like you want ALL or maybe INCLUDE just selected attributes...GlobalSecondaryIndex - Projection
Local secondary indexes work the same way...
It is interesting but I made it work with the below code, even if the key/value is not present in the dictionary when inspecting the debugger you can still retrieve it.
var queryRequest = new QueryRequest
{
TableName = tableName,
IndexName = "srcVideo-index",
ScanIndexForward = true,
KeyConditionExpression = "srcVideo = :v_srcVideo",
ExpressionAttributeValues = new Dictionary<string, AttributeValue>()
{
{":v_srcVideo", new AttributeValue {S = inputMediaKey}}
}
};
var response = await _client.QueryAsync(queryRequest, cancellationToken);
if (response.Items.AnyAndNotNull())
{
var dictionary = response.Items.First().ToDictionary(p => p.Key, x => x.Value.S);
return Result.Ok (new VodDataInfo(
dictionary["srcBucket"],
dictionary["srcVideo"],
dictionary["destBucket"],
dictionary.ContainsKey("dashUrl")
? dictionary["dashUrl"]
: default,
dictionary.ContainsKey("hlsUrl")
? dictionary["hlsUrl"]
: default,
dictionary["workflowStatus"]));
}

Elasticsearch infer _id on dynamic type when using IndexMany

I am trying to get my head around a problem. I am building an application where we are indexing assets in Elastic. The nature of the assets is very dynamic, because they contain client metadata, which is different from client to client.
Because of this, the Index is built from a List of dynamics in C#. This actually works like a charm. The problem is, I cannot control the _id property in Elastic when using the C# interface. This means when I update the documents, instead of updating the correct one a new duplicate is made.
My code looks like this:
List<dynamic> assets = new List<dynamic>();
var settings1 = new ConnectionSettings(
new Uri("http://localhost:9200")
).DefaultIndex("assets");
var client = new ElasticClient(settings1);
//assets is build here
var indexResponse = client.Indices.Create("assets");
var BulkResponse = client.IndexMany(assets);
This actually works and the index is built as I expect it to - almost. Even though I have a property called Id on the dynamic, it is not inferred correctly, which means the document is given an _Id decided by Elastic. Thus the next time I run this code using the same Id a new document is created rather than updated.
I have been searching high and low, but cannot seem to find a good solution. One thing I have tried is the following:
var bulkResponse = client.Bulk(bd => bd.IndexMany(assets, (descriptor, s) => descriptor.Id(s.Id)));
But this throws an error I cannot catch in the .net kernel. This actually works on lower versions on Elastic, but seems to have been broken with 7.2 and 7.0.1 of the C# interface.
Any help is much appreciated.
To allow the following to work
var bulkResponse = client.Bulk(bd => bd.IndexMany(assets, (descriptor, s) => descriptor.Id(s.Id)));
You just need to cast the Id type to the type that it is. For example, if it's a string
var client = new ElasticClient();
var assets = new dynamic[]
{
new { Id = "1", Name = "foo" },
new { Id = "2", Name = "bar" },
new { Id = "3", Name = "baz" },
};
var bulkResponse = client.Bulk(bd => bd.IndexMany(assets, (descriptor, s) => descriptor.Id((string)s.Id)));
This is a runtime limitation.
Instead of using dynamic type you could create dictionary-based custom type like:
public class DynamicDocument : Dictionary<string, object>
{
public string Id => this["id"]?.ToString();
}
and use it as follow:
class Program
{
public class DynamicDocument : Dictionary<string, object>
{
public string Id => this["id"]?.ToString();
}
static async Task Main(string[] args)
{
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var connectionSettings = new ConnectionSettings(pool);
connectionSettings.DefaultIndex("documents");
var client = new ElasticClient(connectionSettings);
await client.Indices.DeleteAsync("documents");
await client.Indices.CreateAsync("documents");
var response = await client.IndexAsync(
new DynamicDocument
{
{"id", "1"},
{"field1", "value"},
{"field2", 1}
}, descriptor => descriptor);
//will update document with id 1 as it's already exists
await client.IndexManyAsync(new[]
{
new DynamicDocument
{
{"id", "1"},
{"field1", "value2"},
{"field2", 2}
}
});
await client.Indices.RefreshAsync();
var found = await client.GetAsync<DynamicDocument>("1");
Console.WriteLine($"Id: {found.Source.Id}");
Console.WriteLine($"field1: {found.Source["field1"]}");
Console.WriteLine($"field2: {found.Source["field2"]}");
}
}
Output:
Id: 1
field1: value2
field2: 2
Tested with elasticsearch 7.2.0 and NEST 7.0.1.
Hope that helps.

How to avoid posting duplicates into elasticsearch using Nest .NET 6.x?

When data from a device goes into the elastic there are duplicates. I like to avoid this duplicates. I'm using a object of IElasticClient, .NET and NEST to put data.
I searched for a method like ElasticClient.SetDocumentId(), but cant find.
_doc doc = (_doc)obj;
HashObject hashObject = new HashObject { DataRecordId = doc.DataRecordId, TimeStamp = doc.Timestamp };
// hashId should be the document ID.
int hashId = hashObject.GetHashCode();
ElasticClient.IndexDocumentAsync(doc);
I would like to update the data set inside the Elastic instead of adding one more same object right now.
Assuming the following set up
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var settings = new ConnectionSettings(pool)
.DefaultIndex("example")
.DefaultTypeName("_doc");
var client = new ElasticClient(settings);
public class HashObject
{
public int DataRecordId { get; set; }
public DateTime TimeStamp { get; set; }
}
If you want to set the Id for a document explicitly on the request, you can do so with
Fluent syntax
var indexResponse = client.Index(new HashObject(), i => i.Id("your_id"));
Object initializer syntax
var indexRequest = new IndexRequest<HashObject>(new HashObject(), id: "your_id");
var indexResponse = client.Index(indexRequest);
both result in a request
PUT http://localhost:9200/example/_doc/your_id
{
"dataRecordId": 0,
"timeStamp": "0001-01-01T00:00:00"
}
As Rob pointed out in the question comments, NEST has a convention whereby it can infer the Id from the document itself, by looking for a property on the CLR POCO named Id. If it finds one, it will use that as the Id for the document. This does mean that an Id value ends up being stored in _source (and indexed, but you can disable this in the mappings), but it is useful because the Id value is automatically associated with the document and used when needed.
If HashObject is updated to have an Id value, now we can just do
Fluent syntax
var indexResponse = client.IndexDocument(new HashObject { Id = 1 });
Object initializer syntax
var indexRequest = new IndexRequest<HashObject>(new HashObject { Id = 1});
var indexResponse = client.Index(indexRequest);
which will send the request
PUT http://localhost:9200/example/_doc/1
{
"id": 1,
"dataRecordId": 0,
"timeStamp": "0001-01-01T00:00:00"
}
If your documents do not have an id field in the _source, you'll need to handle the _id values from the hits metadata from each hit yourself. For example
var searchResponse = client.Search<HashObject>(s => s
.MatchAll()
);
foreach (var hit in searchResponse.Hits)
{
var id = hit.Id;
var document = hit.Source;
// do something with them
}
Thank you very much Russ for this detailed and easy to understand description! :-)
The HashObject should be just a helper to get a unique ID from my real _doc object. Now I add a Id property to my _doc class and the rest I will show with my code below. I get now duplicates any more into the Elastic.
public void Create(object obj)
{
_doc doc = (_doc)obj;
string idAsString = doc.DataRecordId.ToString() + doc.Timestamp.ToString();
int hashId = idAsString.GetHashCode();
doc.Id = hashId;
ElasticClient.IndexDocumentAsync(doc);
}

Insert collection into List from MongoDB

I try to get all data from collection into MongoDB server using C# driver.
The idea is connect to the server and get all collection than insert into list of class.
List<WatchTblCls> wts;
List<UserCls> users;
List<SymboleCls> syms;
public WatchTbl()
{
InitializeComponent();
wts = new List<WatchTblCls>();
users = new List<UserCls>();
syms = new List<SymboleCls>();
}
public async void getAllData()
{
client = new MongoClient("mongodb://servername:27017");
database = client.GetDatabase("WatchTblDB");
collectionWatchtbl = database.GetCollection<WatchTbl>("Watchtbl");
collectionUser = database.GetCollection<UserCls>("Users");
collectionSymbole = database.GetCollection<SymboleCls>("Users");
var filter = new BsonDocument();
using (var cursor = await collectionWatchtbl.FindAsync(filter))
{
while (await cursor.MoveNextAsync())
{
var batch = cursor.Current;
foreach (var document in batch)
{
wts.Add(new WatchTblCls(document["_id"], document["userId"], document["wid"], document["name"], document["Symboles"]));
}
}
}
}
I get this error under
wts.Add(new WatchTblCls(document["_id"], document["userId"], document["wid"], document["name"], document["Symboles"]));
Cannot apply indexing with [] to an expression of type 'WatchTbl'
I don't understand the reason behind using WatchTbl and WatchTblCls both together. Is WatchTblCls a model for the entity WatchTbl here? Im not sure.
In any case. If you go for aggregation and want to convert WatchTbl collection to WatchTblCls list, your desired solution might look like the following. I don't know the defiitions of the classes so I'm assuming:
var client = new MongoClient("mongodb://servername:27017");
var database = client.GetDatabase("WatchTblDB");
var collectionWatchtbl = database.GetCollection<WatchTbl>("Watchtbl");
var collectionUser = database.GetCollection<UserCls>("Users");
var collectionSymbole = database.GetCollection<SymboleCls>("Users");
var list = collectionWatchtbl.AsQueryable().Select(x => new WatchTblCls() {
id = x.id,
userId = x.userId,
.....
});
If you can use the same WatchTbl class and still want to load the full collection to a local List (which is definitely not a good idea):
List<WatchTbl> list = await collectionWatchtbl.Find(x => true).ToListAsync();

Categories

Resources