I'm trying to consume a MongoDB collection but i'm guess is there a better way to do this, better than my implementation.
These is my currently approach:
IMongoClient client = new MongoClient("mongodb://localhost");
IMongoDatabase database = client.GetDatabase("example");
while (true)
{
await Task.Delay(1000);
var collection = database.GetCollection<BsonDocument>("SyncQueue");
var documents = collection.Find(new BsonDocument()).ToList();
foreach (var doc in documents)
{
ProcessDocAndRemoveFromCollection(doc);
}
}
Would it be possible to do something like this below?
PSEUDOCODE:
IMongoClient client = new MongoClient("mongodb://localhost");
IMongoDatabase database = client.GetDatabase("example");
var subject = new Subject<BsonDocument>();
subject.Subscribe(doc => ProcessDocAndRemoveFromCollection(doc));
>>>>>>>>>>>>>>>>>>>>>
when new doc in MongoDB Collection ----> subject.OnNext(doc);
<<<<<<<<<<<<<<<<<<<<<
The key point is, I want to avoid create an infinite loop to consume the collection
Related
I can't see the name of the tables already created. I'm working on a project in which I have access to the DynamoDB database through an IAM client, I create the AmazonClient using the credentials and configs that were made available to me, but I can't see the tables already created in the database.
I have already created the client and connected it to the database, I am trying to see the number of tables as follows, but the result is always 0
new code
List<string> currentTables = client.ListTablesAsync().Result.TableNames;
MessageBox.Show(currentTables.Count.ToString());
Try awaiting the API call:
List<string> currentTables = await client.ListTablesAsync().Result.TableNames;
MessageBox.Show(currentTables.Count.ToString());
Try this sync code instead:
AmazonDynamoDBClient client = new AmazonDynamoDBClient();
// Initial value for the first page of table names.
string lastEvaluatedTableName = null;
do
{
// Create a request object to specify optional parameters.
var request = new ListTablesRequest
{
Limit = 10, // Page size.
ExclusiveStartTableName = lastEvaluatedTableName
};
var response = client.ListTables(request);
ListTablesResult result = response.ListTablesResult;
foreach (string name in result.TableNames)
Console.WriteLine(name);
lastEvaluatedTableName = result.LastEvaluatedTableName;
} while (lastEvaluatedTableName != null);
I'm migrating an SQL database to couchDB. I'm having problem when I post multiple documents, say around 8K doc ids. Code below:
MyClass cl = new MyClass();
foreach (DataRow row in dteqEvent.Rows)
{
NewSnDocument pn = new NewSnDocument();
pn.id = row[1].ToString(); //this is the document id
pn.val = row[2].ToString();
string json = JsonConvert.SerializeObject(pn);
cl.PostToCouch(json); //method under MyClass to post documents
}
Then under MyClass I have the method below:
public async void PostToCouch(string json)
{
using (var client = new MyCouchClient(HostServer, Database))
{
var resp = await client.Documents.PostAsync(json);
Console.WriteLine(resp.StatusCode);
}
}
The first 2K ids are POSTed successfully then it gives me an error after that. Error says: "Unable to connect to the remote server." InnerException states "No connection could be made because the target machine actively refused it." Is this something to do with my couchDB configuration.
Is there an alternative way of POSTing multiple documents. I saw a bulk operation in MyCouch but it is not clear to me: https://github.com/danielwertheim/mycouch/wiki/documentation#bulk-operations
Thanks in advance!
UPDATE:
Alright I managed to solve my problem by tweaking the code a little bit:
MyClass cl = new MyClass();
List<NewSnDocument> pnList = new List<NewSnDocument>();
foreach (DataRow row in dteqEvent.Rows)
{
NewSnDocument pn = new NewSnDocument();
pn.id = row[1].ToString(); //this is the document id
pn.val = row[2].ToString();
pnList.Add(pn);
}
cl.PostToCouch(pnList);
Then method under MyClass:
public async void PostToCouch(List<NewSnDocument> obj)
{
int r = obj.Count;
using (var client = new MyCouchClient(HostServer, Database))
{
for(int i=0; i<r; i++)
{
string json = JsonConvert.SerializeObject(obj[i]);
var resp = await client.Documents.PostAsync(json);
Console.WriteLine(resp.StatusCode);
}
}
I think even your updated code doesn't look right. I'm not sure, please take a look at the comments/modifications I made in your code:
MyClass cl = new MyClass();
List<NewSnDocument> pnList = new List<NewSnDocument>(); //List of documents
foreach (DataRow row in dteqEvent.Rows)
{
NewSnDocument pn = new NewSnDocument();
pn.id = row[1].ToString();
pn.val = row[2].ToString();
// cl.PostToCouch(pnList);
pnList.push(pn);//You need to push each document to the list of documents
//I'm not sure about C#, but there should some "push" method
//or something equivalent to "push"
}
cl.PostToCouch(pnList);//"pnList" contains a list of documents
//So it should be posted to CouchDB outside "foreach" loop
//After all documents have been pushed into it
I am currently running into an issue where I am unable to replace a Mongo document thats supposed to be updated with the following function:
public async void updateSelectedDocument(BsonDocument document)
{
var collection = mongoClient.GetDatabase("db").GetCollection<BsonDocument>("collection");
var id = document.GetElement("_id").ToString().Remove(0, 4);
var filter = Builders<BsonDocument>.Filter.Eq("_id", id);
var result = await collection.ReplaceOneAsync(filter, document, new UpdateOptions { IsUpsert = true });
}
There is no error to go off of and none of the variables are null. My connection to the mongo instance and collection works because I can perform a find and an insert. Please Let me know if you need additional information.
In Microsoft Azure DocumentDb, how can I retrieve a list of all databases? Not documents in a particular database, but all the databases for a particular account. Preferable using the standard DocumentClient class.
You could something like the following:
using (var documentClient = new DocumentClient(new Uri("<endpoint>"), "<accountkey>"))
{
var listDatabasesOperationResult = await documentClient.ReadDatabaseFeedAsync();
foreach (var item in listDatabasesOperationResult)
{
Console.WriteLine(item.Id);
}
}
What's the new way to build indexes with the new driver 2.0?
There's no documentation whatsoever about this.
Apparently this now works with the new IndexKeysDefinitionBuilder<> interface but that's all I got so far.
You need to call and await CreateOneAsync with an IndexKeysDefinition you get by using Builders.IndexKeys:
static async Task CreateIndex()
{
var client = new MongoClient();
var database = client.GetDatabase("db");
var collection = database.GetCollection<Hamster>("collection");
await collection.Indexes.CreateOneAsync(Builders<Hamster>.IndexKeys.Ascending(_ => _.Name));
}
If you don't have a Hamster you can also create the index in a non-strongly-typed way by specifying the index's json representation:
await collection.Indexes.CreateOneAsync("{ Name: 1 }");