I use MongoDB driver (C#) and kinda new with it. Is there any way or best practices to delete multiple collections in one db by one command ?
I have a lot of collections and they’re not needed anymore so I looking the way to delete many collections but without dropping all db in general ( because there’s some collections that I still need ) .
Found DeleteMany but that can be applied to collection, in my case I need something like that but what can be applied to db .
Related
I see that mongo has bulk insert, but I see nowhere the capability to do bulk inserts across multiple collections.
Since I do not see it anywhere I'm assuming its not available from Mongo.
Any specific reason for that?
You are correct in that the bulk API operates on single collections only.
There is no specific reason but the APIs in general are collection-scoped so a "cross-collection bulk insert" would be a design deviation.
You can of course set up multiple bulk API objects in a program, each on a different collection. Keep in mind that while this wouldn't be transactional (in the startTrans-commit-rollback sense), neither is bulk insert.
So, I have been using AutoMapper with the IQueryable extensions to select some really simple viewmodels for list views. This allows me to not load up an entire entity framework object, but I have ran into a little bit of a less than ideal situation where I need to pull a simple viewmodel for a single complex object.
userQuery.Where(u => u.Id == id).ProjectTo<SimpleUserViewModel>().FirstOrDefault();
I could do a normal AutoMapper.Map, but this pulls in the whole object and child objects, when I may only need a single property off of the child and I don't want to eat the database retrieval cost.
Is there a better way of approaching this for getting a single entity and emitting a select through entity framework for only grabbing the necessary objects?
It does look inefficient, but it isn't.
Just like many other LINQ methods, most notably the Select which it replaces, ProjectTo<> relies on deferred execution. It won't pull the data until it reaches the point of having to present (or act on) the data.
Common ways to trigger this execution are ToList, First, Single (including the OrDefault and Async variants for all of them). Essentially, any action that requires actual knowledge of the data set itself.
I know the feeling, it feels less elegant to not be able to do something like ProjectToSingle<SimpleUserViewModel>(x => x.Id == id). If it really bothers you, you can write this wrapper method yourself, essentially translating it to a Where/ProjectTo/Single chain.
I feel the same way, but I've gotten used to writing Where/ProjectTo/Single and it doesn't feel wrong anymore. It's still a lot better than having to write the include statements.
Also, as an aside, even if you weren't using Automapper, but you'd still want to cut down on the columns that you fetch (because you know you won't need all of them) instead of loading the whole entity, you'd still be required to use a Where/Select/Single method chain.
So Automapper didn't make the syntax any less elegant than it already was with regular LINQ/EF.
When using Linq to Sql and updating something like a cross reference table where they could already be records in the table that would just need to stay there, records that will change and records that could be removed. What is the best practice as to handle this? I am thinking delete all and recreate. Is that bad?
Should I delete the reference records and repopulate all of them. Naturally removing what is no longer needed and creating what is needed.
or
Should i attempt to perform some type of check and remove what is old with what is being added
or
what is a better way?
Linq to SQL is not the best tool for scenario's like this.
Basically you will have to write update/insert/delete all by yourself. You can use the exists() any() etc. to create the sets, but the resulting SQL will be all individual inserts, updates.
It is query language after all.
In your case, I would do a merge through a stored procedure and call that.
A MERGE statement (StackOverFlow Merge Example) might work out best in this situation. It will allow for multiple rows to be manipulated at the same time at your discretion.
I come from an Objective-C programming background and as such, am used to using Core Data for storing my data. Core data allows you to define fields (attributes) which you can then refer to as if they were objects in code, for each item in the database.
I wondered if there is a similar way of doing things for C#.
A bit more background
I have a treeview on my winform. The treeview allows people to add and remove new nodes. I've created a subclass of TreeNode so I can store a little more information against each node but I'd like to have all of the database transactions for adding / deleting done as part of this subclass. For instance, when I delete a node from the tree that has subnodes, I can easily remove this from the tree by calling Remove (knowing it will remove all subnodes too), but I also need the database to keep those changes too.
So, as above - is there a way to treat DB records as objects?
Hope that's clear enough!
Entity Framework, LLBLGEN, NHibernate to name a few ORMs
I would suggest looking at Beginner's Guide to ADO.NET Entity Framework
EntityFramework (Nuget Package) (Nuget is the preferred way to install packages such as EF)
Also, take a look at Sam Saffron's How I learned to stop worrying and write my own ORM and Small is Beautiful - .NET Micro ORMs
I am on a big project that use WebForms and i wish to use LINQ TO SQL but because its gonna get slow in VS if i add everything to that model (furthermore i still use standard SQL) it is okay to multiple contexts ?
EDIT:
What i was saying is.. i have almost 100 tables in the database and i cant add them all in one context since i am still using raw sql a lots.. so i though i could seperate the whole thing and create many contexts based on operations. So if i have to add evaluation forms then the context will interact with all the table related to evaluation forms.
THanks.
If you have logical separation in db schema, it will make sense to have multiple contexts.
Also L2S do not prevent you from using raw sql, even more, you can use context's connection and utilize it's methods to execute sql queries.
Also you can map SPs with L2S
The other caveat of having multiple contexts is that you will not be able easily pass objects between contexts. In your case I would prefer to use EF with POCO (probably even new arrived code first approach ), using it you'll be able to pass objects from one context to another, otherwise you'd need to use some kind of Object-to-Object mapping
UPD: that is how you can use LINQ2SQL with raw sql:
db.ExecuteQuery<Customer>("select * from dbo.Customers where City = {0}", "London");
db.ExecuteCommand("UPDATE Products SET QuantityPerUnit = {0} WHERE ProductID = {1}", "24 boxes", 5);
read this article DataContext.ExecuteCommand Method
I'm not sure what you're asking, but generally, you don't need more than one context.