How do one clear a RavenDB database of all data must keeping its structure? I have little experience with RavenDB and the NoSQL database so I must ask for assistance. Do I have to create a .NET interface for managing the database or can this operation be performed from the web interface?
Raven Studio http://localhost:8080/raven/studio.html
If I have understood the structure correctly there are documents that needs to be removed? Can they be removed without damaging the database structure and/or involving .NET integration?
Thank you.
A RavenDB database doesn't have a "database structure". All documents in RavenDB are stored as JSON with a metadata element that describes the name of the corresponding CLR type in .NET.
You can just delete all document collections, or you could even recreate the database. The latter would require you to recreate all indexes. All of this can be done from the web interface.
Related
I'm doing some research for a new project and I'm trying to determine if it's possible (and advisable) to load a SQLite database into memory, perform CRUD operations against it, and persist it back out.
I've seen many examples of utilizing SQLite databases (in memory) for unit testing, and in all of those examples, the data is just trashed in the end - this is NOT what I wish to do.
I'm going to likely use Microsoft SQL Server to manage the overall site data and act as a storage engine (users and credentials and their associated SQLite databases, etc).
When a user selects a SQLite database in the UI, I would like to load it into memory on the server, allow the user to operate against it, and then persist it back to the storage engine (SQL Server) without needing to save a .db file to the filesystem.
I'm comfortable with aspects of EF Core + SQL Server and SQLite (against the filesystem). But what's new to me is the idea of operating against a SQLite database in memory.
So my questions are:
Is this possible with EF Core?
If so, how would I configure my SQLite DbContext class to accomplish this?
Are there any major downsides to this?
Thank you
Is this possible with EF Core?
Yes. Why not? SQLite essentially treats in-memory database the same as any other. There are particular considerations, but you essentially insert and query data in the same way.
If so, how would I configure my SQLite DbContext class to accomplish this?
Once again, refer to documentation. You would need to provide a specific connection string. (Not to discourage SO questions, but if you're going to research and test this you really should research information available on sqlite.org. It is has great, thorough documentation--at least compared to many open-source projects... sometimes a bit scattered, but still accessible.)
Perhaps more complicated than specifying an appropriate connection string is actually loading an existing database file into memory. The default, basic behavior is to only create an empty database in memory. There are multiple ways to load the data, and this question has some useful answers.
Are there any major downsides to this?
You have apparently already identified some of the downsides, but probably no more than any project which needs to merge/synchronize databases. There is no short answer to that question and it is much too broad for Stack Overflow.
You specifically mention syncing data to an SQL Server without saving the data to a disk file. You will certainly have to perform a series of queries form sqlite, massage the data into a corresponding update statement for SQL Server, then execute that on the server. Perhaps there are third party tools to do that same thing for file-based databases, but I suspect that you'd end up performing the same operation with a disc file anyway.
Hello and thanks for looking.
I have a DAL question for an application I'm working on. The app is going to extract some data from 5-6 tables from a production RDBMS that serves a much more critical role in the org. What the app has to do is use the data in these tables, analyze, apply some business logic/rules and then present.
The restrictions are that since the storage model is critical in nature to the org, I need to restrict how the app will request the data. Since the tables are relatively small, I created my data access to use DataTables to load the entirety of the db tables on a fixed interval using a timer.
My questions are really around my current design and the potential use of EF or LINQtoSQL
Can EF/LS work around the restrictions of the RDBMS. Most tutorials I've seen, the storage exists solely for the application. Can access to the storage be controlled and/or can EF use DataTables rather than An RDBMS?
Since the entirety of the tables are going to be loaded, is there a best practice for creating classes to consume the data within these tables? I will have to do in memory joins and querying/logic to get at the actual data I need.
Sorry if I'm being generic. I'm more just looking for thoughts and opinions as opposed to a solution to my problem. Please done hesitate to share your thoughts. Thanks.
For your first question, yes Entity Framework can use a existing DB as it's source, the term to search for when looking for Entity Framework tutorials on this topic is called "Database First"
For your second question let me first preface it with a warning: many ORMs are not designed around using it to load the entire data table and do bulk operations on them, especially if you will be modifying the result set and pushing the data back to the server in large quanties. The updates will be row based not set based because you did the modifications in C# code, not in a T-SQL query. Most ORMs are built around the expectation that you will be doing CRUD operations on the row level, not ETL operations or set level CRUD operations (except for Read which most ORMs will do as a set operation).
If you will not be updating the data, only pulling out using Entity Framework and building reports and whatnot off of the data you should be fine. If you are bulk inserting in to the database, things get more problematic. See this SO question for more information.
My WinForms C#/.NET application requires a table/grid control to display records to the end user. The records will be simple, containing only two fields, a string and a date/time field. I need to persist the data and I am wondering what the most efficient control and storage back-end is to use. The data is non-critical (i.e. - not health or financial records, or anything sensitive requiring extensive safety or any encryption).
One solution I have found so far is the DataGrid control in conjunction with SQL Server Compact Edition. I learned about this solution from this tutorial:
http://www.dotnetperls.com/datagridview-tutorial
It seems though that this may be overkill for my application. In addition, I am worried about the complexities of installing SQL Server CE, especially when it comes to admin vs. user account privilege issues during installation:
http://msdn.microsoft.com/en-us/library/aa983326(v=vs.80).aspx
Is there a table or grid control with built-in file load/save capabilities that uses a simple disk file as the storage method, perhaps a comma delimited ASCII file? I'd like something that I can still use SQL (via LINQ) to interface with. also, I am hoping that this can be done transparently. That is, if I want to upgrade to an SQL database engine solution later, the code from my end that interfaces with the data would not change (except perhaps for the database open/create code of course).
Or am I better off simply biting the bullet and going with SQL Server CE or perhaps SQLite:
Good embedded database solution (like SQLite) for .Net
If you have any caveats or anecdotes regarding installation issues and ease of use, they would be appreciated.
In my projects, we use Object datasources. Grid's can be bound to collections of objects just as easily as they can dataTables. You can store/restore the data using a simple serialization engine (XmlSerializer is rather easy to implement). Make a basic object, use List or BindingList as the dataset, and serialize/de-serialize it in the backEnd when you need it.
List and BindingList both support Linq queries.
Adding database save later is as simple as writing the code that saves the object to the database, in place of the serialization code, no change to the front end at all.
As far as a "Correct" solution is concerned...there are so many different ways to do it that it boils down to personal preference, and possibly actual requirements and expected future development. I find it easier to code using objects because the data manipulation is easier, but if you are going for straight record entry, no data manipulation required, going direct to a database is easier. It just depends on the data and what you plan on doing with it.
I strongly recommend you to use an embedded database, because it will be easier to go to a full database in a near future. SQL Server CE is a good option, and if you want to go big you can simply go to a full SQL Server Database with minimal changes in your code, the only downside of SQL Server CE is that you need to install it and it requires the .NET Framework 4, aside from that I don't see a big problem with it.
Imagine you are writing a large scale application using NHibernate and you want to have 2 seperate schema's (using Sql Server by the way)
Application_System (all the tables relating to the system, config tables, user tables etc)
Application_Data (all the actual data that is stored/retrieved when the user interacts with the system)
Now I've been trying to find a simple clean way to do this in NHibernate and thought I'd found a solution by using the Catalog and Schema properties so for example:
Catalog("Application_System");
Schema("dbo");
Table("SystemSettings")
would generate sql for Application_System.dbo.SystemSettings. And this kinda works but if I have 2 Catalogs defined then the Create/Delete tables functionality of hbm2ddl.auto stops working. Now I've come to the conclusion that I am probably abusing the Catalog and Schema properties for something it wasn't intended for. However I can't seem to find a simple way of achieving the same thing that doesn't involve some convoluted scaffolding.
Any help would be appreciated. I can't believe NHibernate wouldn't support this out of the box I mean it's a fairly basic requirement.
SchemaExport does not support creating schema/catalog ootb but you can add the create schema/catalog ddl by yourself using auxiliary objects in xml, FluentNHibernate or MappingByCode. Note that the auxiliary object has to be added first.
Ok well I kind of found a half way house that I'm reasonably satisfied with. The ISession has a Connection property that exposes a ChangeDatabase(string databaseName) method that allowes you to change the database the session is pointing to.
My schema export is still knackered because ultimately it doesn't know which object is for which database so will attempt to save it all to the database defined in the configuration.
You win some you lose some.
apologies for the previous post, sometime writing the questions actually solves it too ;) as in "the answer is in the question"
So I'm trying to interface an old primitive database system that is accessed vi a DLL entry point, however some work has been done on object rational mapping where one can create objects of each table and access the database that way, however for viewing the entire database it's seams impossible it parse so many tables (1000's or so objects)
However if I can create some sort of schema mapping to a C# Dataset class then that would make it accessible.
Hope that give some info into what i'm trying to attempt
I don't know what kind of DBMS you're trying to access, but if your legacy system has some kind of query language, you could write some kind of interface to query the database (SQL to legacy language class, a SQL driver whatever).
I'm not sure why you would do a object-relational mapping to this. Why not write an interface that would let you query it in a simple, yet effective manner?
edit: I see you write "Object rational mapping", I suppose you mean object relational mapping