var keyspace = "mydb";
var datacentersReplicationFactors = new Dictionary<string, int>(0);
var replication = ReplicationStrategies.CreateNetworkTopologyStrategyReplicationProperty(datacentersReplicationFactors);
using (var cluster = Cluster.Builder().AddContactPoints("my_ip").Build())
using (var session = cluster.Connect())
{
session.CreateKeyspaceIfNotExists(keyspace, replication, true);
session.ChangeKeyspace(keyspace);
var entityTable = new Table<Models.Entity>(session);
var attributeTable = new Table<Models.Attribute>(session);
entityTable.CreateIfNotExists(); // Worked
attributeTable.CreateIfNotExists(); // Worked
entityTable.Delete(); // Does nothing
attributeTable.Delete(); // Does nothing
}
EDIT: Without using raw queries session.Execute("DROP TABLE entities;"); working fine.
Delete() method is not intended for drop of the tables. It returns representation of a DELETE cql statement. If you call it, you just get {DELETE FROM entities}.
If you need to drop a table, the easiest way is just to execute DROP statement:
session.Execute("DROP TABLE entities;");
Unless there is already a method for dropping tables that I not aware of, you can use this extensions.
public static class DastaxTableExtensions
{
public static void Drop<T>(this Table<T> table)
{
table.GetSession().Execute($"DROP TABLE {table.Name};");
}
public static void DropIfExists<T>(this Table<T> table)
{
table.GetSession().Execute($"DROP TABLE IF EXISTS {table.Name};");
}
}
Then you can use it like this
entityTable.Drop();
attributeTable.DropIfExists();
Related
I'm writing a unit test in c# for a function that is responsible for using System.Data.SqlClient.SqlBulkCopy to copy a DataTable to a database server.
I use SQLLite for unit tests, and wanted to connect to my SQLLite in memory database with SqlBulkCopy, and then bulk copy that test data into the SQLLite db.
However, I can't seem to get the connection string right.
I originally tried
var bcp = new SqlBulkCopy("FullUri=file::memory:?cache=shared")
Then
var bcp = new SqlBulkCopy("Data Source=:memory:;Cache=Shared")
Which didn't recognize Cache
So then I tried
var bcp = new SqlBulkCopy("Data Source=:memory:")
out of desperation, which simply timed out when attempting to connect to the database.
Is what I'm trying to accomplish here possible? If it is, can someone please help me with the connection string?
The answer to this was that you cannot connect SqlBulkCopy to a SQLite instance.
What I did to solve my problem (unit test a part of the code that used SqlBulkCopy) was to create a wrapper around SqlBulkCopy that is implemented using SqlBulkCopy for production code, and with a mock bulk copy in test code. Effectively decoupling the dependency on SqlBulkCopy itself.
Specifically, I created
public interface IBulkCopy : IDisposable {
string DestinationTableName { get; set; }
void CreateColumnMapping(string from, string to);
Task WriteToServerAsync(DataTable dt);
}
Then, I implemented this as
public class SQLBulkCopy : IBulkCopy {
private SqlBulkCopy _sbc;
public string DestinationTableName {
get { return _sbc.DestinationTableName; }
set { _sbc.DestinationTableName = value; }
}
public SQLBulkCopy(IDBContext ctx) {
_sbc = new SqlBulkCopy((SqlConnection)ctx.GetConnection());
}
public void CreateColumnMapping(string from, string to) {
_sbc.ColumnMappings.Add(new SqlBulkCopyColumnMapping(from, to));
}
public Task WriteToServerAsync(DataTable dt) {
return _sbc.WriteToServerAsync(dt);
}
}
And in my test utilities I mocked out "bulk copy" with just inserts:
class MockBulkCopy : IBulkCopy {
private IDBContext _context;
public MockBulkCopyHelper(IDBContext context) {
_context = context;
}
public string DestinationTableName { get; set; }
public void CreateColumnMapping(string fromName, string toName) {
//We don't need a column mapping for raw SQL Insert statements.
return;
}
public virtual Task WriteToServerAsync(DataTable dt) {
return Task.Run(() => {
using (var cn = _context.GetConnection()) {
using (var cmd = cn.CreateCommand()) {
cmd.CommandText = $"INSERT INTO {DestinationTableName}({GetCsvColumnList(dt)}) VALUES {GetCsvValueList(dt)}";
cmd.ExecuteNonQuery();
}
}
});
}
Where GetCsvColumnList and GetCsvValueList I implemented as helper functions.
You cannot use SqlBulkCopy for SQLite. SqlBulkCopy has been done for SQL Server.
Normally the trick to dramatically improve performance for SQLite is making sure a transaction is used.
Disclaimer: I'm the owner of .NET Bulk Operations
This library is not free but allows you to easily perform and customize all bulk operations:
Bulk Insert
Bulk Delete
Bulk Update
Bulk Merge
Example
// Easy to use
var bulk = new BulkOperation(connection);
bulk.BulkInsert(dt);
bulk.BulkUpdate(dt);
bulk.BulkDelete(dt);
bulk.BulkMerge(dt);
// Easy to customize
var bulk = new BulkOperation<Customer>(connection);
bulk.BatchSize = 1000;
bulk.ColumnInputExpression = c => new { c.Name, c.FirstName };
bulk.ColumnOutputExpression = c => c.CustomerID;
bulk.ColumnPrimaryKeyExpression = c => c.Code;
bulk.BulkMerge(customers);
EDIT: Answer comment
I want to load a data table from SQLite then "bulk copy" it in other databases
This situation is possible but requires 2 connection
DbConnection sourceConnection = // connection from the source
DbConnection destinationConnection = // connection from the destination
// Fill the DataTable using the sourceConnection
dt = ...;
// BulkInsert using the destinationConnection
var bulk = new BulkOperation(destinationConnection);
bulk.BulkInsert(dt);
I'm trying to insert a list of file names into a simple Sql Server table.
I'm trying to leverage SqlBulkCopy and #markgravell's FastMember library, as suggested by other SO answers.
public async Task AddFileNamesAsync(string[] fileNames)
{
fileNames.ShouldNotBeNull();
using (var bulkCopy = new SqlBulkCopy(ConnectionString))
{
using (var reader = ObjectReader.Create(fileNames))
{
bulkCopy.DestinationTableName = "FileNames";
bulkCopy.ColumnMappings.Add("value", "FileName");
await bulkCopy.WriteToServerAsync(reader)
.ConfigureAwait(false);
}
}
}
CREATE TABLE [dbo].[FileNames](
[FileNameId] [int] IDENTITY(1,1) NOT NULL,
[FileName] [varchar](500) NOT NULL
So I feel like it's a mapping issue either with:
FastMember not able to map to some internal backing collection
The FastMember backing collection doesn't have the same name as the DB column so it can't map.
I have never used this Library before however reviewing the GitHub source code it it requires a property to query from the source. Now on a string there isn't a property value really all you have to use is the Length property. Using Length. Now this may be an issue with the FastMember library, where it creates a CallSite accessor function to capture the Property from the target object.
Source here
Now I have had a play and cannot get access to any property that will work. At first glance their is a Chars property being returned in the TypeAccessor results but this does not appear to work.
My suggestion is not really an answer to the question but a way to make it work. If you created a type that had a Property of the string then you could effectively get around this.
public async Task AddFileNamesAsync(string[] fileNames)
{
fileNames.ShouldNotBeNull();
var list = fileNames.Select(f => new { value = f });
using (var bulkCopy = new SqlBulkCopy(ConnectionString))
{
using (var reader = ObjectReader.Create(list))
{
bulkCopy.DestinationTableName = "FileNames";
bulkCopy.ColumnMappings.Add("value", "FileName");
try
{
await bulkCopy.WriteToServerAsync(reader)
.ConfigureAwait(false);
}
catch(Exception ex)
{
}
}
}
}
Now this will work as we generated a new type with a property of value which is each file name. Now the execution should work as expected. (Note the try..catch... was just for testing).
I have developed a WCF api which is using nHibernate. I am new to this. I have used session.update to take care of transaction. I have a for loop in which based on select condition I am updating a record ie. If A is present in tabel1 then I am updating the table else inserting a new entry.
I am getting "could not execute query." when trying to execute a select query on a table which was previously being updated by adding a new entry in the table.
What I think is, because I am using session.save(table1) and then trying select entries from that table I am getting an error. Since session.save temporarily locks the table I am not able to execute a select query on that table.
What can be the solution on this?
Update:
This the for loop I am using to check in the database for some field:
using (ITransaction tranx = session.BeginTransaction())
{
savefunction();
tranx.Commit();
}
Save function:
public void savefunction()
{
for (int i = 0; i < dictionary.Count; i++)
{
ICandidateAttachmentManager candidateAttach = new ManagerFactory().GetCandidateAttachmentManager();
CandidateAttachment attach = new CandidateAttachment();
attach = checkCV();
if(attach == null)
{
//insert new entry into table attach
session.save(attach);
}
}
}
checkCV function:
public void checkCV()
{
using (ICandidateAttachmentManager CandidateAttachmentManager = new ManagerFactory().GetCandidateAttachmentManager())
{
IList<CandidateAttachment> lstCandidateAttachment = CandidateAttachmentManager.GetByfkCandidateId(CandidateId);
if (lstCandidateAttachment.Count > 0)
{
CandidateAttachment attach = lstCandidateAttachment.Where(x => x.CandidateAttachementType.Id.Equals(FileType)).FirstOrDefault();
if (attach != null)
{
return null;
}
else
{
return "some string";
}
}
}
}
What happening here is in the for loop if say for i=2 the attach value comes to null that I am entering new entry into attach table. Then for i=3 when it enters checkCV function I get an error at this line:
IList lstCandidateAttachment =
CandidateAttachmentManager.GetByfkCandidateId(CandidateId);
I think it is because since I am using session.save and then trying to read the tabel contents I am unable to execute the query and table is locked till I commit my session. Between the beginTransaction and commit, the table associated with the object is locked. How can I achieve this? Any Ideas?
Update:
I read up on some of the post. It looks like I need to set isolation level for the transaction. But even after adding it doesn't seem to work. Here is how I tried to inplement it:
using (ITransaction tranx = session.BeginTransaction(IsolationLevel.ReadUncommitted))
{
saveDocument();
}
something I don't understand in your code is where you get your nHibernate session.
Indeed you use
new ManagerFactory().GetCandidateAttachmentManager();
and
using (ICandidateAttachmentManager CandidateAttachmentManager = new ManagerFactory().GetCandidateAttachmentManager())
so your ManagerFactory class provides you the ISession ?
then you do:
CandidateAttachment attach = new CandidateAttachment();
attach = checkCV();
but
checkCV() returns either a null or a string ?
Finally you should never do
Save()
but instead
SaveOrUpdate()
Hope that helps you resolving your issue.
Feel free to give more details
I got 2 tables in SQL that both have a field called Selection_ID. I want to delete all rows with Selection_ID = inpSelectionID using the Linq-to-SQL.
My tables:
My C# function:
void buttonDeleteSelectionList_Click(object sender, RoutedEventArgs e, Canvas canvasAdvancedSearchFieldResults, int inpSelectionID)
{
PositionServiceReference.PositionServiceClient service = new PositionServiceReference.PositionServiceClient();
service.DeleteSelectionCompleted += new EventHandler<System.ComponentModel.AsyncCompletedEventArgs>(service_DeleteSelectionCompleted);
service.DeleteSelectionAsync(inpSelectionID);
}
My Service Code PositionService.svc.cs:
[OperationContract]
void DeleteSelection(int inpSelectionID)
{
PositionDataClassesDataContext context = new PositionDataClassesDataContext();
context.Lloyds_Selection.DeleteOnSubmit(inpSelectionID);
context.SubmitChanges();
context.Lloyds_Selection_Vessels.DeleteOnSubmit(inpSelectionID);
context.SubmitChanges();
}
DeleteOnSubmit requires the entity object as a parameter, so you can't delete item without selecting it from database first. There is DeleteAllOnSubmit method too, but it requires IEnumerable of entities as well. You can use it like that:
context.Lloyds_Selection.DeleteAllOnSubmit(context.Lloyds_Selection.Where(l => l.Selection_ID == inpSelectionID));
However, you can use DataContext.ExecuteCommand to execute raw SQL against your database:
string command = string.format("DELETE FROM Lloyds_Section WHERE Selection_ID = {0}", inpSelectionID");
context.ExecuteCommand(command );
I'm new to Entity Framework (working mostly with NHibernate with ActiveRecord before) and I'm stuck with something, that I think should be easy...
I have a User Entity, and created partial User class so I can add some methods (like with NHibernate). I added GetByID to make getting user easier:
public static User GetByID(int userID)
{
using (var context = new MyEntities())
{
return context.Users.Where(qq => qq.UserID == userID).Single();
}
}
Now in the same class I want to log moment of logging in, and I try to do:
public static void LogLoginInfo(int userID)
{
using (var context = new MyEntities())
{
var user = User.GetByID(userID);
var log = new LoginLog { Date = DateTime.Now };
user.LoginLogs.Add(log);
context.SaveChanges();
}
}
The problem is I can't access user.LoginLogs because user's context is already disposed... Most likely I'm missing something obvious here, but creating always full queries like:
context.Users.Where(qq => qq.UserID == userID).Single().LoginLogs.Add(log);
doesn't seem like a good option...
I've read about Repository pattern but I think it's too big gun for such task. Please explain me what am I doing wrong. Thanks in advance!
EDIT
To picture what I'd like to do:
//somewhere in business logic
var user = User.GetByID(userID);
var posts = user.GetAllPostsForThisMonth();
foreach(var post in posts)
{
Console.WriteLine(post.Answers.Count);
}
Normally I'm not allowed to do this because I can't get post.Answers without context...
You are closing the object context and then trying to add a log to the user that is detached. You need to attach the user so the objectContext know what has been changed or added.
public static void LogLoginInfo(int userID)
{
using (var context = new MyEntities())
{
var user = context.User.Where(p=>p.UserID == userID); //<= The Context now knows about the User, and can track changes.
var log = new LoginLog { Date = DateTime.Now };
user.LoginLogs.Add(log);
context.SaveChanges();
}
}
Update
You can also attach the object.
public static void LogLoginInfo(int userID)
{
using (var context = new MyEntities())
{
var user = User.GetByID(userID);
var log = new LoginLog { Date = DateTime.Now };
user.LoginLogs.Add(log);
context.User.Attach(user);
context.SaveChanges();
}
}
Update
var getFirstLogin = from p in User.GetUserById(userId)
select p.LoginLogs.FirstOrDefault();
NB if LoginLogs is a different table you will need to use Include.
public static User GetByID(int userID)
{
using (var context = new MyEntities())
{
return context.Users.Include("LoginLogs").Where(qq => qq.UserID == userID).FirstOrDefault();
}
}
If you are open to using stored procedures (and they work nicely with EF), you can return the user object and simultaneously add to the log table with a single call to the database.
I used to do everything with SP's in my pre-EF/ORM days, when I went to EF I tried very hard to avoid using stored procedures to avoid falling back into my old habits, but now I have found that the selective use of stored procedures you can have the benefits of both -the EF way of doing things, and the super functionality/performance that a well written SP can provide.