I'm developing a Windows Store App with sqlite as database. I'm using SQLite for Windows Runtime and sqlite-net. Trying to get a list of Guid from a table, but only getting empty guids in the list ({00000000-0000-0000-0000-000000000000}). However when querying only for one guid it works as it should be.
My code is the following:
async void SyncSurveys()
{
SQLiteConnection _db = new SQLiteConnection(Path.Combine(ApplicationData.Current.LocalFolder.Path, "SurveyDB"));
var localSurveys = (from s in _db.Table<Survey>()
select s.SurveyGuid).ToList();
...
}
I've tried the query in the following format also, but it is neither works:
var localSurveys = _db.Table<Survey>().Select(s => s.SurveyGuid).ToList();
But if I'm using the following query, to get only one guid just for debug purpose it works well:
var localSurvey = db.Table<Survey>().FirstOrDefault().SurveyGuid;
In the not working scenario the list's count matches the table's rowcount. Does anyone have any idea why this isn't working with the list version?
This isn't an answer, but it might put you down the right path. I noticed there is an option that can be added to a connection string that sets if GUIDS are stored in binary or not
Data Source=c:\mydb.db;Version=3;BinaryGUID=False;
More info https://www.connectionstrings.com/sqlite/
I think the problem should be the basic script commands are properly written.
It may be a bug database.
Related
I have a SQLite database for which I want to populate a new field based on an existing one. I want to derive the new field value using a C# function.
In pseudocode, it would be something like:
foreach ( record in the SQLite database)
{
my_new_field[record_num] = my_C#_function(existing_field_value[record_num]);
}
Having looked at other suggestions on StackOverflow, I'm using a SqliteDataReader to read each record, and then running a SQlite "UPDATE" command based on the specific RowId to update the new field value for the same record.
It works .... but it's REALLY slow and thrashes the hard drive like crazy. Is there really no better way to do this?
Some of the databases I need to update might be millions of records.
Thanks in advance for any help.
Edit:
In response to the comment, here's some real code in a legacy language called Concordance CPL. The important point to note is that you can read and write changes to the current record in one go:
int db;
cycle(db)
{
db->FIRSTFIELD = myfunction(db->SECONDFIELD);
}
myfunction(text input)
{
text output;
/// code in here to derive output from input
return output;
}
I have a feeling there's no equivalent way to do this in SQLite as SQL is inherently transactional, whereas Concordance allowed you to traverse and update the database sequentially.
The answer to this is to wrap all of the updates into a single transaction.
There is an example here that does it for bulk inserts:
https://www.jokecamp.com/blog/make-your-sqlite-bulk-inserts-very-fast-in-c/
In my case, it would be bulk updates based on RowID wrapped into a single transaction.
It's now working, and performance is many orders of magnitude better.
EDIT: per the helpful comment above, defining a custom C# function and then reference it in a single UPDATE command also works well, and in some ways is better than the above as you don't have to loop through within C# itself. See e.g. Create/Use User-defined functions in System.Data.SQLite?
This question has probably been asked correctly before, and I'll gladly accept an answer pointing me to the right spot. The problem is I don't know how to ask the question correctly to get anything returned in a search.
I'm trying to pull data from a 3rd party api (ADP) and store data in my database using asp.net core.
I am wanting to take the users returned from the API and store them in my database, where I have an ADP ancillary table seeded with the majority of the data from the api.
I would then like to update or add any missing or altered records in my database FROM the API.
I'm thinking that about using an ajax call to the api to retrieve the records, then either storing the data to another table and using sql to look for records that are changed between the two tables and making any necessary changes(this would be manually activated via a button), or some kind of scheduled background task to perform this through methods in my c# code instead of ajax.
The question I have is:
Is it a better fit to do this as a stored procedure in sql or rather have a method in my web app perform the data transformation.
I'm looking for any examples of iterating through the returned data and updating/creating records in my database.
I've only seen vague not quite what I'm looking for examples and nothing definitive on the best way to accomplish this. If I can find any reference material or examples, I'll gladly research but I don't even know where to start, or the correct terms to search for. I've looked into model binding, ajax calls, json serialization & deserialization. I'm probably overthinking this.
Any suggestions or tech I should look at would be appreciated. Thanks for you time in advance.
My app is written in asp.net core 2.2 using EF Core
* EDIT *
For anyone looking - https://learn.microsoft.com/en-us/dotnet/csharp/tutorials/console-webapiclient
This with John Wu's Answer helped me achieve what I was looking for.
If this were my project this is how I would break down the tasks, in this order.
First, start an empty console application.
Next, write a method that gets the list of users from the API. You didn't tell us anything at all about the API, so here is a dummy example that uses an HTTP client.
public async Task<List<User>> GetUsers()
{
var client = new HttpClient();
var response = await client.GetAsync("https://SomeApi.com/Users");
var users = await ParseResponse(response);
return users.ToList();
}
Test the above (e.g. write a little shoestring code to run it and dump the results, or something) to ensure that it works independently. You want to make sure it is solid before moving on.
Next, create a temporary table (or tables) that matches the schema of the data objects that are returned from the API. For now you will just want to store it exactly the way you retrieve it.
Next, write some code to insert records into the table(s). Again, test this independently, and review the data in the table to make sure it all worked correctly. It might look a little like this:
public async Task InsertUser(User user)
{
using (var conn = new SqlConnection(Configuration.ConnectionString))
{
var cmd = new SqlCommand();
//etc.
await cmd.ExecuteNonQueryAsync();
}
}
Once you know how to pull the data and store it, you can finish the code to extract the data from the API and insert it. It might look a little like this:
public async Task DoTheMigration()
{
var users = await GetUsers();
var tasks = users.Select
(
u => InsertUser(u)
);
await Task.WhenAll(tasks.ToArray());
}
As a final step, write a series of stored procedures or a DTS package to move the data from the temp tables to their final resting place. If you are using MS Access, you can write a series of queries and execute them in order with some VBA. At a high level it would:
Check for any records that exist in the temp table but not in the final table and insert them into the final table.
Check for any records that exist in the final table but not the temp table and remove them or mark them as deleted.
Check for any records in common that have different column values and update the final table.
Each of these development activities raises it own set of questions, of course, which you can post back to StackOverflow with details. As it is your question doesn't have enough specificity for a more in-depth answer.
So, I have this lambda expression and it works just fine
list = list.Where(x => x.ListaDocumentoCaixa.Any(d => d.Observacao.Contains(term.Trim())));
I must add that this column is a varchar(6000) field. So far, this has been working just fine as I mentioned, but just recently I've ran into an issue. It seems that if the term of the search occurs from position 4001 of the string and on, the query fails to return anything to me.
After some debbuging I've found this commented on the query produced by Entity Framework
-- p__linq__0: 'maria stela gonsa' (Type = String, Size = 4000)
Then after some research I found this to be Entity's common behaviour, however, I can't have this kind of limitation on the application. My question is: Is there any way to change this behaviour ? I would like very much to avoid having to write this query as plain text and run this with ExecuteQuery if possible.
Thanks in advance for the help!
I would recommend you follow the following article, assuming you are using SQL server, about how to create a full text search index, and use it in Entity Framework with C#.
Running LIKE statements (which is what Contains() maps to) is HIGHLY inefficient on large varchar fields.
https://www.mikesdotnetting.com/article/298/implementing-sql-server-full-text-search-in-an-asp-net-mvc-web-application-with-entity-framework
EDIT: The summary of the link is:
1.) Create a full text index on the field using SQL server's wizard. That full text field will allow CONTAINS and FREETEXT searches on the whole field, and be much more efficient.
2.) Write a stored procedure that joins the table in question to results from the free text index.
3.) Make an Entity Framework class to represent results from that stored procedure, and use EF to call in and return a list of those results.
Given the following code (which is mostly irrelevant except for the last two lines), what would your method be to get the value of the identity field for the new record that was just created? Would you make a second call to the database to retrieve it based on the primary key of the object (which could be problematic if there's not one), or based on the last inserted record (which could be problematic with multithreaded apps) or is there maybe a more clever way to get the new value back at the same time you are making the insert?
Seems like there should be a way to get an Identity back based on the insert operation that was just made rather than having to query for it based on other means.
public void Insert(O obj)
{
var sqlCmd = new SqlCommand() { Connection = con.Conn };
var sqlParams = new SqlParameters(sqlCmd.Parameters, obj);
var props = obj.Properties.Where(o => !o.IsIdentity);
InsertQuery qry = new InsertQuery(this.TableAlias);
qry.FieldValuePairs = props.Select(o => new SqlValuePair(o.Alias, sqlParams.Add(o))).ToList();
sqlCmd.CommandText = qry.ToString();
sqlCmd.ExecuteNonQuery();
}
EDIT: While this question isn't a duplicate in the strictest manner, it's almost identical to this one which has some really good answers: Best way to get identity of inserted row?
It strongly depends on your database server. For example for Microsoft SQL Server you can get the value of the ##IDENTITY variable, that contains the last identity value assigned.
To prevent race conditions you must keep the insert query and the variable read inside a transaction.
Another solution could be to create a stored procedure for every type of insert you have to do and make it return the identity value and accept the insert arguments.
Otherwise, inside a transaction you can implement whatever ID assignment logic you want and be preserved from concurrency problems.
Afaik there is not finished way.
I solved by using client generated ids (guid) so that my method generated the id and returns it to the caller.
Perhaps you can analyse some SqlServer systables in order to see what has last changed. But you would get concurrency issues (What if someone else inserts a very similar record).
So I would recommend a strategy change and generate the id's on the clients
You can take a look at : this link.
I may add that to avoid the fact that multiple rows can exist, you can use "Transactions", make the Insert and the select methods in the same transaction.
Good luck.
The proper approach is to learn sql.
You can do a SQL command followed by a SELECT in one run, so you can go in and return the assigned identity.
See
I am using LINQ to SQL with Sql Server Compact Edition 3.5 and VS2008.
I have a very simple table (Tokens) with a uniqueidentifier primary key (TokenID) and two other nullable fields (UsedBy and UsedOn). I am trying to use LINQ to insert new rows into the table but for some reason they are not persisting.
Here is my code:
var connectionstring = "Data Source=|DataDirectory|\\MyData.sdf";
MyData db = new MyData(connectionstring) { Log = Console.Out };
Tokens token = new Tokens { TokenID = Guid.NewGuid() };
db.Tokens.InsertOnSubmit(token);
//db.GetChangeSet();
db.SubmitChanges();
var tokens = from t in db.Tokens
select t;
foreach (var t in tokens)
{
Debug.Print(t.TokenID.ToString());
}
If I uncomment db.GetChangeSet(); I can see the pending insert and when I iterate and print them to the debug widow the # of tokens grows each time. But if I query the table in VS (via Show Table Data) it is empty? Viewing the data like this also "resets" the tokens returned by LINQ their original state.
I am pretty sure I am making some simple mistake, but I can't see it. Any ideas?
Check that your DB file is not copied to the output directory at each build (in the property page of the project item)
Of course 5 minutes after posting my question I have a huge DUH! moment. Turns out I was refreshing the database in my project to see if the inserts were persisting when I should have been checking \bin\Debug\MyData.sdf instead.
Trust the code Sean, trust the code. Oh, and remember you're an idiot sometimes.