MySQL insert based on datetime - c#

This is my first post ever, so I apologize if I unknowingly break any rules regarding post etiquette.
I am migrating db entries using C# and the Redmine API. I need to pull the time entries in XML and insert into MySQL if and only if the time entries are newer than what exists currently in the database.
Right now I am prototyping this by simply rerouting the original MySQL db entries back to MySQL and into a new table. Later I will using ERP software similar to SAP software.
I am not sure if the program logic needs to be primarily in the C# code, or in a MySQL statement. Here is where I am stuck:
foreach (TimeEntry t in timeEntries.list)
{
comm.CommandText =
"INSERT INTO time_entries(time_entry_id,project_id,issue_id,user_id,activity_id,hours,comments,"
+ "spent_on,created_on,updated_on,customer_number)"
+ "VALUES(?time_entry_id,?project_id,?issue_id,?user_id,?activity_id,?hours,?comments,?spent_on,"
+ "?created_on,?updated_on,?customer_number)";
comm.Parameters.AddWithValue("?time_entry_id", t.id);
comm.Parameters.AddWithValue("?project_id", t.project.Id);
comm.Parameters.AddWithValue("?issue_id", t.issue.id);
comm.Parameters.AddWithValue("?user_id", t.user.id);
comm.Parameters.AddWithValue("?activity_id", t.activity.id);
comm.Parameters.AddWithValue("?hours", t.hours);
comm.Parameters.AddWithValue("?comments", t.comments);
comm.Parameters.AddWithValue("?spent_on", t.spent_on);
comm.Parameters.AddWithValue("?created_on", t.created_on);
comm.Parameters.AddWithValue("?updated_on", t.updated_on);
comm.Parameters.AddWithValue("?customer_number", t.custom_fields.list[0].value);
comm.ExecuteNonQuery();
comm.Parameters.Clear();
continue;
}
Of course, everything inserts just fine, and subsequent runs of the program will simply insert duplicate records. When this is finished, the idea is that it will run as a daily cron job that will update MySQL with new time entries only.
As I mentioned, at a later point I will be sending the XML data directly to an ERP program similar to SAP software, but right now I want to make sure the MySQL to MySQL prototype works first.
I am a relatively new programmer, and a first time poster, so if there are flaws with my code logic or if my question is not specific enough then please let me know. Thanks in advance.

Actually, I just figured out a working solution. Not sure if it is the best way to do it, but it works, so I am happy.
All I did was changed updated_on to be the primary key in MySQL, and then handled the exception that was thrown for duplicate entries:
try
{
comm.ExecuteNonQuery();
}
catch(MySql.Data.MySqlClient.MySqlException mySqlEx)
{
Console.WriteLine("Duplicate entry found");
}
I wanted to post the code in case it helps anyone else searching for an answer to this same issue. As far as I am concerned, this problem is now solved.

You can try using DUAL
INSERT INTO time_entries (time_entry_id, project_id, issue_id, user_id,
activity_id, hours,comments, spent_on,
created_on, updated_on, customer_number)
SELECT ?time_entry_id, ?project_id, ?issue_id, ?user_id, ?activity_id,
?hours, ?comments, ?spent_on, ?created_on, ?updated_on,
?customer_number
FROM DUAL
WHERE '2015-11-20' NOT IN (SELECT YourTimeField from time_entries)
I tryed to write a clear example (hope you understood the logic)

Related

c# Windows form not updating values to SQL database

I have a problem in my program that's supposed to store projects given by the user in a database. I'm stuck on the edit project button. After entering new values in the program and hitting the button to save the values everything runs successfully with no errors. The messagebox that says "project edited" appears, I hit ok but the database stays the same. There is no error in the code, the SQL code that gets sent to update the database values is also correct but it doesn't work. Can anyone help with this because I am lost.
Here is the method that creates and executes the SQL code to update the database.
enter image description here
Wow man that code is wrong in so many ways according to code standards and principles most popular :) but that is not what the question is about directly, though getting you past lost we have to start at the basic tbh:
Suggestions
when you catch that exception if it comes, show that in a messagebox also you can even add an error icon as part of the .Show command, it's build in.
Move the connection.Close to the finally block instead of having it replicated
Consider making an SQL procedure instead and just parse the parameter into that, this code is prone to sql injection that you pose
Consider not making the procedure and familiarize Yourself with entity framework, it's going to make your life so much easier
do not concatenate like that, use interpolation or string.Combine or you'll be copying stuff all around on the stack, for each + a new copy one and two into third, it is super inefficient
When You write the code works and the sql is correct, consider that the outcome is not the desired and therefore it doesn't technically ;) the best and the worst about computers, is that they do what you ask.
Don't write Your DAL code in the form at all
Consider checking your parameters for default values
You do not have data to say 'project was updated' only 'values were saved', you do not check the old values in the code
Still besides that I do not see why what you wrote wouldn't work, provided the resulting sql is valid in what db you use, but i suppose if you do some of these things the error will present itself
I don't think it's a connection problem because I have a function that updates only the finishdate of the project and that works completely fine.
Here is the function:
public static MySqlCommand FinishProject(int projID, string finishdate) {
try {
if (connection != null) {
connection.Open();
cmd = connection.CreateCommand();
cmd.CommandType = CommandType.Text;
cmd.Parameters.AddWithValue("#value", projID);
cmd.Parameters.AddWithValue("#finishdate", finishdate);
cmd.CommandText = "UPDATE `b1c`.`projects` SET `finishdate` = (#finishdate) WHERE (`projectid` = (#value));";
int i = cmd.ExecuteNonQuery();
connection.Close();
if (i != 0) {
MessageBox.Show("Project finalized.");
i = 0;
}
}
} catch (Exception ex) {
MessageBox.Show("Catch");
connection.Close();
}
return cmd;
}
You can see it's basically the same the only difference are the values.
So it shouldn't be a connection thing because this one works fine I think.
I also don't think it's a problem in the SQL database because all the problems
I've had up until now that had anything to do with the database have shown as errors in visual studio.
If anyone can help I will provide screenshots of anything you need and thank you all once again for trying to help.
Here is the screenshot of the function I've pasted previously so it's easier to look at.
finishprojectfunction

Update Table based on updated information

C#
I am currently working on a project that relies on downloading a table roughly 100 entries.
I first download the table and store it in a local variable, then link the variable to a DataGridView where the user can edit values.
Once done the user pushes save and it must update the table in the SQL DB with the changed information.
I am asking for a best practice here, is it advisable to delete the rows you have changed and bulk upload the changes or update based or even multiple parameters?
I know when working with SQL exclusively, you can use commands like UPDATE FROM and use tables as the source but I do not know how this would work using C#.
Thanks for help in advance.
public DataTable GetSingleTable(string sTableName, string sGetConnString)
{
DataTable dtTabletoReturn = new DataTable();
string sCommand = "SELECT * FROM " + sTableName+ " WHERE
BranchID = '"+ sBranchID +"'";
SqlConnection sqlConnection = new SqlConnection(sGetConnString);
sqlConnection.Open();
SqlDataAdapter sqlOilAdapter = new SqlDataAdapter(sCommand, sqlConnection);
sqlOilAdapter.Fill(dtTabletoReturn);
sqlConnection.Close();
return dtTabletoReturn;
}
Entity Framework MVC will be the best practice for you. You can start with the basics from here:
https://www.entityframeworktutorial.net/what-is-entityframework.aspx
As others have mentioned, if this is not impossible in Your project, try EF core or Dapper - both should simplify your struggles (not without adding some other later in some peculiar scenarios).
If going with EF core, take a look at connected / disconnected scenarios.
In any case, when getting data by using EF in, lets say for simplicity, connected scenario, the EF core context tracks entities (Your data).
It will detect changes made to those entities, so in the end, just calling a SaveChanges() method on EF core DbContext will save and transfer just the modified data.
Mind that this very basic explanation, You will have to read about it by Yourself if You choose to go that way.
So after fiddling around and I rate the best procedure would be to use the DataAdapter Update command, I was looking for best practices here. Unfortunately the Entity Framework, as far as I can tell, works best when building an application from scratch. https://www.youtube.com/watch?v=Y-aGjF6_Ptc&t=166s <- this was the best so far.

Populate new field in SQLite database using existing field value and a C# function

I have a SQLite database for which I want to populate a new field based on an existing one. I want to derive the new field value using a C# function.
In pseudocode, it would be something like:
foreach ( record in the SQLite database)
{
my_new_field[record_num] = my_C#_function(existing_field_value[record_num]);
}
Having looked at other suggestions on StackOverflow, I'm using a SqliteDataReader to read each record, and then running a SQlite "UPDATE" command based on the specific RowId to update the new field value for the same record.
It works .... but it's REALLY slow and thrashes the hard drive like crazy. Is there really no better way to do this?
Some of the databases I need to update might be millions of records.
Thanks in advance for any help.
Edit:
In response to the comment, here's some real code in a legacy language called Concordance CPL. The important point to note is that you can read and write changes to the current record in one go:
int db;
cycle(db)
{
db->FIRSTFIELD = myfunction(db->SECONDFIELD);
}
myfunction(text input)
{
text output;
/// code in here to derive output from input
return output;
}
I have a feeling there's no equivalent way to do this in SQLite as SQL is inherently transactional, whereas Concordance allowed you to traverse and update the database sequentially.
The answer to this is to wrap all of the updates into a single transaction.
There is an example here that does it for bulk inserts:
https://www.jokecamp.com/blog/make-your-sqlite-bulk-inserts-very-fast-in-c/
In my case, it would be bulk updates based on RowID wrapped into a single transaction.
It's now working, and performance is many orders of magnitude better.
EDIT: per the helpful comment above, defining a custom C# function and then reference it in a single UPDATE command also works well, and in some ways is better than the above as you don't have to loop through within C# itself. See e.g. Create/Use User-defined functions in System.Data.SQLite?

Trying to sync data from third party api

This question has probably been asked correctly before, and I'll gladly accept an answer pointing me to the right spot. The problem is I don't know how to ask the question correctly to get anything returned in a search.
I'm trying to pull data from a 3rd party api (ADP) and store data in my database using asp.net core.
I am wanting to take the users returned from the API and store them in my database, where I have an ADP ancillary table seeded with the majority of the data from the api.
I would then like to update or add any missing or altered records in my database FROM the API.
I'm thinking that about using an ajax call to the api to retrieve the records, then either storing the data to another table and using sql to look for records that are changed between the two tables and making any necessary changes(this would be manually activated via a button), or some kind of scheduled background task to perform this through methods in my c# code instead of ajax.
The question I have is:
Is it a better fit to do this as a stored procedure in sql or rather have a method in my web app perform the data transformation.
I'm looking for any examples of iterating through the returned data and updating/creating records in my database.
I've only seen vague not quite what I'm looking for examples and nothing definitive on the best way to accomplish this. If I can find any reference material or examples, I'll gladly research but I don't even know where to start, or the correct terms to search for. I've looked into model binding, ajax calls, json serialization & deserialization. I'm probably overthinking this.
Any suggestions or tech I should look at would be appreciated. Thanks for you time in advance.
My app is written in asp.net core 2.2 using EF Core
* EDIT *
For anyone looking - https://learn.microsoft.com/en-us/dotnet/csharp/tutorials/console-webapiclient
This with John Wu's Answer helped me achieve what I was looking for.
If this were my project this is how I would break down the tasks, in this order.
First, start an empty console application.
Next, write a method that gets the list of users from the API. You didn't tell us anything at all about the API, so here is a dummy example that uses an HTTP client.
public async Task<List<User>> GetUsers()
{
var client = new HttpClient();
var response = await client.GetAsync("https://SomeApi.com/Users");
var users = await ParseResponse(response);
return users.ToList();
}
Test the above (e.g. write a little shoestring code to run it and dump the results, or something) to ensure that it works independently. You want to make sure it is solid before moving on.
Next, create a temporary table (or tables) that matches the schema of the data objects that are returned from the API. For now you will just want to store it exactly the way you retrieve it.
Next, write some code to insert records into the table(s). Again, test this independently, and review the data in the table to make sure it all worked correctly. It might look a little like this:
public async Task InsertUser(User user)
{
using (var conn = new SqlConnection(Configuration.ConnectionString))
{
var cmd = new SqlCommand();
//etc.
await cmd.ExecuteNonQueryAsync();
}
}
Once you know how to pull the data and store it, you can finish the code to extract the data from the API and insert it. It might look a little like this:
public async Task DoTheMigration()
{
var users = await GetUsers();
var tasks = users.Select
(
u => InsertUser(u)
);
await Task.WhenAll(tasks.ToArray());
}
As a final step, write a series of stored procedures or a DTS package to move the data from the temp tables to their final resting place. If you are using MS Access, you can write a series of queries and execute them in order with some VBA. At a high level it would:
Check for any records that exist in the temp table but not in the final table and insert them into the final table.
Check for any records that exist in the final table but not the temp table and remove them or mark them as deleted.
Check for any records in common that have different column values and update the final table.
Each of these development activities raises it own set of questions, of course, which you can post back to StackOverflow with details. As it is your question doesn't have enough specificity for a more in-depth answer.

How to split synchronization process in sync framework

I'm using sync framework to synchronize sql server 2008 database with sqlCE on mobile device. Everything looks fine besides some problems. One of them is:
If i want to sync 1000 or more rows, i get OutOfMemory Exception on mobile device(tho the sync completes well, because after it i check the data of some rows and it looks synced). I thought that maybe too large xmls are rotating between mobile device and server(for 100 rows evrth works just fine)...Thats why i asked about how to split the sent data. But maybe im wrong. I didn't found any resources on this, so i dont exactly know WHAT can eat so much memory to add just 60Kb to the compact database.
You'll need to implement some sort of batching.
A quite naive version of it is shown on here:
http://msdn.microsoft.com/en-us/library/bb902828.aspx.
I've seen that you're intrested in some filtring. If this will filter out some, or rather alot, of rows I would recommend to write your own batch logic. The one we're currently using sets the #sync_new_received_anchor to the anchor of the #sync_batch_size:th row to be synced.
In a quite simplified way the logic looks like this:
SELECT #sync_new_received_anchor = MAX(ThisBatch.ChangeVersion)
FROM (SELECT TOP (#sync_batch_size) CT.SYS_CHANGE_VERSION AS ChangeVersion
FROM TabletoSync
INNER JOIN CHANGETABLE(CHANGES [TabletoSync],
#sync_last_received_anchor) AS CT
ON TabletoSync. TabletoSyncID = CT. TabletoSyncID
WHERE TabletoSync.FilterColumn = #ToClient
ORDER BY CT.SYS_CHANGE_VERSION ASC) AS ThisBatch

Categories

Resources