Linq To Sql updates DataContext but not the database - c#

I can't update the database but when im debuging it, production is updated and created records in context. But when I close the debugging, the database has no any Data Definition.
I enjoyed coding until I got stocked up with this 'not updating' SubmitChanges() method to the database(but updates the context). It really kills me, i'm stocked about few hours finding solution in the web.
By the way, I set 'id' as Production PK with Identity incremented in pk property. Also in .dbml file, I set also set auto increment. Below is my code:
qmsDBDataContext context = new qmsDBDataContext();
public void AddProduction(int quails, int eggs, int feeds, int id_box) {
Production production = new Production();
production.quails = quails;
production.eggs = eggs;
production.id_box = id_box;
production.feeds = feeds;
context.Productions.InsertOnSubmit(production);
context.SubmitChanges();
}

I found a fix exactly answers my problem it was explained here
. The mdf file I was working on existed in my project/debug folder, and the mdf file that server explorer was looking at existed in the project folder. Thanks to Matt Warren - MSFT to answering this question. Cheers
#sgmoore Thanks for giving me ideas about Transaction Scope Thing.

Related

How To DEALLOCATE PREPARE Statement In C# Using Mysql.Data Client?

hope you guys are fine?
OK.. i am using MySQL.Data client/library to access and use MySQL database. I was using happily it for sometimes on quite a few project. But suddenly facing a new issue that causing me hold on my current project. :(
Because current project makes some (looks like it's a lot) db queries. and i am facing following exception :
Can't create more than max_prepared_stmt_count statements (current value: 16382)
i am closing and disposing the db engine/connection every time i am done with it. But getting damn confused why i am still getting this error.
​here is the sample code just to give you idea.. (trimmed out unnecessary parts)
//this loop call an API with pagination and get API response
while(ContinueSalesOrderPage(apiClient, ref pageNum, days, out string response, window) == true)
{
//this handle the API date for the current page, it's normally 500 entry per page, and it throws the error on 4th page
KeyValueTag error = HandleSalesOrderPageData(response, pageNum, out int numOrders, window);
}
private KeyValueTag HandleSalesOrderPageData(string response, int pageNum, out int numOrders, WaitWindow window)
{
numOrders = json.ArrayOf("List").Size;
//init db
DatabaseWriter dbEngine = new DatabaseWriter()
{
Host = dbHost,
Name = dbName,
User = dbUser,
Password = dbPass,
};
//connecting to database
bool pass = dbEngine.Connect();
//loop through all the entry for the page, generally it's 500 entries
for(int orderLoop = 0; orderLoop < numOrders; orderLoop++)
{
//this actually handle the queries, and per loop there could be 3 to 10+ insert/update query using prepared statements
KeyValueTag error = InsertOrUpdateSalesOrder(dbEngine, item, config, pageNum, orderLoop, numOrders, window);
}
//here as you can see, i disconnect from db engine, and following method also close the db connection before hand
dbEngine.Disconnect();
}
//code from DatabaseWriter class, as you see this method close and dispose the database properly
public void Disconnect()
{
_CMD.Dispose();
_engine.Close();
_engine.Dispose();
}
so, as you can see i close/dispose the database connection on each page processing, but still it shows me that error on 4th page. FYI, 4th page data is not the matter i checked that. If i skip the page and only process the 4th page, it process successfully.
and after some digging more in google, i found prepare statement is saved in database server and that needs to be close/deallocate. But i can't find any way to do that using MySQL.Data Client :(
following page says:
https://dev.mysql.com/doc/refman/8.0/en/sql-prepared-statements.html
A prepared statement is specific to the session in which it was created. If you terminate a session without deallocating a previously prepared statement, the server deallocates it automatically. 
but that seems incorrect, as i facing the error even after closing connection on each loop
so, i am at dead end and looking for some help here? 
thanks in advance
best regards
From the official docs, the role of max_prepared_stmt_count is
This variable limits the total number of prepared statements in the server.
Therefore, you need to increase the value of the above variable, so as to increase the maximum number of allowed prepared statements in your MySQL server's configuration
Open the my.cnf file
Under the mysqld section, there is a variable max_prepared_stmt_count. Edit the value accordingly(remember the upper end of this value is 1048576)
Save and close the file. Restart MySQL service for changes to take place.
You're probably running into bug 77421 in MySql.Data: by default, it doesn't reset connections.
This means that temporary tables, user-declared variables, and prepared statements are never cleared on the server.
You can fix this by adding Connection Reset = True; to your connection string.
Another fix would be to switch to MySqlConnector, an alternative ADO.NET provider for MySQL that fixes this and other bugs. (Disclaimer: I'm the lead author.)

Uncheckout tfs file programmatically if not modified using C#

After searching the googles for couple hours I found an answer to my question. I know this post Undo checkout TFS answers my question, however it doesn't answer all the questions I have. I want to achieve the same objective that the post asked about. How to only revert files that have been checked out if nothing was modified in that file? The answer to my question shouldn't be too hard to answer.
So what I'm doing is copying files from a server and overwriting them in my local workspace. I am checking out all the files being copied. However, if a file that was copied is not modified in anyway(server file and destination file are exact same), I'd like to undo the checkout of that file.
I know I'm to use the workspace.Undo() method and the gentleman said it worked for him. However he didn't show how he implemented it.
Here is the code I have with help from the link:
public static void CheckOutFromTFS(string filepath)
{
var workspaceInfo = Workstation.Current.GetLocalWorkspaceInfo(filepath);
if (workspaceInfo == null)
{
return;
}
var server = new TfsTeamProjectCollection(workspaceInfo.ServerUri);
var workspace = workspaceInfo.GetWorkspace(server);
workspace.PendEdit(filepath);
}
The answer given was to use the workspace.Undo() method. Do I add this method as the last line in CheckOutFromTFS() like so?
public static void CheckOutFromTFS(string filepath)
{
var workspaceInfo = Workstation.Current.GetLocalWorkspaceInfo(filepath);
if (workspaceInfo == null)
{
return;
}
var server = new TfsTeamProjectCollection(workspaceInfo.ServerUri);
var workspace = workspaceInfo.GetWorkspace(server);
workspace.PendEdit(filepath);
workspace.Undo();
}
Or is it done differently? I'm not sure if this Undo() will only revert files if there are no changes or just revert the checkout entirely and render the PendEdit() useless. Can someone help clarify this for me?
If you use a local workspace then all file that have no changes will automatically revert to not checked-out. You don't need to do anything at all. This works with VS 2012 or better with TFS 2012 or better. You'll need to convert you workspace to a local workspace first like this
So I found the answer to my question in various posts. I kinda took bits an pieces and combined them together to get my working solution. The use of the Undo() function with passing in the filepath actually does uncheckout the file regardless if it was modified or not. My workspace was also local but VS and TFS couldn't automatically revert those unmodified files for me so I took the below approach.
So what I decided to do was to just use the Team Foundation Power Tools "uu" command to undo the changes to unchanged files in the workspace. I created a batch file and entered the following command: echo y | tfpt uu . /noget /recursive. Since we will not show the shell during execution, I used the "echo y" command to automatically answer the question, "Do you wish to undo these redundant pending changes? (Y/N)". Including /noget is highly recommended since it prevents a forced 'get latest' of all your project's files which depending on the total number can take a extremely long time.
var startInfo = new System.Diagnostics.ProcessStartInfo
{
WorkingDirectory = projectRoot,
FileName = projectRoot + #"\undoUnchanged.bat",
UseShellExecute = false,
CreateNoWindow = true
};
Process process = Process.Start(startInfo);
process.WaitForExit();
process.Close();
After the script runs and the process.Close() executes you and double check if your unmodified files actually were unchecked out by hitting the refresh button on the Team Explorer window in your project. Hope someone else can find some use in this.
If I understand the question well and you actually need undo through C# code behind, I believe this shoul help you:
Undo checkout TFS

SQL Server CE 4.0 - DbUpdate Exception was unhandled

I am struggling with SQL Server CE 4.0 almost all day. I get this error when I try to save new record (entity) to my database.
First a little preview - I installed SQL Server CE 4.0 via NuGet and I created a database for my project. Then I created the Entity Framework model layer and started working with that.
When I get this error my inner exception says this :
The column cannot be modified. [ Column name = Id ]
I did a little research and find out that this may be cause because of my settings for the Id property so I changed it like that :
I already have 17 records that I have inserted manually for testing purposes but I doubt this may cause any problem.
So what I try to do is this:
public override void Save()
{
using (RalBaseEntities ctx = new RalBaseEntities())
{
System.Data.Entity.DbSet<MainInfo> mainInfoEntity = ctx.Set<MainInfo>();
MainInfo entity = new MainInfo();
entity.Manager = txtManager.Text;
entity.Broker = txtBroker.Text;
mainInfoEntity.Add(entity);
ctx.SaveChanges();
}
}
So when I try to execute the Save() method I get the error above. I wrote that in previous versions one should have to create the id manually, but it's fixed in v. 4.0 and judging by the settings that I show as an image here I don't see a reason just to get a new record with an auto generated unique Id.
The word Update in DbUpdate Exception that I get is worrying me a bit. Maybe I'm trying to save the data in a wrong way but I spend a lot of time googling and it seems to be the right way.

DataGridView to MS Access Database

I'm trying to update a database after the changes in a DataGridView.
After changing the code several times, I've come to this one-liner:
primeTableAdapter.Update((PrimeDataSet.PrimeDataTable)(primeDataSet.Tables["Prime"]));
The method returns the correct number of rows added / edited and entering the form again without closing the entire application everything seems fine. The new rows are kept in the DataGridView. However, the Access file is not changed one bit, so when I relaunch the application they disappear.
I've bound the DataGridView to the table using the "Add New Data Source" wizard and I'm using Visual Studio 2010.
I've also found the code at: Inserting data from a DataGridView to a database and it seems similar to what I have to do, but I wasn't able to translate the vb.net code to something that would compile. However, I'm fairly sure it is really, really close to what I have to do.
Later Edit: I had added the database to the project. By setting the Copy to Output Directory property to Copy if Newer the changes were persistent between sessions.
if you want to pay,you can use a third party (Spire dataExport ) it make things more easier : to save datagrid into msAccess :
private void btnExportToAccess_Click(object sender, EventArgs e)
{
Spire.DataExport.Access.AccessExport accessExport = new
Spire.DataExport.Access.AccessExport();
accessExport.DataSource = Spire.DataExport.Common.ExportSource.DataTable;
accessExport.DataTable = this.dataGridView1.DataSource as DataTable;
accessExport.DatabaseName = #"..\..\ToMdb.mdb";
accessExport.TableName = "ExportFromDatatable";
accessExport.SaveToFile();
}
and Here a link where you can find more clarification

Use all backup sets to restore database with SMO

My problem is really simple.
I have a .bak file that contains one or more backup set.
When I'm using SMO to restore the database with this .bak file, it only takes the first backup set to do its work. It seems to ignore the remaining sets.
Why's that ?
See my code :
//Sets the restore configuration
Restore restore = new Restore()
{
Action = RestoreActionType.Database,
Database = _databaseToRestore.DatabaseName,
ReplaceDatabase = true
};
restore.Devices.Add(new BackupDeviceItem(_backupFilePath, DeviceType.File));
Server server = new Server(_databaseToRestore.ServerName);
DataTable fileList = restore.ReadFileList(server);
string serverDataFolder = server.Settings.DefaultFile;
if (string.IsNullOrEmpty(serverDataFolder))
serverDataFolder = server.Information.MasterDBPath;
foreach (DataRow file in fileList.Rows)
{
restore.RelocateFiles.Add(
new RelocateFile((string)file["LogicalName"],
Path.Combine(serverDataFolder, _databaseToRestore.DatabaseName + Path.GetExtension((string)file["PhysicalName"]))));
}
//Gets the exclusive access to database
server.KillAllProcesses(_databaseToRestore.DatabaseName);
restore.Wait();
restore.SqlRestore(server);
I thought the BackupDeviceItem could gives me a feedback on how many backup sets there's inside, this way I could warn the user, but it's not.
Anyone has a clue on this ?
Thanks for your time.
Ok, fixed my problem.
The important field is FileNumber on the Restore object.
The default value is 1, so that's why it always took my first backup set.
I just had to set this property to the number of backup sets in the file and now it takes the most recent backup done.
Note : No differencial backups are implicated in this concern.
I just found out that I could easily know how many backup sets the file contains.
DataTable backupSets = restore.ReadBackupHeader(server);
Now, a simple backupSets.Rows.Count can help me to warn the user.

Categories

Resources