Comparing different tables from different databases - c#

I am using c#, asp.net
I have two different tables from two different databases.
Both Have one field in common that I am interested in say custid
I have assigned both to data tables but what I would like to do is this
Compare the two datatables (maybe with a dataview, which I have never used before)
to see what custid are missing from either table,
then i can insert or delete them from the first table , as I want the first table to have the same custid's as the second table.
Table 1 Table 2
1 1
2 - want to delete from table 1
3 3
4 - want to add to table 1
Any help would be appreciated.

As I didn't want to display the data just manipulate it
What I did in the end was
string [] servercustomers = new string [servercustomertable.lenght];
for (i = 0; i < servercustomertable.length; i++)
{
// run query to see if current record exists in localtable
// if it does update it
// if it doesnt create it
servercustomer[i] = servercustomertable.Rows[i]["Custid"];
}
// check through the localtable
for (i = 0; i < localcustomertable.length; i++)
{ bool recordexist = false;
// check through the array to see if the custid is there
for (j = 0; j < servercustomer.length; j++)
{
if (servercustomer[j] = localcustomertable.Rows[i]["custId"])
{
recordexists = true;
break;
}
if(!recordexists)
// delete the record from the local table
}
}
Thanks for all your help
Rachael

Related

How to insert a datatable into an existing Database and overwrite duplicates?

I currently have a data table which adds new rows based on the Access table rows. In the data table, I change the value of certain columns and update them back to the database. I want to write a line of code that overwrites the data if the primary key already exists. I already looked into a query INSERT INTO table (id, name, age) VALUES(1, "A", 19) ON DUPLICATE KEY UPDATE name="A", age=19 but I can't find a way to implement this since i'm using a data adapter.
I have made the following code:
for (int i = 1; i < 19; i++)
{
Normen.Clear();
Normen.Add(-1);
for (int j = 9; j < 18; j++)
{
Normen.Add(Convert.ToInt32(NormDb.Rows[j].Field<double>(i)));
}
newRow[i] = Convert.ToInt32(GlobalMethods.LeftSegmentIndex(Normen, Convert.ToInt32(newRow[i + 18])));
}
schoolDb.Rows.Add(newRow);
Normen is in this case a list with integers i use to calculate test scores which are then inserted into the empty columns i fetched from the database table.
when this code runs, i have a datatable with rows and columns which are identical to the database columns. When this code is done, the other method is triggered to update the data into the datatable like this:
using (var dbcommand = new OleDbCommand(ZoekQuery, connection))
{
using (var SchoolAdapter = new OleDbDataAdapter(ZoekQuery, connection))
{
OleDbCommandBuilder cb = new OleDbCommandBuilder(SchoolAdapter);
cb.GetInsertCommand();
SchoolAdapter.FillSchema(schoolDb, SchemaType.Source);
SchoolAdapter.Fill(schoolDb);
Normeringen.Normeer(exportDb.Rows.Count, schoolDb, exportDb, NormDb, normcode);
SchoolAdapter.Update(schoolDb);
MessageBox.Show(Success);
}
}
But when the primary key already exists i want to overwrite the record because in this use case it is possible to grade a test with 2 different values. Is there any way i can implement this?

unreliable information returned

I'm writing an upgrade program for moving Access 2.0 .mdb files to Access 2003 .mdb. We are staying with the mdb file structure for a couple of reasons as this is code that is in several customers locations, and the mdb allows us to utilize existing code.
The main problem:
I go through JET to read in the tables from the file.mdb and read each table into a C# datatable. I then do several checks and duplicate the table in the 2003 mdb db. I use the DataTable.PrimaryKey function to gather the columns that are primary key columns, but I don't get reliable results.
Access 2.0 shows the PrimaryKey in several tables (single column) where DataTable does not, but not always.
I have verified that I do get the PrimaryKey(s) on some tables, just not all.
DataColumn[] dcPrimaryKeyCols = OrgTbl.PrimaryKey;
//Read the Ordinal so we can order the columns correctly
for (int m = 0; m < NumCols; m++)
{
ColumnOrder[m] = OrgTbl.Columns[m].Ordinal;
if (ColumnOrder[m] != m)
MessageBox.Show("In table " + nm+ "out of order ordinal on column: " + OrgTbl.Columns[m].ColumnName);
}
lblStatus.Text = "Creating Table";
pbTableProgress.Value = 0;
pbTableProgress.Maximum = NumCols;
for (int col = 0; col < NumCols;col++ )
{
pbTableProgress.Value = col;
Application.DoEvents();
sColNm=OrgTbl.Columns[col].ColumnName.Trim();
bPrimaryKey = false;
//determine if this column is part of a primary key group
for (int k = 0; k < dcPrimaryKeyCols.Length;k++ )
{
if (dcPrimaryKeyCols[k].ColumnName.Trim().Equals(sColNm))
{
bPrimaryKey = true;
break;
}
}
I have set a breakpoint in the bPrimaryKey = true line, and it gets there sometimes, but not on all the tables where a primarykey is defined.
One thing I noted, in Access Ver 2.0, the column information for the some of the primary keys show: required = no, unique = no. I don't know if this causes JET or the C# datatable to unmark a primarykey column or if other things are at work here. But the end results is that I am not able to correctly detect ALL primarykey columns.

C# Comparing values in 2 DataTables

I'm processing 2 DataTables:
SSFE: Contains the values I want to find
FFE: Is larger, smaller or equally large as SSFE, but does not necessarily contain every value of SSFE
The values I need to match between these tables are integers, both tables are sorted from small to large.
My idea was to start searching on the first item in FFE, start looping through SSFE, and when I find a match -> remember current index -> save match -> select next item from FFE and continue from the previous index.
Also FFE can contain integers, but can also contain strings, that is why I cast the values to a string and compare these.
I made some code, but it takes, too much time.
It will take about a minute to compare SSFE(1.000 items) to FFE(127.000) items.
int whereami = 0;
bool firstiteration = true;
for (int i = 0; i < FFEData.Rows.Count - 1; i++)
{
for (int j = 0; j < SSFEData.Rows.Count - 1; j++)
{
if (firstiteration)
{
j = whereami;
firstiteration = false;
}
if (SSFEData.Rows[j][0] == FFEData.Rows[i][0].ToString())
{
found++;
whereami = j;
firstiteration = true;
break;
}
}
}
I'm only storing how many occurences I have found for testing. In this example it will find 490 matches, not that this is relevant.
Any suggestions would be great!
Could try the DataRelation class. It creates a foreign key / join between two DataTables in a DataSet.
using System.Data;
using System.Text;
public int GetMatches(DataTable table1, DataTable table2)
{
DataSet set = new DataSet();
//wrap the tables in a DataSet.
set.Tables.Add(table1);
set.Tables.Add(table2);
//Creates a ForeignKey like Join between two tables.
//Table1 will be the parent. Table2 will be the child.
DataRelation relation = new DataRelation("IdJoin", table1.Columns[0], table2.Columns[0], false);
//Have the DataSet perform the join.
set.Relations.Add(relation);
int found = 0;
//Loop through table1 without using LINQ.
for(int i = 0; i < table1.Rows.Count; i++)
{
//If any rows in Table2 have the same Id as the current row in Table1
if (table1.Rows[i].GetChildRows(relation).Length > 0)
{
//Add a counter
found++;
//For debugging, proof of match:
//Get the id's that matched.
string id1 = table1.Rows[i][0].ToString();
string id2 = table1.Rows[i].GetChildRows(relation)[0][0].ToString();
}
}
return found;
}
I randomly populated two non-indexed tables with nvarchar(2) strings, with 10,000 rows each. The match took sub 1 second, including the time spent populating the tables. I would get between 3500 and 4000 matches a run on average.
However, the major caveat is that the DataColumns being matched must be the same data type. So if both columns are string, or at least integers stored as string, then this will work.
But if one column is an integer, you will have to add a new column, and store the integers as string in that column first. The string translations will add a hefty amount of time.
Another option will be uploading the tables to a database and performing a query. That large an upload will probably take a few seconds, but the query will be under one second as well. So still better than 60 seconds+.

Column added to .net DataTable does not get added to database Table

I'm new to this site and I started some time ago to program an application.
Right now I'm stuck at a problem and I have no idea how to solve it.
My problem is: I want to register every user in a new column, because I have to save a great amount of data for each of them (up to 1000 rows). Now I add a new column to the already existing table. But the thing is, if I want to retrieve this data I get an error that says there is no column called like this.
programDataSet.User.Columns.Add(textBoxUser.Text, typeof(Int32));
for (int i = 0; i < 4 && i < alchemieDataSet.User.Count; i++)
{
programDataSet.User[i][textBoxUser.Text] = "1";
}
for (int i = 0; i < programDataSet.User.Count; i++)
{
alchemieDataSet.User[i][textBoxUser.Text] = "0";
}
this.Validate();
this.userBindingSource.EndEdit();
this.tableAdapterManager.UpdateAll(this.programDataSet);
MessageBox.Show("Registration completed");
Does anyone have an idea what I made wrong?
I agree with everyone that there are big red flags with data architecture.
But to your specific question, DataSets and Adapters do not change table columns in a target database. They only change rows.
You have to update the columns separately.

Autodetecting Oracle data types

I'm trying to fetch data from multiple tables from an Oracle db and insert it into a sql db. The problem that I am running into is that I am fetching almost 50 columns of data all of different datatypes. I then proceed to insert these individual column values into a SQL statement which then inserts the data into the sql db. So the algo looks something like this:
Fetch row data{
create a variable for each individual column value ( int value = reader.getInt32(0); )
add a sqlparameter for it (command.Parameters.Add(new SqlParameter("value", value)); )
once all the 50 or so variables have been created make a sql statement
Insert into asdf values (value,........)
}
Doing it this way for a table with <10 columns seems ok but when it exceeds that length this process seems tedious and extraneous. I was wondering if there was a simpler way of doing this like fetch row data and automatically determine column data type and automatically create a varialbe and automatically insert into sql statement. I would appreciate it if anyone could direct me to the right way of doing this.
The data reader has a neutral GetValue method returning an object and the SqlCommand has an AddWithValue method that does not require to specify a parameter type.
for (int i = 0; i < reader.VisibleFieldCount; i++) {
object value = reader.GetValue(i);
command.Parameters.AddWithValue("#" + i, value);
}
You could also create the SQL command automatically
var columns = new StringBuilder();
var values = new StringBuilder();
for (int i = 0; i < reader.VisibleFieldCount; i++) {
values.Append("#").Append(i).Append(", ");
columns.Append("[").Append(reader.GetName(i)).Append("], ");
}
values.Length -= 2; // Remove last ", "
columns.Length -= 2;
string insert = String.Format("INSERT INTO myTable ({0}) VALUES ({1})",
columns.ToString(), values.ToString());

Categories

Resources