I am new to JSON and SQLBulkCopy. I have a JSON formatted POST data that I want to Bulk Copy/Insert in Microsoft SQL using C#.
JSON Format:
{
"URLs": [{
"url_name": "Google",
"url_address": "http://www.google.com/"
},
{
"url_name": "Yahoo",
"url_address": "http://www.yahoo.com/"
},
{
"url_name": "FB",
"url_address": "http://www.fb.com/"
},
{
"url_name": "MegaSearches",
"url_address": "http://www.megasearches.com/"
}]
}
Classes:
public class UrlData
{
public List<Url> URLs {get;set;}
}
public class Url
{
public string url_address {get;set;}
public string url_name {get;set;}
}
How can I do that efficiently?
TL;DR If you have your data already represented as DataTable, you can insert it to the destination table on the server with SqlBulkCopy:
string csDestination = "put here the a connection string to the database";
using (SqlConnection connection = new SqlConnection(csDestination))
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
connection.Open()
bulkCopy.DestinationTableName = "TUrls";
bulkCopy.WriteToServer(dataTableOfUrls);
}
If you want to load just "from 10 to 50 urls" there's no need to use SqlBulkCopy - its general purpose to eliminate thousands of separate inserts.
So, inserting without SqlBulkCopy [and without EntityFramework] can be done one by one:
string insertQuery = "insert into TUrls(address, name) values(#address, #name)";
foreach (URL url in listOfUrls)
{
SqlCommand cmd = new SqlCommand(insertQuery);
cmd.Parameters.AddWithValue("#name", url.url_name);
cmd.Parameters.AddWithValue("#address", url.urld_address);
// Remember to take care of connection! I omit this part for clarity
cmd.ExecuteNonQuery();
}
To insert data with SqlBulkCopy you need to convert your data (e.g. a list of custom class objects) to DataTable. Below is the quote from Marc Gravell's answer as an example of generic solution for such conversion:
Here's a nice 2013 update using
FastMember from NuGet:
IEnumerable<SomeType> data = ...
DataTable table = new DataTable();
using(var reader = ObjectReader.Create(data)) {
table.Load(reader);
}
Yes, this is pretty much the exact opposite of this one;
reflection would suffice - or if you need quicker,
HyperDescriptor in 2.0, or maybe Expression in 3.5. Actually,
HyperDescriptor should be more than adequate.
For example:
// remove "this" if not on C# 3.0 / .NET 3.5
public static DataTable ToDataTable<T>(this IList<T> data)
{
PropertyDescriptorCollection props =
TypeDescriptor.GetProperties(typeof(T));
DataTable table = new DataTable();
for(int i = 0 ; i < props.Count ; i++)
{
PropertyDescriptor prop = props[i];
table.Columns.Add(prop.Name, prop.PropertyType);
}
object[] values = new object[props.Count];
foreach (T item in data)
{
for (int i = 0; i < values.Length; i++)
{
values[i] = props[i].GetValue(item);
}
table.Rows.Add(values);
}
return table;
}
Now, having your data represented as DataTable, you're ready to write it to the destination table on the server:
string csDestination = "put here the a connection string to the database";
using (SqlConnection connection = new SqlConnection(csDestination))
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
connection.Open();
bulkCopy.DestinationTableName = "TUrls";
bulkCopy.WriteToServer(dataTableOfUrls);
}
Hope it helps.
**UPDATE: **
Answer to #pseudonym27 question: "Hello can I use BulkCopy class to append data to existing table in SQL database?"
Yes, you can - BulkCopy works just as an insert command in a way that it appends data.
Also, consider using an intermediate table in case there's high probability for operation to go wrong (e.g. long insert time and connection issues), and you want to busy/lock the destination table as little time as possible. Another use case for an intermediate table is, of course, the need to do some data transformations before an insert.
Using this below code, you can convert List<YourClassname> to DataTable:-
List<YourClass> objlist = alldata;
string json = Newtonsoft.Json.JsonConvert.SerializeObject(objlist);
DataTable dt = Newtonsoft.Json.JsonConvert.DeserializeObject<DataTable>(json);
SaveDataInTables(dt, "Table_Name_Of_SQL");
Here, I'm assuming that alldata contains list<YourClass> object and you can also do - objlist.Add(objYourClass), then pass sql_TableName and data table in SaveDataInTables method. This method will insert all data in SQL_Table.
public void SaveDataInTables(DataTable dataTable, string tablename)
{
if (dataTable.Rows.Count > 0)
{
using (SqlConnection con = new SqlConnection("Your_ConnectionString"))
{
using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(con))
{
sqlBulkCopy.DestinationTableName = tablename;
con.Open();
sqlBulkCopy.WriteToServer(dataTable);
con.Close();
}
}
}
}
Hope, these codes help you!!!
You should use Table valued parameters. if you are using > sql server 2005. You can have an example here
If it's only 10-50 urls, being inserted infrequently, you can fire off insert statements. Simple and less hassle and you can use something easy and quick like dapper.
Else if you want the Bulk Copy, you would need to create and fill up an ADO.NET datatable from your JSON first - preferably matching the schema of your destination sql table. It's your choice.
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "dbo.LogData";
try
{
// Write from the source to the destination.
connection.Open();
bulkCopy.WriteToServer(dataTable1);
connection.Close();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
Related
I am trying to write a Method that simply accepts a string which is an SQL Command and runs it against the pre-defined Database/Server ... This is what I have so far:
public class TSqlConnector : IDbConnector
{
private readonly string _connectionString;
public TSqlConnector(string conn)
{
_connectionString = conn;
}
public IEnumerable<object[]> ExecuteCommand(string query)
{
var res = new List<object[]>();
try
{
using (SqlConnection sql = new SqlConnection(_connectionString))
{
sql.Open();
SqlCommand cmd = new SqlCommand(query, sql);
var reader = cmd.ExecuteReader();
DataTable tbl = new DataTable();
while (reader.Read())
{
var dr = tbl.NewRow();
dr.ItemArray = new object[reader.FieldCount];
reader.GetValues(dr.ItemArray);
res.Add(dr.ItemArray);
}
}
return res;
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
throw;
}
}
}
This code, however, gives me an error saying that
Input array is longer than the number of columns in this table.
I googled the error message, apparently I first have to define the DataTable's columns, using tbl.Add("ColumnName", typeof(type));
This however, completely undermines what I was trying to do - writing a generic version. All I wanted was some kind of construct which contains the information I would get from the SqlServer if I typed the same command into SSMS, I don't really care what hoops I have to jump through to read the data in C#; Using an object-Array for each row, having to manually cast each object into a string, int or whatever is perfectly acceptable, even a CSV-like string would be just fine
The only thing I don't want to do is add a definition for the table or a fixed amount of row. Each method that uses ExecuteCommand() will have to know what type of object-array is returned and that's fine but adding some complex data structure containing types and column names in addition to the SQL Commands seems like overkill.
Is there some easier way to achieve this?
What you have is an IDataReader from your cmd.ExecuteReader();
To load the results into a DataTable, you can use the Load method as follows:
var reader = cmd.ExecuteReader();
DataTable tbl = new DataTable();
tbl.Load(reader);
// now tbl contains the corresponding columns and rows from your sql command.
// Then you can return the ItemArrays from each row;
return tbl.Rows.Cast<DataRow>().Select(row => row.ItemArray);
I've used code like this to input a generic SQL query and return the results as a datatable. Of course, you'll have to parse out the results how you need them.
private DataTable QueryToTable(string sql, string cs)
{
var ds = new DataSet();
using (var adapter = new SqlDataAdapter(sql, cs))
{
adapter.Fill(ds);
}
return ds.Tables(0);
}
I'm trying to port some old VB6 code to C# and .NET.
There are a number of places where the old code uses a RecordSet to execute a SQL query and then loop through the results. No problem so far, but inside the loop the code makes changes to the current row, updating columns and even deleting the current row altogether.
In .NET, I can easily use a SqlDataReader to loop through SQL query results, but updates are not supported.
So I've been playing with using a SqlDataAdapter to populate a DataSet, and then loop through the rows in a DataSet table. But the DataSet doesn't seem very smart compared to the VB6's old RecordSet. For one thing, I need to provide update queries for each type of edit I have. Another concern is that a DataSet seems to hold everything in memory at once, which might be a problem if there are many results.
What is the best way to duplicate this behavior in .NET? The code below shows what I have so far. Is this the best approach, or is there another option?
using (SqlConnection connection = new SqlConnection(connectionString))
{
DataSet dataset = new DataSet();
using (SqlDataAdapter adapter = new SqlDataAdapter(new SqlCommand(query, connection)))
{
adapter.Fill(dataset);
DataTable table = dataset.Tables[0];
foreach (DataRow row in table.Rows)
{
if ((int)row["Id"] == 4)
{
if ((int)row["Value1"] > 0)
row["Value2"] = 12345;
else
row["Value3"] = 12345;
}
else if ((int)row["Id"] == 5)
{
row.Delete();
}
}
// TODO:
adapter.UpdateCommand = new SqlCommand("?", connection);
adapter.DeleteCommand = new SqlCommand("?", connection);
adapter.Update(table);
}
}
Note: I'm new to the company and can't very well tell them they have to change their connection strings or must switch to Entity Framework, which would be my choice. I'm really looking for a code-only solution.
ADO.NET DataTable and DataAdapter provide the closest equivalent of ADO Recordset with applies separation of concens principle. DataTable contains the data and provides the change tracking information (similar to EF internal entity tracking) while DataAdapter provides a standard way to populate it from database (Fill method) and apply changes back to the database (Update method).
With that being said, what are you doing is the intended way to port the ADO Recordset to ADO.NET. The only thing you've missed is that you are not always required to specify Insert, Update and Delete commands. As soon as your query is querying a single table (which I think was a requirement to get updateable Recordset anyway), you can use another ADO.NET player called DbCommandBuilder:
Automatically generates single-table commands used to reconcile changes made to a DataSet with the associated database.
Every database provider provides implementation of this abstract class. The MSDN example for SqlCommandBuilder is almost identical to your sample, so all you need before calling Update is (a bit counterintuitive):
var builder = new SqlCommandBuilder(adapter);
and that's it.
Behind the scenes,
The DbCommandBuilder registers itself as a listener for RowUpdating events that are generated by the DbDataAdapter specified in this property.
and dynamically generates the commands if they are not specifically set in the data adapter by you.
I came up with an (untested) solution for a data table.
It does require you to do some work, but it should generate update and delete commands for each row you change or delete automatically, by hooking up to the RowChanged and RowDeleted events of the DataTable.
Each row will get it's own command, equivalent to ADODB.RecordSet update / delete methods.
However, unlike the ADODB.RecordSet methods, this class will not change the underling database, but only create the SqlCommands to do it. Of course, you can change it to simply execute them on once they are created, but as I said, I didn't test it so I'll leave that up to you if you want to do it. However, please note I'm not sure how the RowChanged event will behave for multiple changes to the same row. Worst case it will be fired for each change in the row.
The class constructor takes three arguments:
The instance of the DataTable class you are working with.
A Dictionary<string, SqlDbType> that provides mapping between column names and SqlDataTypes
An optional string to represent table name. If omitted, the TableName property of the DataTable will be used.
Once you have the mapping dictionary, all you have to do is instantiate the CommandGenerator class and iterate the rows in the data table just like in the question. From that point forward everything is automated.
Once you completed your iteration, all you have to do is get the sql commands from the Commands property, and run them.
public class CommandGenerator
{
private Dictionary<string, SqlDbType> _columnToDbType;
private string _tableName;
private List<SqlCommand> _commands;
public CommandGenerator(DataTable table, Dictionary<string, SqlDbType> columnToDbType, string tableName = null)
{
_commands = new List<SqlCommand>();
_columnToDbType = columnToDbType;
_tableName = (string.IsNullOrEmpty(tableName)) ? tableName : table.TableName;
table.RowDeleted += table_RowDeleted;
table.RowChanged += table_RowChanged;
}
public IEnumerable<SqlCommand> Commands { get { return _commands; } }
private void table_RowChanged(object sender, DataRowChangeEventArgs e)
{
_commands.Add(GenerateDelete(e.Row));
}
private void table_RowDeleted(object sender, DataRowChangeEventArgs e)
{
_commands.Add(GenerateDelete(e.Row));
}
private SqlCommand GenerateUpdate(DataRow row)
{
var table = row.Table;
var cmd = new SqlCommand();
var sb = new StringBuilder();
sb.Append("UPDATE ").Append(_tableName).Append(" SET ");
var valueColumns = table.Columns.OfType<DataColumn>().Where(c => !table.PrimaryKey.Contains(c));
AppendColumns(cmd, sb, valueColumns, row);
sb.Append(" WHERE ");
AppendColumns(cmd, sb, table.PrimaryKey, row);
cmd.CommandText = sb.ToString();
return cmd;
}
private SqlCommand GenerateDelete(DataRow row)
{
var table = row.Table;
var cmd = new SqlCommand();
var sb = new StringBuilder();
sb.Append("DELETE FROM ").Append(_tableName).Append(" WHERE ");
AppendColumns(cmd, sb, table.PrimaryKey, row);
cmd.CommandText = sb.ToString();
return cmd;
}
private void AppendColumns(SqlCommand cmd, StringBuilder sb, IEnumerable<DataColumn> columns, DataRow row)
{
foreach (var column in columns)
{
sb.Append(column.ColumnName).Append(" = #").AppendLine(column.ColumnName);
cmd.Parameters.Add("#" + column.ColumnName, _columnToDbType[column.ColumnName]).Value = row[column];
}
}
}
As I wrote, this is completely untested, but I think it should be enough to at least show the general idea.
Your constraints:
Not using Entity Framework
DataSet seems to hold everything in memory at once, which might be a
problem if there are many results.
a code-only solution ( no external libraries)
Plus
The maximum number of rows that a DataTable can store is 16,777,216
row MSDN
To get high performance
//the main class to update/delete sql batches without using DataSet/DataTable.
public class SqlBatchUpdate
{
string ConnectionString { get; set; }
public SqlBatchUpdate(string connstring)
{
ConnectionString = connstring;
}
public int RunSql(string sql)
{
using (SqlConnection con = new SqlConnection(ConnectionString))
using (SqlCommand cmd = new SqlCommand(sql, con))
{
cmd.CommandType = CommandType.Text;
con.Open();
int rowsAffected = cmd.ExecuteNonQuery();
return rowsAffected;
}
}
}
//------------------------
// using the class to run a predefined patches
public class SqlBatchUpdateDemo
{
private string connstring = "myconnstring";
//run batches in sequence
public void RunBatchesInSequence()
{
var sqlBatchUpdate = new SqlBatchUpdate(connstring);
//batch1
var sql1 = #"update mytable set value2 =1234 where id =4 and Value1>0;";
var nrows = sqlBatchUpdate.RunSql(sql1);
Console.WriteLine("batch1: {0}", nrows);
//batch2
var sql2 = #"update mytable set value3 =1234 where id =4 and Value1 =0";
nrows = sqlBatchUpdate.RunSql(sql2);
Console.WriteLine("batch2: {0}", nrows);
//batch3
var sql3 = #"delete from mytable where id =5;";
nrows = sqlBatchUpdate.RunSql(sql3);
Console.WriteLine("batch3: {0}", nrows);
}
// Alternative: you can run all batches as one
public void RunAllBatches()
{
var sqlBatchUpdate = new SqlBatchUpdate(connstring );
StringBuilder sb = new StringBuilder();
var sql1 = #"update mytable set value2 =1234 where id =4 and Value1>0;";
sb.AppendLine(sql1);
//batch2
var sql2 = #"update mytable set value3 =1234 where id =4 and Value1 =0";
sb.AppendLine(sql2);
//batch3
var sql3 = #"delete from mytable where id =5;";
sb.AppendLine(sql3);
//run all batches
var nrows = c.RunSql(sb.ToString());
Console.WriteLine("all patches: {0}", nrows);
}
}
I simulated that solution and it's working fine with a high performance because all updates /delete run as batch.
My goal is to copy generic tables from one database to another. I would like to have it copy the data as is and it would be fine to either delete whatever is in the table or to add to it with new columns if there are new columns. The only thing I may want to change is to add something for versioning which can be done in a seperate part of the query.
Opening the data no problem but when I try a bulk copy but it is failing. I have gone though several posts and the closest thing is this one:
SqlBulkCopy Insert with Identity Column
I removed the SqlBulkCopyOptions.KeepIdentity from my code but it still is throwing
"The given ColumnMapping does not match up with any column in the source or destination" error
I have tried playing with the SqlBulkCopyOptions but so far no luck.
Ideas?
public void BatchBulkCopy(string connectionString, DataTable dataTable, string DestinationTbl, int batchSize)
{
// Get the DataTable
DataTable dtInsertRows = dataTable;
using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString))
{
sbc.DestinationTableName = DestinationTbl;
// Number of records to be processed in one go
sbc.BatchSize = batchSize;
// Finally write to server
sbc.WriteToServer(dtInsertRows);
}
}
If I could suggest another approach, I would have a look at the SMO (SQL Server Management Objects) library to perform such tasks.
You can find an interesting article here.
Using SMO, you can perform tasks in SQL Server, such a bulk copy, treating tables, columns and databases as objects.
Some time ago, I used SMO in a small open source application I developed, named SQLServerDatabaseCopy.
To copy the data from table to table, I created this code (the complete code is here):
foreach (Table table in Tables)
{
string columnsTable = GetListOfColumnsOfTable(table);
string bulkCopyStatement = "SELECT {3} FROM [{0}].[{1}].[{2}]";
bulkCopyStatement = String.Format(bulkCopyStatement, SourceDatabase.Name, table.Schema, table.Name, columnsTable);
using (SqlCommand selectCommand = new SqlCommand(bulkCopyStatement, connection))
{
LogFileManager.WriteToLogFile(bulkCopyStatement);
SqlDataReader dataReader = selectCommand.ExecuteReader();
using (SqlConnection destinationDatabaseConnection = new SqlConnection(destDatabaseConnString))
{
if (destinationDatabaseConnection.State == System.Data.ConnectionState.Closed)
{
destinationDatabaseConnection.Open();
}
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationDatabaseConnection))
{
bulkCopy.DestinationTableName = String.Format("[{0}].[{1}]", table.Schema, table.Name);
foreach (Column column in table.Columns)
{
//it's not needed to perfom a mapping for computed columns!
if (!column.Computed)
{
bulkCopy.ColumnMappings.Add(column.Name, column.Name);
}
}
try
{
bulkCopy.WriteToServer(dataReader);
LogFileManager.WriteToLogFile(String.Format("Bulk copy successful for table [{0}].[{1}]", table.Schema, table.Name));
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.WriteLine(ex.StackTrace);
}
finally
{
//closing reader
dataReader.Close();
}
}
}
}
}
As you can see, you have to add the ColumnMappings to the BulkCopy object for each column, because you have to define which column of source table must be mapped to a column of destination table. This is the reason of your error that says: The given ColumnMapping does not match up with any column in the source or destination.
I would add some validation to this to check what columns your source and destination tables have in common.
This essentially queries the system views (I have assumed SQL Server but this will be easily adaptable for other DBMS), to get the column names in the destination table (excluding identity columns), iterates over these and if there is a match in the source table adds the column mapping.
public void BatchBulkCopy(string connectionString, DataTable dataTable, string DestinationTbl, int batchSize)
{
using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString))
{
sbc.DestinationTableName = DestinationTbl;
string sql = "SELECT name FROM sys.columns WHERE is_identity = 0 AND object_id = OBJECT_ID(#table)";
using (var connection = new SqlConnection(connectionString))
using (var command = new SqlCommand(sql, connection))
{
command.Parameters.AddWithValue("#table", DestinationTbl);
connection.Open();
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
var column = reader.GetString(0);
if (dataTable.Columns.Contains(column))
{
sbc.ColumnMappings.Add(column, column);
}
}
}
}
// Number of records to be processed in one go
sbc.BatchSize = batchSize;
// Finally write to server
sbc.WriteToServer(dataTable);
}
}
This could still get invalid cast errors as there is no data type check, but should get you started for a generic method.
You can add
sbc.ColumnMappings.Add(0, 0);
sbc.ColumnMappings.Add(1, 1);
sbc.ColumnMappings.Add(2, 2);
sbc.ColumnMappings.Add(3, 3);
sbc.ColumnMappings.Add(4, 4);
before executing
sbc.WriteToServer(dataTable);
Thank you !!
I'm trying to do some manual SQL queries against my SQLite database using the ExecuteStoreQuery method on my ObjectContext.
The catch is that I don't always know how many columns are in the table that I'm querying. Ideally, I would like each fetched row to simply be an string[] object.
I've looked at Example 2 here: http://msdn.microsoft.com/en-us/library/vstudio/dd487208(v=vs.100).aspx
It's close to what I want to do, except that I don't know the structure of the TElement I'm fetching, so I can't define a struct as they do in the example.
Below is some of my code (not compiling due to the ???? TElement). The code below is trying to fetch the table info, so in this case I do know the structure of the rows, but in general I don't.
Is there a way to do this with ExecuteStoreQuery? Or is there a different way of doing it, while still using the existing connection of my ObjectContext (rather than opening a new SQL connection to the DB)?
public void PrintColumnHeaders(NWRevalDatabaseEntities entities, string tableName)
{
string columnListQuery = string.Format("PRAGMA table_info({0})", tableName);
var result = entities.ExecuteStoreQuery<????>(columnListQuery);
foreach (string[] row in result)
{
string columnHeader = row[1]; // Column header is in second column of table
Console.WriteLine("Column Header: {0}", columnHeader);
}
}
I got this working based on Gert Arnold's comment. Also, it took me some effort to figure out that I need a SQLiteConnection, not the EntityConnection that I could get directly from the ObjectContext. The answer to this question helped me with that.
The working code is below:
public static void PrintColumnHeaders(NWRevalDatabaseEntities entities, string tableName)
{
var sc = ((System.Data.EntityClient.EntityConnection)entities.Connection).StoreConnection;
System.Data.SQLite.SQLiteConnection sqliteConnection = (System.Data.SQLite.SQLiteConnection)sc;
sqliteConnection.Open();
System.Data.Common.DbCommand cmd = sc.CreateCommand();
cmd.CommandType = System.Data.CommandType.Text;
cmd.CommandText = string.Format("PRAGMA table_info('{0}');", tableName);
System.Data.Common.DbDataReader reader = cmd.ExecuteReader();
if (reader.HasRows)
{
object[] values = new object[reader.FieldCount];
while (reader.Read())
{
int result = reader.GetValues(values);
string columnHeader = (string)values[1]; // table_info returns a row for each column, with the column header in the second column.
Console.WriteLine("Column Header: {0}", columnHeader);
}
}
sqliteConnection.Close();
}
I have a DataTable that is filled in from an SQL query to a local database, but I don't know how to extract data from it.
Main method (in test program):
static void Main(string[] args)
{
const string connectionString = "server=localhost\\SQLExpress;database=master;integrated Security=SSPI;";
DataTable table = new DataTable("allPrograms");
using (var conn = new SqlConnection(connectionString))
{
Console.WriteLine("connection created successfuly");
string command = "SELECT * FROM Programs";
using (var cmd = new SqlCommand(command, conn))
{
Console.WriteLine("command created successfuly");
SqlDataAdapter adapt = new SqlDataAdapter(cmd);
conn.Open();
Console.WriteLine("connection opened successfuly");
adapt.Fill(table);
conn.Close();
Console.WriteLine("connection closed successfuly");
}
}
Console.Read();
}
The command I used to create the tables in my database:
create table programs
(
progid int primary key identity(1,1),
name nvarchar(255),
description nvarchar(500),
iconFile nvarchar(255),
installScript nvarchar(255)
)
How can I extract data from the DataTable into a form meaningful to use?
The DataTable has a collection .Rows of DataRow elements.
Each DataRow corresponds to one row in your database, and contains a collection of columns.
In order to access a single value, do something like this:
foreach(DataRow row in YourDataTable.Rows)
{
string name = row["name"].ToString();
string description = row["description"].ToString();
string icoFileName = row["iconFile"].ToString();
string installScript = row["installScript"].ToString();
}
You can set the datatable as a datasource to many elements.
For eg
gridView
repeater
datalist
etc etc
If you need to extract data from each row then you can use
table.rows[rowindex][columnindex]
or
if you know the column name
table.rows[rowindex][columnname]
If you need to iterate the table then you can either use a for loop or a foreach loop like
for ( int i = 0; i < table.rows.length; i ++ )
{
string name = table.rows[i]["columnname"].ToString();
}
foreach ( DataRow dr in table.Rows )
{
string name = dr["columnname"].ToString();
}
The simplest way to extract data from a DataTable when you have multiple data types (not just strings) is to use the Field<T> extension method available in the System.Data.DataSetExtensions assembly.
var id = row.Field<int>("ID"); // extract and parse int
var name = row.Field<string>("Name"); // extract string
From MSDN, the Field<T> method:
Provides strongly-typed access to each of the column values in the
DataRow.
This means that when you specify the type it will validate and unbox the object.
For example:
// iterate over the rows of the datatable
foreach (var row in table.AsEnumerable()) // AsEnumerable() returns IEnumerable<DataRow>
{
var id = row.Field<int>("ID"); // int
var name = row.Field<string>("Name"); // string
var orderValue = row.Field<decimal>("OrderValue"); // decimal
var interestRate = row.Field<double>("InterestRate"); // double
var isActive = row.Field<bool>("Active"); // bool
var orderDate = row.Field<DateTime>("OrderDate"); // DateTime
}
It also supports nullable types:
DateTime? date = row.Field<DateTime?>("DateColumn");
This can simplify extracting data from DataTable as it removes the need to explicitly convert or parse the object into the correct types.
Please consider using some code like this:
SqlDataReader reader = command.ExecuteReader();
int numRows = 0;
DataTable dt = new DataTable();
dt.Load(reader);
numRows = dt.Rows.Count;
string attended_type = "";
for (int index = 0; index < numRows; index++)
{
attended_type = dt.Rows[indice2]["columnname"].ToString();
}
reader.Close();
Unless you have a specific reason to do raw ado.net I would have a look at using an ORM (object relational mapper) like nHibernate or LINQ to SQL. That way you can query the database and retrieve objects to work with which are strongly typed and easier to work with IMHO.
var table = Tables[0]; //get first table from Dataset
foreach (DataRow row in table.Rows)
{
foreach (var item in row.ItemArray)
{
console.Write("Value:"+item);
}
}
Please, note that Open and Close the connection is not necessary when using DataAdapter.
So I suggest please update this code and remove the open and close of the connection:
SqlDataAdapter adapt = new SqlDataAdapter(cmd);
conn.Open(); // this line of code is uncessessary
Console.WriteLine("connection opened successfuly");
adapt.Fill(table);
conn.Close(); // this line of code is uncessessary
Console.WriteLine("connection closed successfuly");
Reference Documentation
The code shown in this example does not explicitly open and close the
Connection. The Fill method implicitly opens the Connection that the
DataAdapter is using if it finds that the connection is not already
open. If Fill opened the connection, it also closes the connection
when Fill is finished. This can simplify your code when you deal with
a single operation such as a Fill or an Update. However, if you are
performing multiple operations that require an open connection, you
can improve the performance of your application by explicitly calling
the Open method of the Connection, performing the operations against
the data source, and then calling the Close method of the Connection.
You should try to keep connections to the data source open as briefly
as possible to free resources for use by other client applications.