I'm completly new in donet and I'm trying to find a way to write database from c# into a db provider. The data stems from json files, which hold all necessary information for the insertion process (tablesnames, columnnames, types (as strings) and data). Since the amount of data can get quite big I'd like to circumvent creating a new insert statement for every row. Is there an approach that works with all db providers?
From the deserialized json File I'm first creating a DbType array, which holds the datatypes for each column.
With this I tried to get a satisfactory insert approach by
Building the INSERT Statement by using a stringbuilder
Building a DbCommand and supply it with all information
Execute the created DbCommand
using (IDbConnection connection = dbFactory.CreateConnection())
{
foreach (TableDTO table in dbTables)
{
//build the insert statement
StringBuilder insertSQLBuilder = new StringBuilder();
insertSQLBuilder.Append("INSERT INTO " + table.Name + "(");
foreach (ColumnDTO column in table.Columns)
{
insertSQLBuilder.Append(column.Name + ", ");
}
insertSQLBuilder.Length = insertSQLBuilder.Length - 2;
insertSQLBuilder.Append(") VALUES (");
for (int i = 0; i < table.Columns.Length; i++) {
insertSQLBuilder.Append("#param" + i + ", ");
}
insertSQLBuilder.Length = insertSQLBuilder.Length - 2;
insertSQLBuilder.Append(")");
//prepare the insert command
using (IDbCommand dbCommand = connection.CreateCommand()) {
dbCommand.CommandText = insertSQLBuilder.ToString();
IDbDataParameter[] dbParameters = new DbParameter[table.Columns.Length];
for (int i = 0; i < table.Columns.Length; i++)
{
IDbDataParameter dbParameter = dbCommand.CreateParameter();
dbParameter.DbType = typeArray[i]; //DbType Array, which holds the types for each column
dbParameter.ParameterName = "param" + i;
dbParameters[i] = dbParameter;
}
while (dataDeserializer.MoveNext())
{
// get new row from json file, each element of ColumnData holds the value and extra information which is not needed here
ColumnData[] columnData = dataDeserializer.Current;
for (int i = 0; i < dbParameters.Length; i++)
{
bool isNotGuid = typeArray[i] != DbType.Guid;
object value = null;
//TODO stupid conversion workaround
if (isNotGuid)
{
value = columnData[i].Value;
}
else
{
value = Guid.Parse(columnData[i].Value);
}
dbParameters[i].Value = value ?? DBNull.Value;
dbCommand.Parameters.Add(dbParameters[i]);
}
//execute statement and close connection
dbCommand.Connection.Open();
dbCommand.ExecuteNonQuery();
dbCommand.Connection.Close();
dbCommand.Parameters.Clear();
}
}
}
}
What I'd like to do is to circumvent single insert calls to the database without specifying a database provider. Is there a library which support a bulk insert for multiple providers? A perfect scenario would be a library, which allows me to just change the value of a parameter and also a maximum count for rows which should be inserted at once (to keep memory usage in check).
Related
I attempted to passing value from object params array to each parameter of stored procedure using loop, but I get this error
'Procedure or function Proc_GetPaging_Product has too many arguments specified.'.
The problem is cmd.Parameters already has parameters when using SqlCommandBuilder.DeriveParameters(cmd) so no need to add value but assigning.
public ServiceResult<T> GetPaging<T>(params object[] values)
{
var result = new ServiceResult<T>();
using(_conn)
{
_conn.Open();
using (SqlCommand cmd = _conn.CreateCommand())
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = "[SalesLT].[Proc_GetPaging_Product]";
SqlCommandBuilder.DeriveParameters(cmd);
int lengthParams = cmd.Parameters.Count;
for (int i = 1; i < lengthParams; i++)
{
cmd.Parameters.AddWithValue(cmd.Parameters[i].ParameterName, values[i - 1]);
}
using (SqlDataReader reader = cmd.ExecuteReader())
{
while (reader.Read())
{
var item = Activator.CreateInstance<T>();
var properties = typeof(T).GetProperties();
foreach (var property in properties)
{
//Type convertTo = Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType;
property.SetValue(item, reader[property.Name], null);
}
result.ListResult.Add(item);
}
reader.NextResult();
}
}
}
return result;
}
Error I got in debugger
You're adding the parameters twice.
Once in this row:
SqlCommandBuilder.DeriveParameters(cmd);
and once for each parameter in the loop:
for (int i = 1; i < lengthParams; i++)
{
cmd.Parameters.AddWithValue(cmd.Parameters[i].ParameterName, values[i - 1]);
}
Instead of using AddWithValue, simply do cmd.Parameter[i].Value = values[i].
Also, the parameters collection, like any other built-in collection in .Net, is zero based and not one based.
You loop should look like this:
for (int i = 0; i < lengthParams; i++)
{
cmd.Parameters[i].Value = values[i];
}
Also, I you should add a condition before the loop to make sure the number of values you have match the number of parameters the stored procedure have.
And one more note: you should consider setting the parameters in the code instead of using DeriveParameters: Official documentation clearly states:
DeriveParameters requires an additional call to the database to obtain the information. If the parameter information is known in advance, it is more efficient to populate the parameters collection by setting the information explicitly.
I'd like to know if it's possible to input a query result as an array using a script task(C#)? I found a code that uses a script task but does not assing each value of a query result in a variable.
//Read list of Tables with Schema from Database
string query = "Select * From "+TableName;
SqlCommand cmd = new SqlCommand(query, myADONETConnection);
//myADONETConnection.Open();
DataTable d_table = new DataTable();
d_table.Load(cmd.ExecuteReader());
myADONETConnection.Close();
string FileFullPath = DestinationFolder +"\\"+ FileNamePart +"_" + datetime + FileExtension;
StreamWriter sw = null;
sw = new StreamWriter(FileFullPath, false);
// Write the Header Row to File
int ColumnCount = d_table.Columns.Count;
for (int ic = 0; ic < ColumnCount; ic++)
{
sw.Write(d_table.Columns[ic]);
if (ic < ColumnCount - 1)
{
sw.Write(FileDelimiter);
}
}
sw.Write(sw.NewLine);
// Write All Rows to the File
foreach (DataRow dr in d_table.Rows)
{
for (int ir = 0; ir < ColumnCount; ir++)
{
if (!Convert.IsDBNull(dr[ir]))
{
sw.Write(dr[ir].ToString());
}
if (ir < ColumnCount - 1)
{
sw.Write(FileDelimiter);
}
}
sw.Write(sw.NewLine);
}
Thanks.
Based on the following comment:
Basically, I'll make a query that returns multiple columns and rows and I'd like to assign each data in a variable, like a Foreach loop container. This code is an example
I will assume that you are looking to store a query result inside a table and loop over the result row by row and assign the columns value inside variable. Then you should use an Execute SQL Task to execute the query and store the ResultSet inside a variable of type System.Object then loop over the ResultSet (Result Table) using a ForEach Loop Container with an ADO enumerator.
To get a step by step guide you can refer to one of the following links:
Looping Through a Result Set with the ForEach Loop
SSIS Basics: Using the Execute SQL Task to Generate Result Sets
I am executing a stored procedure using QueryMultiple to return multiple sets of data.
var gridReader = db.QueryMultiple("sp",
parameters,
commandType: CommandType.StoredProcedure);
I can very easily get each set given I know the order they will come back in.
SELECT * FROM dbo.Set1;
SELECT * FROM dbo.Set2;
SELECT * FROM dbo.Set3;
var set1 = gridReader.Read<Set1>();
var set2 = gridReader.Read<Set2>();
var set3 = gridReader.Read<Set3>();
However, I am in a situation where the order they will come back in may change. Another developer could decide to change the order for whatever reason. The stored procedure now becomes this:
SELECT * FROM dbo.Set1;
SELECT * FROM dbo.Set3;
SELECT * FROM dbo.Set2;
How can I handle this?
My initial attempt was to iterate each grid, checking the column names. This seemed to work well at first, but I wasn't able to figure out how to then project the grid into a class, besides manually setting each field. The main reason I'm using Dapper is so it can do this for me.
while (true)
{
var grid = gridReader.Read();
IDictionary<string, object> row = grid.FirstOrDefault();
if (row == null)
break;
if (row.Keys.Contains("Set1_UniqueColumnName"))
{
// Need something like grid.Read<Set1>();
}
else if (row.Keys.Contains("Set2_UniqueColumnName")) { }
else if (row.Keys.Contains("Set3_UniqueColumnName")) { }
}
My second idea was to read each grid into a class, check the unique fields of the class for nulls/default values, and trying the next class if the test failed. This obviously won't work though. .Read() will return the next grid of results. This solution would require me to be able to read the same grid over and over.
Dapper provides an IDataReader.GetRowParser extension method that enables type switching per row. From the Dapper docs here...
Usually you'll want to treat all rows from a given table as the same
data type. However, there are some circumstances where it's useful to
be able to parse different rows as different data types. This is where
IDataReader.GetRowParser comes in handy.
Imagine you have a database table named "Shapes" with the columns: Id,
Type, and Data, and you want to parse its rows into Circle, Square, or
Triangle objects based on the value of the Type column.
var shapes = new List<IShape>();
using (var reader = connection.ExecuteReader("select * from Shapes"))
{
// Generate a row parser for each type you expect.
// The generic type <IShape> is what the parser will return.
// The argument (typeof(*)) is the concrete type to parse.
var circleParser = reader.GetRowParser<IShape>(typeof(Circle));
var squareParser = reader.GetRowParser<IShape>(typeof(Square));
var triangleParser = reader.GetRowParser<IShape>(typeof(Triangle));
var typeColumnIndex = reader.GetOrdinal("Type");
while (reader.Read())
{
IShape shape;
var type = (ShapeType)reader.GetInt32(typeColumnIndex);
switch (type)
{
case ShapeType.Circle:
shape = circleParser(reader);
break;
case ShapeType.Square:
shape = squareParser(reader);
break;
case ShapeType.Triangle:
shape = triangleParser(reader);
break;
default:
throw new NotImplementedException();
}
shapes.Add(shape);
}
}
You'll need to get access to the IDataReader that the GridReader wraps or change your code to use the good old-fashioned ADO.NET SqlConnection & SqlCommand objects like this...
using (command = new SqlCommand("sp", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddRange(parameters);
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
// read row columns
}
}
}
Davmos's answer pointed me in the right direction. Needed to use a combination of ADO.NET and Dapper. Essentially use ADO.NET to retrieve and iterate through the data, but use Dapper to parse the rows into my objects. Note the use of FieldCount in the while loop in case a result set actually does return 0 rows. We want it to move on to the next result set, not break out of the loop.
Set1 set1 = null;
var set2 = new List<Set2>();
Set3 set3 = null;
using (var command = new SqlCommand("sp", conn))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddRange(parameters);
command.Connection.Open();
using (var reader = command.ExecuteReader())
{
while (reader.FieldCount > 0)
{
var set1Parser = reader.GetRowParser<Set1>();
var set2Parser = reader.GetRowParser<Set2>();
var set3Parser = reader.GetRowParser<Set3>();
var isSet1 = HasColumn(reader, "Set1_UniqueColumnName");
var isSet2 = HasColumn(reader, "Set2_UniqueColumnName");
var isSet3 = HasColumn(reader, "Set3_UniqueColumnName");
while (reader.Read())
{
if (isSet1)
{
set1 = set1Parser(reader);
}
else if (isSet2)
{
set2.Add(set2Parser(reader));
}
else if (isSet3)
{
set3 = set3Parser(reader);
}
}
reader.NextResult();
}
}
}
public static bool HasColumn(IDataReader reader, string columnName)
{
for (var i = 0; i < reader.FieldCount; i++)
{
if (reader.GetName(i).Equals(columnName, StringComparison.InvariantCultureIgnoreCase))
{
return true;
}
}
return false;
}
Is there a way to export a whole table with a nested schema from Google BigQuery using the REST API as a CSV?
There is an example for doing this (https://cloud.google.com/bigquery/docs/exporting-data) with a not nested schema. This works fine on the not nested columns in my table. Here is the code of this part:
PagedEnumerable<TableDataList, BigQueryRow> result2 = client.ListRows(datasetId, result.Reference.TableId);
StringBuilder sb = new StringBuilder();
foreach (var row in result2)
{
sb.Append($"{row["visitorId"]}, {row["visitNumber"]}, {row["totals.hits"]}{Environment.NewLine}");
}
using (var stream = new MemoryStream(Encoding.UTF8.GetBytes(sb.ToString())))
{
var obj = gcsClient.UploadObject(bucketName, fileName, contentType, stream);
}
In BQ there are columns like totals.hits, totals.visits...If I try to address them I got the errormessage that there is not such a column. If I am addressing "totals" I get the objectname "System.Collections.Generic.Dictionary`2[System.String,System.Object]" in the rows in my csv.
Is there any possibility to do something like that? In the end I want my table from GA in BQ as a CSV somewhere else.
It is possible. Select every column you need like in the following shema und flatten everything was need to be flattened.
string query = $#"
#legacySQL
SELECT
visitorId,
visitNumber,
visitId,
visitStartTime,
date,
hits.hitNumber as hitNumber,
hits.product.productSKU as product.productSKU
FROM
FLATTEN(FLATTEN({tableName},hits),hits.product)";
//Creating a job for the query and activating legacy sql
BigQueryJob job = client.CreateQueryJob(query,
new CreateQueryJobOptions { UseLegacySql = true });
BigQueryResults queryResult = client.GetQueryResults(job.Reference.JobId,
new GetQueryResultsOptions());
StringBuilder sb = new StringBuilder();
//Getting the headers from the GA table and write them into the first row of the new table
int count = 0;
for (int i = 0; i <= queryResult.Schema.Fields.Count() - 1; i++)
{
string columenname = "";
var header = queryResult.Schema.Fields[0].Name;
if (i + 1 >= queryResult.Schema.Fields.Count)
columenname = queryResult.Schema.Fields[i].Name;
else
columenname = queryResult.Schema.Fields[i].Name + ",";
sb.Append(columenname);
}
//Getting the data from the GA table and write them row by row into the new table
sb.Append(Environment.NewLine);
foreach (var row in queryResult.GetRows())
{
count++;
if (count % 1000 == 0)
Console.WriteLine($"item {count} finished");
int blub = queryResult.Schema.Fields.Count;
for (Int64 j = 0; j < Convert.ToInt64(blub); j++)
{
try
{
if (row.RawRow.F[Convert.ToInt32(j)] != null)
sb.Append(row.RawRow.F[Convert.ToInt32(j)].V + ",");
}
catch (Exception)
{
}
}
sb.Append(Environment.NewLine);
}
i have tried to search some examples about my approach but all questions was not close enough to what i was trying to achieve .
for the TLDR sake , Question is : how do i make it work as in plain sql query?
using c# - Winforms with SqlCompact4 and Linq to SQL
my scenario involves a form with all the relevant Db table columns as availble filters to query
and then on text change event of each filtertextbox as a filter, the datasource of the gridview updates accordingly
and because i allow filtered search via many of them columns i was trying to avoid use of some extra
lines of code.
so lets say we only concentrate on 4 columns
custID, name, email, cellPhone
each has its corresponding TextBox.
i am trying to make a query as follows :
first i systematically collect all Textbox into a List
var AllFormsSearchFiltersTBXLst = new List<TextBox>();
code that collects all tbx on current form
var AllFormsSearchFiltersTBXLst = [currentFormHere].Controls.OfType<TextBox>();
so now i have all of textboxes as filters regardless if they have any value
then check who has some value in it
forech textbox in this filters textboxes if text length is greater than zero
it means that current filter is active
then.. a second list AllFormsACTIVESearchfiltersTBXLst will contain only active filters
what i was trying to achieve was in same way i didn't have to specify each of textbox objects
i just looped through each of them all as a collection and didn't have to specify each via it's id
now i want to make a filter on a dbContext using only those active filters
so i will not have to ask if current tbxName is email
like
query = db.Where(db=>db.email.Contains(TbxEmail.Text));
and again and again for each of 10 to 15 columns
what i have got so far is nothing that implements what i was heading to.
using (SqlCeConnection ClientsConn = new SqlCeConnection(ConfigurationManager.ConnectionStrings["Conn_DB_RCL_CRM2014"].ConnectionString))
{
System.Data.Linq.Table<ContactsClients> db = null;
// get all column names from context
var x =(System.Reflection.MemberInfo[]) typeof(ContactsClients).GetProperties();
using (DB_RCL_CRM2014Context Context = new DB_RCL_CRM2014Context(ClientsConn))
{
if (!Filtered)
db = Context.ContactsClients;//.Where(client => client.Name.Contains("fler"));
else
{
db = Context.ContactsClients;
// filters Dictionary contains the name of textBox and its value
// I've named TBX as Columns names specially so i could equalize it to the columns names when needed to automate
foreach (KeyValuePair<string,string> CurFltrKVP in FiltersDict)
{
foreach (var memberInfo in x)
{
// couldn't find out how to build the query
}
}
}
BindingSource BS_Clients = new BindingSource();
BS_Clients.DataSource = db;
GV_ClientInfo_Search.DataSource = BS_Clients;
what i normally do when working with plain sql is
foreach textbox take its value and add it into a string as filter
var q = "where " ;
foreach(tbx CurTBX in ALLFILTERTBX)
{
q +=CurTBX.Name +" LIKE '%" + CurTBX.Text + "%'";
// and some checking of last element in list off cores
}
then pass this string as a filter to the main select query ... that simple
how do i make it work as in plain sql query?
I think that you're trying to get the property of db dynamically, like: db.email according to the looped name of your textbox (here 'email'). However, I recommend you to do it some other way. I'd make a switch for each type of the property, like: email, name etc. Something like this:
// Create a list for the results
var results = new List<YourDBResultTypeHere>();
foreach(tbx CurTBX in ALLFILTERTBX)
{
switch(CurTBX.Name) {
case "email":
results.AddRange(db.Where(db => db.email.Contains(tbx.Text)).ToList());
break;
case "name":
results.AddRange(db.Where(db => db.name.Contains(tbx.Text)).ToList());
break;
}
}
try this
void UpdateGridViewData(bool Filtered=false, Dictionary<string,string> FiltersDict = null)
{
using (SqlCeConnection ClientsConn = new SqlCeConnection(ConfigurationManager.ConnectionStrings["Conn_DB_RCL_CRM2014"].ConnectionString))
{
System.Data.Linq.Table<ContactsClients> db = null;
IEnumerable<ContactsClients> IDB = null;
BindingSource BS_Clients = new BindingSource();
System.Reflection.MemberInfo[] AllDbTblClientsColumns = (System.Reflection.MemberInfo[])typeof(ContactsClients).GetProperties();
using (DB_RCL_CRM2014Context Context = new DB_RCL_CRM2014Context(ClientsConn))
{
if (!Filtered)
{
db = Context.ContactsClients;
BS_Clients.DataSource = db;
}
else
{
string fltr = "";
var and = "";
if (FiltersDict.Count > 1) and = "AND";
for (int i = 0; i < FiltersDict.Count; i++)
{
KeyValuePair<string, string> CurFltrKVP = FiltersDict.ElementAt(i);
if (i >= FiltersDict.Count-1) and = "";
for (int j = 0; j < AllDbTblClientsColumns.Length; j++)
{
if (AllDbTblClientsColumns[j].Name.Equals(CurFltrKVP.Key))
{
fltr += string.Format("{0} Like '%{1}%' {2} ", AllDbTblClientsColumns[j].Name, CurFltrKVP.Value, and);
}
}
}
try
{
IDB = Context.ExecuteQuery<ContactsClients>(
"SELECT * " +
"FROM ContactsCosmeticsClients " +
"WHERE " + fltr
);
BS_Clients.DataSource = IDB;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
GV_ClientInfo_Search.DataSource = BS_Clients;
}
}
}