I'd like to know if it's possible to input a query result as an array using a script task(C#)? I found a code that uses a script task but does not assing each value of a query result in a variable.
//Read list of Tables with Schema from Database
string query = "Select * From "+TableName;
SqlCommand cmd = new SqlCommand(query, myADONETConnection);
//myADONETConnection.Open();
DataTable d_table = new DataTable();
d_table.Load(cmd.ExecuteReader());
myADONETConnection.Close();
string FileFullPath = DestinationFolder +"\\"+ FileNamePart +"_" + datetime + FileExtension;
StreamWriter sw = null;
sw = new StreamWriter(FileFullPath, false);
// Write the Header Row to File
int ColumnCount = d_table.Columns.Count;
for (int ic = 0; ic < ColumnCount; ic++)
{
sw.Write(d_table.Columns[ic]);
if (ic < ColumnCount - 1)
{
sw.Write(FileDelimiter);
}
}
sw.Write(sw.NewLine);
// Write All Rows to the File
foreach (DataRow dr in d_table.Rows)
{
for (int ir = 0; ir < ColumnCount; ir++)
{
if (!Convert.IsDBNull(dr[ir]))
{
sw.Write(dr[ir].ToString());
}
if (ir < ColumnCount - 1)
{
sw.Write(FileDelimiter);
}
}
sw.Write(sw.NewLine);
}
Thanks.
Based on the following comment:
Basically, I'll make a query that returns multiple columns and rows and I'd like to assign each data in a variable, like a Foreach loop container. This code is an example
I will assume that you are looking to store a query result inside a table and loop over the result row by row and assign the columns value inside variable. Then you should use an Execute SQL Task to execute the query and store the ResultSet inside a variable of type System.Object then loop over the ResultSet (Result Table) using a ForEach Loop Container with an ADO enumerator.
To get a step by step guide you can refer to one of the following links:
Looping Through a Result Set with the ForEach Loop
SSIS Basics: Using the Execute SQL Task to Generate Result Sets
Related
I'm completly new in donet and I'm trying to find a way to write database from c# into a db provider. The data stems from json files, which hold all necessary information for the insertion process (tablesnames, columnnames, types (as strings) and data). Since the amount of data can get quite big I'd like to circumvent creating a new insert statement for every row. Is there an approach that works with all db providers?
From the deserialized json File I'm first creating a DbType array, which holds the datatypes for each column.
With this I tried to get a satisfactory insert approach by
Building the INSERT Statement by using a stringbuilder
Building a DbCommand and supply it with all information
Execute the created DbCommand
using (IDbConnection connection = dbFactory.CreateConnection())
{
foreach (TableDTO table in dbTables)
{
//build the insert statement
StringBuilder insertSQLBuilder = new StringBuilder();
insertSQLBuilder.Append("INSERT INTO " + table.Name + "(");
foreach (ColumnDTO column in table.Columns)
{
insertSQLBuilder.Append(column.Name + ", ");
}
insertSQLBuilder.Length = insertSQLBuilder.Length - 2;
insertSQLBuilder.Append(") VALUES (");
for (int i = 0; i < table.Columns.Length; i++) {
insertSQLBuilder.Append("#param" + i + ", ");
}
insertSQLBuilder.Length = insertSQLBuilder.Length - 2;
insertSQLBuilder.Append(")");
//prepare the insert command
using (IDbCommand dbCommand = connection.CreateCommand()) {
dbCommand.CommandText = insertSQLBuilder.ToString();
IDbDataParameter[] dbParameters = new DbParameter[table.Columns.Length];
for (int i = 0; i < table.Columns.Length; i++)
{
IDbDataParameter dbParameter = dbCommand.CreateParameter();
dbParameter.DbType = typeArray[i]; //DbType Array, which holds the types for each column
dbParameter.ParameterName = "param" + i;
dbParameters[i] = dbParameter;
}
while (dataDeserializer.MoveNext())
{
// get new row from json file, each element of ColumnData holds the value and extra information which is not needed here
ColumnData[] columnData = dataDeserializer.Current;
for (int i = 0; i < dbParameters.Length; i++)
{
bool isNotGuid = typeArray[i] != DbType.Guid;
object value = null;
//TODO stupid conversion workaround
if (isNotGuid)
{
value = columnData[i].Value;
}
else
{
value = Guid.Parse(columnData[i].Value);
}
dbParameters[i].Value = value ?? DBNull.Value;
dbCommand.Parameters.Add(dbParameters[i]);
}
//execute statement and close connection
dbCommand.Connection.Open();
dbCommand.ExecuteNonQuery();
dbCommand.Connection.Close();
dbCommand.Parameters.Clear();
}
}
}
}
What I'd like to do is to circumvent single insert calls to the database without specifying a database provider. Is there a library which support a bulk insert for multiple providers? A perfect scenario would be a library, which allows me to just change the value of a parameter and also a maximum count for rows which should be inserted at once (to keep memory usage in check).
Working on a windows form application that reads in data from csv files and adds the data to a Datagridview. I ran into an issue with all of the rows being added to the datable and being displayed on the datagridview. The datagridview displays the datarows from the first two if conditions and OneRow if condition only. It will not add the rows from the twoRow if condition if the datable and datagridview rows are populated with the OneRow if condition rows. But i want the rows from both OneRow and TwoRow to be displyed. Also the rows from TwoRow do populate the datatable and datagridview when I comment(/**/) out the OneRow if condition. But I need both to populate the table. Thanks in advance!
Construct.MainDataTable.Columns.Add("Date", typeof(DateTime));
Construct.MainDataTable.Columns.Add("Time");
Construct.MainDataTable.Columns.Add("Serial");
Construct.MainDataTable.Columns.Add("Type");
Construct.MainDataTable.Columns.Add("level");
Construct.MainDataTable.Columns.Add("price");
Construct.MainDataTable.Columns.Add(" Limit");
Construct.MainDataTable.Columns.Add("last Limit");
Construct.MainDataTable.Columns.Add("Data");
..........................
...............................................
DataRow oneRow = Construct.MainDataTable.NewRow();
DataRow twoRow = Construct.MainDataTable.NewRow();
dataGridView2.AllowUserToAddRows = false;
if (line.Split(',')[2].Equals("Time"))
{
time = line.Split(',')[3];
date = line.Split(',')[1];
}
if (line.Split(',')[2].Equals("Level"))
{
level = line.Split(',')[3];
}
//OneROw(IF condition)
if ((Convert.ToDecimal(line.Split(',')[8])) < (Convert.ToDecimal (line.Split(',')[12])))
{
type = line.Split(',')[1];
serial = line.Split(',')[7];
price = line.Split(',')[3];
Limit = line.Split(',')[8];
lastLimit = line.Split(',')[10];
Data = line.Split(',')[12];
oneRow["Date"] = date;
oneRow["Time"] = time;
oneRow["Serial"] = serial;
oneRow["Type"] = type;
oneRow["level"] = level;
oneRow["price"] = price;
oneRow[" Limit"] = Limit;
oneRow["last Limit"] = lastlimit;
oneRow["Data"] = Data;
Construct.MainDataTable.Rows.Add(oneRow);
}
//TwoROw(IF condition)
if ((line.Contains('"')) && ((line.Contains("NG"))))
{
price = line.Split(',')[3];
type = line.Split(',')[1];
serial = line.Split(',')[7];
Limit = line.Split('"')[7];
var valLimit = Limit.Split(',').Select(a => Convert.ToInt32(a, 16));
var limitJoin = String.Join(",", valLimit);
lastlimit = line.Split('"')[1];
var vallastLimit = lastlimit.Split(',').Select(d => Convert.ToInt32(d, 16));
var lastJoin = String.Join(",", vallastLimit);
Data = line.Split('"')[5];
var valDatas = Data.Split(',').Select(s => Convert.ToInt32(s, 16));
var dataJoin = String.Join(",", valDatas);
twoRow["Date"] = date;
twoRow["Time"] = time;
twoRow["Serial"] = serial;
twoRow["Type"] = type;
twoRow["level"] = level;
twoRow["price"] = price;
twoRow["Limit"] = limitJoin;
twoRow["last Limit"] = lastJoin;
twoRow["Data"] = dataJoin;
Construct.MainDataTable.Rows.Add(twoRow);
}
dataGridView2.DataSource = Construct.MainDataTable;
Can't add a comment because I don't have enough karma so I ask my questions here: So, if I understood your problem you can't add data from one .csv file if it have more then one row? Why are you using 2 different if conditions for row in .csv file?
If you have empty data in row never mind you can still place them to your DataTable column, so you can use loop to add data from .csv to your DataTable. Try some thing like this:
public static DataTable CsvToDataTable(string csv)
{
DataTable dt = new DataTable();
string[] lines = csv.Split(new[] { "\r\n", "\r", "\n" }, StringSplitOptions.RemoveEmptyEntries);
Regex onlyDeimiterComma = new Regex(",(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))");
for (int i = 0; i < lines.Length; i++)
{
DataRow row = dt.NewRow();
string[] cells = onlyDeimiterComma.Split(lines[i]);
for (int j = 0; j < cells.Length; j++)
{
if (i == 0)
{
if (j == 0)
{
dt.Columns.Add(cells[j], typeof(DateTime));
}
else
{
dt.Columns.Add(cells[j]);
}
}
else
{
row[j] = cells[j];
}
}
dt.Rows.Add(row);
}
return dt;
}
Just call this method anywhere in your code and give it string read from your .csv file.
You can try to compile this code here and see how it works on .csv data with different data (empty columns, quoted text, quoted commas)
UPD: If you need to fill DataTable from two different .csv files you can still use code above. Just call it twice for both files and then Merge two DataTable, like this:
DataTable dt = CsvToDataTable(csvFileOne);
DataTable dtTwo = CsvToDataTable(csvFileTwo);
dt.Merge(dtTwo);
I'm new to programming (1st year of learning at college) and I'm working on a small application.
I have a window where user can retrieve data from SQL to DataGrid and a Button for exporting some data from a DataGrid data to a text file.
This is the code I've used to get data from SQL:
SqlConnection con = new SqlConnection("Server = localhost;Database = autoser; Integrated Security = true");
SqlCommand cmd = new SqlCommand("selectproduct", con); // Using a Store Procedure.
cmd.CommandType = CommandType.StoredProcedure;
DataTable dt = new DataTable("dtList");
cmd.Parameters.AddWithValue("#Code", txtbarcode.Text);
SqlDataAdapter da = new SqlDataAdapter(cmd);
da.Fill(dt);
data.ItemsSource = dt.DefaultView;
SqlDataAdapter adapt = new SqlDataAdapter(cmd);
DataSet ds = new DataSet();
adapt.Fill(ds);
con.Close();
int count = ds.Tables[0].Rows.Count;
if (count == 0)
{
MessageBox.Show("This product doesn't excist");
SystemSounds.Hand.Play();
}
else if (count == 1)
{
lblinfo.Visibility = Visibility.Visible;
SystemSounds.Asterisk.Play();
}
And this one is the code I used to write text file:
{
using (StreamWriter writer = new StreamWriter("D:\\test.txt", true))
{
writer.WriteLine("Welcome");
writer.WriteLine("E N T E R N E T");
}
using (StreamWriter writer = new StreamWriter("D:\\test.txt", true))
{
writer.WriteLine(data.Items);
}
using (StreamWriter writer = new StreamWriter("D:\\test.txt", true))
{
writer.WriteLine(data.Items);
}
// Append line to the file.
using (StreamWriter writer = new StreamWriter("D:\\test.txt", true))
{
writer.WriteLine("---------------------------------------");
writer.WriteLine(" Thank You! ");
writer.WriteLine(" " + DateTime.Now + " ");
}
}
When I Open the text file i get this data
Welcome
E N T E R N E T
System.Windows.Controls.ItemCollection - Why isn't show the data grid
data
---------------------------------------
Thank You
7/26/2018 12:38:37 PM
My question is: Where is my mistake that cause the data from the DataGrid to don't be showed in correct way?
Thanks in advance
You are using currently the following overload of the WriteLine method:
public virtual void WriteLine(object value)
If you look at the documentation of StreamWriter.WriteLine(object) it says that it:
Writes the text representation of an object by calling the ToString method on that object, followed by a line terminator to the text string or stream.
This is the reason why you get the following nice line in your file:
System.Windows.Controls.ItemCollection
The documentation of Object.ToString() method reveals that the
default implementations of the Object.ToString method return the fully qualified name of the object's type.
You would need to iterate through the collection and write each entry separately into the file. I would also suggest to use directly the data source instead of writing from the DataGrid.
foreach (DataRow row in dt.Rows)
{
object[] array = row.ItemArray;
writer.WriteLine(string.Join(" | ", array));
}
This is because data.Items is an ItemCollection and not a string.
All objects return the output of their ToString method when are asked to represent their contents as string. Normally you would override this method but in this case you can't.
So you need to tell the compiler how to retrieve the representative information from that collection. You can use either of these queries to fetch desired information out of the data grid:
var items = data.Items.AsQueryable().Cast<MyItemDataType>().Select(x => x.MyProperty);
var items = data.ItemsSource.Cast<MyItemDataType>().Select(x => x.MyProperty);
var items = data.Items.SourceCollection.AsQueryable().Cast<MyItemDataType>().Select(x => x.MyProperty);
items is a collection so you need to convert it to a string:
var text = items.Aggregate((x,y)=> x+", "+y);
MyItemDataType differs in each query and you have to find out yourself which data type is being used and MyProperty is the property in that class which represents the text of a row.
Edit
You can use this code too. It does the same thing:
string text = "";
for (int i = 0; i < data.Items.Count; i++)
{
text += data.Items[i].ToString();
if(i < data.Items.Count - 1)
text += ", ";
}
writer.WriteLine(text);
But pay attention to the data type of each item in data.Items[i].ToString(). For example if each item is of type int then data.Items[i].ToString() returns a string representing the value of that integer (e.g. 1 turns into "1") but if they are of other types (e.g. such as Customer or MyDataGridItem) you need to override ToString() method of that class to look something like this:
public class Customer{
//...
public override string ToString(){
return this.Id + " " + this.Name;
}
}
so if you cannot override this method for any reason you need to do the other approach:
string text = "";
for (int i = 0; i < data.Items.Count; i++)
{
Customer customer = data.Items[i] as Customer;//cast is required since type of Items[i] is object
text += (customer.Id + " " + customer.Name);
if(i < data.Items.Count - 1)
text += ", ";
}
writer.WriteLine(text);
furthermore, you can use a StringBuilder to speed up the string concatenation because += is slow on strings.
Look at the sample code here. This will do what you want.
public static void WriteDataToFile(DataTable submittedDataTable, string submittedFilePath)
{
int i = 0;
StreamWriter sw = null;
sw = new StreamWriter(submittedFilePath, false);
for (i = 0; i < submittedDataTable.Columns.Count - 1; i++)
{
sw.Write(submittedDataTable.Columns[i].ColumnName + ";");
}
sw.Write(submittedDataTable.Columns[i].ColumnName);
sw.WriteLine();
foreach (DataRow row in submittedDataTable.Rows)
{
object[] array = row.ItemArray;
for (i = 0; i < array.Length - 1; i++)
{
sw.Write(array[i].ToString() + ";");
}
sw.Write(array[i].ToString());
sw.WriteLine();
}
sw.Close();
}
Also, take a look at this.
using System;
using System.Web;
using System.IO;
using System.Data;
namespace WebApplication1
{
public partial class WebForm1 : System.Web.UI.Page
{
protected void Button1_Click(object sender, EventArgs e)
{
StreamWriter swExtLogFile = new StreamWriter("D:/Log/log.txt",true);
DataTable dt = new DataTable();
//Adding data To DataTable
dt.Columns.Add("ID");
dt.Columns.Add("Name");
dt.Columns.Add("Address");
dt.Rows.Add(1, "venki","Chennai");
dt.Rows.Add(2, "Hanu","London");
dt.Rows.Add(3, "john","Swiss");
int i;
swExtLogFile.Write(Environment.NewLine);
foreach (DataRow row in dt.Rows)
{
object[] array = row.ItemArray;
for (i = 0; i < array.Length - 1; i++)
{
swExtLogFile.Write(array[i].ToString() + " | ");
}
swExtLogFile.WriteLine(array[i].ToString());
}
swExtLogFile.Write("*****END OF DATA****"+DateTime.Now.ToString());
swExtLogFile.Flush();
swExtLogFile.Close();
}
}
}
Is there a way to export a whole table with a nested schema from Google BigQuery using the REST API as a CSV?
There is an example for doing this (https://cloud.google.com/bigquery/docs/exporting-data) with a not nested schema. This works fine on the not nested columns in my table. Here is the code of this part:
PagedEnumerable<TableDataList, BigQueryRow> result2 = client.ListRows(datasetId, result.Reference.TableId);
StringBuilder sb = new StringBuilder();
foreach (var row in result2)
{
sb.Append($"{row["visitorId"]}, {row["visitNumber"]}, {row["totals.hits"]}{Environment.NewLine}");
}
using (var stream = new MemoryStream(Encoding.UTF8.GetBytes(sb.ToString())))
{
var obj = gcsClient.UploadObject(bucketName, fileName, contentType, stream);
}
In BQ there are columns like totals.hits, totals.visits...If I try to address them I got the errormessage that there is not such a column. If I am addressing "totals" I get the objectname "System.Collections.Generic.Dictionary`2[System.String,System.Object]" in the rows in my csv.
Is there any possibility to do something like that? In the end I want my table from GA in BQ as a CSV somewhere else.
It is possible. Select every column you need like in the following shema und flatten everything was need to be flattened.
string query = $#"
#legacySQL
SELECT
visitorId,
visitNumber,
visitId,
visitStartTime,
date,
hits.hitNumber as hitNumber,
hits.product.productSKU as product.productSKU
FROM
FLATTEN(FLATTEN({tableName},hits),hits.product)";
//Creating a job for the query and activating legacy sql
BigQueryJob job = client.CreateQueryJob(query,
new CreateQueryJobOptions { UseLegacySql = true });
BigQueryResults queryResult = client.GetQueryResults(job.Reference.JobId,
new GetQueryResultsOptions());
StringBuilder sb = new StringBuilder();
//Getting the headers from the GA table and write them into the first row of the new table
int count = 0;
for (int i = 0; i <= queryResult.Schema.Fields.Count() - 1; i++)
{
string columenname = "";
var header = queryResult.Schema.Fields[0].Name;
if (i + 1 >= queryResult.Schema.Fields.Count)
columenname = queryResult.Schema.Fields[i].Name;
else
columenname = queryResult.Schema.Fields[i].Name + ",";
sb.Append(columenname);
}
//Getting the data from the GA table and write them row by row into the new table
sb.Append(Environment.NewLine);
foreach (var row in queryResult.GetRows())
{
count++;
if (count % 1000 == 0)
Console.WriteLine($"item {count} finished");
int blub = queryResult.Schema.Fields.Count;
for (Int64 j = 0; j < Convert.ToInt64(blub); j++)
{
try
{
if (row.RawRow.F[Convert.ToInt32(j)] != null)
sb.Append(row.RawRow.F[Convert.ToInt32(j)].V + ",");
}
catch (Exception)
{
}
}
sb.Append(Environment.NewLine);
}
In my xml file,it has many elements and many child/sub child elements are there so i have decided to load as generic list and display in the DGView and the columns are already created/customed in DGview.
I have to edit the values in the datagridview and serialize back to the file. I would like to know,
how can i get the values from the datagrid view and serialize back to the file.
I tried using this, Dataset ds = new Dataset();ds = (Dataset) (dataGridView2.Datasource);ds.WriteXml("XML_File.xml"); i got a error message, nullRefExceptionError.
As I know the DataSet ds is null, thats why I'm getting this error.
I don't want to use the dataset for binding.I want to bind the xml file directly to the datagridview. IS it possible with my approach???
This approach is good but it's not saving the xml file as like the original xml file:
DataTable dt = new DataTable("Rules");
for (int i = 0; i < dataGridView4.ColumnCount; i++)
{
dt.Columns.Add(dataGridView4.Columns[i].Name, typeof(System.String));
}
DataRow myrow;
int icols = dataGridView4.Columns.Count;
foreach (DataGridViewRow drow in this.dataGridView4.Rows)
{
myrow = dt.NewRow();
for (int i = 0; i <= icols - 1; i++)
{
myrow[i] = drow.Cells[i].Value;
}
dt.Rows.Add(myrow);
}
dt.WriteXml(#"C:\test\items.xml");
Any help for me to serilaize/write the values from datagridview.
I have adapted this approach for my problem & it works.
List<Test>laneConfigs = new List<Test>();//From a class
foreach (DataGridViewRow dr in dataGridView1.Rows)
{
int bbbBorder = 0;
Int32.TryParse(dr.Cells["BBor"].Value.ToString(), out bbbBorder );
int eeee= 0;
Int32.TryParse(dr.Cells["EBor"].Value.ToString(), out eee);
LaneConfig laneConfig = new LaneConfig(
dr.Cells["ID"].Value.ToString(),
(TrafficLaneType)Enum.Parse(typeof(TrafficLaneType), dr.Cells["Type"].Value.ToString()),
new ValueWithUnit<int>(bbbBorder, "mm"),
new ValueWithUnit<int>(eee, "mm"));
laneConfigs.Add(llaneConfig);
}