I've been developing a C# WPF project with VS2015 using SQL Server Express LocalDb with Entity Framework. I have built a custom seeder for the database, that reads test data from an Excel file, that simply combines the Excel data into a command string, and this is inserted using context.Database.ExecuteSQLCommand.
Now, I was thinking of launching the project with SQL Server Compact Edition 4.0, but I find this command is not working anymore. Do I have to write my uploader again using SqlCeConnection and SqlCeCommand or am I missing something?
Also, from somewhere I have understood that with EF you can switch the SQL provider and the code would not need other changes. Am I in for more surprises down the road?
Example of the uploader command:
string cmd = "INSERT INTO Venues(Name, City, Telephone) Values ('X','Y','Z')"
context.Database.ExecuteSqlCommand(cmd);
The error:
There was an error parsing the query. [ Token line number = 2,Token line offset = 1,Token in error = INSERT ]
This is not just a testing issue, as I would want to include this uploader in the production version, too, for quick inserting of master data (e.g. employee list).
EDIT: Uploader code. If this can be done without resorting to raw SQL, that would be a good solution, too.
This loops through Excel sheets (named after entities) and columns (first row has property name) and rows 2->n (data). This handles the upload of basically any amount of data within Excel limitations. The point is that the code has no knowledge of the entities (might have been possible to parameterize DataContext too). Code might not be optimal, as I'm just a beginner, but has worked for me, except not with SQL CE. Editing to suit CE is not a big issue, but I wanted to ask for possibly better ways.
public static class ExcelUploader
{
static ArrayList data;
static List<string> tableNames;
public static string Upload(string filePath)
{
string result = "";
data = new ArrayList();
tableNames = new List<string>();
ArrayList upLoadData = ReadFile(filePath);
List<string> dataList = ArrayListToStringList(upLoadData);
using (var db = new DataContext())
{
using (var trans = db.Database.BeginTransaction())
{
try
{
foreach (var cmd in dataList)
{
Console.WriteLine(cmd);
db.Database.ExecuteSqlCommand(cmd);
}
db.SaveChanges();
trans.Commit();
}
catch (Exception e)
{
trans.Rollback();
result = e.Message;
MessageBox.Show(result);
}
}
}
return result;
}
private static ArrayList ReadFile(string fileName)
{
List<string> commands = new List<string>();
var xlApp = new Microsoft.Office.Interop.Excel.Application();
var wb = xlApp.Workbooks.Open(fileName, ReadOnly: true);
xlApp.Visible = false;
foreach (Worksheet ws in wb.Worksheets)
{
var r = ws.UsedRange;
var array = r.Value;
data.Add(array);
tableNames.Add(ws.Name);
}
wb.Close(SaveChanges: false);
xlApp.Quit();
return data;
}
private static List<string> ArrayListToStringList(ArrayList arrList)
{
List<string> result = new List<string>();
for(int tableAmount = 0;tableAmount<data.Count;tableAmount++)
{
result.Add(ArrayToSqlCommand(arrList[tableAmount] as Array, tableNames[tableAmount]));
}
return result;
}
private static string ArrayToSqlCommand(Array arr, string tableName)
{
int propertyRow = 1;
int firstDataRow = 2;
string command = "";
// loop rows
for (int rowIndex = firstDataRow; rowIndex <= arr.GetUpperBound(0); rowIndex++)
{
command += "INSERT INTO " + tableName + "(";
//add column names
for (int colIndex = 1; colIndex <= arr.GetUpperBound(1); colIndex++)
{
//get property name
command += arr.GetValue(propertyRow, colIndex);
//add comma if not last column, otherwise close bracket
if (colIndex == arr.GetUpperBound(1))
{
command += ") Values (";
}
else
{
command += ", ";
}
}
//add values
for (int colIndex = 1; colIndex <= arr.GetUpperBound(1); colIndex++)
{
//get property value
command += "'" + arr.GetValue(rowIndex, colIndex) + "'";
//add comma if not last column, otherwise close bracket
if (colIndex == arr.GetUpperBound(1))
{
command += ");";
}
else
{
command += ", ";
}
}
command += "\n";
}
return command;
}
}
There are two ways to use raw SQL queries I'd offer.
Initial data
1) Excel table
+=======+=======+===========+
| Name | City | Telephone |
|===========================|
| Adam | Addr1 | 111-11-11 |
|-------|-------|-----------|
| Peter | Addr2 | 222-22-22 |
+-------+-------+-----------+
2) SQL Server CE table
CREATE TABLE Venues
(
Id int identity primary key,
[Name] nvarchar(100) null,
City nvarchar(100) null,
Telephone nvarchar(100) null
);
3) Getting data from Excel
Here we're interested in getting array from Excel sheet. As soon as we get it, we can safely close Excel. The code assumes file "Employees.xlsx" to be next to executable file.
private object[,] GetExcelData()
{
xlApp = new Excel.Application { Visible = false };
var xlBook =
xlApp.Workbooks.Open(System.IO.Path.Combine(
Environment.CurrentDirectory,
"Employees.xlsx"));
var xlSheet = xlBook.Sheets[1] as Excel.Worksheet;
// For process termination
var xlHwnd = new IntPtr(xlApp.Hwnd);
var xlProc = Process.GetProcesses()
.Where(p => p.MainWindowHandle == xlHwnd)
.First();
// Get Excel data: it's 2-D array with lower bounds as 1.
object[,] arr = xlSheet.Range["A1"].CurrentRegion.Value;
// Shutdown Excel
xlBook.Close();
xlApp.Quit();
xlProc.Kill();
GC.Collect();
GC.WaitForFullGCComplete();
return arr;
}
Now you can use one of the ways to generate query.
Option 1. Use ExecuteSqlCommand
When using ExecuteSqlCommand, it's advisable to use parameterized queries to avoid errors. You can pass explicitly created SqlCeParameter or just pass a value.
private void UseExecuteSqlCommand()
{
object[,] arr = GetExcelData();
using (var db = new EmpContext())
{
db.Database.Initialize(true);
int count = 0;
string sql = "INSERT INTO Venues (Name, City, Telephone) " +
"VALUES (#name, #city, #phone);";
// Start from 2-nd row since we need to skip header
for (int r = 2; r <= arr.GetUpperBound(0); ++r)
{
db.Database.ExecuteSqlCommand(
sql,
new SqlCeParameter("#name", (string)arr[r, 1]),
new SqlCeParameter("#city", (string)arr[r, 2]),
new SqlCeParameter("#phone", (string)arr[r, 3])
);
++count;
}
conn.Close();
MessageBox.Show($"{count} records were saved.");
}
}
Option 2. Use DbConnection
If you want your code to be more generic, you can create method which would accept DbConnection. This will allow to pass either SqlConnection or SqlCeConnection. But the code becomes more verbose because we can't use constructors since these classes are abstract.
private void UseDbConnection()
{
object[,] arr = GetExcelData();
using (var db = new EmpContext())
{
db.Database.Initialize(true);
int count = 0;
string sql = "INSERT INTO Venues (Name, City, Telephone) " +
"VALUES (#name, #city, #phone);";
DbParameter param = null;
DbConnection conn = db.Database.Connection;
conn.Open();
DbCommand command = conn.CreateCommand();
command.CommandText = sql;
command.CommandType = CommandType.Text;
// Create parameters
// Name
param = command.CreateParameter();
param.ParameterName = "#name";
command.Parameters.Add(param);
// City
param = command.CreateParameter();
param.ParameterName = "#city";
command.Parameters.Add(param);
// Telephone
param = command.CreateParameter();
param.ParameterName = "#phone";
command.Parameters.Add(param);
// Start from 2-nd row since we need to skip header
for (int r = 2; r <= arr.GetUpperBound(0); ++r)
{
command.Parameters["#name"].Value = (string)arr[r, 1];
command.Parameters["#city"].Value = (string)arr[r, 2];
command.Parameters["#phone"].Value = (string)arr[r, 3];
command.ExecuteNonQuery();
++count;
}
conn.Close();
MessageBox.Show($"{count} records were saved.");
}
}
You can also use ordinal positions for parameters which eliminates creating parameters names and makes code much shorter:
private void UseDbConnection()
{
object[,] arr = GetExcelData();
using (var db = new EmpContext())
{
db.Database.Initialize(true);
int count = 0;
// Take a note - use '?' as parameters
string sql = "INSERT INTO Venues (Name, City, Telephone) " +
"VALUES (?, ?, ?);";
DbConnection conn = db.Database.Connection;
conn.Open();
DbCommand command = conn.CreateCommand();
command.CommandText = sql;
command.CommandType = CommandType.Text;
// Create parameters
command.Parameters.Add(command.CreateParameter());
command.Parameters.Add(command.CreateParameter());
command.Parameters.Add(command.CreateParameter());
for (int r = 2; r <= arr.GetUpperBound(0); ++r)
{
// Access parameters by position
command.Parameters[0].Value = (string)arr[r, 1];
command.Parameters[1].Value = (string)arr[r, 2];
command.Parameters[2].Value = (string)arr[r, 3];
command.ExecuteNonQuery();
++count;
}
conn.Close();
MessageBox.Show($"{count} records were saved.");
}
}
P.S.
I didn't check whether the underlying connection is opened, but it's a good idea to do so.
Based on JohnyL's excellent input, I was able to modify my code so that it works with either SQL Server Express and and SQL Server CE. I'll put my new code as an answer, as I had to parameterize it further, as I couldn't write the property names in the code either. But this was a simple step, once I got the idea from JohnyL. Not sure though, if the database writing operation should be wrapped inside a DbTransaction, but this worked for now.
public static class ExcelUploader
{
static ArrayList data;
static List<string> tableNames;
static List<DbCommand> cmdList = new List<DbCommand>();
static DbConnection conn;
public static void Upload(string filePath)
{
data = new ArrayList();
tableNames = new List<string>();
//get Excel data to array list
ArrayList upLoadData = ReadFile(filePath);
using (var db = new DataContext())
{
conn = db.Database.Connection;
//transform arraylist into a list of DbCommands
ArrayListToCommandList(upLoadData);
conn.Open();
try
{
foreach (var cmd in cmdList)
{
//Console.WriteLine(cmd.CommandText);
cmd.ExecuteNonQuery();
}
}
catch (Exception e)
{
var result = e.Message;
MessageBox.Show(result);
}
}
}
//opens Excel file and reads worksheets to arraylist
private static ArrayList ReadFile(string fileName)
{
List<string> commands = new List<string>();
var xlApp = new Microsoft.Office.Interop.Excel.Application();
var wb = xlApp.Workbooks.Open(fileName, ReadOnly: true);
xlApp.Visible = false;
foreach (Worksheet ws in wb.Worksheets)
{
var r = ws.UsedRange;
var array = r.Value;
data.Add(array);
tableNames.Add(ws.Name);
}
wb.Close(SaveChanges: false);
xlApp.Quit();
return data;
}
//transforms arraylist to a list of DbCommands
private static void ArrayListToCommandList(ArrayList arrList)
{
List<DbCommand> result = new List<DbCommand>();
for (int tableAmount = 0; tableAmount < data.Count; tableAmount++)
{
ArrayToSqlCommands(arrList[tableAmount] as Array, tableNames[tableAmount]);
}
}
private static void ArrayToSqlCommands(Array arr, string tableName)
{
//Excel row which holds property names
int propertyRow = 1;
//First Excel row with values
int firstDataRow = 2;
string sql = "";
DbCommand cmd = conn.CreateCommand();
sql += "INSERT INTO " + tableName + "(";
//add column names to command text
for (int colIndex = 1; colIndex <= arr.GetUpperBound(1); colIndex++)
{
//get property name
sql += arr.GetValue(propertyRow, colIndex);
//add comma if not last column, otherwise close bracket
if (colIndex == arr.GetUpperBound(1))
{
sql += ") Values (";
}
else
{
sql += ", ";
}
}
//add value parameter names to command text
for (int colIndex = 1; colIndex <= arr.GetUpperBound(1); colIndex++)
{
//get property name
sql += "#" + arr.GetValue(propertyRow, colIndex);
//add comma if not last column, otherwise close bracket
if (colIndex == arr.GetUpperBound(1))
{
sql += ");";
}
else
{
sql += ", ";
}
}
//add data elements as command parameter values
for (int rowIndex = firstDataRow; rowIndex <= arr.GetUpperBound(0); rowIndex++)
{
//initialize command
cmd = conn.CreateCommand();
cmd.CommandText = sql;
cmd.CommandType = CommandType.Text;
for (int colIndex = 1; colIndex <= arr.GetUpperBound(1); colIndex++)
{
//set parameter values
DbParameter param = null;
param = cmd.CreateParameter();
param.ParameterName = "#" + (string)arr.GetValue(propertyRow, colIndex);
cmd.Parameters.Add(param);
cmd.Parameters[param.ParameterName].Value = arr.GetValue(rowIndex, colIndex);
}
//add command to command list
cmdList.Add(cmd);
}
}
}
Related
My code isn't returning any rows from a test database table when I pass a string version of a list, but it does return rows if I pass the list members in directly.
When I use a message box to show the string joinedSerialsList, it appears to be formatted properly.
// Create comma delimited list of serials:
int currentSerial = beginning;
List<string> serialsList = new List<string>();
for (int i = 0; i < count; i++)
{
serialsList.Add(currentSerial.ToString());
currentSerial++;
}
string joinedSerialsList = string.Format("({0})", string.Join(", ", serialsList));
OleDbConnection connection = BadgeDatabaseDB.GetConnection();
string checkStatement
= "SELECT SerialNumber, OrderNumber "
+ "FROM SerialNumbersMFG "
+ "WHERE SerialNumber IN (#List)";
OleDbCommand command = new OleDbCommand(checkStatement, connection);
command.Parameters.AddWithValue("#List", joinedSerialsList);
string duplicateSerials = "";
try
{
connection.Open();
OleDbDataReader dataReader = command.ExecuteReader();
if (dataReader.Read())
{
duplicateSerials += dataReader["OrderNumber"].ToString() + "\n";
}
}
catch (OleDbException ex)
{
throw ex;
}
finally
{
connection.Close();
}
return duplicateSerials;
I rewrited your sample, this work:
private IEnumerable<string> getData()
{
// Create comma delimited list of serials:
int currentSerial = 4452; // your constant
var serialsList = new List<int>();
var count = 100;
for (int i = 0; i < count; i++)
serialsList.Add(currentSerial++);
var connString = getConnectionString();
var results = new List<string>();
string sqlSelect = $"SELECT SerialNumber, OrderNumber FROM SerialNumbersMFG WHERE SerialNumber IN ({string.Join(",", serialsList)})";
using (var connection = new SqlConnection(connString)) // BadgeDatabaseDB.GetConnection();
{
using (var command = new SqlCommand(sqlSelect, connection))
{
connection.Open();
var dataReader = command.ExecuteReader();
while (dataReader.Read())
results.Add(dataReader["OrderNumber"].ToString());
}
}
return results;
}
I try to delete some rows from a table in an access database file via C#.
This attempt fails with no error which leads me to the conclusion that I have a valid query with incorrect data.
I tried to see if I can query the data with a select statement from my code and I can narrow the problem down to the parameters.
The statement should look as follows
SELECT * FROM tbIndex where pguid in ('4a651816-e15b-4c6a-85c4-74033ca6c423', '0add7bff-a22f-4238-9c7f-e1ff4ed3c7e2', '742fae8b-2692-4a6f-802c-848fad570696', '5e6b65de-2403-4800-a47d-e57c7bd8e0a6')
I tried two different ways*(dbCmd2 and dbCmd3)* from which the first*(dbCmd2)* works but is, due to injection problems, not my prefered solution.
using (OleDbCommand dbCmd2 = new OleDbCommand { Connection = m_Connection })
{
dbCmd2.CommandText = "SELECT * FROM tbIndex where pguid in ("+pguid+")";
using (DbDataReader reader = dbCmd2.ExecuteReader())
{
List<object[]> readValuesFromIndex = new List<object[]>();
while (reader.Read())
{
//Point reached
object[] arr = new object[reader.VisibleFieldCount];
reader.GetValues(arr);
//...
}
reader.Close();
}
using (OleDbCommand dbCmd3 = new OleDbCommand { Connection = m_Connection })
{
dbCmd3.CommandText = "SELECT * FROM tbIndex where pguid in (#pguid)";
dbCmd3.Parameters.Add("#pguid", OleDbType.VarChar).Value = pguid;
using (DbDataReader reader = dbCmd3.ExecuteReader())
{
List<object[]> readValuesFromIndex = new List<object[]>();
while (reader.Read())
{
//Point not reached
object[] arr = new object[reader.VisibleFieldCount];
reader.GetValues(arr);
//...
}
reader.Close();
}
}
Note that pguid is set to "'4a651816-e15b-4c6a-85c4-74033ca6c423', '0add7bff-a22f-4238-9c7f-e1ff4ed3c7e2', '742fae8b-2692-4a6f-802c-848fad570696', '5e6b65de-2403-4800-a47d-e57c7bd8e0a6'".
I always thought that the second option would simply replace the parameter in a safe manner but this is obviously not the case.
My question is:
Why doesn't the second option return any values?
A parameter always is a single value.
An in clause requires multiple values, separated by comma's.
You can do something like the following to pass them like separate parameters:
string[] guids = pguid.Split(',');
string sqlin = "";
int paramno = -1;
foreach (var guid in guids)
{
parametercount ++;
sqlin = sqlin + "#Param" + (string)parametercount; + ","
}
dbCmd3.CommandText = "SELECT * FROM tbIndex where pguid in (" + sqlin.Substring(0, sqlin.Length-1) + ")";
for(int i = 0; i <= parametercount; i++){
dbCmd3.Parameters.Add("#Param" + (string)i, OleDbType.VarChar).Value = guids[i].Replace("'", "");
}
I am trying to bulk copy from one table to another by mapping the column names as the source and destination may not have same columns always.
Source can have 8 columns and destination can have 10 .. I need to map the columns and bulk copy.
Tried the below code..didn't work..getting Error: The given ColumnName 'moduleid' does not match up with any column in data source.
Source: existingtablecolumnsPresent has [collection time],[logtime],[moduleid],[node],[reason],[time],[timestamp],[usecaseid]
Destination: dataTable.Columns has [Node],[Time],[Reason],[Moduleid],[Usecaseid]
Please advise
public static void BatchBulkCopy(DataTable dataTable, string DestinationTbl, List<string> columnMapping,string filename)
{
var program = new Program();
// Get the DataTable
DataTable dtInsertRows = dataTable;
using (SqlBulkCopy sbc = new SqlBulkCopy(program.connectionStr.ToString()))
{
try {
sbc.DestinationTableName = DestinationTbl.ToLower();
string sourceTableQuery = "Select top 1 * from " + "[" + dataTable.TableName + "]";
DataTable dtSource = SqlHelper.ExecuteDataset(program.connectionStr.ToString(), CommandType.Text, sourceTableQuery).Tables[0];
for (int i = 0; i < dataTable.Columns.Count; i++)
{ //check if destination Column Exists in Source table
if (dtSource.Columns.Cast<DataColumn>().Select(a => "[" + a.ColumnName.ToLower() + "]").Contains(dataTable.Columns[i].ToString().ToLower()))//contain method is not case sensitive
{
List<string> existingtablecolumnsPresent = dtSource.Columns.Cast<DataColumn>().Select(a => "[" + a.ColumnName.ToLower() + "]").Distinct().OrderBy(t => t).ToList();
int sourceColumnIndex = existingtablecolumnsPresent.IndexOf(dataTable.Columns[i].ToString().ToLower());//Once column matched get its index
sbc.ColumnMappings.Add(dtSource.Columns[sourceColumnIndex].ToString(), dtSource.Columns[sourceColumnIndex].ToString());//give coluns name of source table rather then destination table so that it would avoid case sensitivity
}
}
sbc.WriteToServer(dtInsertRows);
sbc.Close();
}
catch (Exception ex)
{
Log.WriteLog("BatchBulkCopy" + " - " + filename, dataTable.TableName, ex.Message.ToString());
// To move a file or folder to a new location:
//if (File.Exists(program.sourceFile + filename))
// System.IO.File.Move(program.sourceFile + filename, program.failedfiles + filename);
}
As requested (create a DataTable with the columns you want to insert in them- leave the others out. Make sure any columns you leave out are marked in the table for NULL or have a DEFAULT VALUE constraint (I can't show you how to do that unless you show me your table);
//This first method is psuedoCode to explain how to create your datatable. You need to do it in the way that makes sense for you.
public DataTable createDataTable(){
List<string> excludedColumns = new List<string>();
excludedColumns.Add("FieldToExclude");
//...
DataTable dt = new DataTable();
foreach(string col in getColumns(myTable)){
if(!excludedColumns.Contains(name)){
DataColumn dC = new DataColumn(name,type);
DataTable.Add(dC);
}
return dt;
}
public List<string> getColumns(string tableName)
{
List<string> ret = new List<string>();
using (SqlConnection conn = getConn())
{
conn.Open();
using (SqlCommand com = conn.CreateCommand())
{
com.CommandText = "select column_Name from information_schema.COLUMNS where table_name = #tab";
com.Parameters.AddWithValue("#tab", tableName);
SqlDataReader read = com.ExecuteReader();
While(read.Read()){
ret.Add(Convert.ToString(read[0]);
}
conn.Close();
}
return ret;
}
//Now, you have a DataTable that has all the columns you want to insert. Map them yourself in code by adding to the appropriate column in your datatable.
public bool isCopyInProgess = false;//not necessary - just part of my code
public void saveDataTable(string tableName, DataTable table)
{
using (SqlConnection conn = getConn())
{
conn.Open();
using (var bulkCopy = new SqlBulkCopy(conn))//, SqlBulkCopyOptions.KeepIdentity))//un-comment if you want to use your own identity column
{
// my DataTable column names match my SQL Column names, so I simply made this loop. However if your column names don't match, just pass in which datatable name matches the SQL column name in Column Mappings
foreach (DataColumn col in table.Columns)
{
//Console.WriteLine("mapping " + col.ColumnName+" ("+does_Column_Exist(col.ColumnName,"Item").ToString()+")");
bulkCopy.ColumnMappings.Add(col.ColumnName, "["+col.ColumnName+"]");
// Console.WriteLine("ok\n");
}
bulkCopy.BulkCopyTimeout = 8000;
bulkCopy.DestinationTableName = tableName;
bulkCopy.BatchSize = 10000;
bulkCopy.EnableStreaming = true;
//bulkCopy.SqlRowsCopied += BulkCopy_SqlRowsCopied;
//bulkCopy.NotifyAfter = 10000;
isCopyInProgess = true;
bulkCopy.WriteToServer(table);
}
conn.Close();
}
}
Also, use this as your bolumn checker:
public bool does_Column_Exist(string colName,string tableName)
{
bool ret = false;
using (SqlConnection conn = getConn())
{
conn.Open();
using (SqlCommand com = conn.CreateCommand())
{
com.CommandText = "select count(*) from information_schema.COLUMNS where column_name = #col and table_name = #tab";
com.Parameters.AddWithValue("#tab", tableName);
com.Parameters.AddWithValue("#col", colName);
ret = Convert.ToInt32(com.ExecuteScalar()) == 0 ? false : true;
}
conn.Close();
}
return ret;
}
Is there a specific reason you need C# for this? It seems like the path of least resistance would be to use SQL to do the job.
INSERT INTO table2
(column_name(s))
SELECT column_name(s)
FROM table1;
I already have this code to make stringBuilder for every employee, I get all the employeesId from another table . But if I get more than 1000 employees, I get the error ORA-07195 , I know this error is related to a Maximum of expression in a list . Therefore how can I send every 500 employees to my query in Data Access Objects.
Public List<GraphModel> countRequestCreatedByTypeDefaulPage(int year, int month, String employeeID)
{
int count = 0;
int countEmployeess = 0;
string employeesid = "";
DataView dv = _employeeOverrideBO.getRelatedEmployees(year, month, employeeID);
StringBuilder listEmployees = new StringBuilder();
for (int i = 0; i < countEmployees; i += 500)
{
foreach (DataRowView rowView in dv)
{
DataRow row = rowView.Row;
String employee = row["EMPLOYEE_ID"].ToString();
if (count > 0)
listEmployees.Append(",");
listEmployees.Append("'").Append(employee).Append("'");
count++;
}
}
countEmployeess++;
employeesid = listEmployees.ToString();
return _requestDAO.countRequestCreatedByTypeDefaulPage(employeesid);
Also this is my query in Data Access Object
public List<GraphModel> countRequestCreatedByTypeDefaulPage(string employeesIds)
{
String sql = " select NVL(TO_CHAR(RR.REASON_NM_NEW), 'Total') as SERIES1, count(*) AS VAL" +
" from REQUEST R, REQUEST_PERSON RP, REQUEST_REASON RR " +
" WHERE R.STATUS IN ('CREATED', 'PENDING APPROVAL', 'APPROVED BY MANAGER', 'APPROVED', 'IN PROCESS') " +
" AND R.REQUEST_ID = RP.REQUEST_ID" +
" AND RP.REQUEST_ROLE = 'REQUESTOR' " +
" AND RR.REASON_ID = R.REASON_ID" +
" AND RP.EMPLOYEE_ID IN (" + employeesIds + ") " +
" group by rollup (RR.REASON_NM_NEW) " +
" ORDER BY count(*) DESC";
OracleCommand cmd = new OracleCommand(sql);
try
{
DataTable dataTable = Data_base_Access.executeSQL(cmd, ConfigurationManager.ConnectionStrings["stage"].ToString());
return (GraphModel.convertToList(dataTable));
}
catch (Exception ex)
{
Log.writeError("Request DAO", ex);
throw new DataAccessException("There was an error counting the open requests");
}
}
Also this query get the count to list called GraphModel
public static List<GraphModel> convertToList(System.Data.DataTable dataTable)
{
List<GraphModel> list = new List<GraphModel>();
foreach (DataRow dtRow in dataTable.Rows)
{
list.Add(convertToGraphModel(dtRow));
}
return list;
}
public static GraphModel convertToGraphModel(DataRow dtRow)
{
GraphModel graphModel = new GraphModel();
if (dtRow.Table.Columns.Contains("SERIES1") && dtRow["SERIES1"] != DBNull.Value)
{
graphModel.SERIES1 = Convert.ToString(dtRow["SERIES1"]);
}
if (dtRow.Table.Columns.Contains("SERIES2") && dtRow["SERIES2"] != DBNull.Value)
{
graphModel.SERIES2 = Convert.ToString(dtRow["SERIES2"]);
}
if (dtRow.Table.Columns.Contains("VAL") && dtRow["VAL"] != DBNull.Value)
{
graphModel.VAL = Convert.ToInt32(dtRow["VAL"]);
}
return graphModel;
}
}
I really appreciate your help because I am research a lot and I dont know what can I do
Split the list into 1000 item lists and change the query into this:
" AND (RP.EMPLOYEE_ID IN (" + ids_1_1000 + ") OR RP.EMPLOYEE_ID IN (" + ids_1001_2000 + "))" +
One of the features I love most about Oracle is the Oracle Call Interface (OCI) that lets you access some of the more powerful features or Oracle with programming languages. In particular, for this example, the ability to do bulk inserts should prove very helpful.
If, instead of the approach you have above, which is trying to insert thousands of literals into a single SQL statement, you put those values into a table and do a join, I think you will:
Spare the shared pool from having to compile a really nasty SQL statement
Eliminate the need to escape strings or worry about SQL Injection
Have a lightning fast query that substitutes a join (bread and butter for a database) for a giant in-list
Step 1: Create a GTT:
create global temporary table employee_list (
employee_id varchar2(100) not null
) on commit preserve rows;
The GTT is based on a session, so even if you have this code running in multiple instances, each GTT will act as a blank slate for each instance -- no possibility of collisions of data.
Step 2: Within your code, create a transaction to handle the fact that you need the insert to the table and the select on the data to occur as part of the same transaction:
OracleTransaction trans = conn.BeginTransaction(IsolationLevel.ReadCommitted);
Step 3: Use ODP.net's bulk insert capabilities to insert all of your employee Ids at once. I encourage you to benchmark this versus inserting them one at a time. You'll be amazed. If you have more than 50,000, then maybe you need to break the up into chunks, but with a single field, I think this should be more than adequate:
// string[] employeesIds
OracleCommand cmd = new OracleCommand("insert into employee_list values (:EMPLOYEE)",
conn);
cmd.Transaction = trans;
cmd.Parameters.Add(new OracleParameter("EMPLOYEE", OracleDbType.Varchar2));
cmd.Parameters[0].Value = employeesIds;
cmd.ArrayBindCount = employeesIds.Length;
cmd.ExecuteNonQuery();
Note employeeIds should be an array.
Step 4: Change your SQL from an in-list to a join:
select NVL(TO_CHAR(RR.REASON_NM_NEW), 'Total') as SERIES1, count(*) AS VAL
from
REQUEST R,
REQUEST_PERSON RP,
REQUEST_REASON RR,
employee_list e -- added this
WHERE R.STATUS IN ('CREATED', 'PENDING APPROVAL', 'APPROVED BY MANAGER',
'APPROVED', 'IN PROCESS')
AND R.REQUEST_ID = RP.REQUEST_ID
AND RP.REQUEST_ROLE = 'REQUESTOR'
AND RR.REASON_ID = R.REASON_ID
AND RP.EMPLOYEE_ID = e.employee_id -- changed this
group by rollup (RR.REASON_NM_NEW)
ORDER BY count(*) DESC
And here's what it would all look like together:
public List<GraphModel> countRequestCreatedByTypeDefaulPage(string[] employeesIds)
{
OracleTransaction trans = conn.BeginTransaction(IsolationLevel.ReadCommitted);
OracleCommand cmd = new OracleCommand("insert into employee_list values (:EMPLOYEE)",
conn);
cmd.Transaction = trans;
cmd.Parameters.Add(new OracleParameter("EMPLOYEE", OracleDbType.Varchar2));
cmd.Parameters[0].Value = employeesIds;
cmd.ArrayBindCount = employeesIds.Length;
cmd.ExecuteNonQuery();
String sql = ""; // code from above goes here
cmd = new OracleCommand(sql, conn);
cmd.Transaction = trans;
DataTable dataTable = null;
try
{
dataTable = Data_base_Access.executeSQL(cmd,
ConfigurationManager.ConnectionStrings["stage"].ToString());
return (GraphModel.convertToList(dataTable));
}
catch (Exception ex)
{
Log.writeError("Request DAO", ex);
throw new DataAccessException("There was an error counting the open requests");
}
finally
{
trans.Rollback();
}
return dataTable;
}
I resolved my problem with this code..
public List<GraphModel> countRequestCreatedByTypeDefaulPage(int year, int month, String employeeID)
{
int count = 0;
int countEmployees = 0;
Dictionary<string, int> dataChart = new Dictionary<string, int>();
DataView dv = _employeeOverrideBO.getRelatedEmployeesRequests(year, month, employeeID);
StringBuilder listEmployees = new StringBuilder();
foreach (DataRowView rowView in dv)
{
if (countEmployees == 500)
{
List<GraphModel> listReturn = _requestDAO.countRequestCreatedByTypeDefaulPage(listEmployees.ToString());
foreach(GraphModel model in listReturn){
if (dataChart.ContainsKey(model.SERIES1))
{
dataChart[model.SERIES1] = dataChart[model.SERIES1] + model.VAL;
}
else
{
dataChart[model.SERIES1] = model.VAL;
}
}
listEmployees = new StringBuilder();
count = 0;
countEmployees = 0;
}
DataRow row = rowView.Row;
String employee = row["EMPLOYEE_ID"].ToString();
if (count > 0)
listEmployees.Append(",");
listEmployees.Append("'").Append(employee).Append("'");
count++;
countEmployees++;
}
//Last Call
List<GraphModel> listReturnLast = _requestDAO.countRequestCreatedByTypeDefaulPage(listEmployees.ToString());
foreach (GraphModel model in listReturnLast) {
if (dataChart.ContainsKey(model.SERIES1))
{
dataChart[model.SERIES1] = dataChart[model.SERIES1] + model.VAL;
}
else
{
dataChart[model.SERIES1] = model.VAL;
}
}
List<GraphModel> list = new List<GraphModel>();
foreach (KeyValuePair<string, int> entry in dataChart)
{
GraphModel model = new GraphModel();
model.SERIES1 = entry.Key;
model.VAL = entry.Value;
list.Add(model);
}
return list;
}
This is a bit weird to me.
I have the following code:
public static object[,] dbMultipleSingleQueries(string strDBPath, object[] oSQL)
{
//This is needed for attaching a DB.
object[,] objOut;
objOut = new Object[1, 1];
try
{
//Prepare DB connection string
strDBPath = "Data Source=" +
#strDBPath +
";Version=3";
//SQL Objs
SQLiteConnection dbConnection = new SQLiteConnection(strDBPath);
SQLiteCommand dbCommand = null;
SQLiteDataAdapter dbAdapter = null;
DataTable table = new DataTable();
//For looping
int j = 0;
//Conn str
dbConnection.Open();
dbCommand = new SQLiteCommand(dbConnection);
while (j < oSQL.Length)
{
try
{
dbCommand.CommandText = oSQL[j].ToString();
dbCommand.CommandType = System.Data.CommandType.Text;
if (j == (oSQL.Length - 1))
{
dbAdapter = new SQLiteDataAdapter(dbCommand);
dbAdapter.Fill(table);
objOut = table2Array(table);
}
else
{
dbCommand.ExecuteNonQuery();
objOut[0, 0] = "Success!";
}
}
catch (Exception e2)
{
System.Console.WriteLine(e2.StackTrace.ToString());
objOut[0, 0] = e2.StackTrace.ToString();
}
j++;
};
dbConnection.Close();
dbConnection = null;
}
catch (Exception e)
{
objOut[0, 0] = e.StackTrace.ToString();
}
return objOut;
}
which works perfectly if i feed my arguments from a spreadsheet like this:
o = Application.Run("dbMultipleSingleQueries", "20130201.db", [test])
where [test] is a range in my workbook with queries that work and look something like this:
attach database 'a.db' as d1
select * from d1.main
now if i do the following in vba
Option Explicit
Sub Main()
Dim sql(1 To 2, 1 To 1)
sql(1, 1) = "attach database 'a.db' as d1"
sql(2, 1) = "select * from d1.main"
o = Application.Run("dbMultipleSingleQueries", "20130201.db", sql)
End Sub
this does not work. i get a "Type Mismatch" error. Any input why would be a huge help this is driving me nuts!
Your function prototype specifies that the second argument needs to be of type "Object[]" - but you are passing a variant that probably thinks it is a string array, based on the assignment.