I am trying to create a function that will connect to an external database and run a query. When I run my function I am getting this error:
Data access is not allowed in this context. Either the context is a function or method not marked with DataAccessKind.Read or SystemDataAccessKind.Read, is a callback to obtain data from FillRow method of a Table Valued Function, or is a UDT validation method.
I don't think that I am doing anything strange, but here is my code. Please let me know if you spot something odd. I really don't know what else to try.
[Microsoft.SqlServer.Server.SqlFunction]
public static SqlString createFile()
{
string theQuery = "SELECT * FROM A_TABLE;";
string theConnection = "Data Source=serverName;Initial Catalog=DatabaseBane;Persist Security Info=True;User ID=login;Password=thePassword";
SqlConnection DBConnect = new SqlConnection(theConnection);
try
{
//My code is breaking here************************************
DBConnect.Open();
}
catch (Exception e)
{
return "Happening in the connect: " + e.Message;
}
SqlDataAdapter dataAdapter = new SqlDataAdapter(theQuery, DBConnect);
DataTable HRNdata = new DataTable();
dataAdapter.Fill(HRNdata);
FileStream stream = new FileStream(#"C:\TestFiles\demo.xls", FileMode.OpenOrCreate);
ExcelWriter writer = new ExcelWriter(stream);
writer.BeginWrite();
Dictionary<string, int> noteDict = new Dictionary<string, int>();
foreach (DataRow r in HRNdata.Rows)
{
try
{
noteDict.Add(r["Note"].ToString(), 1);
}
catch
{
noteDict[r["Note"].ToString()] += 1;
}
}
int counter = 1;
foreach (KeyValuePair<string, int> pair in noteDict)
{
writer.WriteCell(1, counter, pair.Key);
writer.WriteCell(2, counter, pair.Value);
counter++;
}
writer.EndWrite();
stream.Close();
try
{
DBConnect.Close();
}
catch (Exception e)
{
return e.Message;
}
return "";
}
You will need to add an annotation to your method along the lines DataAccessKind.Read.
Related
So this is a little bit code-ceptionlike.
I have a function that is checking the last ID in a table, this function is called within another function. At the end of that function, I have another function that's opening another datareader.
Error:
There is already an open Datareader associated with this connection which must be closed first.
getLastIdfromDB()
public string getLastIdFromDB()
{
int lastIndex;
string lastID ="";
var dbCon = DB_connect.Instance();
if (dbCon.IsConnect())
{
MySqlCommand cmd2 = new MySqlCommand("SELECT ID FROM `competitor`", dbCon.Connection);
try
{
MySqlDataReader reader = cmd2.ExecuteReader();
while (reader.Read())
{
string item = reader2["ID"].ToString();
lastIndex = int.Parse(item);
lastIndex++;
lastID = lastIndex.ToString();
}
}
catch (Exception ex)
{
MessageBox.Show("Error:" + ex.Message);
}
}
return lastID;
}
This function is later-on used in this function:
private void addPlayerBtn_Click(object sender, EventArgs e)
{
ListViewItem lvi = new ListViewItem(getLastIdFromDB());
.........................................^
... HERE
...
... irrelevant code removed
.........................................
var dbCon = DB_connect.Instance();
if (dbCon.IsConnect())
{
MySqlCommand cmd = new MySqlCommand("INSERT INTO `competitor`(`ID`, `Name`, `Age`) VALUES(#idSql,#NameSql,#AgeSql)", dbCon.Connection);
cmd.Parameters.AddWithValue("#idSql", getLastIdFromDB());
cmd.Parameters.AddWithValue("#NameSql", playerName.Text);
cmd.Parameters.AddWithValue("#AgeSql", playerAge.Text);
try
{
cmd.ExecuteNonQuery();
listView1.Items.Clear();
}
catch (Exception ex)
{
MessageBox.Show("Error:" + ex.Message);
dbCon.Connection.Close();
}
finally
{
updateListView();
}
}
}
What would be the best way for me to solve this problem and in the future be sure to close my connections properly?
UPDATE: (per request, included DB_connect)
class DB_connect
{
private DB_connect()
{
}
private string databaseName = "simhopp";
public string DatabaseName
{
get { return databaseName; }
set { databaseName = value; }
}
public string Password { get; set; }
private MySqlConnection connection = null;
public MySqlConnection Connection
{
get { return connection; }
}
private static DB_connect _instance = null;
public static DB_connect Instance()
{
if (_instance == null)
_instance = new DB_connect();
return _instance;
}
public bool IsConnect()
{
bool result = true;
try
{
if (Connection == null)
{
if (String.IsNullOrEmpty(databaseName))
result = false;
string connstring = string.Format("Server=localhost; database={0}; UID=root;", databaseName);
connection = new MySqlConnection(connstring);
connection.Open();
result = true;
}
}
catch (Exception ex)
{
Console.Write("Error: " + ex.Message);
}
return result;
}
public void Close()
{
connection.Close();
}
}
}
You are trying to have multiple open readers on the same connection. This is commonly called "MARS" (multiple active result sets). MySql seems to have no support for it.
You will have to either limit yourself to one open reader at a time, or use more than one connection, so you can have one connection for each reader.
My suggestion would be to throw away that singleton-like thingy and instead use connection pooling and proper using blocks.
As suggested by Pikoh in the comments, using the using clause indeed solved it for me.
Working code-snippet:
getLastIdFromDB
using (MySqlDataReader reader2 = cmd2.ExecuteReader()) {
while (reader2.Read())
{
string item = reader2["ID"].ToString();
lastIndex = int.Parse(item);
lastIndex++;
lastID = lastIndex.ToString();
}
}
Your connection handling here is not good. You need to ditch the DB_connect. No need to maintain a single connection - just open and close the connection each time you need it. Under the covers, ADO.NET will "pool" the connection for you, so that you don't actually have to wait to reconnect.
For any object that implements IDisposable you need to either call .Dispose() on it in a finally block, or wrap it in a using statement. That ensures your resources are properly disposed of. I recommend the using statement, because it helps keep the scope clear.
Your naming conventions should conform to C# standards. Methods that return a boolean should be like IsConnected, not IsConnect. addPlayerBtn_Click should be AddPlayerButton_Click. getLastIdFromDB should be GetlastIdFromDb or getLastIdFromDatabase.
public string GetLastIdFromDatabase()
{
int lastIndex;
string lastID ="";
using (var connection = new MySqlConnection(Configuration.ConnectionString))
using (var command = new MySqlCommand("query", connection))
{
connection.Open();
MySqlDataReader reader = cmd2.ExecuteReader();
while (reader.Read())
{
string item = reader2["ID"].ToString();
lastIndex = int.Parse(item);
lastIndex++;
lastID = lastIndex.ToString();
}
}
return lastID;
}
Note, your query is bad too. I suspect you're using a string data type instead of a number, even though your ID's are number based. You should switch your column to a number data type, then select the max() number. Or use an autoincrementing column or sequence to get the next ID. Reading every single row to determine the next ID and incrementing a counter not good.
My database is in SQL Server and I use Linq-to-SQL. I used from SP(Save cards) .
I put breakpoint in my code, when arrive at rdr = cmm.ExecuteReader(); get me exception!!!
private void btnSave_Click(object sender, EventArgs e)
{
PersianCalendar jc = new PersianCalendar();
string SaveDate = jc.GetYear(DateTime.Now).ToString();
int from=Convert.ToInt32(txt_barcode_f.Text);
int to=Convert.ToInt32(txt_barcode_t .Text);
int quantity=Convert.ToInt32(to-from);
int card_Type_ID=Convert.ToInt32(cmb_BracodeType .SelectedValue);
int[] arrCardNum = new int[quantity];
arrCardNum[0]=from;
for (int i = from; i < to;i++ )
{
for(int j=0; j<quantity ;j++)
{
arrCardNum[j]=from+j;
int r = arrCardNum[j];
sp.SaveCards(r, 2, card_Type_ID, SaveDate, 2);
}
}
}
public void SaveCards(int Barcode_Num, int Card_Status_ID, int Card_Type_ID, string Save_Date, int Save_User_ID)
{
IDbCommand cmm;
cmm = Linq.Connection.CreateCommand();
try
{
cmm.Parameters.Add(new SqlParameter("#Barcode_Num", Barcode_Num));
cmm.Parameters.Add(new SqlParameter("#Card_Status_ID", 2));
cmm.Parameters.Add(new SqlParameter("#Card_Type_ID", Card_Type_ID));
cmm.Parameters.Add(new SqlParameter("#SaveDate", Save_Date));
cmm.Parameters.Add(new SqlParameter("#Save_User_ID", Save_User_ID));
cmm.CommandText = "SaveCards";
cmm.Connection.Open();
cmm.Connection = Linq.Connection;
cmm.CommandType = CommandType.StoredProcedure;
IDataReader rdr = null;
**rdr = cmm.ExecuteReader();**
while (rdr.Read())
{
Console.Write(" All Insert ! " + "\n");
}
}
catch (SqlException ex)
{
SqlExceptionHandler(ex, Save_User_ID);
}
catch (Exception ex)
{
PopularEexceptionHandler(ex, Save_User_ID);
}
finally
{ cmm.Connection.Close(); }
}
when excute sp, show no result and display this:
when execute sp , display this:The INSERT statement conflicted with the CHECK constraint "CK_BarCode_Num". The conflict occurred in database "Parking", table "dbo.TBL_Cards", column 'BarCode_Num'. The statement has been terminated. No rows affected. (0 row(s) returned) #RETURN_VALUE = -6
You're a bit chaotic on how you set up your connection and command..... e.g. you open the connection before you even assign it! How is that going to work??
My recommendation would be this order (based on the principle first do all the setup before opening the connection, and furthermore open the connection as late as possible, close it as quickly as possible) :
IDbCommand cmm = Linq.Connection.CreateCommand();
try
{
// define name and type of command
cmm.CommandText = "SaveCards";
cmm.CommandType = CommandType.StoredProcedure;
// assign connection
cmm.Connection = Linq.Connection;
// define parameters
cmm.Parameters.Add(new SqlParameter("#Barcode_Num", Barcode_Num));
cmm.Parameters.Add(new SqlParameter("#Card_Status_ID", 2));
cmm.Parameters.Add(new SqlParameter("#Card_Type_ID", Card_Type_ID));
cmm.Parameters.Add(new SqlParameter("#SaveDate", Save_Date));
cmm.Parameters.Add(new SqlParameter("#Save_User_ID", Save_User_ID));
// only now - after all the setup - open the connection, read the data
cmm.Connection.Open();
IDataReader rdr = rdr = cmm.ExecuteReader();
while (rdr.Read())
{
....
}
}
To catch the errors that the execution throws, replace your catch block for the SqlException with this:
catch (SqlException ex)
{
StringBuilder sbErrors = new StringBuilder();
foreach (SqlError error in ex.Errors)
{
sbErrors.AppendLine(error.Message);
}
string allErrors = sbErrors.ToString();
}
and debug into that catch block - what is the allErrors string in the end??
Update: after a chat session, we know finally know what the message in the SQL exception is:
The INSERT statement conflicted with the CHECK constraint "CK_BarCode_Num". The conflict occurred in database "Parking", table "dbo.TBL_Cards", column 'BarCode_Num'.
Now we're trying to find out what that constraint is / does and why it gets violated....
I think it could be that you have open the connection and assigning another connection before executing the reader.
cmm.CommandText = "SaveCards";
//cmm.Connection.Open(); You should open the connection after assigning it
cmm.Connection = Linq.Connection;
cmm.CommandType = CommandType.StoredProcedure;
cmm.Connection.Open(); //Open it here
SqlDataReader rdr = cmm.ExecuteReader();
I hava access DB, one of my function(C#.net) need to Exec a SQL more than 4000 times with transaction.
It seems that after execution the DB file stay opened exclusively. because there is a *.ldb file, and that file stay there for a long time.
Is that caused by dispose resources incorrectly???
private int AmendUniqueData(Trans trn)
{
int reslt = 0;
foreach (DataRow dr in _dt.Rows)
{
OleDbParameter[] _params = {
new OleDbParameter("#templateId",dr["Id"].ToString()),
new OleDbParameter("#templateNumber",dr["templateNumber"].ToString())
};
string sqlUpdateUnique = "UPDATE " + dr["proformaNo"].ToString().Substring(0,2) + "_unique SET templateId = #templateId WHERE templateNumber=#templateNumber";
reslt = OleDBHelper.ExecSqlWithTran(sqlUpdateUnique, trn, _params);
if (reslt < 0)
{
throw new Exception(dr["id"].ToString());
}
}
return reslt;
}
the transaction:
using (Trans trn = new Trans())
{
try
{
int reslt=AmendUniqueData(trn);
trn.Commit();
return reslt;
}
catch
{
trn.RollBack();
throw;
}
finally
{
trn.Colse();
}
}
forget closing the database connection.
I've written a small console app that I point to a folder containing DBF/FoxPo files.
It then creates a table in SQL based on each dbf table, then does a bulk copy to insert the data into SQL. It works quite well for the most part, except for a few snags..
1) Some of the FoxPro tables contain 5000000+ records and the connection expries before the insert completes..
Here is my connection string:
<add name="SQL" connectionString="data source=source_source;persist security info=True;user id=DBFToSQL;password=DBFToSQL;Connection Timeout=20000;Max Pool Size=200" providerName="System.Data.SqlClient" />
Error message:
"Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
CODE:
using (SqlConnection SQLConn = new SqlConnection(SQLString))
using (OleDbConnection FPConn = new OleDbConnection(FoxString))
{
ServerConnection srvConn = new Microsoft.SqlServer.Management.Common.ServerConnection(SQLConn);
try
{
FPConn.Open();
string dataString = String.Format("Select * from {0}", tableName);
using (OleDbCommand Command = new OleDbCommand(dataString, FPConn))
using (OleDbDataReader Reader = Command.ExecuteReader(CommandBehavior.SequentialAccess))
{
tbl = new Table(database, tableName, "schema");
for (int i = 0; i < Reader.FieldCount; i++)
{
col = new Column(tbl, Reader.GetName(i), ConvertTypeToDataType(Reader.GetFieldType(i)));
col.Nullable = true;
tbl.Columns.Add(col);
}
tbl.Create();
BulkCopy(Reader, tableName);
}
}
catch (Exception ex)
{
// LogText(ex, #"C:\LoadTable_Errors.txt", tableName);
throw ex;
}
finally
{
SQLConn.Close();
srvConn.Disconnect();
}
}
private DataType ConvertTypeToDataType(Type type)
{
switch (type.ToString())
{
case "System.Decimal":
return DataType.Decimal(18, 38);
case "System.String":
return DataType.NVarCharMax;
case "System.Int32":
return DataType.Int;
case "System.DateTime":
return DataType.DateTime;
case "System.Boolean":
return DataType.Bit;
default:
throw new NotImplementedException("ConvertTypeToDataType Not implemented for type : " + type.ToString());
}
}
private void BulkCopy(OleDbDataReader reader, string tableName)
{
using (SqlConnection SQLConn = new SqlConnection(SQLString))
{
SQLConn.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(SQLConn);
bulkCopy.DestinationTableName = "schema." + tableName;
try
{
bulkCopy.WriteToServer(reader);
}
catch (Exception ex)
{
//LogText(ex, #"C:\BulkCopy_Errors.txt", tableName);
}
finally
{
SQLConn.Close();
reader.Close();
}
}
}
My 2nd & 3rd errors are the following:
I understand what the issues are, but how to rectify them i'm not so sure
2) "The provider could not determine the Decimal value. For example, the row was just created, the default for the Decimal column was not available, and the consumer had not yet set a new Decimal value."
3) SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM.
I found a result on google that indicated what the issue is : [A]... and a possible work around [B] (but I'd like to keep my decimal values as decimal, and dates as date, as I'll be doing further calculations against the data)
What I'm wanting to do as a solution
1.) Either increase the connection time, (but i dont think i can increase it any more than i have), or alternatively is it possible to split the OleDbDataReader's results and do in incremental bulk insert?
2.)I was thinking if its possible to have bulk copy to ignore results with errors, or have the records that do error out log to a csv file or something to that extent?
So where you do the "for" statement I would probably break it up to take so many at a time :
int i = 0;
int MaxCount = 1000;
while (i < Reader.FieldCount)
{
var tbl = new Table(database, tableName, "schema");
for (int j = i; j < MaxCount; j++)
{
col = new Column(tbl, Reader.GetName(j), ConvertTypeToDataType(Reader.GetFieldType(j)));
col.Nullable = true;
tbl.Columns.Add(col);
i++;
}
tbl.Create();
BulkCopy(Reader, tableName);
}
So, "i" keeps track of the overall count, "j" keeps track of the incremental count (ie your max at one time count) and when you have created your 'batch', you create the table and Bulk Copy it.
Does that look like what you would expect?
Cheers,
Chris.
This is my current attemt at the bulk copy method, I't works for about 90% of the tables, but i get an OutOfMemory exeption, with the bigger tables... I'd like to split the reader's data into smaller secions, without having to pass it into a DataTable and store it in memory first (which is the cause of the OutOfMemory exception on the bigger result sets)
UPDATE
Imodified the code below as to how it looks in my solution.. It aint pretty.. but it works. I'll def do some refactoring, and update my answer again.
private void BulkCopy(OleDbDataReader reader, string tableName, Table table)
{
Console.WriteLine(tableName + " BulkCopy Started.");
try
{
DataTable tbl = new DataTable();
List<Type> typeList = new List<Type>();
foreach (Column col in table.Columns)
{
tbl.Columns.Add(col.Name, ConvertDataTypeToType(col.DataType));
typeList.Add(ConvertDataTypeToType(col.DataType));
}
int batch = 1;
int counter = 0;
DataRow tblRow = tbl.NewRow();
while (reader.Read())
{
counter++;
int colcounter = 0;
foreach (Column col in table.Columns)
{
try
{
tblRow[colcounter] = reader[colcounter];
}
catch (Exception)
{
tblRow[colcounter] = GetDefault(typeList[0]);
}
colcounter++;
}
tbl.LoadDataRow(tblRow.ItemArray, true);
if (counter == BulkInsertIncrement)
{
Console.WriteLine(tableName + " :: Batch >> " + batch);
counter = PerformInsert(tableName, tbl, batch);
batch++;
}
}
if (counter > 0)
{
Console.WriteLine(tableName + " :: Batch >> " + batch);
PerformInsert(tableName, tbl, counter);
}
tbl = null;
Console.WriteLine("BulkCopy Success!");
}
catch (Exception ex)
{
Console.WriteLine("BulkCopy Fail!");
SharedLogger.Write(ex, #"C:\BulkCopy_Errors.txt", tableName);
Console.WriteLine(ex.Message);
}
finally
{
reader.Close();
reader.Dispose();
}
Console.WriteLine(tableName + " BulkCopy Ended.");
Console.WriteLine("*****");
Console.WriteLine("");
}
I am getting data from a mySql database and I am inserting it into another system. If a column has data in an incorrect format I log this and go to next row in the datatable. It works as expected but now if I have a search function in my method that gets some additional data and this function fails I want to immediately log this and go to next row. As it is now I just log it but it still gets inserted (without the value that didn't meet the search criteria).
My code:
private void updateCustomer()
{
MySqlConnection connection = new MySqlConnection("server=myServer;database=myDatabase;uid=myID;password=myPass");
MySqlCommand command = new MySqlCommand(#"mySelectCommand", connection);
DataTable customerTbl = new DataTable();
MySqlDataReader reader;
try
{
connection.Open();
reader = command.ExecuteReader();
if (reader.HasRows)
{
customerTbl.Load(reader);
}
reader.Close();
}
catch (Exception ex)
{
_out.error("Could not connect to mySql database");
}
finally
{
connection.Close();
}
foreach (DataRow row in customerTbl.Rows)
{
// Declare the customer variables
string customerID = Encoding.ASCII.GetString((byte[])row["Customer ID"]);
string ChildOf = row["Child of"].ToString();
// Create the customer object
Customer customer = new Customer();
customer.entityId = customerID;
if (ChildOf != "")
{
RecordRef parentRef = new RecordRef();
try
{
parentRef.internalId = searchCustomer(ChildOf);
}
catch
{
// If it fails here I want to log the customerID and then go to the next row in the datatable (could not find the internalid for ChildOf
_out.error(customerID + " was not updated. Error: invalid format Parent string");
}
finally
{
parentRef.typeSpecified = false;
customer.parent = parentRef;
}
}
// Invoke update() operation
WriteResponse response = _service.update(customer);
// Process the response
if (response.status.isSuccess)
{
}
else
{
_out.error(customerID + " was not updated. Error: " + getStatusDetails(response.status));
}
}
}
You need to remove the row in the catch block, and change the foreach loop to a backwards for loop to handle the removals.
I realized that I want to log the other failed fields as well. Maybe it's an inefficient way but I did something like:
bool findParent = true;
if (ChildOf != "")
{
try
{
RecordRef parentRef = new RecordRef();
parentRef.internalId = searchCustomer(ChildOf);
parentRef.typeSpecified = false;
customer.parent = parentRef;
}
catch
{
findParent = false;
_out.error(customerID + " was not inserted. Error: invalid format Parent string");
}
}
And then an if statement before trying to insert:
if (findPartner == true && findParent == true)
{
response = _service.add(customer);
// Process the response
if (response.status.isSuccess)
{
}
else
{
_out.error(customerID + " was not inserted. Error: " + getStatusDetails(response.status));
}
}
else
{
//_out.error(customerID + " was not updated. Error: " + getStatusDetails(response.status));
}
Use the row.HasError property.