I have two parts to my application which both does massive amount of insert and update respectively and because or poor managemenent, there's deadlock.
I am using entity framework to do my insert and update.
The following is my code for my TestSpool program. The purpose of this program is to insert x number of records with a given interval.
using System;
using System.Linq;
using System.Threading;
using System.Transactions;
namespace TestSpool
{
class Program
{
static void Main(string[] args)
{
using (var db = new TestEntities())
{
decimal start = 700001;
while (true)
{
using (TransactionScope scope = new TransactionScope())
{
//Random ir = new Random();
//int i = ir.Next(1, 50);
var objs = db.BidItems.Where(m => m.BidItem_Close == false);
foreach (BidItem bi in objs)
{
for (int j = 0; j <= 10; j++)
{
Transaction t = new Transaction();
t.Item_Id = bi.BidItemId;
t.User_Id = "Ghost";
t.Trans_Price = start;
t.Trans_TimeStamp = DateTime.Now;
start += 10;
db.Transactions.AddObject(t);
}
Console.WriteLine("Test Spooled for item " + bi.BidItemId.ToString() + " of " + 11 + " bids");
db.SaveChanges();
}
scope.Complete();
Thread.Sleep(5000);
}
}
}
}
}
}
The second part of the program is the testserverclass, the serverclass supposed to processed a huge amount of transactions from testspool and determined the highest amount of the transaction and update to another table.
using System;
using System.Linq;
using System.Transactions;
public class TestServerClass
{
public void Start()
{
try
{
using (var db = new TestServer.TestEntities())
{
while (true)
{
using (TransactionScope scope = new TransactionScope())
{
var objsItems = db.BidItems.Where(m => m.BidItem_Close == false);
foreach (TestServer.BidItem bi in objsItems)
{
var trans = db.Transactions.Where(m => m.Trans_Proceesed == null && m.Item_Id == bi.BidItemId).OrderBy(m => m.Trans_TimeStamp).Take(100);
if (trans.Count() > 0)
{
var tran = trans.OrderByDescending(m => m.Trans_Price).FirstOrDefault();
// TestServer.BidItem bi = db.BidItems.FirstOrDefault(m => m.BidItemId == itemid);
if (bi != null)
{
bi.BidMinBid_LastBid_TimeStamp = tran.Trans_TimeStamp;
bi.BidMinBid_LastBidAmount = tran.Trans_Price;
bi.BidMinBid_LastBidBy = tran.User_Id;
}
foreach (var t in trans)
{
t.Trans_Proceesed = "1";
db.Transactions.ApplyCurrentValues(t);
}
db.BidItems.ApplyCurrentValues(bi);
Console.WriteLine("Processed " + trans.Count() + " bids for Item " + bi.BidItemId);
db.SaveChanges();
}
}
scope.Complete();
}
}
}
}
catch (Exception e)
{
Start();
}
}
}
However, as both application con-currently runs, it will go into deadlock pretty fast randomly either from the first test or server application. How do i optimised my code for both side to prevent deadlocks ? I am expecting huge amount of inserts from the testspool application.
Since they work on the same data and get in each others way, I believe the cleanest way to do this would be to avoid executing these two at the same time.
Define a global static variable, or a mutex or a flag of some kind, maybe on the database. Whoever starts executing raises the flag, other one waits for the flag to come down. When flag comes down the other one raises the flag and starts executing.
To avoid long wait times on each class you can alter both your classes to process only a limited amount of records each turn. You should also introduce a maximum wait time for the flag. Maximum record limit should be chosen carefully to ensure that each class finishes its job in shorter time than max wait time.
Related
This question already has answers here:
How to deal with cross-thread access exceptions?
(3 answers)
Closed 5 years ago.
I have the the following method to show in a gridview (dgJEARequests) a customized selection of items, as you can see from the images. The user will select the items that will send the parameters to the query whereStatement. But when many checkboxes (items) are marked or selected, it will take some time to load all the data in the dgJEARequests, let's say, 5000 records, or less. So, I decided to add a WaitProcessing (a form with a progress bar). When I click the search button btnSearchSelection, I get this error:
An exception of type 'System.InvalidOperationException' occurred in System.Windows.Forms.dll but was not handled in user code
Additional information: Cross-thread operation not valid: Control 'dgJEARequests' accessed from a thread other than the thread it was created on
private void ShowOnlyCustomSelection()
{
String whereStatement = "";
List<String> fieldList = new List<String>();
string getStatusName = "";
string _getStatusName = "";
//loop through all the checkboxes
for (int i = 0; i < count; i++)
{
if (_cbStatus[i].Checked)
{
//getStatusName should give all selected options like this: 'New', 'Started', 'Accepted', etc., then pass it to whereStatement
_getStatusName += ("'" + _cbStatus[i].Text + "'" + ",").TrimEnd();
}
}
//trims the last comma (,)
getStatusName = _getStatusName.TrimEnd(',');
//textBox1.Text = _getStatusName.TrimEnd(','); //--->>this is for testing
////////////
if (getStatusName == "" || getStatusName == null)
{
{
MessageBox.Show("You have not selected your filter(s)!", "Filter Result", MessageBoxButtons.OK, MessageBoxIcon.Exclamation);
}
}
else
{
//Build WHERE Statement
fieldList.Add("RequestStatus.StatusName IN (" + getStatusName + ")");
if (fieldList.Count > 0)
{
for (int x = 0; x < fieldList.Count; x++)
{
if (x == 0)
{
whereStatement = fieldList[x];
}
else
{
whereStatement = whereStatement + " AND " + fieldList[x];
}
}
//Seach for Requests
jeaRequests = itServices.getRequestsBySQLStatement(whereStatement);
//dgJEARequests.DataSource = jeaRequests;
requestList = new List<ITWebService.Requests>(jeaRequests.ToList());
dgJEARequests.DataSource = requestList;
dgJEARequests.ClearSelection();
lblCountOfRequests.Text = requestList.Count.ToString().ToString() + " requests loaded";
}
else
{
if (chkMine.Checked)
{
jeaRequests = itServices.getRequestsByAssignedToAndStatusID(JEAUser.UserName, "0");
//dgJEARequests.DataSource = jeaRequests;
requestList = new List<ITWebService.Requests>(jeaRequests.ToList());
dgJEARequests.DataSource = requestList;
dgJEARequests.ClearSelection();
}
else
{
jeaRequests = itServices.getRequestsBySQLStatement("Requests.ID > 0");
//dgJEARequests.DataSource = jeaRequests;
requestList = new List<ITWebService.Requests>(jeaRequests.ToList());
dgJEARequests.DataSource = requestList;
dgJEARequests.ClearSelection();
}
}
}
}
This is the code for the progressbar:
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Windows.Forms;
namespace JEAProjectManager
{
public partial class WaitProcessing : Form
{
public Action Worker
{
get;
set;
}
public WaitProcessing(Action worker)
{
InitializeComponent();
if (worker == null)
throw new ArgumentNullException();
Worker = worker;
}
protected override void OnLoad(EventArgs e)
{
base.OnLoad(e);
Task.Factory.StartNew(Worker).ContinueWith(t =>
{
this.Close();
},
TaskScheduler.FromCurrentSynchronizationContext());
}
}
}
This other is for when I click the search button:
private void btnSearchSelection_CheckedChanged(object sender, EventArgs e)
{
//ShowOnlyCustomSelection();
//WaitProcessing processing = new WaitProcessing();
using (WaitProcessing processing = new WaitProcessing(ShowOnlyCustomSelection))
{
processing.ShowDialog(this);
}
}
The query parameters are sent to a Webservice. What can I do to solve this problem? I am customizing this application. I am not the original developer, but it is my task to make it better. I know there are some similar errors out there, but different scenarios.
Screenshots:
Doing my selection
Passing parameters for query
ProgressBar
Error Message
possible duplicate :-/
How to deal with cross-thread access exceptions?
you need to ensure, that the access thread is initialized
It looks like the answer was easy, but took me a while to figure it out. I am not sure if there are better solutions for this, but I had to enclose those components in lambda expression like:
lblCountOfRequests.Text = requestList.Count.ToString().ToString() + " requests loaded";
Or
dgJEARequests.Invoke(new Action(() => dgJEARequests.DataSource = requestList));
I'm working on a utility to read through a JSON file I've been given and to transform it into SQL Server. My weapon of choice is a .NET Core Console App (I'm trying to do all of my new work with .NET Core unless there is a compelling reason not to). I have the whole thing "working" but there is clearly a problem somewhere because the performance is truly horrifying almost to the point of being unusable.
The JSON file is approximately 27MB and contains a main array of 214 elements and each of those contains a couple of fields along with an array of from 150-350 records (that array has several fields and potentially a small <5 record array or two). Total records are approximately 35,000.
In the code below I've changed some names and stripped out a few of the fields to keep it more readable but all of the logic and code that does actual work is unchanged.
Keep in mind, I've done a lot of testing with the placement and number of calls to SaveChanges() think initially that number of trips to the Db was the problem. Although the version below is calling SaveChanges() once for each iteration of the 214-record loop, I've tried moving it outside of the entire looping structure and there is no discernible change in performance. In other words, with zero trips to the Db, this is still SLOW. How slow you ask, how does > 24 hours to run hit you? I'm willing to try anything at this point and am even considering moving the whole process into SQL Server but would much reather work in C# than TSQL.
static void Main(string[] args)
{
string statusMsg = String.Empty;
JArray sets = JArray.Parse(File.ReadAllText(#"C:\Users\Public\Downloads\ImportFile.json"));
try
{
using (var _db = new WidgetDb())
{
for (int s = 0; s < sets.Count; s++)
{
Console.WriteLine($"{s.ToString()}: {sets[s]["name"]}");
// First we create the Set
Set eSet = new Set()
{
SetCode = (string)sets[s]["code"],
SetName = (string)sets[s]["name"],
Type = (string)sets[s]["type"],
Block = (string)sets[s]["block"] ?? ""
};
_db.Entry(eSet).State = Microsoft.EntityFrameworkCore.EntityState.Added;
JArray widgets = sets[s]["widgets"].ToObject<JArray>();
for (int c = 0; c < widgets.Count; c++)
{
Widget eWidget = new Widget()
{
WidgetId = (string)widgets[c]["id"],
Layout = (string)widgets[c]["layout"] ?? "",
WidgetName = (string)widgets[c]["name"],
WidgetNames = "",
ReleaseDate = releaseDate,
SetCode = (string)sets[s]["code"]
};
// WidgetColors
if (widgets[c]["colors"] != null)
{
JArray widgetColors = widgets[c]["colors"].ToObject<JArray>();
for (int cc = 0; cc < widgetColors.Count; cc++)
{
WidgetColor eWidgetColor = new WidgetColor()
{
WidgetId = eWidget.WidgetId,
Color = (string)widgets[c]["colors"][cc]
};
_db.Entry(eWidgetColor).State = Microsoft.EntityFrameworkCore.EntityState.Added;
}
}
// WidgetTypes
if (widgets[c]["types"] != null)
{
JArray widgetTypes = widgets[c]["types"].ToObject<JArray>();
for (int ct = 0; ct < widgetTypes.Count; ct++)
{
WidgetType eWidgetType = new WidgetType()
{
WidgetId = eWidget.WidgetId,
Type = (string)widgets[c]["types"][ct]
};
_db.Entry(eWidgetType).State = Microsoft.EntityFrameworkCore.EntityState.Added;
}
}
// WidgetVariations
if (widgets[c]["variations"] != null)
{
JArray widgetVariations = widgets[c]["variations"].ToObject<JArray>();
for (int cv = 0; cv < widgetVariations.Count; cv++)
{
WidgetVariation eWidgetVariation = new WidgetVariation()
{
WidgetId = eWidget.WidgetId,
Variation = (string)widgets[c]["variations"][cv]
};
_db.Entry(eWidgetVariation).State = Microsoft.EntityFrameworkCore.EntityState.Added;
}
}
}
_db.SaveChanges();
}
}
statusMsg = "Import Complete";
}
catch (Exception ex)
{
statusMsg = ex.Message + " (" + ex.InnerException + ")";
}
Console.WriteLine(statusMsg);
Console.ReadKey();
}
I had an issue with that kind of code, lots of loops and tons of changing state.
Any change / manipulation you make in _db context, will generate a "trace" of it. And it making your context slower each time. Read more here.
The fix for me was to create new EF context(_db) at some key points. It saved me a few hours per run!
You could try to create a new instance of _db each iteration in this loop
contains a main array of 214 elements
If it make no change, try to add some stopwatch to get a best idea of what/where is taking so long.
If you're making thousands of updates then EF is not really the way to go. Something like SQLBulkCopy will do the trick.
You could try the bulkwriter library.
IEnumerable<string> ReadFile(string path)
{
using (var stream = File.OpenRead(path))
using (var reader = new StreamReader(stream))
{
while (reader.Peek() >= 0)
{
yield return reader.ReadLine();
}
}
}
var items =
from line in ReadFile(#"C:\products.csv")
let values = line.Split(',')
select new Product {Sku = values[0], Name = values[1]};
then
using (var bulkWriter = new BulkWriter<Product>(connectionString)) {
bulkWriter.WriteToDatabase(items);
}
I thought I was trying to do something very simple. I just want to report a running number on the screen so the user gets the idea that the SQL Stored Procedure that I'm executing is working and that they don't get impatient and start clicking buttons.
The problem is that I can't figure out how to actually call the progress reporter for the ExecutNonQueryAsync command. It gets stuck in my reporting loop and never executes the command but, if I put it after the async command, it will get executed and result will never not equal zero.
Any thoughts, comments, ideas would be appreciated. Thank you so much!
int i = 0;
lblProcessing.Text = "Transactions " + i.ToString();
int result = 0;
while (result==0)
{
i++;
if (i % 500 == 0)
{
lblProcessing.Text = "Transactions " + i.ToString();
lblProcessing.Refresh();
}
}
// Yes - I know - the code never gets here - that is the problem!
result = await cmd.ExecuteNonQueryAsync();
The simplest way to do this is to use a second connection to monitor the progress, and report on it. Here's a little sample to get you started:
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.Text;
using System.Threading.Tasks;
namespace Microsoft.Samples.SqlServer
{
public class SessionStats
{
public long Reads { get; set; }
public long Writes { get; set; }
public long CpuTime { get; set; }
public long RowCount { get; set; }
public long WaitTime { get; set; }
public string LastWaitType { get; set; }
public string Status { get; set; }
public override string ToString()
{
return $"Reads {Reads}, Writes {Writes}, CPU {CpuTime}, RowCount {RowCount}, WaitTime {WaitTime}, LastWaitType {LastWaitType}, Status {Status}";
}
}
public class SqlCommandWithProgress
{
public static async Task ExecuteNonQuery(string ConnectionString, string Query, Action<SessionStats> OnProgress)
{
using (var rdr = await ExecuteReader(ConnectionString, Query, OnProgress))
{
rdr.Dispose();
}
}
public static async Task<DataTable> ExecuteDataTable(string ConnectionString, string Query, Action<SessionStats> OnProgress)
{
using (var rdr = await ExecuteReader(ConnectionString, Query, OnProgress))
{
var dt = new DataTable();
dt.Load(rdr);
return dt;
}
}
public static async Task<SqlDataReader> ExecuteReader(string ConnectionString, string Query, Action<SessionStats> OnProgress)
{
var mainCon = new SqlConnection(ConnectionString);
using (var monitorCon = new SqlConnection(ConnectionString))
{
mainCon.Open();
monitorCon.Open();
var cmd = new SqlCommand("select ##spid session_id", mainCon);
var spid = Convert.ToInt32(cmd.ExecuteScalar());
cmd = new SqlCommand(Query, mainCon);
var monitorQuery = #"
select s.reads, s.writes, r.cpu_time, s.row_count, r.wait_time, r.last_wait_type, r.status
from sys.dm_exec_requests r
join sys.dm_exec_sessions s
on r.session_id = s.session_id
where r.session_id = #session_id";
var monitorCmd = new SqlCommand(monitorQuery, monitorCon);
monitorCmd.Parameters.Add(new SqlParameter("#session_id", spid));
var queryTask = cmd.ExecuteReaderAsync( CommandBehavior.CloseConnection );
var cols = new { reads = 0, writes = 1, cpu_time =2,row_count = 3, wait_time = 4, last_wait_type = 5, status = 6 };
while (!queryTask.IsCompleted)
{
var firstTask = await Task.WhenAny(queryTask, Task.Delay(1000));
if (firstTask == queryTask)
{
break;
}
using (var rdr = await monitorCmd.ExecuteReaderAsync())
{
await rdr.ReadAsync();
var result = new SessionStats()
{
Reads = Convert.ToInt64(rdr[cols.reads]),
Writes = Convert.ToInt64(rdr[cols.writes]),
RowCount = Convert.ToInt64(rdr[cols.row_count]),
CpuTime = Convert.ToInt64(rdr[cols.cpu_time]),
WaitTime = Convert.ToInt64(rdr[cols.wait_time]),
LastWaitType = Convert.ToString(rdr[cols.last_wait_type]),
Status = Convert.ToString(rdr[cols.status]),
};
OnProgress(result);
}
}
return queryTask.Result;
}
}
}
}
Which you would call something like this:
class Program
{
static void Main(string[] args)
{
Run().Wait();
}
static async Task Run()
{
var constr = "server=localhost;database=tempdb;integrated security=true";
var sql = #"
set nocount on;
select newid() d
into #foo
from sys.objects, sys.objects o2, sys.columns
order by newid();
select count(*) from #foo;
";
using (var rdr = await SqlCommandWithProgress.ExecuteReader(constr, sql, s => Console.WriteLine(s)))
{
if (!rdr.IsClosed)
{
while (rdr.Read())
{
Console.WriteLine("Row read");
}
}
}
Console.WriteLine("Hit any key to exit.");
Console.ReadKey();
}
}
Which outputs:
Reads 0, Writes 0, CPU 1061, RowCount 0, WaitTime 0, LastWaitType SOS_SCHEDULER_YIELD, Status running
Reads 0, Writes 0, CPU 2096, RowCount 0, WaitTime 0, LastWaitType SOS_SCHEDULER_YIELD, Status running
Reads 0, Writes 0, CPU 4553, RowCount 11043136, WaitTime 198, LastWaitType CXPACKET, Status suspended
Row read
Hit any key to exit.
You're not going to be able to get ExecuteNonQueryAsync to do what you want here. To do what you're looking for, the result of the method would have to be either row by row or in chunks incremented during the SQL call, but that's not how submitting a query batch to SQL Server works or really how you would want it to work from an overhead perspective. You hand a SQL statement to the server and after it is finished processing the statement, it returns the total number of rows affected by the statement.
Do you just want to let the user know that something is happening, and you don't actually need to display current progress?
If so, you could just display a ProgressBar with its Style set to Marquee.
If you want this to be a "self-contained" method, you could display the progress bar on a modal form, and include the form code in the method itself.
E.g.
public void ExecuteNonQueryWithProgress(SqlCommand cmd) {
Form f = new Form() {
Text = "Please wait...",
Size = new Size(400, 100),
StartPosition = FormStartPosition.CenterScreen,
FormBorderStyle = FormBorderStyle.FixedDialog,
MaximizeBox = false,
ControlBox = false
};
f.Controls.Add(new ProgressBar() {
Style = ProgressBarStyle.Marquee,
Dock = DockStyle.Fill
});
f.Shown += async (sender, e) => {
await cmd.ExecuteNonQueryAsync();
f.Close();
};
f.ShowDialog();
}
That is an interesting question. I have had to implement similar things in the past. In our case the priority was to:
Keep client side responsive in case the user doesn't want to stick around and wait.
Update the user of action and progress.
What I would do is use threading to run the process in the background like:
HostingEnvironment.QueueBackgroundWorkItem(ct => FunctionThatCallsSQLandTakesTime(p, q, s));
Then using a way to estimate work time I would increment a progress bar from client side on a clock. For this, query your data for a variable that gives you a linear relationship to the work time needed by FunctionThatCallsSQLandTakesTime.
For example; the number of active users this month drives the time FunctionThatCallsSQLandTakesTime takes. For each 10000 user it takes 5 minutes. So you can update your progress bar accordingly.
I'm wondering if this might be a reasonable approach:
IAsyncResult result = cmd2.BeginExecuteNonQuery();
int count = 0;
while (!result.IsCompleted)
{
count++;
if (count % 500 == 0)
{
lblProcessing.Text = "Transactions " + i.ToString();
lblProcessing.Refresh();
}
// Wait for 1/10 second, so the counter
// does not consume all available resources
// on the main thread.
System.Threading.Thread.Sleep(100);
}
I have a DbContext with a dataset of >20M records, that has to be converted to a different data format. Therefore, I read the data into memory, perform some tasks and then dispose the DbContext. The code works fine, but after a while I get OutOfMemoryExceptions. I have been able to narrow it down to the following piece of code, where I retrieve 2M records, then release them and fetch them again. The first retrieval works just fine, the second one throws an exception.
// first call runs fine
using (var dbContext = new CustomDbContext())
{
var list = dbContext.Items.Take(2000000).ToArray();
foreach (var item in list)
{
// perform conversion tasks...
item.Converted = true;
}
}
// second call throws exception
using (var dbContext = new CustomDbContext())
{
var list = dbContext.Items.Take(2000000).ToArray();
foreach (var item in list)
{
// perform conversion tasks...
item.Converted = true;
}
}
Shouldn't the GC automatically release all memory allocated in the first using block, such that the second block should run as fine as the first one?
In my actual code, I do not retrieve 2 million records at once, but something between 0 and 30K in each iteration. However, after about 15 minutes, I run out of memory, although all objects should have been released.
I suspect you met LOH. Probably your objects are bigger than threashold and they are getting there, thus GC doesnt help by default.
Try this: https://www.simple-talk.com/dotnet/.net-framework/large-object-heap-compaction-should-you-use-it/
and see if your exception goes away.
i.e. add this between first and second part:
GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect();
IEnumerable has GetEnumerator() so you could try this to avoid .ToArray() or .ToList() that arenĀ“t necessary if you just want to read:
// first call
using (var dbContext = new CustomDbContext())
{
foreach (var item in dbContext.Items.Take(2000000))
{
// perform conversion tasks...
item.Converted = true;
}
}
// second call
using (var dbContext = new CustomDbContext())
{
foreach (var item in dbContext.Items.Take(2000000))
{
// perform conversion tasks...
item.Converted = true;
}
}
Running GC will not help you, you have to run each iteration in different context. And dispose your context.
// ID is your primary key
long startID = 0;
while(true){
using(var db = new CustomDbContext()){
var slice = db.Items.Where(x=>x.ID > startID)
.OrderBy(x=>x.ID)
.Take(1000).ToList();
// stop if there is nothing to process
if(!slice.Any())
break;
foreach(var item in slice){
// your logic...
item.Converted = true;
}
startID = slice.Last().ID;
}
}
If you want to process these things faster, alternate approach would be to run slices in parallel ....
Alternate Approach
I would recommend using dividing slices in 100x100, then I can process 100 slices of 100 items in parallel.
You can always easily customize slicing to meet your speed needs.
public IEnumerable<IEnumerable<T>> Slice(IEnumerable<T> src, int size){
while(src.Any()){
var s = src.Take(size);
src = src.Skip(size);
yield return s;
}
}
long startID = 0;
while(true){
using(var db = new CustomDbContext()){
var src = db.Items.Where(x=>x.ID > startID)
.OrderBy(x=>x.ID)
.Take(10000).Select(x=>x.ID).ToList();
// stop if there is nothing to process
if(!src.Any())
break;
Parallel.ForEach(src.Slice(100), slice => {
using(var sdb = new CustomDbContext()){
foreach(var item in sdb.Items.Where(x=> slice.Contains(x.ID)){
item.Converted = true;
}
}
} );
startID = src.Last();
}
}
After refactoring, memory gets released. I don't know why, but it works.
private static void Debug()
{
var iteration = 0;
while(true)
{
Console.WriteLine("Iteration {0}", iteration++);
Convert();
}
}
private static void Convert()
{
using (var dbContext = new CustomDbContext(args[0]))
{
var list = dbContext.Items.Take(2000000).ToList();
foreach (var item in list)
{
item.Converted = true;
}
}
}
When I move the content of Convert() to the while loop in Debug(), the OutOfMemoryExceptions is thrown.
private static void Debug()
{
var iteration = 0;
while(true)
{
Console.WriteLine("Iteration {0}", iteration++);
using (var dbContext = new CustomDbContext(args[0]))
{
// OutOfMemoryException in second iteration
var list = dbContext.Items.Take(2000000).ToList();
foreach (var item in list)
{
item.Converted = true;
}
}
}
}
I am trying to read the messages that SQL Server normally returns in SSMS in the messages tab. Specifically information from the SET STATISTICS IO and TIME. The code below works but does not actually give this output. Any help is greatly appreciated. Thank you.
using System;
using System.Collections;
using System.Data.SqlClient;
using System.Data;
namespace ConsoleApp
{
class Program
{
static void Main(string[] args)
{
var cn2 = new MedusaPerf.ConnectionStringBuilder().GetTrustedConnectionString("localhost", "AdventureWorks2012", false);
//var cn2 = new MedusaPerf.ConnectionStringBuilder().GetStandardConnectionString("localhost", "AdventureWorks2012", "testuser", "pass", false);
string infoMessageText = "";
try
{
var cmd = "SET STATISTICS IO ON; SET STATISTICS TIME ON; SELECT TOP(5) DatabaseLogID, PostTime, Event FROM [dbo].[DatabaseLog];";
cn2.StatisticsEnabled = true;
//cn2.InfoMessage += new SqlInfoMessageEventHandler(InfoMessageHandler);
cn2.InfoMessage += delegate(object sender, SqlInfoMessageEventArgs e)
{
infoMessageText += e.Message.ToString();
};
var daDataOutput = new SqlDataAdapter(cmd, cn2);
DataTable dtOutput = new DataTable();
daDataOutput.Fill(dtOutput);
foreach (DataRow i in dtOutput.Rows)
{
string dataRowOutput = "";
for (int j = 0; j < dtOutput.Columns.Count; j++)
{
dataRowOutput = dataRowOutput + i[j].ToString();
}
Console.WriteLine(dataRowOutput);
}
IDictionary d = cn2.RetrieveStatistics();
string[] keys = new string[d.Count];
d.Keys.CopyTo(keys,0);
for (int x = 0; x < d.Count; x++)
{
Console.WriteLine("{0}\t{1}",keys[x], (long)d[keys[x]]);
}
Console.WriteLine("Success ");
}
catch (Exception)
{
throw;
}
Console.WriteLine(infoMessageText);
Console.WriteLine("Hit Enter to Continue");
System.Console.ReadKey();
}
static void InfoMessageHandler(object sender, SqlInfoMessageEventArgs e)
{
string myMsg = e.Message;
Console.WriteLine(e.Message);
}
}
}
Here is the output:
13/14/2012 1:14:18 PMCREATE_TABLE
23/14/2012 1:14:18 PMALTER_TABLE
53/14/2012 1:14:18 PMCREATE_TYPE
63/14/2012 1:14:18 PMCREATE_TYPE
213/14/2012 1:14:19 PMCREATE_XML_SCHEMA_COLLECTION
ExecutionTime 46
UnpreparedExecs 1
SelectRows 5
Prepares 0
BuffersSent 1
PreparedExecs 0
SelectCount 2
IduRows 0
BytesReceived 911
Transactions 0
IduCount 0
ServerRoundtrips 1
CursorOpens 0
SumResultSets 1
NetworkServerTime 0
ConnectionTime 0
BytesSent 262
BuffersReceived 1
Success
Hit Enter to Continue
The resolution ultimately was that Fill method does not trigger InfoMessage method thus I was unable to capture the messages. I found another post on StackOverflow that addresses this reasoning. Below is the working code. https://stackoverflow.com/a/2914981/62511
using System;
using System.Collections;
using System.Data.SqlClient;
using System.Data;
namespace ConsoleApp
{
class Program
{
static void Main(string[] args)
{
var cn2 = new MedusaPerf.ConnectionStringBuilder().GetTrustedConnectionString("localhost", "AdventureWorks2012", false);
//var cn2 = new MedusaPerf.ConnectionStringBuilder().GetStandardConnectionString("localhost", "AdventureWorks2012", "testuser", "pass", false);
string infoMessageText = "";
var cmd = "SET STATISTICS IO ON; SET STATISTICS TIME ON; SELECT DatabaseLogID, PostTime, Event FROM [dbo].[DatabaseLog];";
cn2.StatisticsEnabled = true;
cn2.InfoMessage += delegate(object sender, SqlInfoMessageEventArgs e)
{
infoMessageText += e.Message.ToString();
};
cn2.Open();
try
{
SqlCommand comm = new SqlCommand(cmd, cn2);
comm.ExecuteNonQuery();
IDictionary d = cn2.RetrieveStatistics();
string[] keys = new string[d.Count];
d.Keys.CopyTo(keys, 0);
for (int x = 0; x < d.Count; x++)
{
Console.WriteLine("{0}\t{1}", keys[x], (long)d[keys[x]]);
}
Console.WriteLine("Success ");
}
catch (Exception)
{
throw;
}
//Console.Write(conn_InfoMessage());
cn2.Close();
Console.WriteLine(infoMessageText);
Console.WriteLine("Hit Enter to Continue");
System.Console.ReadKey();
}
}
}
first of all I don't think the
SET STATISTICS IO ON
is needed ... io statistics seem to be controlled be the StatisticsEnabled property of the SqlConnection class ...
the thing with the time statistics is really strange ... I had the same problem ... I found out that when you insert a print statement between SET STATISTICS TIME ON and SELECT ... the handler for InfoMessage gets called like it should be ... once for the print statement and once for the statistics ...
ps: tried to put the complete statement here but could not submit ("an error occurred submitting the answer") ... hope you can find out yourself ...