Debugging Visual Studio Linq query in LinqPad - c#

I have a very large Visual Studio project and am in the process of debugging a complicated Linq query.
The program gets it's data from a stored procedure in SQL, converts it to data tables and then does some background magic on it.
Somehow, all the questions I find here involve how to debug a Linqpad query in visual studio, not the other way round.
Is there a short and easy way to serialize an EnumerableRowCollection in Visual Studio, and then deserialize it in LinqPad so I can play around with it in LinqPad? Or maybe there is another (better) way to debug?
So let's say I have this:
var table = processManager.getDataTable();
var filteredRows = from row in table.AsEnumerable()
where (row.Field<DateTime>("DateFrom") <= currentDate &&
row.Field<DateTime>("DateTo") >= currentDate)
select new
{
....
};
The first line (processManager.getDataTable()) MUST run in visual studio. How can I debug in LinqPad?

Based on your comment, you essentially want to export the datatable from your solution and import it into Linqpad for further processing.
The easiest solution would be to export it into a csv - file. Let's take "vc 74"s solution posted here for the export:
StringBuilder sb = new StringBuilder();
//assuming processManager.getDataTable() returns a DataTable object
IEnumerable<string> columnNames = processManager.getDataTable().Columns.Cast<DataColumn>().
Select(column => column.ColumnName);
sb.AppendLine(string.Join(",", columnNames));
foreach (DataRow row in dt.Rows)
{
IEnumerable<string> fields = row.ItemArray.Select(field => field.ToString());
sb.AppendLine(string.Join(",", fields));
}
Now that we have a csv file, lets reimport it in Linqpad:
//Where condition is optional; ensures no blank lines are processed
//to prevent index out of bounds error on Split
var csvData = from row in File.ReadLines(#"path/to/csv/file").Where(arg => !string.IsNullOrWhiteSpace(arg) && arg.Length > 0).AsEnumerable()
//Delimiter
let column = row.Split(';')
select new
{
Prop1 = column[0],
Prop2 = column[1],
Prop3 = column[2],
Prop4 = column[3],
Prop5 = column[4]
};
You can also strongly type it, by defining all columns in a row as a seperate class in Linqpad and then call select new myClass { ... }.
Kind regards

There is an extension that once installed and enabled on a solution, export your debugging variables to LINQPad directly. Check it out, it's called LINQBridgeVS:
It basically maps a custom debugger visualizer to all of the public classes and structs of a solution. So when you run your app, the magnifier glass on the data tip is available on almost any type:
Disclaimer: I'm the author of the extension.

Related

Fastest way to update all rows to have same value in one column in datatable, without loop in C#

I have datatable "users" and column "is_published" in it. I have about 100k rows.
What is the fastest way to update value in the column, so the whole rows in column have same value = 1.
I try with classic foreach loop and it't slow, also I try with LINQ :
dsData.Tables["users"].Select().ToList().ForEach(x => x["is_published"] = 1;);
and it still isn't fast enough.
Also variant wit Expression doesn't work for me, because after that fields is ReadOnly and I can't change value again.
This is C#.
when you create your table you can simply push a default value to your column..
DataTable dt = new DataTable();
dt.Columns["is_published"].DataType = System.Int32;
dt.Columns["is_Published"].DefaultValue = 1;
then when you need to change the rows to default value ( or will you need? )
// Say your user selects the row which its index is 2..
// The ItemArray gives the selectedRow's cells as object..
// And say your columns index no is 5..
dt.Rows[2].ItemArray[5] = default ;
or
dt.Rows[2].ItemArray[5] = dt.Columns["is_published"].DefaultValue;
Separate the select and the update into two operations. Skip the ToList() operation and instead iterate afterwards over the IEnumerable collection using forEach and update the value:
var rows = dsData.Tables["users"].Select();
forEach(var row in rows)
{
row["is_published"] = 1;
}
The ToList forces an immediate query evaluation which in this case acts as a copy of all items from the IEnumerable collection, so you can gain some speed here. I ran some tests and the result in this case is (using your code and the modification): ToList is 3 times slower than iterating over IEnumerable and single update!
IMO 40 seconds is an awful lot for 100K items. If your DataTable is bound to a DataGridView or some other UI control, i believe that the update of the GUI is taking so long and not the update of the values itself. In my tests the update using ToList took fractions of a second (on my simple Lenovo netbook with AMD E-450 processor, and i assume you are not using a 386 machine). Try suspending the UI bevor updating and refreshing the values and then enable it again - example in this SO post.
My original post (as i can see you gained some speed using the code - interesting):
More an experiment for my part, but it is possible to:
convert the table to XML
fetch all elements that should be changed
change them
write the changed XML back to the table
The code:
// temp table
var dataTable = new DataTable("Table 1");
dataTable.Columns.Add("title", typeof(string));
dataTable.Columns.Add("number", typeof(int));
dataTable.Columns.Add("subnum1", typeof(int));
dataTable.Columns.Add("subnum2", typeof(int));
// add temp data
Enumerable.Range(1, 100000).ToList().ForEach(e =>
{
dataTable.Rows.Add(new object[] { "A", 1, 2, 3 });
});
// "bulk update"!
var sb = new StringBuilder();
var xmlWriter = XmlWriter.Create(sb);
dataTable.WriteXml(xmlWriter);
var xml = XDocument.Parse(sb.ToString());
// take column to change
var elementsToChange = xml.Descendants("title").ToList();
// the list is referenced to the XML, so the XML is changed too!
elementsToChange.ForEach(e => e.Value = "Z");
// clear current table
dataTable.Clear();
// write changed data back to table
dataTable.ReadXml(xml.CreateReader());
The table is updated. IMO the parts that make this solution slow are the
convertion from and to XML
and the fill of the StringBuilder
The other way around the pure update of the list is probably faster than the table update.
Finaly! I speed up update so it takes 2-3 sec. I added BeginLoadData() and EndLoadData()
DataTable dt = ToDataSet().Tables["users"];
var sb = new StringBuilder();
var xmlWriter = XmlWriter.Create(sb);
dt.WriteXml(xmlWriter);
var xml = XDocument.Parse(sb.ToString());
xml.Descendants("is_published").ToList().ForEach(e => e.Value = "1");
dt.Clear();
dt.BeginLoadData();
dt.ReadXml(xml.CreateReader());
dt.EndLoadData();

Querying 2 datatables in a dataset

I have 2 datatables named 'dst' and 'dst2'. they are located in the dataset 'urenmat'.
The mayority of the data is in 'dst'. this however contains a column named 'werknemer'. It contains a value which corresponds to a certain row in 'dst2'. This column is named 'nummer'.
What i need is a way to left outer join both datatables where dst.werknemer and dst2.nummer are linked, and a new datatable is created which contains 'dst2.naam' linked to 'dst.werknemer' along with all the other columns from 'dst'.
I have looked everywhere and still can't seem te find the right answer to my question. several sites provide a way using LINQ in this situation. I have tried using LINQ but i am not so skilled at this.
I tried using the 101 LINQ Samples:
http://code.msdn.microsoft.com/101-LINQ-Samples-3fb9811b
urenmat = dataset.
dst = a, b, c, d, werknemer.
dst2 = nummer, naam.
I used the following code from '101'.
var query =
from contact in dst.AsEnumerable()
join order in dst2.AsEnumerable()
on contact.Field<string>("werknemer") equals
order.Field<string>("nummer")
select new
{
a = order.Field<string>("a"),
b = order.Field<string>("b"),
c = order.Field<string>("c"),
d = order.Field<string>("d"),
naam = contact.Field<decimal>("naam")};
I however don't know what to change 'contact' and 'order' to and i can't seem to find out how to save it to a datatable again.
I am very sorry if these are stupid questions but i have tried to solve it myself but it appears i'm stupid:P. Thank for the help in advance!
PS. i am using C# to code, the dataset and datatables are typed.
if you want to produce a projected dataset of dst left outer joined to dst2 you can use this LINQ expression (sorry i don't really work in LINQ query syntax so you'll have to use this lambda syntax instead).
var query = dst.AsEnumerable()
.GroupJoin(dst2.AsEnumerable(), x => x.Field<string>("werknemer"), x => x.Field<string>("nummer"), (contact, orders) => new { contact, orders })
.SelectMany(x => x.orders.DefaultIfEmpty(), (x, order) => new
{
a = order.Field<string>("a"),
b = order.Field<string>("b"),
c = order.Field<string>("c"),
d = order.Field<string>("d"),
naam = x.contact.Field<decimal>("naam")
});
because this is a projected dataset you cannot simply save back to the datatable. If saving is desired then you would want to load the affected row, update the desired fields, then save the changes.
// untyped
var row = dst.Rows.Find(keyValue);
// typed
var row = dst.FindBy...(keyValue);
// update the field
row.SetField("a", "some value");
// save only this row's changes
row.AcceptChanges();
// or after all changes to the table have been made, save the table
dst.AcceptChanges();
Normally if you need to perform loading and saving of (projected) data, an ORM (like entity framework, or LINQ-to-SQL) would be the best tool. However, you are using DataTable's in this case and I'm not sure if you can link an ORM to these (though it seems like it would probably be possible).

How to retrieve checkbox options and add to database using linq

I am attempting to add some checkbox options to my database via LINQ entities as one item. However the roadblock i run into is that i get an error trying to add these as into one variable to add to the db. Here is what i have so far:
public string GetSelectedItems(CheckBoxList control)
{
var items = new StringBuilder();
foreach (ListItem item in chbxRoomChange.Items)
{
if (item.Selected)
items.Append(string.Format("{0},", item.Text));
}
return items.ToString().TrimEnd(',');
}
adding to db:
var choices = GetSelectedItems(chbxRoomChange);
rc.preference = choices;
Based on the exception details you have posted then I suspect the column you are inserting into the database is too small to hold the data you are passing from your application.
Debug your app and see what length of the string returned by GetSelectedItems is and then compare that with the size of the column you are inserting into. I suspect you will find your string just won't fit and you need to increase it's size.
As a side note you could rewrite your method to use 1 line:
var choices = string.Join(",", chbxRoomChange.Items
.Cast<ListItem>()
.Where(li => li.Selected)
.Select(li => li.Text)
.ToArray());

Getting all column names / column values from datarow when debugging

I'm creating a mock instance of one of my datarows for testing.
The row I'm trying to duplicate from our database contains 37 columns with different variables.
Is there any chance when debugging to get the information out in clean text for simple editing of my mock objects?
I had to adjust Nikhil's Cast:
var colNames = dr.Table.Columns.Cast<DataColumn>().Select(x => x.ColumnName).ToList();
How about
var colNames = dr.Table.Columns.Cast(Of DataColumn)
.Select(x => x.ColumnName).ToList()
Solved it like this since I couldnt get Nikhils code to work.
int i = 0;
while(i<looprow.Table.Columns.Count)
{
Debug.WriteLine(looprow.Table.Columns[i].ColumnName);
i++;
}

Modify CSV Parser to work with TSV files C#

I have this code for parsing a CSV file.
var query = from line in File.ReadAllLines("E:/test/sales/" + filename)
let customerRecord = line.Split(',')
select new FTPSalesDetails
{
retailerName = "Example",
};
foreach (var item in query)
{
//sales details table
ItemSale ts = new ItemSale
{
RetailerID = GetRetailerID(item.retailerName)
};
}
Obviously there will be more data in the above code, I am just awaiting the test information file details/structure.
In the mean time I thought I'd ask if this could me modified to parse TSV files?
All help is appreciated,
thanks :)
assuming tsv is tab separated value, you can use
line.Split('\t')
if you are using .NET 4.0, i would recommend that u use File.ReadLines for large files in order to use LINQ and not to load all the lines in memory at once.

Categories

Resources