This is what I am working with to get back to the web dev world
ASP.Net with VS2008
Subsonic as Data Access Layer
SqlServer DB
Home Project description:
I have a student registration system. I have a web page which should display the student records.
At present I have a gridview control which shows the records
The user logs on and gets to the view students page. The gridview shows the students in the system, where one of the columns is the registration status of open, pending, complete.
I want the user to be able to apply dynamic sorting or filters on the result returned so as to obtain a more refined result they wish to see. I envisioned allowing user to filter the results by applying where clause or like clause on the result returned, via a dataset interface by a subsonic method. I do not want to query the databse again to apply the filter
example: initial query
Select * from studentrecords where date = convert(varchar(32,getdate(),101)
The user then should be able to applly filter on the resultset returned so that they can do a last name like '%Souza%'
Is this even possible and is binding a datasource to a gridview control the best approach or should i create a custom collection inheriting from collectionbase and then binding that to the gridview control?
PS: Sorry about the typo's. My machine is under the influence of tea spill on my laptop
I use LINQ-to-SQL, not Subsonic, so YMMV, but my approach to filtering has been to supply an OnSelecting handler to the data source. In LINQ-to-SQL, I'm able to replace the result with a reference to a DataContext method that returns a the result of applying a table-valued function. You might want to investigate something similar with Subsonic.
As tvanfosson said, LINQ is very well suited to making composable queries; you can do this either with fully dunamic TSQL that the base library generates, or via a UDF that you mark with [FunctionAttribute(..., IsComposable=true)] in the data-context.
I'm not familiar with Subsonic, so I can't advise there; but one other thought: in your "date = " code, you might consider declaring a datetime variable and assigning it first... that way the optimiser can usually do a better job of optimising it (the query is simpler, and there is no question whether it is converting the datetime (per row) to a varchar, or the varchar to a datetime). The most efficient way of getting just the date portion of something is to cast/floor/cast:
SET #today = GETDATE()
SET #today = CAST(FLOOR(CAST(#today as float)) as datetime)
[update re comment]
Re composable - I mean that this allows you to build up a query such that only the final query is executed at the database. For example:
var query = from row in ctx.SomeComplexUdf(someArg)
where row.IsOpen && row.Value > 0
select row;
might go down the the server via the TSQL:
SELECT u1.*
FROM dbo.SomeComplexUdf(#p1) u1
WHERE u1.IsOpen = 1 -- might end up parameterized
AND u1.Value > 0 -- might end up parameterized
The point here being that only the suitable data is returned from the server, rather than lots of data being returned, then thrown away. LINQ-to-SQL can do all sorts of things via compsable queries, including paging, sorting etc. By minimising the amount of data you load from the database you can make significant improvements in performance.
The alternative without composability is that it simply does:
SELECT u1.*
FROM dbo.SomeComplexUdf(#p1) u1
And then throws away the other rows in your web app... obviously, if you are expecting 20 open records and 10000 closed records, this is a huge difference.
How about something like this?
Rather than assigning a data source / table to your grid control, instead attach a 'DataView' to it.
Here's sort of a pseudocode example:
DataTable myDataTable = GetDataTableFromSomewhere();
DataGridView dgv = new DataGridView();
DataView dv = new DataView(myDataTable);
//Then you can specify things like:
dv.Sort = "StudentStatus DESC";
dv.Filter = "StudentName LIKE '" + searchName + '";
dgv.DataSource = dv;
...and so on.
Related
I have a datagridview which filled by SQL table. When I search a value with a textbox in the data and update/insert the some values and use fill function again I get this error
"System.NullReferenceException HResult=0x80004003 Message=Object reference not set to an instance of an object."
Note: No problem with update, insert and fill function without using search textbox
This is my fill function;
DataTable dt1 = new DataTable();
void fill()
{
try
{
//txtSearch.Clear();
dt1.Rows.Clear();
dt1.Columns.Clear();
SqlDataAdapter da = new SqlDataAdapter("Select * From Bilgisayar_Zimmet", bgl.baglanti());
da.Fill(dt1);
dataGridView1.DataSource = dt1;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message, "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
Here is my search code;
private void txtSearch_TextChanged(object sender, EventArgs e)
{
(dataGridView1.DataSource as DataTable).DefaultView.RowFilter = String.Format("Name LIKE '%{0}%' OR Surname LIKE '%{1}%'", txtSearch.Text, txtSearch.Text);
}
I think my problem is txtSearch_TextChanged methods. It blocks fill function because in the dgv there is filtered row.
So, if I was hanging my hat on the DataTable peg I would do the following in a net framework project - large parts of this process are switched off in netcore+ projects because of bugs/incomplete support in VS. It does work in core as a runtime thing, just not design (so you could do this design in framework, and import the code to core, or even have a core and netfw project that edit the same files..):
I would...
Have my table in my DB:
Make a new DataSet type of file:
Right click the surface and add a tableadapter
Add a connection string, and choose SELECT
Write a query - personally I recommend putting like select * table where primarykeycolumn = #id here rather than selecting all, but you apparently want to have all 2000 rows in the client. Have a look in advanced and check that all 3 boxes are ticked
Give the query a decent name - if youre selecting by#id then call it FillById etc
You're done
Flip to the forms designer for your form, and open the data sources panel (View menu, Other windows). Drag the node representing the table onto the form:
A load of stuff appears at the bottom, a grid appears, the toolbar appears. Add a textbox too (the asterisk) - we'll use it for the filter
Double click the textbox; you'll see the minimal code VS has written to load the data. The rest of the code is in the Form.Designer.cs file and the DataSet.Designer.cs file if you want to have a look. In our form we just need this code:
I'd paste it as text, but genuinely, the only line of it I actually wrote was the one line in the TextChanged:
bilgisayarZimmetBindingSource.Filter = String.Format("FirstName LIKE '%{0}%' OR LastName LIKE '%{0}%'", txtSearch.Text);
you can reuse placeholders in a string format by the way. Note that the filtering is done on the bindingsource - a device that VS has put between the table and the grid. BindingSources are very useful; they maintain position/current-row knowledge and can transmit updates up and down between grid and datatable. When related data is shown they automatically filter child datatables to show only children of a current parent
We can now run the app and load data, save data and filter data, all off that one line of code. I add a bunch of people:
See how their IDs are all negative? The datatable has an autoincrement that steps -1 then the db will calc the values when we hit save and they will be retrieved automatically, patched into the datatable and the grid will update, all automatically. If these rows were the parent of other rows, the child rows ParentId would be updated too via a DataRelation in the dataset:
We haven't had to refill anything; VS just got the single IDs out the DB
I can type a filter:
No errors; these people are editable, saveable, I can clear the textbox and see everyone etc.
So what is this voodoo? How does it work? Well.. It's not much different to yours. If we go rummaging in DataSet.Designer.cs we can find the queries that pull and push data:
read and write queries right there; the tableadapter wraps a dataadapter and sets up the relevant queries, the datarow state drives whether an insert or update etc is fired.. All the code in your question and more is written by VS and tucked away.. The dataset has strongly typed tables in that inherit from DataTable, and rows that have proper properties rather than just an array you access via string column names
We would just load data:
var dt = tableAdapter.GetData();
var dt2 = new BlahBlahDataTable();
tableAdapter.Fill(dt2);
//if we put parameters in the dataset query like SELECT * FROM person WHERE name = #n
var dt3 = new PersonDataTable();
ta.FillByName(dt3, "John");
The strongly typed datatables are nicer to LINQ with
//weakly typed datatable
dt.Rows.Cast<DataRow>()
.Where(r => r.Field<string>("FirstName").Contains("John") || r["LastName"].ToString().Contains("John"))
//strongly typed datatable
dt.Where(r => r.FirstName.Contains("John") || r.LastName.Contains("John"))
etc..
Final point: I personally wouldn't download 2000 rows into the client and hold them there. I'd download what I needed as and when. The longer you have something the more out of date it's likely to be, the more meory it consumes and the more your server disk/network gets battered serving up volumes of data noone uses. I understand the tradeoff between loading all the rows and then burning the client's CPU to search them rather than the server's, and that "string contains" are hard to optimally search/index, but still - I'd look at full text indexing and leaving them on the server
When you're making your query above, don't put SELECT * FROM BilgisayarZimmet - put SELECT * FROM BilgisayarZimmet WHERE ID = #id and call it FillById. Then when you're done with that one, add another query, SELECT * FROM BilgisayarZimmet WHERE FirstName LIKE #n OR LastName LIKE #n and call it FillbyNameLike
In code, call it like this:
tableAdapter.FillByNameLike(theDatasetX.BilgisayarZimmet, "%" + txtSearch.Text + "%")
Or even better let the user supply the wildcard, tohugh people are perhaps used to writing * so:
tableAdapter.FillByNameLike(theDatasetX.BilgisayarZimmet, txtSearch.Text.Replace("*", "%"))
When the TableAdapter fills it will clear the datatable first, so no need to clear it yourself. To turn this behaviour off, set tableAdapter.ClearBeforeFill = false. This way your grid shows only the data downlaoded from the DB, and that's a small, more recent dataset, than querying it all at app start
All in, VS write a lot of tedious code,and makes life a lot nicer. I use these nearly exclusively if operating in DataTable mode
I have DataGridView where I am showing data read from database:
DataSet ds = new DataSet();
sqlDa.Fill(ds);
dgView.DataSource = ds.Tables[0]
After adding all of the rows in the UI, I need to to SQL UPDATE of rows that previously read from database, and do INSERT for new rows by clicking Save button (I don't save rows one by one when adding, just all of them when I click the Save button):
foreach (DataGridViewRow dgvRow in dgView.Rows)
{
// do insert for new rows, and update for existing ones from database
}
How can I know what rows are newly added and what are not? Can I add some type of attribute to every row that is read from database so that can I know that they need to be updated?
How can I know what rows are newly added and what are not?
You don't need to; the datatable the DGV is showing is already tracking this. If you make a SqlDataAdapter and plug a SqlCommandBuilder into it see the example code in the docs so that it gains queries in its InsertCommand/UpdateCommand/DeleteCommand properties (or you can put these commands in yourself, but there isn't much point given that a command builder can make them automatically) then you just say:
theDataAdapter.Update(theDataTable);
If you didn't save it anywhere else you can get it from the DataSource of the DGV:
theDataAdapter.Update(dgView.DataSource as DataTable);
Ny the way, the word "Update" here is nothing to do with an update query; Microsoft should have called it SaveChanges. It runs all kinds of modification query (I/U/D) not just UPDATE
If you really want to know, and have a burning desire to reinvent this wheel, you can check a DataRow's RowState property, and it will tell you if it's Added, Modified or Deleted, so you can fire the appropriate query (but genuinely you'd be reimplementing functionality that a SqlDataAdapter already has built in)
All this said, you might not be aware that you can make your life massively easier by:
Add a new DataSet type of file to your project (like you would add a class). Open it
Right-click in the surface of it, choose add TableAdapter
Design your connection string in (once)
Enter your query as a "select that produces rows" like SELECT * FROM SomeTable WHERE ID = Id (it's advisable to use a where clause that selects on the ID; you can add more queries later to do other things, like SELECT * FROM SomeTable WHERE SomeColumn LIKE #someValue but for now selecting on ID gives you a base query to use that is handy for loading related data). You can also use existing or new stored procs if you want
Give it a sensible name pair like FillById, GetDataById - FillBy fills an existing table, Get gets a new one
Finish
You'll now have objects available in your code that are wrappers data adapters and datatables - same functionality but more nicely strongly typed
e.g. you can fill your grid with:
var ta = new SomeTableAdapter();
dgView.DataSource = ta.GetDataByFirstName("John%"); //does select * from table where firstname like 'john%' into a datatable
The datatables are strongly typed, so you don't access them like this:
//no
foreach(DataRow row in someTable.Rows){
if((row["someColumn"] as string) == "hello" && row.IsNull("otherColumn"))
row["otherColumn"] = "goodbye";
}
You have named properties:
//yes
foreach(var row in someTable){
if((row.SomeColumn == "hello" && row.IsOtherColumnNull())
row.OtherColumn = "goodbye";
}
Much nicer. LINQ works on them too, without AsEnumerable or Cast and endless casting the values.
It's not magic; VS writes boatloads of code behind the scenes for you - check in the YourDataSet.Designer.cs file - hundreds of SqlCommands, fully parameterized, for all the table operations (Select/Insert/Update/Delete), all base don typing a SELECT command into a tool pane. It's quite nice to use really, even all these years later.
Oh, but the designer doesn't work very nicely in net core. They're really lagging behind on fixing up the bugs that netcore brings (other priorities)
I have a DataTable containing 10000+ rows, resulting from a SQL Server query. The DataTable is used as the DataSource for a DataGridView, like so (simplified code):
MyBindingSource = New BindingSource(MyDataTable, Nothing)
MyDataGridView.DataSource = MyBindingSource
As this takes a looong time to load, I would like to limit the number of rows displayed in the DataGridView, somehow.
I can't use TOP in my query, because I need all the data present in the DataTable for filtering later on, without re-setting the DataSource (MyBindingSource.Filter = MyFilter).
Also, I can't use the Filter property for limiting the number of rows, because there's no relevant data in the query result that I can use for this. To get around this, I've thought about adding TSQL's ROW_NUMBER to the query result (MyBindingSource.Filter = "[RowNumberField] < 100"), but this would only work when no other fields are used in the filter.
Any ideas?
Here are a two options:
I would simply implement pagination on all of your views (filtered or unfiltered) using this technique of using a BindingNavigator GUI control which uses a BindingSource object to identify page breaks.
You can also filter by more than one criteria by using an OR operator but I don't see how that helps you with your current approach because your row numbers will have to be recalculated after each filter e.g. [RowNumberField] < 100 might return 100 rows with no filter by only 10 after a filter.
What you could do is move the filtering logic to your SQL query and then always show only the first X rows based on the row number (which I assume you are dynamically adding each time using TSQL's ROW_NUMBER()).
The advantage to this approach is that you can perform much more powerful filtering in TSQL and it keeps you Data Source smaller.
If you do take this approach, be careful about mixing your queries in with your view logic - I would recommend the Repository Pattern.
I have a dropdown list in my aspx page. Dropdown list's datasource is a datatable. Backend is MySQL and records get to the datatable by using a stored procedure.
I want to display records in the dropdown menu in ascending order.
I can achieve this by two ways.
1) dt is datatable and I am using dataview to filter records.
dt = objTest_BLL.Get_Names();
dataView = dt.DefaultView;
dataView.Sort = "name ASC";
dt = dataView.ToTable();
ddown.DataSource = dt;
ddown.DataTextField = dt.Columns[1].ToString();
ddown.DataValueField = dt.Columns[0].ToString();
ddown.DataBind();
2) Or in the select query I can simply say that
SELECT
`id`,
`name`
FROM `test`.`type_names`
ORDER BY `name` ASC ;
If I use 2nd method I can simply eliminate the dataview part. Assume this type_names table has 50 records. And my page is view by 100,000 users at a minute. What is the best method by considering efficiency,Memory handling? Get unsorted records to datatable and filter in code behind or sort them inside the datatabse?
Note - Only real performance tests can tell you real numbers.. Theoretical options are below (which is why I use word guess a lot in this answer).
You have at least 3 (instead of 2) options -
Sort in database - If the column being sorted on is indexed.. Then this may make most sense, because overhead of sorting on your database server may be negligible. SQL servers own data caches may make this super fast operation.. but 100k queries per minute.. measure if SQL gives noticeably faster results without sort.
Sort in code behind / middle layer - Likely you won't have your own equivalent of index.. you'd be sorting list of 50 records, 100k times per minutes.. would be slower than SQL, I would guess.
Big benefit would apply, only if data is relatively static, or very slow changing, and sorted values can be cached in memory for few seconds to minutes or hours..
The option not in your list - send the data unsorted all the way to the client, and sort it on client side using javascript. This solution may scale the most... sorting 50 records in Browser, should not be a noticeable impact on your UX.
The SQL purists will no doubt tell you that it’s better to let SQL do the sorting rather than C#. That said, unless you are dealing with massive record sets or doing many queries per second it’s unlikely you’d notice any real difference.
For my own projects, these days I tend to do the sorting on C# unless I’m running some sort of aggregate on the statement. The reason is that it’s quick, and if you are running any sort of stored proc or function on the SQL server it means you don’t need to find ways of passing order by’s into the stored proc.
I currently have a datagrid bound to a table with tens of thousands of records. I display the datagrid using the existing asp.net 2.0. I show ten records at a time.
My problem is whenever I try to access the next page, I gets all the records again from the database and then displays the ones that are required. This is slowing down the app. Is there feature within .net 2.0 which will help me optimize this issue? I cant use third party controls or ajax.
UPDATE
I cant use SQL to get 10 records at a time as well. Without changing any existing business logic or data retrieval, I want to do this.
You can do paging in your SQL and/or stored procedure.
Basically, you pass the sproc the number of items on the page and what page you are on. For example, page 3 and 20 records per page.
If you can use SQL Server 2005 or higher, you can use some newer keywords that make the query easier.
Here is a simple version of such a query:
SELECT ClientName, RowNumber
FROM (SELECT ClientName, ROW_NUMBER() OVER (ORDER BY ClientName) AS RowNumber
FROM Clients) AS cl
WHERE RowNumber BETWEEN 12 AND 30
You would make the values above of "12" and "30" be input parameters.
This way, you are only returning the rows you are going to display.
You can stick your datasource in the session.
then check if your DataSource is null from the session and then go get it.
This way you only get your data one time and if you need to get fresh data (say because you updated something) just empty the session variable that holds your data source.
EDIT:
here's a half attempt at an example, (by this i mean I'm not going into how to change the page because I assume you already know how to do that)
protected void PageChanging(object sender, GridViewPageEventArgs e)
{
if(Session["YourSessionVariableForData"] == null)
Session["YourSessionVariableForData"] = YourDataCall();
YourGridView.PageIndex = e.NewPageIndex;
YourGridView.DataSource = ((YourDataType)Session["YourSessionVariableForData"]);
YourGridView.DataBind();
}
Do the Paging in a Stored Procedure. Look into using the ROW_NUMBER() function for this. Create a class that will call the stored procedure and wire it to the grid using an ObjectDataSource.