Linq to Objects performance and best implementation - c#

I work on a web application that modifies a XML document that is stored in a database. It is made to mimic exactly like any program does when you click the "preferences" option in the menu. The XML document is read by a local client application and any changes are reflected in the local client.
My web app has 3 tiers I guess you would say with the aspx page being the "view", a service layer where I validate / process all user input, and a data layer where I update the XML document.
For each page of settings, I had been recursively traversing all the controls on the page and if I came to one that I wanted to work on (checkbox, textbox, ...) I added it to a IList and then sent that IList to the service layer where I then had to pull the control out of the list so I could work on it.
I noticed this seemed to be a bit slow so I profiled the page and basically the LINQ to Objects queries that I had been using tend to eat up a ton of time.
(CheckBox)lstControls.Where(x => x.ID == "some_id").SingleOrDefault();
I then switched to manually adding the controls on the page to a IList and then pulling them out in the service layer by indexer in the order they were put in. This is fugly and completely dependent on you not screwing up the index of the control you are looking for.
Finally, this breaks the rule of mixing elements from the view with the service layer or data layer. I know I should be working with data only, but I am at a loss of how to efficiently do this.
Each settings page has from one to thirty controls that need to be processed. How do I get all the data from the controls to the service layer without sending the actual controls?
Thanks for the help....

You might get into a better situation if you gather the data, the stuff you are actually interested in, into some object structure with some semantics that will help you structure what is passed on to the next layer. The keyword "Databinding" should also help you to just get to the values that were entered. There is no need to pass any controls back to the next layer..

Related

How do I use multiple user controls on an aspx page without making the same database calls in more than one control?

I'm currently on a project where each .aspx page is made up of multiple user controls (.ascx files). In some cases, more than one user control for a single page is making a call to the database for the same data. Two user controls might call the database for the same customer object to perform the tasks necessary to their individual circumstances. After all is said and done, some of the web pages end up making the same database calls 10 or more times because each user control needs the data for something it is doing. This does not seem efficient.
What are the best practices for handling this situation using ASP.NET Webforms? We have tried caching the database calls, but it just doesn't feel like a solid solution to the problem, although it has helped with performance. When using MVC, I can pass in a strongly-typed object containing all the data needed for a particular page, and the query for the data is only made one time. How do I achieve something similar using WebForms?
The best practices are the same as for any OO programming: Don't Repeat Yourself.
In this case, rather than having the user controls query the database to get their data, have the data passed to the controls, through properties. You can even use data binding, just like a "real" server control.
BTW, there's nothing at all preventing you from passing a strongly-typed object to the user controls, just like you would in MVC:
In MyControl.ascx.cs:
public MyControlModel Model {get; set;}
Then just have the calling page set Model before the control data binds.

Designing app to load, edit, and save hierarchical data

I am writing a GUI that will be integrated with SAP Business One. I'm having a difficult time determine how to load, edit, and save the data in the best way possible (reliable, fast, easy).
I have two tables which are called Structures and StructureRows (not great names). A structure can contain other structures, generics, and specifics. The structures rows hold all of these items and have a type associated with them. The generics are placeholders for specifics and the specifics are an actual item in inventory.
A job will contain job metadata as well as n structures. On the screen where you edit the job, you can add structures and delete structures as well as edit the rows underneath them. For example, if you added Structure 1 to Job 1 and Structure 1 contains Generic 1, the user would be able to swap Generic 1 for a Specific.
I understand how to store the data, but I don't know what the best method to load and save the data is...
I see a few different options:
When someone adds a structure to a job, load the structure, and then recursively load any structures beneath it (the generics and specifics will already be loaded). I would put this all into an Object Model such as List and each Structure object would have List and List. When I save the changes back to the database, I would have to manually loop through the data and persist the changes.
Somehow load the data into a view in SQL and then group and order the datatable/dataset on the client side. Bind the data to a GridView so changes are automatically reflected in the dataset. When you go to save, SQL / ADO.NET could handle this automatically? This seems like the ideal solution, but I don't know how to actually implement it...
The part that throws me off is being able to add a structure to a structure. If it wasn't for this, I would select the Specifics and Generics from the StructureRows table, and group them in the GUI based on the Structure they belong to. I would have them in a DataTable and bind that to the GridView so any changes were persisted automatically to the DataTable and then I could turn around and push them to SQL very easily...
Is loading and saving the data manually via an object model the only option I have? If not, how would you do it? I'm not sure if I'm just making it more complicated then it needs to be or if this is actually difficult to do with C#, ADO.NET and MS SQL.
The HierarchyID datatype was introduced in SQLServer 2008 to handle this kind of thing. Haven't done it myself, but here's a place to start that gives a fair example of how to use it.
That being said, if you aren't wedded to your current tables, and don't need to query out the individual elements (in other words, you are always dealing the job as a whole), I'd be tempted to store the data for each job as XML. (If you were doing a web-based app, you could also go with JSON.) It preserves the hierarchy and there are many tools in .NET for working with XML. There's also a built-in TreeView class for winForms, and doubtless other third party controls available.

Web application to allow users to pick and choose objects used as building blocks?

I'm currently developing an application using ASP.NET MVC, and now I need to create an interface (web page) that will allow the users to pick and choose from a set of different objecs, the ones they'd like to use as the building blocks for constructing a more complex object.
My question is supposed to be generic, but to provide the actual example, let's say the application that will allow users to design pieces of furniture, like wardrobes, kitchen cabinets, etc. So, I've created C# classes representing the basic building blocks of furniture design, like basic shapes (pieces of wood that added together form a box, etc), doors, doorknobs, drawers, etc. Each of these classes have some common properties (width, height, length) and some specific properties, but all descend from a basic class called FurnitureItem, so there are ways for them to be 'connected' together, and interchanged. For instance, there are different types of doors that can be used in a wardrobe... like SimpleDoor, SlidingDoor, and so on. The user designing the furniture would have to choose wich type of Door object to apply to the current furniture. Also, there are other items, like dividing panels, shelves, drawers, etc. The resulting model of course would be a complete customized modularly designed wardrobe or kitchen cabinet, for example.
The problem is that while I can easily instantiate all the objects that I need and connect them together using C#, forming a complete furniture item, I need to provide a way for users to do it using a web interface. That means, they would probably have a toolbox or toolbar of some sort, and select (maybe drag and drop) items to a design panel, in the web interface... so, while in the browser I cannot have my C# class implementation... and if I post the selected item to the server (either a form post or using ajax), i need to reconstruct the whole collection of objects that were already previously chosen by the user, so I can fit the newly added item... and calculate it's dimensions, etc. and then finaly return the complete modified set of objects...
I'm trying to think of different ways of caching, or persisting theses objects while the user is still designing (adding and deleting items), since there may be many roundtrips to the server, because the proper calculation of dimentions (width, height, etc of contained objects) is done at the server by methods of my C# classes. It would be nice maybe to store objects for the currrent furniture being designed in a session object or cache object per user... even then I need to be able to provide some type of ID to the object being added and the one being added to, in a parent owner kind of way, so I can identify properly the object instance back in the server where the new instance will be connected to.
I know it's somehow confusing... but I hope this gives one idea of the problem I'm facing... In other words, I need to keep a set of interconnected objects in the server because they are responsible for calculations and applying some constraints, while allowing the users to manipulate each of these objects and how they are connected, adding and deleting them, through a web interface. So at the end, the whole thing can be persisted in a database. Idealy I want even to give user a visual representation or feedback, so they can see what they are designing as they go along...
Finally, the question is more so as to what approach should I take to this problem. Are C# classes enough in the server (encapsulating calculation and maybe generating each one it's own graphical representation back to the client)? Will I need to create similar classes in javascript to allow a more slick user experience? Will it be easier if I manage to keep the objects alive in a session or cache object between requests? Or should I just instantiate all objects that form the whole furniture again on each user interaction (for calculation)? In that case, I would have to post all the objects and all the already customized properties every time?
Any thoughts or ideas on how to best approach this problem are greatly appreciated...
Thanks!
From the way you've described it, here is what I'm envisioning:
It sounds like you do want a slick looking UI so yes, you'll want to divide your logic into two sets; a client-side set for building and a server-side set for validation. I would get heavy on the javascript so that the user can happily build their widget disconnected, and then validate everything once it's posted to the server.
Saving to a session opens a whole can of webfarm worms. If these widgets can be recreated in less than a minute (once they've decided what they like), I would avoid saving partials all together. If it's absolutely necessary though, I would save them to the database.
If the number of objects to construct a widget is reasonable, it could all come down at once. But if there are hundreds of types of 'doors' you're going to want to consider asynchronous calls to load them, with possible paging/sorting.
I'm confused about your last part about instantiating/posting all objects that form the whole furniture. This shouldn't be necessary. I imagine the user would do his construction on his client, and then pass up a single widget object to the server for validation.
That's what I'm thinking anyway... by the way, hello StackOverflow, this is my first post.
You might want to take a look at Backbone.js for this kind of project. It allows you to create client-side models, collections, views and controllers that would be well suited to your problem domain. It includes built in Ajax code for loading/saving those models/collections to/from the server.
As far as storing objects before the complete object is sent to the server, you could utilize localStorage, and store your object data as a JSON string.

How to improve performance of XtraReports using a web service to get the data?

I'm facing a potential problem using the XtraReports tool and a web service about performance. in a Windows Form app.
I know XtraReport loads large data set (I understand a large data set as +10,000 rows) by loading the first pages and then continue loading the rest of the pages in the background, but all this is done with a data source in hand. So what happens if this data source has to pass through a web service, which will need to serialize the data in order to send it to the client?
The scenario is the following:
I have a thin client in windows form that makes calls to a web service, which takes that call and by reflection instantiates the corresponding class and calls the required method (Please notice that this architecture is inherited, I have almost no choice on this, I have to use it). So I'll have a class that gets the data from the database and send it to the client through the web service interface. This data can be a DataSet, SqlDataReader (also notice that we're using SQL Server 2000, but could be 2008 by the end of the year), DataTable, XML, etc.
If the result data set is large, the serialization + transference time can be considerable, and then render the report can add some more time, degradating the overall performance.
I know that there is a possibility for using something like streaming video, but for streaming data through a web service but I have no lead info for trying something around it.
What do you think about this? Please let me know any questions you may have or if I need to write more info for better statement of the problem.
Thanks!
I'll give you an answer you probably don't want to hear.
Set expectations.
Reports are typically slow because they have to churn through a lot of data. There just isn't a good way to get around it. But barring that, I'd do the following:
Serialize the data load to a binary state, convert to something transferable via soap (base64 for instance) and transfer that. This way you'll avoid a lot of useless angle brackets.
Precache as much data on the client as possible.
Focus on the perceived performance of the application. For instance, throw the report data gathering onto a background thread, so the user can go and do other work, then show the user a notification when the report is available.
Sometimes, it is possible to generate the report for the most often used criteria ahead of time and provide that when asked.
Is there a way you can partition the data, ie return a small subset of it?
If there's no absolute requirement to return all the data at the same time, then dividing up the data into smaller pieces will make a huge difference when reading from the database, serializing, and transporting through the webservice.
Edit: I had deleted this answer since you already mentioned partitioning. I'm undeleting it now, maybe it will serve some use as the starting point of a discussion...
As for your point about how to work with paging: I think you're on the right track with the "Show next 100 results" approach. I don't know how XtraReport works, and what its requirements for a datasource are. There are 3 issues I see:
Server support for partitioned data (eg your webservice should support returning just "page 3"'s data
Summary Data - does your report have a row of totals or averages? Are those calculated by the XtraReport control? Does it require a complete dataset to display those results? Can you provide the summaries to the control on your own (and find a more efficient way to calculate them without returning the entire data set?)
XtraReport support for a datasource that uses paging.
transfering DataSets is a bad idea. DataSet have a lot of unusefull date. Use Simple Objects. Use ORM on server side.
Also you can precalculte some data. Referencies can be cached on client and then joined with server data.

Adding "Export to XML" to Dynamic Data Site

Am working on a business layer for a complex web application and am temporary using the Dynamic Data Site functionality to allow data to be entered into the many tables that I need to maintain. I don't want to spend too much time on this DDS since the business layer needs to be finished first. Once the business layer is done, it gets shipped to someone else to add a better user interface.
However, while the DDS offers a lot of functionality in a very easy way, I just want to extend it with an "Export to XML" button or link. (And I'll probably will add an "Export to Excel" button later.)
So, has anyone done something like this already? What would be the easiest way to implement this in .NET, without rewriting the DDS?
(I use an Entity model for the database connection and much of the business layer is built upon this entity model. Once the business layer is finished, the real GUI interface will be developed for this web application but for now I just need a good way to input/output this data.)
I have no problems converting an entity set to XML. That's the easy part. My problem lies in expanding "ListDetails.aspx" with an additional button which the user can click. Once clicked, it should export the dataset to XML. To make it interesting, if the user has set one or more filters, it should only export those filtered records.
I think I'll have to look into the "GridDataSource" object that's on this page and experiment with it. Will it return the whole table or just the filtered dataset? Or just the records that are on the current page?
Now, with export, I just want a dump of the dataset to XML. Basically, what you see should end up in the final XML. If I have access to the filtered dataset then creating the XML would be easy. (And creating the Excel sheet on top of that is a piece of cake too.) In general, the export is just used to help develop the business layer of a project I'm working on. Most of the code is business logic that will be used in other (web/desktop) client applications but while the project is still in progress, the DDS is needed for easier data entry for the project. Once it's finished (a gazillion years from now, I guess) then the DDS won't be used anymore. Nor would we use the XML exports or the export sheets. But for now, those exports are useful to evaluate the data. (Since I still have to develop the more complex analysis tools.)
This is fairly straightforward, you've got to address a couple of issues:
Providing a means to trigger the export
Generating the XML
Making the XML available (as a link) for download - assuming that that's what you want to do.
There's a slightly less straightforward alternative which is to create a service to generate and return the XML.
In terms of the first - there's nothing to stop you editing either the master page or the default page to add your own functionaliy i.e. a button or a link to an XML gen page.
In terms of the second - Linq makes it almost trivial to generate XML from your Entity model.
Once you've got your XML you've got various options the key here is that you can add your own pages to the site if you want - the magic in Dynamic Data is simply a starting point not the final product (although if it does all you need then you can walk away with a smile on your face).
I appreciate that these are generic answers but its a fairly generic question and the details of implementation would be better addressed by more specific pages.
In terms of specific, I have a Dyanmic data site which needs to generate XML, the first iteration was simply a button on the default page that saved a file to disk (one file name, one file format, click, gen, save, done). The reason for the XML was as the source data for another site so I then added a WCF service which exposes the same XML. Total time spent (less a bit for getting my head around WCF) probably less than half a day - most of which has been spent fiddling with the XML output.
Maybe you can do something with FileHelpers.

Categories

Resources