We have a very old windows forms application that communicates with the server using .net remoting.
Can anyone recommend a method or a tool to performance test this.
Last time I looked there were no good tools for performance testing a windows application with many users.
You can use a profiler to see what is going on if the single user case is too slow.
If you only are about a hand full of users, most UI test tools could be used to record a script that you played back on a few machines.
Otherwise you need to write a command line application that talks to the server in the same way as the windows forms app, and then run many copies of that command line app.
Or get the server to log all calls in such a way as the calls can be played back.
In the past I have thought about getting each client to log the calls to the sever in the form of C# statements that could then be compiled to create a performance test program.
There are some systems that claims to record the communication between the client and the sever at the network level, and then play it back many times. This may work if it is sensible for lot of clients to send the some requests with the same params to the server, otherwise there will be lots of scripting within the tool to create custom requests for each client.
To do performance testing on the application, you can use a profiler, such as the ANTS Performance Profiler. I'm not sure if that's going to profile the network communication as well or not.
Alternatively, you can set up performance counters and record information that way when you run your app.
Related
I'd like to know my options for the following scenario:
I have a C# winforms application (developed in VS 2010) distributed to a number of offices within the country. The application communicates with a C# web service which lies on a main server at a separate location and there is one database (SQL Server 2012) at a further location. (All servers run Windows Server 2008)
Head Office (where we are) utilize the same front-end to manage certain information on the database which needs to be readily available to all offices - real-time. At the same time, any data they change needs to be readily available to us at Head Office as we have a real-time dashboard web application that monitors site-wide statistics.
Currently, the users are complaining about the speed at which the application operates. They say it is really slow. We work in a business-critical environment where every minute waiting may mean losing a client.
I have researched the following options, but do not come from a DB background, so not too sure what the best route for my scenario is.
Terminal Services/Sessions (which I've just implemented at Head Office and they say it's a great improvement, although there's a terrible lag - like remoting onto someones desktop, which is not nice to work on.)
Transactional Replication (Sounds like something quite plausible for my scenario, but would require all offices to have their own SQL server database on their individual servers and they have a tendency to "fiddle" and break everything they're left in charge of!) Wish we could take over all their servers, but they are franchises so have their own IT people on site.)
I've currently got a whole lot of the look-up data being cached on start-up of the application but this too takes 2-3 minutes to complete which is just not acceptable!
Does anyone have any ideas?
With everything running through the web service, there is no need for additional SQL Servers to be deployed local to the client. The WS wouldn't be able to communicate with these databases, unless the WS was also deployed locally as well.
Before suggesting any specific improvements, you need to benchmark where your bottlenecks are occurring. What is the latency between the various clients and the web service, and then from the web service and the database? Does the database show any waiting? Once you know the worst case scenario, improve that, and then work your way down.
Some general thoughts, though:
Move the WS closer to the database
Cache the data at the web service level to save on DB calls
Find the expense WS calls, and try to optimize the throughput
If the lookup data doesn't change all that often, use a local copy of SQL CE to cache that data, and use the MS Sync Framework to keep the data synchronized to the SQL Server
Use SQL CE for everything on the client computer, and use a background process to sync between the client and WS
UPDATE
After your comment, two additional thoughts. If your web service payload(s) is/are large, you can try adding compression on the web service (if it hasn't already been implemented).
You can also update your client to do the WS calls asynchronously, either in a thread or if you are using .NET 4.5 using async/await. This would at least allow the client to use the UI, but wouldn't necessary fix any issues with data load times.
I have one application which is developed in ASP .NET MVC 3 which using a SQL server database.
Apart from this, I have one console application which calls an external web service and update the same database with the information and business rules. (Basically we iterate the records from Web service and process the business rule and update the same database), we have configured the console application with Windows scheduler to process it periodically.
The problem is, when my Console application runs periodically, it uses the 100% CPU usage (because we're getting more than 2000 records from web service), and because of that my current MVC application is gets haging OR sometime works very very slow because both application are configured on same windows server.
Could anybody please do let me know that How would I resolve this problem where I want both the things on same server because I have central database used by both application.
Thanks in advance.
You haven't given any detail that anyone can really provide resolution, so I'll simply suggest how I would approach it.
First, I would review the database schema with a DBA to make sure there aren't things like table locks (or if there are, come up with strategies to compensate for them). I would then use the SQL Server profiler to see where (or if) there are any bottle necks in SQL server while these things are running. I would then profile the console application to make sure it's not doing something it doesn't need to be doing. I might even consider profiling the web site to see if there's anything in there that might be contributing to slowness.
After that, I would figure out how to get rid of the Console application and work its functionality into the site. Spawning another application on a given web request is not scalable. More than a couple of those come in at once and you've got the potential to bog the server down very easily.
I have a question that is perhaps slightly towards architecture and design rather than syntax problem....
I have a c# winforms app on my desktop which I have built which is similar to a CRM. I make updates to customers records as they submit orders/invoices etc and what I would like to do is to build (or buy) a module that will update a remote database that sits behind a website onto hich registered clients log in. I want the clients to be able to see the status of their invoice/purchase etc as I have updated it on the winforms app.
I can think of a couple of options of the top of my head but would like to know more if you have done something similar
Things I am considering are;
>Replication - I think this is overkill as the updates are not
huge, are one way only, and not
critical they are in real time, and
also I am running SQL express on the
winforms app. This can be changed
but rather not
>create a text/xml file that gets created and uploaded to the web
server to a location that is
monitored every 5 minutes and then
updates the web database. - I am not
hosting the website myself so I do
not have complete control over what
I can install but I suspect I can
install a .NET 'filewatcher'
Anyway, I would appreciate your thought on my 'problem'
thanks
I think your best bet is to create a web service of some kind (I like using ServiceStack.net to create simple REST ones, much cleaner imo than WCF). This will sit on the server and be responsible for the server-side sync piece.
On the client, you could either have the winforms app fire off the call to the web service based on some threshold of activity, or you could have a windows service that you install with the winforms app which does it in a scheduled job or on a timer.
You'll want to be sure that you're doing all your calls over SSL of course, and make sure you're authenticating the clients, but that's the basic architectural approach I'd take.
I'm going to develop a POS system for medium scale company
and the requirement for me is to make all data on time for all of their branches
while in my mind, move the server from local to web would solve this problem
but, i never done any online server for window application
may i know what is the best option for use as secure database ?
such as SQL can handle this well ?
i tried to google but all of the result return is not what i want
may i know what will you do when you facing this problem ?
my knowledge on coding is just VB and CS
also SQL for database
i would like to learn new if there is better option
i hope it is impossible to access by anonymous and it is store secure at back-end only
What you probably want to do is create a series of services exposed on the internet and accessed by your application. All database access would be mediated by these services. For security you would probably want to build them in WCF and expose them through IIS. Then your Windows application would just call these services for most of its processing.
If you design it properly you could also have it work with a local database as well so that it could work in a disconnected manner if, for example, your servers go down.
Typically you don't move the server off of the site premises.
The problem is that they will go completely down in the event your remote server is inaccessible. Things that can cause this are internet service interruption (pretty common), remote server overloaded (common enough), basically anything that can stop the traffic between the store location and your remove server will bring them to their knees. The first time this happens they'll scream. The second time and they'll want your head due to the lost sales.
Instead, leave a sql server at each location. Set up a master sql server somewhere. Then set up a VPN connection between the stores and this central office. Finally, have the store sql boxes do merge replication with the central office. Incidentally, don't use the built in replication, but an off the shelf product which specializes in replicating sql server. The built in one can be difficult to learn.
In the event their internet connection goes dark the individual stores will still be able to function. It will also remain performant as all of the desktop app traffic is purely to the local sql box.
Solving replication errors is much easier than dealing with a flaky ISP.
I would recommend you to check Viravis Platform out.
It is an application platform that also can be used just as an online database for any .NET client with the provided SDK. It has its own generic windows and web clients and some custom web solutions for some specific applications.
You may be using it as a complete solution or as a secure online database backend.
I am making a medium sized standard LOB application. Currently its a web application but I am formulating a proposal to revamp it into a Desktop remote application. By this I mean that the database and the application server will be hosted in a remote location. The client application will communicate with the server via the internet through (either WCF / Webservices / Remoting).
My question is this: The only reason I am shifting this from a web platform is due to the constraints of the web (I dont want to do AJAX or Java scripting to minimize those constraints, so please no JS/AJAX recommendations). I have made traditional desktop applications and they are considerably fast but i have never made a remote or a distributed application. I am not sure weather the speed of the application will be faster then the web or not.
As I understand it, the remote desktop application would be much faster. For one, there wont be any post backs involved, (I hate them so much). The data will obviously come via internet, so in that respect, is it better to shift to the remote desktop just for sheer speed and power?
Any help in the right direction would be greatfull. Many thanks.
Zeeshan
I think biggest advantage of desktop clients over web applications is freedom in UI design, and you don't have to worry about any inconsistencies in the client environment, although those are not an issue if you are using a client that runs on silverlight.
Personally I don't like web applications that requires a lot of user interaction, there are some of them that is a pleasure to use but I think it is very easy to do it the wrong way and end up having a buggy or not so responsive application (probably because of the incompatibilities in browsers, I have IE, Firefox and Chrome installed on my computer and I use one for some websites because they run faster on it, and others for other sites because web pages show up correctly only on them). Though this might not be an issue for a silverlight client.
In case of network speed, depending on the things that goes on the wire even with binary serialization remoting might have quite a bit of overhead. For example along with the data it writes full class names, library names and their versions so it can get pretty big and slow even for small amounts of data (although it should still be smaller then HTTP). It also has the same problems that HTTP has over unreliable connections because it uses a similar protocol. For one project we had to write a custom serializer for some objects because binary serialization alone was generating 200K, but our custom serializer for those objects were generating 50K. Then we ended up writing our own network protocol because the one that comes with the runtime was frequently stalling over unreliable wireless networks, and remoting doesn't give any control on the socket created by it (which makes sense in terms of encapsulation but you can't close it and force it to open a new one).
(I am assuming that you are asking about remoting vs web app. not remote desktop vs. web apps, because of your note about post back, you can't avoid it with a remote desktop session)
Rewriting an application just for sheer speed? No, because probably user won't see much difference in response time.
You are somewhat ambiguous with your terminology - do you want a client app that runs on the user's machine, or do you want an app that runs on the server and the user connects via remote desktop (RDP)?
If you are talking about a client app that communicates to the server via WCF etc., then yes it will be faster than a standard web app, although it will still be slower than a native desktop app. It will be faster than the web app not just because of the lack of postbacks, but also because you will be sending pure data through the wire, not a massive amount of HTML/Javascript combined with your data. With a client app, you have several options so consider them carefully - do you want Silverlight, WPF, or a native WinForms app? Each have their positives and negatives.
If you were talking about having a client app running on the server which the user then access via RDP, then you have other considerations to think of. For any more than two concurrent users you will need to consider buying CALs so the users can connect to the server. At this point you should also be considering whether you should be running a terminal server or Citrix type setup instead of using remote desktop.
Edit
When using WCF over a WAN (internet) you will certainly have to consider how you will secure it. WCF makes it trivial to secure the channel, but you need to consider how you will do authentication - there are a couple of different ways, but you can easily google that stuff yourself. The method you choose will be important due to the limited resources or skill-sets of the users.
As for what you write it in, you can't argue with Winforms if that is where your experience is. Personally, i would never again use ASP.NET/Ajax/etc for a web type application, it would be WPF or Silverlight all the way (i would only use ASP.NET for simple web sites). You can use the express (free) versions of Visual Studio to write it in, you don't need Expression (it's just a nice to have, and is more aimed at the design side than the actual coding side). Deploying the app need not be difficult - Silverlight or WPF xbap are delivered via the web, the user has to do nothing (except for the simple install of the Silverlight plugin or installing the right .Net framework for WPF - check this link). Winforms or stand-alone WPF require slightly more work, but you can avoid most issues by writing a good installer.
Whichever you choose, make sure you don't under estimate the time for development (because you will have a bit of a learning curve), and also make sure you budget enough time for testing it - especially the security side of it :)
I have been in a similar situation, although started with a Winforms LOB application.
Heres what we found with WinForms...
It's going to be harder to deploy in your release cycle, to all client machines.
WinForms can't be run on other operating systems easily. (with the exception on mono)
WCF endpoints can get complicated, and you need to manage an endpoint for release/version of your application.
Authentication, Authorization and Security can be tricky to get right!
Heres why you should stick to a html web application.
it's going to be easier to deploy, as you just need to copy one set of DLL's into the bin folder. Can be scripted from a continuous integration or staging server.
Security is going to be easy, by using a SSL certificate.
Silverlight/Flash should fill in the gaps that HTML leaves out.
Microsoft has also combined the connected systems in .net 3.5, they now call it WCF (ASMX/Remoting/etc...). It's got quite a learning curve 4-5 weeks.