Is there a way for me to get the amount of memory and processor power needed for my application. I recently had a very unpleasant experience when one of my applications kept freezing the computers on which it was working. This is obviously related to the lack of hardware power, because it works perfectly on the stronger computers that I used for testing purposes, where the application worked perfectly. So my question is - is there a way to calculate the amount of hardware power needed to run the application smoothly?
Almost all of my applications are done in C#, so I would need a method that can work with that type of application.
Thanks
This is obviously related to the lack of hardware power
This entirely depends on what your application is doing. If you are solving problems in a "not so time efficient way", then you can optimize the code.
I would suggest that you analyze your code with a profiler.
This will tell you:
What parts of your code are taking up most RAM/CPU
How much RAM in total did your application need when it peeked
Information about CPU consumption
This is obviously related to the lack of hardware power, because it works perfectly on the
stronger computers that I used for testing purposes,
Whoever set up testing should be fired.
You have to have one set of computers that are similar to the ones the application will run in for testing. That was accepted practice 20 years ago - seems modern times do not care about that.
Seriously, you NEED to have a test set that is representative on your lowest accepted hardware level.
Otherwise - no, sorry, no magic button. Profilers do NOT necessarily help (debugging, profiler may use more memory). Try a profiler. Optimize code. But at the end... you need to have a decent testbed.
I'd argue that this should be checked during software installation. Later, if user was prompted for updating his/her hardware and dismissed the warning, you shouldn't care about that.
If you're using Windows Installer (MSI), you can play with a custom action and use System.Management classes to detect whatever you want.
Related
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How to specify the hardware your software needs?
How do you determine the system requirements of a user's PC in order for them to install and run your software?
I am aware of the obvious, such as Windows, .NET Framework [version number]. But how do you come up with the correct RAM, Processor and all of that?
Is this just something that you observe while you're debugging your app? Do you just check out the Resource Monitor and watch for how much Disk usage your app is using, or how much memory it is taking up?
Are there any tools, or would you recommend I use tools to help determine system requirements for my applications?
I've searched for this but I have not been able to find much information.
More importantly, what about the Windows Experience Index? I've seen a few box apps in the shop say you need a Windows Exp. Index of N, but are there tools that determine what index is required for my app to run?
Until you start doing stress testing and load testing, using or carefully simulating production volumes and diversity of data, you do not really have a high quality build ready for mass deployment.
And when you do, experience (measurements and, if necessary, projection) from this testing will give you RAM, CPU and similar requirements for your customers.
Sure, the resource monitor is a good way to see how much CPU and ram it consumes. But it all depends on the app you're making, and as the developer you know aprox. how much power is needed under the hood.
If you're just developing standard WinForms / VCL apps that use standard native controls, you really shouldn't worry too much - 256 MB RAM and a 1 GHz processor should be enough, this is usually what I tend to put on my sysreq page.
For heavy 3D games you should probably start looking more into it, how you do that I can't tell you.
If you REALLY want exact hertz and bytes, you could use a VM and alter the specs and see how your app behaves.
All what I know about performance testing is what it's name suggests!
But I have some problems specially with the database querying techniques and how will it affect my application's performance at normal times and at stress!
So can performance tests calculates for me a certain page's performance ?
Can I do that on the development machine (my own pc/local host) ?
Or I have to test it on the hosting server ? do I have to own a server or shared hosting is okay ?
what are the available books/articles ? and the good free tools to use ?
I know I asked a lot of questions but they will actually all adds up to help anyone that is having the same spins in my head when trying to decide which technique to use and can't get a definite opinion from the experienced ones!
Thanks in advance for your time and effort =)
First, if you know you have problems with your db architecture, then it sounds like you don't really need to do load testing at this time, you'd be better served figuring out what your db issues are.
As for the overall, "how can I load test, and what are some good directions to go?" It depends on a couple of things. First, you could test in your dev environment, though unless its the same setup as the production environment (server setup / cpu memory / ect.), then it is only going to be an estimate. In general I prefer to use a staging / test environment that mimics the production environment as closely as possible.
If you think you're going to have an application with high usage you'll want to know what your performance is period, whether dedicated or shared hosting. I will say, however, that if you are expecting a high traffic site / application, you'll probably have a number of reasons to have a dedicated hosting environment (or a cloud based solution).
There are some decent free tools available, specifically there is http://jmeter.apache.org/ which can plug into a bunch of stuff, the catch is that, while the gui interface is better than years ago, its not as good as some of the commercial options available.
You'll ultimately run into an issue where you can only bang on something so much from a single client computer, even with one of these packages, and you'll need to start distributing that load. That is where the commercial packages start to really provide some good benefits.
For C# specifically, and .Net projects in general, Visual Studio (depedning on your version) should have something like Test Projects, which you can read more about here: http://msdn.microsoft.com/en-us/library/ms182605(v=vs.80).aspx That may be closer, specifically, to what you were asking in the first place.
The most basic without access to the server is:
Console.write("Starting in " + DateTime.Now;)
//code
Console.write("Ending in " + DateTime.Now;)
Then you can measure what consult takes more time.
But you need to test with more scenarios, an approach can be better that other in certain cases, but vice-versa in others.
It's a tricky subject, and you will need more than just Stack Overflow to work through this - though I'm not aware of any books or web sites. This is just my experience talking...
In general, you want to know 2 things:
how many visitors can my site handle?
what do I need to do to increase that number?
You usually need to manage these concurrently.
My approach is to include performance testing into the development lifecycle, by creating a test environment (a dev machine is usually okay) on which I can control all the variables.
I use JMeter to run performance tests mimicking the common user journeys, and establish the number of users where the system starts to exceed maximum allowed response times (I typically use 1 second as the limit). Once I know where that point is, I will use analysis tools to understand what is causing the system to exceed its response time - is it the database? Should I introduce caching? Tools like PAL make this easy; at a more detailed level, you should use profilers (Redgate do a great one).
I run this process for an afternoon, once every two weeks, so there's no nasty surprise at the end of the project. By doing this, I have a high degree of confidence in my application's performance, and I know what to expect on "production" hardware.
On production, it's much harder to get accesso to the data which allows you to analyze a bottleneck - and once the site is live, it's usually harder to get permission to run performance tests which can bring the site down. On anything other than a start-up site, the infrastructure requirements mean it's usually too expensive to have a test environment that reflects live.
Therefore, I usually don't run a performance test on production which drives the app to the breaking point - but I do run "smoke tests", and collect log files which allow the PAL reports to be generated. The smoke test pushes the environment to a level which I expect to be around 50% of the breaking point - so if I think we've got a capacity of 100 concurrent users, the smoke test will go to 50 concurrent users.
What are some good conventions to follow if I want to make my application harder to crack?
As long as your entire application is client side, it's completely impossible to protect it from being cracked. The only way to protect an application from being cracked is to make it have to connect to a server to function (like an online game, for example).
And even then, I have seen some cracks that simulate a server and send a dummy confirmation to the program so it thinks it's talking to a real, legit server (in this case I'm talking about a "call home" verification strategy, not a game).
Also, keep in mind that where there is a will, there's a way. If someone wants your product badly, they will get it. And in the end you will implement protection that can cause complications for your honest customers and is just seen as a challenge to crackers.
Also, see this thread for a very thorough discussion on this topic.
A lot of the answers seem to miss the point that the question was how to make it harder, not how to make it impossible.
Obfuscation is the first critical step in that process. Anything further will be too easy to work out if the code is not Obfuscated.
After that, it does depend a bit on what you are trying to avoid. Installation without a license? The timed trial blowing up? Increased usage of the software (e.g. on more CPUs) without paying additional fees?
In today's world of virtual machines, the long term anti-cracking strategy has to involve some calling of home. The environment is just too easy to make pristine. That being said, some types of software are useless if you have to go back to a pristine state to use them. If that is your type of software, then there are rather obscure places to put things in the registry to track timed trials. And in general a license key scheme that is hard to forge.
One thing to be aware of though - don't get too fancy. Quite often the licensing scheme gets the least amount of QA, and hits serious problems in production where legitimate customers get locked out. Don't drive away real paying customers out of fear of copying by people would most likely wouldn't have paid you a dime anyway.
Book: Writing Secure Code 2
There are 3rd party tools to obfuscate your code. Visual Studio comes with one.
BUT, first, you should seriously think about why you'd bother. If your app is good enough and popular enough to desire being cracked, it will be, despite all of your efforts.
Here are some tips, not perfect but maybe could help:
update your software frequently
if your software connects to some server somewhere change the protocol now and then. you can even have a number of protocols and alternate between them depending on some algorithm
store part of your software on a server which downloads every time you run the software
when you start your program do a crc check of your dlls that you load i.e. have a list of crc's for approved dll's
have a service that overlooks your main application doing crc checks once in a while and monitoring your other dependent dll's/assemblies.
unfortunately the more you spend on copy protecting your software the less you have to spend on functionality, all about balance.
another approach is to sell your software cheap but to do frequent, cheap upgrades/updates, that way it will not profitable to crack.
The thing with .NET code is it is relatively easy to reverse engineer using tools like .NET Reflector. Obfuscation of code can help but it's still possible to work out.
If you want a fast solution (but of course, there's no promise that you won't be cracked - it's just some "protection"), you can search for some tools like Themida or Star Force. These are both famous protection shells.
It's impossible really. Just release a patch often then change the salt in your encryption. However if your software get's cracked be proud it must be really good :-)
this is almost like mission impossible, unless you have very few customers.
just consider - have you ever seen a version of Windows that is not cracked?
If you invent a way to protect it, someone can invent a way to crack it. Spend enought effort so that when people use it in an "illegal" way, they are aware of it. Most things beyond that risk being a waste of time ;o)
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I want to develop a windows application. If I use native C++ and MFC for user interface then the application will be very fast and tiny. But using MFC is very complicated. Also If I use C# then the application will be slower than the native code and It reqiures .NET framework to run. But developing GUI is very easy by using WinForm. Which one do you prefer?
"fast" and "slow" are subjective, especially with today's PC's. I'm not saying deliberately make the thing slow, but there isn't nearly as much overhead in writing a managed application as you might think. The JIT etc work very well to make the code execute very fast. And you can also NGEN for extra start-up speed if you really need.
Actually, if you have time to learn it, you might want to consider WPF rather than winform - this is a different skill-set, but allows you to make very good use of graphics hardware etc.
Also - .NET framework comes with new OS installs, and is still very common on those that pre-date it. So for me it would be a fairly clear choice to develop with C#/.NET. The time to develop a robust and fully tested C++ app (with no leaks, etc) is (for me at least) much greater than the same with C#.
Be careful not to optimize too early; you mention the speed of the code but for most Windows user interface operations, the difference is not noticeable as the main bottlenecks of drawing and disk access are no different for either approach.
My recommendation is that you use C# and WPF or WinForms for your user interface. If you encounter some slow downs, use a profiler to determine where and then consider replacing some business logic with native code, but only if there is a benefit.
There are a great many possible Windows applications, and each has its own requirements.
If your application needs to be fast (and what I work on does), then native C++ is a good way to go.
If your application needs to be small (perhaps for really fast transmission over slow lines), then use whatever gets it small.
If your application is likely to be downloaded a lot, you probably want to be leery of later versions of .NET, which your users might not have yet.
If, like most, it will be fast and small enough anyway on the systems it's likely to be used on, use what will allow you to develop fastest and best.
In almost all cases, the right thing to optimize is developer effort. Use whatever gets a high-quality job done fastest and best.
First.. (though I'm a die hard c++ coder) I have to admit c# is in most cases perfectly fine where speed and size are concerned. In some cases the application is smaller, because the interpreted part is already on the target system. (don't spamm me on that one, an app with a dll is smaller then the app all in one. Windows just happens to ship with the "DLL" already there.)
As to coding.. I honestly don't think there is a significant difference. I don't spend alot of my time typing code. Most of it is thinking out a problem. The code part is quite small. Saving a few lines here and there.. Blahh it's not an argument for me.. If it were I'd be working in APL. Learning the STL, MFC and what have you is likely just as intensive as learning the c# libraries. In the end they're all the same.
C# does have one thing going for it.. A market. It's the latest "hot" skill and so theres a market for it. Jobs are easy to find. Now keep in mind java was a "hot" skill a few years back and now ever tom dick and harry has it on their resume. That makes it harder to niche yourself.
Ok.. all that said.. I LOVE C++.. There's nothing like getting dirty when I really need to. When the MFC libs don't do the job I take a look at what they're sitting on and so on and so on.. It's a perenial language and I belive it's still at or near the most used lang in the world. Yah c++, Yah!.
Note also that most Windows computer already have .NET installed on them, so that really shouldn't be a concern.
Also, aside from the .NET installation, .NET applications tend to be quite small.
And for most application with a UI, the speed of the User is the really limiting time factor.
C# applications are slower to start than MFC applications, but you might not notice a speed difference between the two once the application is loaded.
Having no information on the application you plan to develop, I vote for WPF.
In my opinion, the requirements should help you decide the platform. What is more important: Having an application that is easily maintainable or one that must be extremely fast and small ?
A large class of applications nowadays can be written using .NET and managed code and this is in general beneficial to the development in the long term. From my experience, .NET applications are usually fast enough for most use cases and they are simpler to create.
Native C++ still has its use, but just for being "faster and smaller", when "fast enough and small enough" is sufficient does not sound enough as a justification.
The speed argument between native and managed code is largely a non-issue at this point. Each release of the .NET Framework makes performance improvements over the previous ones and application performance is always a very high priority for the .NET development teams.
Starting with Windows Vista and Windows Server 2008, the .NET Framework is installed as part of the operating system. It is also part of Windows Update so almost any Windows XP system will also have it installed. If the requirement that the framework be installed on the target machine is really that much of a problem there are also compilers that will essentially embed the required runtime portions into your application to generate a single exe, but they are expensive (and in my opinion, not really worth the cost).
MFC isn't hard to learn, actually it is very easy.
Almost equal to C#.
Choice of a language or tool should be dictated by the functional and performance requirements of your project and expertise. If performance is a real consideration for you and you have done some analysis to prefer C++ over C#, then you have a decision already. Note though that having MFC based application is not terribly efficient either. On the other hand, overheads in .NET applications are over-stated.
Performance is something that is really a function of how well you write your code and what scalability requirements exist. If you would only have to work with one client with a maximum database records of 1K, then we should not be talking performance.
If ease of development and maintainability is more important, certainly C# would be the choice.
So I am not sure this is a question that can be given an answer as choice A or B with the data you have provided. You need to do the analysis of functional and non-functional requirements and decide.
Also If I use C# then the application will be slower than the native code and It reqiures .NET framework to run
An MFC app requires MFC dll's to run (and probably VC runtime as well!), so they might need to be installed, or if they are statically linked, you add to the size of the exe.
Blockquote
.NET is easier to work with. Unless you'll lose users by using it or will have trouble with code migration, you should probably use .NET. It is highly unlikely that speed will be an issue for this. Size probably doesn't matter that much, either.
Which technology are you more familiar with?
The information you gave does not include anything that would help decide. Yes, MFC apps tend to be smaller (if you include the runtime size which isn't a suitable measure in the long run), more responsive and mor costly to develop. So what?
I've not be coding long so I'm not familiar with which technique is quickest so I was wondering if there was a way to do this in VS or with a 3rd party tool?
Thanks
Profilers are great for measuring.
But your question was "How can I determine where the slow parts of my code are?".
That is a different problem. It is diagnosis, not measurement.
I know this is not a popular view, but it's true.
It is like a business that is trying to cut costs.
One approach (top down) is to measure the overall finances, then break it down by categories and departments, and try to guess what could be eliminated. That is measurement.
Another approach (bottom up) is to walk in at random into an office, pick someone at random, and ask them what they are doing at that moment and (importantly) why, in detail.
Do this more than once.
That is what Harry Truman did at the outbreak of WW2, in the US defense industry, and immediately uncovered massive fraud and waste, by visiting several sites. That is diagnosis.
In code you can do this in a very simple way: "Pause" it and ask it why it is spending that particular cycle. Usually the call stack tells you why, in detail.
Do this more than once.
This is sampling. Some profilers sample the call stack. But then for some reason they insist on summarizing time spent in each function, inclusive and exclusive. That is like summarizing by department in business, inclusive and exclusive.
It loses the information you need, which is the fine-grain detail that tells if the cycles are necessary.
To answer your question:
Just pause your program several times, and capture the call stack each time. If your code is very slow, the wasteful function calls will be on nearly every stack. They will point with precision to the "slow parts of your code".
ADDED: RedGate ANTS is getting there. It can give you cost-by-line, and it is quite spiffy. So if you're in .NET, and can spare 3 figures, and don't mind waiting around to install & learn it, it can tell you much of what your Pause key can tell you, and be much more pretty about it.
Profiling.
RedGate has a product.
JetBrains has a product.
I've used ANTS Profiler and I can join the others with recommendation.
The price is NEGLIGIBLE when you compare it with the amount of dev hours it will save you.
I you're developer for a living, and your company won't buy it for you, either change the company or buy it for yourself.
For profiling large complex UI applications then you often need a set of tools and approaches. I'll outline the approach and tools I used recently on a project to improve the performance of a .Net 2.0 UI application.
First of all I interviewed users and worked through the use cases myself to come up with a list of target use cases that highlighted the systems worse performing areas. I.e. I didn't want to spend n man days optimising a feature that was hardly ever used but very slow. I would want to spend time, however, optimising a feature that was a little bit sluggish but invoked a 1000 times a day, etc.
Once the candidate use cases were identified I instrumented my code with my own light weight logging class (I used some high performance timers and a custom logging solution because a needed sub-millisecond accuracy). You might, however, be able to get away with log4net and time stamps. The reason I instrumented code is that it is sometimes easier to read your own logs rather than the profiler's output. I needed both for a variety of reasons (e.g. measuring .Net user control layouts is not always straightforward using the profiler).
I then ran my instrumented code with the ANTS profiler and profiled the use case. By combining the ANTS profile and my own log files I was very quickly able to discover problems with our application.
We also profiled the server as well as the UI and were able to work out breakdowns for time spent in the UI, time spent on the wire, time spent on the server etc.
Also worth noting is that 1 run isn't enough, and the 1st run is usually worth throwing away. Let me explain: PC load, network traffic, JIT compilation status etc can all affect the time a particular operation will take. A simple strategy is to measure an operation n times (say 5), throw away the slowest and fastest run, the analyse the remianing profiles.
Eqatec profiler is a cute small profiler that is free and easy to use. It probably won't come anywhere near the "wow" factor of Ants profiler in terms of features but it still is very cool IMO and worth a look.
Use a profiler. ANTS costs money but is very nice.
i just set breakpoints, visual will tell you how many ms between breakpoint has passed. so you can find it manually.
ANTS Profiler is very good.
If you don't want to pay, the newer VS verions come with a profiler, but to be honest it doesn't seem very good. ATI/AMD make a free profiler... but its not very user friendly (to me, I couldn't get any useful info out of it).
The advice I would give is to time function calls yourself with code. If they are fast and you do not have a high-precision timer or the calls vary in slowness for a number of reasons (e.g. every x calls building some kind of cache), try running each one x10000 times or something, then dividing the result accordingly. This may not be perfect for some sections of code, but if you are unable to find a good, free, 3rd party solution, its pretty much what's left unless you want to pay.
Yet another option is Intel's VTune.