Wednesday, 14 May 2008

Having a baseline

While the technical part of database performance tuning is almost different in each case, on the business side of things I run almost always into the same conversations. My favorite one is the conversation with potential customers. These follows more or less the same structure. And it always begins with the same sentences:

-Customer: our application has performance problems, we're looking for someone to improve it.
-Consultuning: have you already identified specific parts of the application that are not performing as you need?
-Customer: well it's this screen/process/job that is being perceived as slow.
-Consultuning: ok, let's have a look at it. What exactly do you mean by slow?
-Customer: I mean awfully slow, our business cannot operate with these response times.

Ironically, you'll think that at this point the customer answer would be "it takes xxxx seconds", but usually that's not the case. It's unusual to face a customer that has actually timed it at least to the second. For interactive applications ("dialog steps" in the SAP world) this is understandable, as it was proved a long time ago by usability studies that anything longer than three seconds is perceived as "slow". For batch processes, since they are run unattended, the execution time is not usually monitored even if it's captured in the system logs.

But remember, if you don't measure it, you cannot tune it. Any performance improvement effort will be a balance of cost, time to implement, and resource consumption. These have to be weighted against the benefits you're getting. And unless you have some tangible numbers, you simply cannot do that. So you'll not have any basis to accept or reject any proposed changes. One of my rules of thumb is to reject any proposed change if it's not backed up by some measurements that prove that the performance gain is worth the cost. This is also one of the points of friction when dealing with people (or automated tools) trying to improve performance by applying "rules of thumb", as the cost and impact of changes is often ignored. But is worth the discussion.

And what about the targets?


Yes, ideally, you should have a performance target. The problem is, the very nature of performance tuning is not deterministic. Because if you already know what your target is and that you can reach it, you probably already solved the tuning problem. So the conversation follows...

-Consultuning: well, how fast you want it to be?
-Customer: of course, I want it to be fast enough.

My advice is that you try to define "fast enough", but be aware that you'll not get any commitment to reach that performance level from any tuning expert worth its title, except in the "been there, done that" cases, but those are very rare with bespoke applications. So we have two cases:

-You have a performance target. Stop tuning when you reach it. Save your tuning budget for later rounds.
- If you don't have a target in your head, keep this instead: each tuning step will yield less performance benefits than the previous one. At some point, the gains will be minimal, and the cost to implement them too high. This is the moment where you should stop tuning.

No comments:

Post a Comment