When I've worked with clients to device CM practices, tools and processes, I've found that the "mix" of tools depends on the environment. If you have one or a small number of highly integrated tiers serving up some core business applications, then a dedicated modelling tool such as made by TeamQuest or BMC may be appropriate.
There are products on the market that can be used to shoe-horn application monitoring as well: Amberpoint (now owned by Oracle) and Correlsense for example.
If your organization has a heterogeneous mix, and a large enterprise, then you face a different set of problems. What I've seen clients do is get "best-in-breed" host monitoring software such as HP's OpenView, Microsoft's SCOM/MOM, or IBM's Tivoli, not to mention SAN performance data collection tools. The problem there is that clients end up with a huge amount of performance data, and a set of static alarms that help with performance monitoring - but they under-leverage the potential of the dataset for capacity planning. Also, these tools do a terrible job of dealing with huge datasets from thousands of entities in terms of spotting capacity-relevant (as opposed to real-time performance) issues. Usually, clients in this situation end up developing a witches' brew of custom analysis and reporting tools, using SAS, Excel, R, etc.
I've developed a software solution
aimed at pulling together, analyzing and reporting on huge performance datasets for such clients - but I won't turn this into a marketing email.
Director - Capacity management