Call now: 252-767-6166  
Oracle Training Oracle Support Development Oracle Apps

 E-mail Us
 Oracle Articles

 Oracle Training
 Oracle Tips

 Oracle Forum
 Class Catalog

 Remote DBA
 Oracle Tuning
 Emergency 911
 RAC Support
 Apps Support
 Oracle Support

 SQL Tuning

 Oracle UNIX
 Oracle Linux
 Remote s
 Remote plans
 Application Server

 Oracle Forms
 Oracle Portal
 App Upgrades
 SQL Server
 Oracle Concepts
 Software Support

 Remote S


 Consulting Staff
 Consulting Prices
 Help Wanted!


 Oracle Posters
 Oracle Books

 Oracle Scripts

Don Burleson Blog 







VMware: Why Two Approaches

Oracle Tips by Burleson Consulting

I believe it often helps to understand why we do things a certain way and tuning databases is no exception. So I asked Why are there just two primary approaches to database tuning?

I believe I found a scientific explanation. The human brain is known to consist of two distinct hemispheres, with the following characteristics for each hemisphere listed in Figure 1 below on the next page.

Figure 1:  Characteristics of Right and Left Hemispheres of the Brain

Look at the three highlighted characteristics above. The first group (and especially those embracing colorful GUI dashboards) is attacking their performance issues using the right hemisphere of the brain. Whereas the second group employing instrumentation analysis such as trace files are very clearly focusing on looking at what happened when. Therefore, the second group is attacking the problem using the left hemisphere of the brain. This diagram seems to explain why there are only two ways to tune databases.

Yet a Third Dimension

Now we know that DBAs generally fall into one of two camps and why. But I am going to add a third dimension, the 50,000 foot view. One thing humans can do better than computers is to see the forest through the trees.  Some would call this capability abstract reasoning, which defines as the ability to analyze information and solve problems on a complex, thought-based level. Computers are really nothing more than super-fast calculators. They can run millions of well defined transactions per second. People might not be that quick but we can think, so that allows us to readily solve problems computers can not.

That is the basis for my addition of a third dimension to the practice of database tuning. While the two camps defined in the prior section tend to work predominately one way or the other, I generally prefer to do both and add my own additional level of tests. I refer to these extra checks as falling into one or more of the following categories:

  •  Low Hanging Fruit

  • Obvious yet Overlooked

  • Subtle yet Highly Critical

So my database optimization and tuning efforts follow a simple three- step process which is really nothing more than a combination of all these techniques since I do not expect any one to find all the answers:

1.      Ask a lot of high level, dumb questions, and verify the basics

2.      Perform Method-R-like database application trace file analysis

3.      Perform a final database health check using diagnostic software

The following is a real-world example of applying this comprehensive optimization and tuning process with the positive results obtained and the relative cost to do so. Once you have examined a successful implementation of this technique, the next step will be to define a generic recipe for its successful execution. Dont be too surprised by its simplicity!

Customer X had an OLTP application deployed on an Oracle 10g RAC database on Linux. The performance was substandard and they were fully contemplating returning to large SMP boxes running Solaris, which was their historic deployment platform and comfort zone. They had unsuccessfully initially deployed the RAC Linux solution, and even paid post-deployment database tuning experts (of the right brain hemispherical nature using scripts and dashboards) to try to improve the situation. As Method-R predicts, the results were unimpressive. While the tuning experts could run reports to show that the ratios and metrics had improved by impressive percentages, the response time was actually no better. Since the cost to go backwards was deemed far too expensive in terms of reputation, they received a quote for experts who would employ Method-R software and techniques with the promise that payment would only be based upon achieving results.  This was attractive to them since they had already been burned once.

However, even that reasonable estimate and its promise meant that the overall budget, including the failed tuning efforts, was going to far exceed the initial planned savings. So they were stuck between a rock and a hard spot either go backward and suffer the loss of end user faith, or spend the second round of tuning to obtain acceptable results that would result in no money having been saved. Hence, they were ripe to hear yet another alternative. I offered a simple solution:  let me come in for one day, ask some high level dumb questions and, based upon the answers, try a very limited round of recommended changes. The fee would be just one days consulting engagement and I also would accept payment based upon results. Thus, for a very small fixed cost and one additional days effort, they could possibly salvage the project budget and, at worst, for no cost and yet be even better prepared for the Method-R tuning attempt should they proceed.

So I sat down with their DBAs and asked some questions about their application. Was it OLTP or data warehouse? What was the nature of the applications queries? Were the queries predominately pre-canned or more ad-hoc? What would the daily workload profile look like? I asked these and a host of other seemingly basic and innocuous questions. I then asked to see their configuration settings for their SAN, LUNs, operating system, kernel and database. They became quite skeptical since these were exactly the kinds of things the prior tuning effort had worked with. But I explained to them that I was doing this devoid of any observed performance values. I was simply doing a brain-dead verification of the settings based upon their answers to my questions. By changing just five basic database configuration parameters, we reduced the average response time under maximum concurrent load by 99.4%!

Figure 2:  Chart of Customer Xs Adjusted Parameters

So what great black magic was employed? Absolutely none I simply recommended adjusting five basic database parameters (see Figure 2 above) based upon the answers to my dumb questions:

  • Run 1: reduce the multi-block read count to 2

  • Run 2: reduce the db block size from 8K to 4K

  • Run 3: cursor space for time=true & cursor sharing=similar

  • Run 4: optimizer index caching=80 and index cost adjust=20

  • Run 5: maximize jumbo frames on the RAC private network

In a nutshell, their RAC interconnect traffic and delays were killing them. Since the application was a fairly standard third party OLTP application with lots of screens and reports, the settings above made sense more than the database defaults as their starting point based upon my 50,000 foot vantage point. Common sense or wisdom was my guide not some off-kilter database ratio or metric.

So in reality, it is obvious that all I did was to apply experience based, pre-emptive Method-R via abstract reasoning upon their answers. But this step is often skipped by many DBAs. In this case, we obtained a satisfactory answer for one twentieth the cost and one days time. Customer X was actually able to skip the Method-R consulting engagement and successfully deploy the application.

The key point here is that you can stop tuning whenever the SLAs are met, and you might actually stop anywhere along the three-step process.

This is an excerpt from
Oracle on VMWare: Expert tips for database virtualization by Rampant TechPress.




Oracle training Excel
Oracle performance tuning software 


Burleson is the American Team

Note: This Oracle documentation was created as a support and Oracle training reference for use by our DBA performance tuning consulting professionals.  Feel free to ask questions on our Oracle forum.

Verify experience! Anyone considering using the services of an Oracle support expert should independently investigate their credentials and experience, and not rely on advertisements and self-proclaimed expertise. All legitimate Oracle experts publish their Oracle qualifications.

Errata?  Oracle technology is changing and we strive to update our BC Oracle support information.  If you find an error or have a suggestion for improving our content, we would appreciate your feedback.  Just  e-mail:  and include the URL for the page.


Burleson Consulting

The Oracle of Database Support

Oracle Performance Tuning

Remote DBA Services


Copyright ? 1996 -  2014

All rights reserved by Burleson

Oracle ? is the registered trademark of Oracle Corporation.