 |
|
Oracle SQL Hashing Tips
Oracle Tips by Burleson Consulting |
A Matter of Hashing
We have discussed hashing in prior lessons,
essentially each SQL statement is hashed and this hash value is then
used to compare to already stored SQL areas, if a matching hash is
found the statements are compared.
The hash is only calculated based
on the first 200 or so characters in the SQL statement, so extremely
long SQL statements can result in multiple hashes being the same even
though the stored SQL is different (if the first 100 or so characters
in each statement are identical). This is another argument for using
stored procedures and functions to perform operations and for the use
of bind variables. There is hope, in 8i (or 8.1 if you prefer)
the hash value will be calculated on the first 100 and last 100
characters reducing the chances of multiple identical hash values for
different SQL statements.
If the number of large, nearly identical
statements is high, then the number of times the parser has to compare
a new SQL statement to existing SQL statements with the same hash
value increases. This results in a higher statement overhead and
poorer performance. You should identify these large statements and
encourage users to re-write them using bind variables or to
proceduralize them using PL/SQL. The report in Source 24 will show if
you have a problem with multiple statements being hashed to the same
value.
rem:
FUNCTION: Shows by user who has possible
rem: SQL
reuse problems
COLUMN total_hash
HEADING 'Total Hash|Values'
COLUMN same_hash
HEADING 'SQL With|Same Hash'
COLUMN u_hash_ratio FORMAT 999.999
HEADING 'SQL Sharing|Hash'
START title80 'Shared Hash Value Report'
SPOOL rep_out\&&db\shared_hash.lst
BREAK ON REPORT
COMPUTE SUM OF total_hash ON REPORT
COMPUTE SUM OF same_hash ON REPORT
SELECT
a.username,
count(b.hash_value) total_hash,
count(b.hash_value)-COUNT(UNIQUE(b.hash_value))
same_hash,
(COUNT(UNIQUE(b.hash_value))/COUNT(b.hash_value))*100 u_hash_ratio
FROM
dba_users a, v$sqlarea b
WHERE
a.user_id=b.parsing_user_id
GROUP BY
a.username;
CLEAR COMPUTES
Source 24: Example Script to Report on Hashing
Problems
The script in Source 24 produces a report
similar to that shown in Listing 23. The report in Listing 23 shows
which users are generating SQL that hashes to the same values. Once
you have a user isolated you can then run the script in Source 12 to
find the bad SQL statements.
Date:
11/20/98
Page: 1
Time: 11:40 AM
Shared Hash Value Report
AULTM
DCARS database
Total Hash SQL With SQL Sharing
USERNAME
Values Same Hash Hash
------------------------------ ---------- --------- -----------
AULTM
129 0
100.000
DCARS
6484 58
99.105
QDBA
109 0
100.000
RCAPS
270 0
100.000
RCOM
342 7
97.953
SECURITY_ADMIN
46 0
100.000
SYS
134 0
100.000
---------- ---------
sum
7514 65
Listing 23: Hash Report
A quick glance at the report in Listing 23
shows that I need to look at the DCARS user to correct the
hashing problems they might be having and improve the reuse of SQL in
the shared pool. However, look at the number of hash areas this user
has accumulated, 6,484, if I run the report from Source 12 in the
first half of this lesson it will out weigh the paper version of the
Oracle documentation set. A faster way to find the hash values would
be to do a self-join and filter out the hash values that are
duplicated. Sounds easy enough, but remember, the V$ tables have no
ROWIDs so you can?t use the classic methods, you have to find another
column that will be different when the HASH_VALUE column in V$SQLAREA
is the same. Look at the select in Source 25.
select
distinct a.hash_value from v$sqlarea a, v$sqlarea b, dba_users c
where a.hash_value=b.hash_value and
a.parsing_user_id = c.user_id
and c.username='DCARS' and <-- change to user you are concerned
about
a.FIRST_LOAD_TIME != b.FIRST_LOAD_TIME
Source 25: Example Select To Determine
Duplicate Hash Values
Listing 24 has an example output from the
above select.
DCARS:column
hash_value format 99999999999
DCARS:set echo on
DCARS: select distinct a.hash_value from v$sqlarea a, v$sqlarea b,
2 dba_users c
3 where a.hash_value=b.hash_value and
4 a.parsing_user_id = c.user_id
5 and c.username='DCARS' and
6* a.FIRST_LOAD_TIME != b.FIRST_LOAD_TIME
HASH_VALUE
------------
-1595172473
-1478772040
-1344554312
-941902153
-807684425
-507978165
-270812489
441376718
784076104
979296206
1765990350
1945885214
Listing 24: Example Hash Select Output
Once you have the hash value you can pull the
problem SQL statements from either V$SQLAREA or V$SQLTEXT very easily,
look at Listing 25.
DCARS:select
sql_text from v$sqlarea where hash_value='441376718';
SQL_TEXT
-----------------------------------------------------------------------
SELECT region_code, region_dealer_num,
consolidated_dealer_num,
dealer_name, dealer_status_code,
dealer_type_code, mach_credit_code,
parts_credit_code FROM dealer WHERE
region_code = '32' AND reg_dealer_num = '6433'
SELECT region_code, region_dealer_num,
consolidated_dealer_num,
dealer_name, dealer_status_code,
dealer_type_code, mach_credit_code,
parts_credit_code FROM dealer WHERE
region_code = '56' AND reg_dealer_num = '6273'
Listing 25: Example of Statements With
Identical Hash Values But Different SQL
Long statements require special care to see
that bind variables are used to prevent this problem with hashing.
Another help for long statements is to use views to store values at an
intermediate state thus reducing the size of the variable portion of
the SQL. Notice in the example select in Listing 25 that the
only difference between the two identically hashed statements is that
the ?region_code? and ?reg_dealer_num? comparison values are
different, if bind variables had been used in these statements there
would only have been one hash entry instead of two.
Guideline 6: Use bind variables, PL/SQL (procedures or functions) and
views to reduce the size of large SQL statements to prevent hashing
problems.
This is an excerpt from
the eBook "Oracle
DBA made Simple".
For more details on Oracle
database administration, see the "Easy
Oracle Jumpstart" by Robert Freeman and Steve Karam. It?s
only $19.95 when you buy it directly from the publisher
here.
|