Call now: 252-767-6166  
Oracle Training Oracle Support Development Oracle Apps

 
 Home
 E-mail Us
 Oracle Articles
New Oracle Articles


 Oracle Training
 Oracle Tips

 Oracle Forum
 Class Catalog


 Remote DBA
 Oracle Tuning
 Emergency 911
 RAC Support
 Apps Support
 Analysis
 Design
 Implementation
 Oracle Support


 SQL Tuning
 Security

 Oracle UNIX
 Oracle Linux
 Monitoring
 Remote s
upport
 Remote plans
 Remote
services
 Application Server

 Applications
 Oracle Forms
 Oracle Portal
 App Upgrades
 SQL Server
 Oracle Concepts
 Software Support

 Remote S
upport  
 Development  

 Implementation


 Consulting Staff
 Consulting Prices
 Help Wanted!

 


 Oracle Posters
 Oracle Books

 Oracle Scripts
 Ion
 Excel-DB  

Don Burleson Blog 


 

 

 


 

 

 
 

Technological feasibility and Oracle

Oracle Database Tips by Donald BurlesonDecember 17, 2015

Technological feasibility becomes a problem because the public at-large grossly misunderstands the capability of modern computers.  Back in the 1980's many people assumed that computers like the HAL-9000 really existed (from the movie 2001: A Space Odyssey), and it was very difficult for practicing computer scientists to explain the real limitations for the hardware of the day.

In the 1980's before Oracle ruled the database world, Cullinet's IDMS was the top database, and I remember serving on the IDMS Large User Advisory Committee.  We were astonished when we learned that the CIA was building a giant database that was to hold an three terabytes of data, an almost unimaginable amount of storage in a day when a gigabytes of disk costs a quarter million dollars.  But time marches on, and the technology continues to change the way that we process information.

Most of the technological limitations today are driven by the limitations of the hardware, not the software as as of 2015, there are still many limits to our ability to process information:

  • Drinking from the garden hose - With today's memory speed limitations, processing more than ten million rows per second from a single feed is not technologically feasible.

  • Searching - Even with the fastest supercomputers, complex searching through petabytes of information cannot be done with sub-second response time. Today, some American intelligence sources must process more than four petabytes a month of data, and the bar is constantly being raised on what constitutes technological feasibility.

  • Storage density - As of 2015, it takes 37 megabytes just to store the DNA genome from a single human being, and it could be a century before humans can match the storage density of a mammalian brain.

  • Artificial intelligence - Even with the advances in computing power, scientists still cannot replicate the intelligence of a fly's brain.

  • Networking - Even with the "dark fiber" high-speed links, real limits exist of the speed that we can transfer large volumes of information.

To understand technological feasibility we need to remember that the computer market is not driven solely by the technology itself, we must consider other factors.  There are two rules that you must grasp to appreciate the cycles of technological advances:

  • Hardware advances always precede software advances - It's not until the hardware vendors deliver new technology that the software vendors figure out ways to exploit it.  If you want to read the future of Oracle software, look no farther than the advances being made in hardware technology.

  • Business information systems are driven by economics - Technological changes are driven solely by shifts in economics.  For example, as hardware becomes cheaper, the relative cost of human experts becomes more costly, and It managers seek to reduce their manpower needs.

For example, consider the birth and death of client-server computing.  Client server was introduced in the early 1990's solely because of economic reasons, and IT manager were thrilled with new cheap $40,000 minicomputers, a hundred times cheaper than their multi-million-dollar mainframes.

In hindsight client-server was a disaster because the massive system deconsolidation, and the age of client server meant cheaper hardware, but it ruined the main benefit of centralized computing, easier management and sharing of computing resources.  I was at a shop in 1998 that paid over a half million dollars to upgrade to Oracle9i on over 400 mini-servers, a task that could have been done in a few days if the systems were on a single computer.

The technology is constantly changing and hardware is always becoming cheaper and faster, but there is a very important exception.  RAM memory has not gotten faster in the past 40 years, an anomaly that is driving the way that computers are being designed today.

Moore's Law

Back in the mid-1960's, Gordon Moore, the director of the research and development labs at Fairchild Semiconductor, published a research paper titled "Cramming More Components into Integrated Circuits."  In his paper, Moore performed a linear regression on the rate of change in server processing speed and costs and noted an exponential growth in processing power and an exponential reduction of processing costs. 

But unlike CPU, RAM has not made many significant gains in speed since the mid 1970's.  This is due to the limitations of silicon and the fact that access speed in nanoseconds approaches the speed of light.  The only way to further improve the speed of RAM would be to employ a radical new medium such as Gallium Arsenide.


What's old is new again

While technological feasibility has a focus on hardware, we must remember that with software, older is often better.  I've been a full-time IT professional since 1983, and there is nothing new under the sun, it's all the same old stuff, re-packaged and dusted-off every decade or so.  In some areas, being older technology is a real advantage:

  • Databases - Even though being a new player has advantages, elderly databases have the advantages of decades of tuning and enhancements.  IMS-FASTPATH, an elderly database from the 1960's is still widely used today, and it sometimes beats Oracle in it's ability to process millions of transactions per second on IBM mainframes.

  • Languages - Third and fourth generation languages have not changed in 40 years, and it was a great shock to the IT industry when, instead of going to a 5th generation languages, we went backwards, using 3 GL's such as C and Java.    Even though COBOL is now 50 years old, it's the original object-oriented language, perfect for today's web applications.  The average person in America is thought to deal with a COBOL-powered system on an average of 13 times per day.

  • Resource sharing - The IT industry dusts-off old concepts such as resource sharing and give it new names, such as Cloud Computing, a new name for an old technology that's been around since the 1980's.  The same is true for virtualization and vmware technology.  The concepts behind vmware and not new, products such as Prism were virtualizing mainframe resources back in the 1980's.

While some technology is highly dependent on the changing technology, some tried and true solutions remain is use for decades.  Oracle PL/SQL is very much the same $GL as it was when it was introduced decades ago.  I was at a live space shuttle launch earlier this year and the space shuttle simulator flashes COBOL code:


Move spaces to places: COBOL is alive and well in science

Oracle anticipates the future of technology

Every computer science student is taught that hardware advances always precede advances in software, and Oracle is not exception. 

However, Oracle has had far more vision of emerging technologies than other database vendors, and Oracle is often years ahead of the technology.

  • Virtualization - Back in the 1990's, Oracle introduced "database appliances", small PC's that contained no disk, only a Java-enabled web browser.  This was based on Oracle's knowledge that the future of computing was the return of the "dumb terminal" where the PC of the future would have everything managed in cyberspace.

  • Parallelism - Oracle parallel query was introduced back in the 1990's, many years before SMP computers were publically available.

  • SSD - Oracle saw solid state computing coming and they are ready in 2015 with technologies such as the new 11g flash cache features that allows the Oracle professional to leverage upon the super-fast RAM storage that will eventually replace platter-based disks.

  • Database consolidation - Due to falling prices on large servers with 32 and 64 CPU's, shops all across America are undertaking Oracle Server Consolidation, moving dozens and hundreds of instances into a single server for easier management and resource sharing.

     

    Similar to "fencing" tools from the 1980's mainframes (e.g. the Prism tool), Oracle instance caging runs inside the database resource manager (DRM) and is related with RAC One Node to facilitate super fast instance relocation is cases of server failure and instance caging.

    Oracle has recognized that the hardware companies are returning to the good old days of single monolithic computers, and is developing tools such as instance caging tools for virtualizing large computers.

One reason that Oracle continues to dominate the database market is their forward-thinking vision and their recognition that their software must be tested and ready-to-use as soon as the new hardware technology becomes available.

 

If you like Oracle tuning, you may enjoy my new book "Oracle Tuning: The Definitive Reference", over 900 pages of BC's favorite tuning tips & scripts. 

You can buy it direct from the publisher for 30%-off and get instant access to the code depot of Oracle tuning scripts.



 

 

��  
 
 
Oracle Training at Sea
 
 
 
 
oracle dba poster
 

 
Follow us on Twitter 
 
Oracle performance tuning software 
 
Oracle Linux poster
 
 
 

 

Burleson is the American Team

Note: This Oracle documentation was created as a support and Oracle training reference for use by our DBA performance tuning consulting professionals.  Feel free to ask questions on our Oracle forum.

Verify experience! Anyone considering using the services of an Oracle support expert should independently investigate their credentials and experience, and not rely on advertisements and self-proclaimed expertise. All legitimate Oracle experts publish their Oracle qualifications.

Errata?  Oracle technology is changing and we strive to update our BC Oracle support information.  If you find an error or have a suggestion for improving our content, we would appreciate your feedback.  Just  e-mail:  

and include the URL for the page.


                    









Burleson Consulting

The Oracle of Database Support

Oracle Performance Tuning

Remote DBA Services


 

Copyright © 1996 -  2020

All rights reserved by Burleson

Oracle ® is the registered trademark of Oracle Corporation.