Just a short blog post today to promote my upcoming speaking engagement at the Midwest DB2 User Group, in Chicago. If you are in or around the Chicagoland area on December 4th, 2015 I invite you to stop by and participate in the meeting!
The meeting starts at Noon and a free lunch is provided. I will be giving one of the 3 presentations that day. My presentation is titled Database and DB2 Trends circa 2015 - An overview of an industry in transition.... This is an ever-changing presentation that I have delivered on several occasions in the past, but not in exactly the same way. This pitch provides an overview of the transformation of data management over the course of the past few years. I discuss Big Data, analytics, NoSQL, and their impact on the modern data architecture and DB2 for z/OS in particular.
But that is not the only highlight of this event. Sheryl M. Larsen, now with BMC Software, will regale the group with the results of BMC's Annual Mainframe Research Survey. BMC started their mainframe survey ten years ago as a way to gain insight into the issues and challenges facing mainframe customers. And it always contains a lot of useful information and details for those of us in the business of mainframe computing.
The third speaker is Tim Lenahan, who I've been told will be presenting something a little bit different this time around. And having heard Tim speak in the past, I'm looking forward to what he has to say now!
So if you are going to be near Chicago in early December, register and attend the MWDUG meeting. I'm sure it will be worth your time!
Hope to see you there!
Showing posts with label DBA. Show all posts
Showing posts with label DBA. Show all posts
Wednesday, November 18, 2015
Friday, January 09, 2015
New Skills Required of Internet DBAs
When DBAs transition from supporting internal applications only, to Internet-exposed applications, there are a lot of additional skills required to ensure success. Some of these skills are obvious, like the need to understand the protocols of the Internet including HTTP and FTP and how data is transferred across the Internet. Others, are more business-focused, such as the impact of clients accessing your data over the Web from anywhere at any time. Still others are bit more esoteric, like knowledge of the basic tools used for web development including CGI, SSL (Secure Sockets Layer), and how URLs are structured. You should also bone up on Web-development methods and technologies like JSON (JavaScript Object Notation), XML, and the like.
There are several additional skills that will make you a better Internet-using-DBA that should top the list of things to learn before your try to support Web-enabled database applications as a DBA. Compliance and security should top that list. Learn about SQL injection attacks and how to prevent them. Learn about protecting your data using all of the capabilities of your DBMS including trusted context, data masking, row permissions, views, and label-based access control. Learn about the encryption capabilities of your DBMS, O/S and hardware -- and what encrypting data might mean regarding efficient data access. Also, knowledge of Internet security technologies like SSL, firewalls, and network/OS security will prove to be useful.
You must also obtain knowledge on the connections from DB2 to the Internet. This requires knowing how connections are configured and how TCP/IP is setup, configured, and administered.
Another good idea is to gain a working knowledge of the operating system for the server on which your Web server is running. This facilitates easier monitoring and tuning for performance problems. If you are a z/OS DB2 DBA, this will likely require you to get out of your comfort zone and dig into Linux or some variant of Unix.
It can also be worthwhile to develop a better understanding of highly available RAID storage technologies in use for most 24x7 Internet applications.
Of course, having a clear understanding of the business functionality that is being achieved via the web-enabled databases in your organization is also vitally important. Only by understanding the business impact of database downtime can the appropriate administrative techniques be deployed to maintain constant availability.
I don't imagine that this is a 100 percent complete list of thing-you-need-to-know, but it is probably a good place to start your learning adventure. Let me know what I might have missed by adding your comments below!
And if you already support Internet access to your databases and you don't feel up-to-date on these items, then it is definitely time to get cracking!
Another good idea is to gain a working knowledge of the operating system for the server on which your Web server is running. This facilitates easier monitoring and tuning for performance problems. If you are a z/OS DB2 DBA, this will likely require you to get out of your comfort zone and dig into Linux or some variant of Unix.
It can also be worthwhile to develop a better understanding of highly available RAID storage technologies in use for most 24x7 Internet applications.
Of course, having a clear understanding of the business functionality that is being achieved via the web-enabled databases in your organization is also vitally important. Only by understanding the business impact of database downtime can the appropriate administrative techniques be deployed to maintain constant availability.
I don't imagine that this is a 100 percent complete list of thing-you-need-to-know, but it is probably a good place to start your learning adventure. Let me know what I might have missed by adding your comments below!
And if you already support Internet access to your databases and you don't feel up-to-date on these items, then it is definitely time to get cracking!
Wednesday, December 10, 2014
An Extra DBA Rule of Thumb
Last year on the blog I posted a series of 12 DBA Rules of Thumb. As a quick reminder, these Rules of Thumb - or ROTS, are some general rules of the road that apply to the management discipline of Database Administration that I have collected over the years. These ROTs are broadly applicable to all DBAs, even though this is a DB2-focused blog.
The purpose of today's blog post is to suggest an additional Rule of Thumb... and that is to Diversify! A good DBA is a Jack-of-All-Trades.
Please click on the link in the paragraph above if you need a refresher on the DBA ROTs from last year.
The purpose of today's blog post is to suggest an additional Rule of Thumb... and that is to Diversify! A good DBA is a Jack-of-All-Trades.
You can’t just master one thing and be successful in this day-and-age. The DBA maintains production, QA and test environments, monitors application development projects, attends strategy and design meetings, selects and evaluates new products, and connects legacy systems to the Web.
And if all of that is not enough, to add to the chaos, DBAs are expected to know everything about everything. From technical and business jargon to the latest management and technology fads and trends, the DBA is expected to be “in the know.” For example, the DBA must be up on trends like Big Data and Analytics.
And do not expect any private time: A DBA must be prepared for interruptions at any time to answer any type of question… and not just about databases, either.
When application problems occur, the database environment is frequently the first thing blamed. The database is “guilty until proven innocent.” And the DBA is expected to be there to fix things. That means the DBA is often forced to prove that the database is not the source of the problem. The DBA must know enough about all aspects of IT to track down errors and exonerate the DBMS and database structures he has designed. So he must be an expert in database technology, but also have semi-expert knowledge of the IT components with which the DBMS interacts: application programming languages, operating systems, network protocols and products, transaction processors, every type of computer hardware imaginable, and more. The need to understand such diverse elements makes the DBA a very valuable resource. It also makes the job interesting and challenging.
To summarize, the DBA must be a Jack-of-all-Trades... and a master of several!!!
Thursday, October 09, 2014
Database System Performance Tools
System performance tools examine the database server, its
configuration, and usage. The most commonly used system performance tool is the
performance monitor. Database performance monitoring and
analysis tools support many types of performance-oriented requests in many
ways. For example, system performance tools can operate:
- In the background mode as a batch job that reports on performance statistics written by the DBMS trace facility
- In the foreground mode as an online monitor that either traps trace information or captures information from the DBMS control blocks as applications execute
- By sampling the database kernel and user address spaces as the program runs and by capturing information about the performance of the job, independent of database traces
- By capturing database trace information and maintaining it in a history file (or table) for producing historical performance reports and for predicting performance trends
- As a capacity planning device that gives statistical information about an application and the environment in which it will operate
- As an after-the-fact analysis tool on a workstation, that analyzes and graphs all aspects of application performance and system-wide performance
Each database performance monitor supports one or more of
these features. The evaluation of database performance monitors is a complex
task. Sometimes more than one performance monitor is used at a single
site—perhaps one for batch reporting and another for online event monitoring.
Maybe an enterprise-wide monitoring solution has been implemented and one
component of that solution is a database module that monitors your DBMS, but it
lacks the details of a more sophisticated DBMS monitor. So, another performance
monitor is purchased for daily DBA usage, while the module of the enterprise-wide
monitoring solution is used for integrated monitoring by system administrators.
Modern database performance tools can set performance
thresholds that, once reached, will alert the DBA, perform another task to
report on, or actually fix the problem. These tools are typically agent-based.
An agent is a piece of independent code that runs on the
database server looking for problems. It interacts with, but does not rely on,
a console running on another machine that is viewed by the DBA. This agent
architecture enables efficient database monitoring because the agent is not
tied to a workstation and can act independently. The agent sends information to
the DBA only when required.
Additionally, some system performance tools are available
that focus on a specific component of the DBMS such as the buffer pools (data cache). Such a
tool can be used to model the memory requirements for database caching, to
capture data cache utilization statistics, and perhaps even to make
recommendations for improving the performance of the buffers.
Another type of performance optimization tool enables
database configuration parameters to be changed without recycling the DBMS
instance, subsystem, or server. These tools are useful when the changes require
the DBMS to be stopped and restarted. Such tools can dramatically improve
availability, especially if configuration parameters need to be changed
frequently and the DBMS does not support dynamic parameter modification.
A few ISVs provide invasive system
performance tools that enhance the performance of databases by adding
functionality directly to the DBMS and interacting with the database kernel.
Typically, these products take advantage of known DBMS shortcomings.
For example, products are available that enhance the
performance of reading a database page or block or that optimize data caching
by providing additional storage and control over buffers and their processing.
Care must be taken when evaluating invasive performance tools. New releases of
the DBMS may negate the need for these tools because functionality has been
added or known shortcomings have been corrected. However, this does not mean
that you should not consider invasive database performance tools. They can pay for themselves after only a short period of time.
Discarding the tool when the DBMS supports its functionality is not a problem
if the tool has already paid for itself in terms of better performance.
One final caution: Because invasive performance tools can
interact very closely with the database kernel, be careful when migrating to a
new DBMS release or a new release of the tool. Extra testing should be
performed with these tools because of their intrusive nature.
Friday, July 25, 2014
Happy DBA Day!
Hey everybody, time to celebrate... today, July 25, 2014 is SysAdmin day! For the past 15 years, the last Friday in June has been set aside to recognize the hard work done by System Administrators. This is known as System Administrator Appreciation Day.
As a DBA, I have regularly co-opted the day to include DBAs because, after all, we are a special type of system administrator -- the system we administer is the DBMS!
So if you are a SysAdmin, DBA, Network Admin, etc. have an extra cup of Joe and a donut or two. Hang up a sign on your cubicle telling people it is SysAdmin Day. And hopefully get a little respect and appreciation for all you do every day of the year!
As a DBA, I have regularly co-opted the day to include DBAs because, after all, we are a special type of system administrator -- the system we administer is the DBMS!
So if you are a SysAdmin, DBA, Network Admin, etc. have an extra cup of Joe and a donut or two. Hang up a sign on your cubicle telling people it is SysAdmin Day. And hopefully get a little respect and appreciation for all you do every day of the year!
Thursday, July 17, 2014
DB2 Health Checks - Part 1
Left to their own devices, DB2 databases and applications
will accumulate problems over time. Things that used to work, stop working.
This can happen for various reasons including the addition of more data, a
reduction in some aspect of business data, different types of data, more users,
changes in busy periods, business shifts, software changes, hardware changes…
you get the idea.
And there is always the possibility of remnants from the
past causing issues with your DB2 environment. Some things may have been
implemented sub-optimally from the start, perhaps many years ago… or perhaps
more recently. Furthermore, DB2 is not a static piece of software; it changes
over time with new versions, features and functionality. As new capabilities
are introduced, older means of performing similar functionality become
suboptimal, and in some cases, even obsolete. Identifying these artifacts can
be troublesome and is not likely to be something that a DBA will do on a daily
basis.
Nonetheless, the performance and availability of your DB2
environment – and therefore the business systems that rely on DB2 – can suffer
if you do not pay attention to the health and welfare of your DB2 databases and
applications.
Health Checking Your
DB2
The general notion of a health check is well known in the IT
world, especially within the realm of DB2 for z/OS. The purpose of a DB2 health
check is to assess the stability, performance, and availability of your DB2
environment. Health checks are conducted by gathering together all of the
pertinent details about your DB2-based systems and reviewing them to ascertain
their appropriateness and effectiveness. You may narrow down a health check to
focus on specific aspects of your infrastructure, for example, concentrating on
just availability and performance, or on other aspects such as recoverability,
security, and so on.
At any rate, scheduling regular independent reviews of your
DB2 environment is an important aspect of assuring the viability and robustness
of your implementation. Simply migrating DB2 applications to production and
then neglecting to review them until or unless there are complaints from the
end users is not a best practice for delivering good service to your business.
Just like a car requires regular maintenance, so too does your DB2 environment.
Regular analysis and health check with an overall goal should of identifying
weaknesses and targeting inefficiencies, can save your organization time and money,
as well as reduce the daily effort involved in implementing and maintaining
your DB2 applications.
Think about the health of your DB2 system the same way you
think about your health. A regular health check helps to identify and eliminate
problems. And it helps you to perform the daily operational tasks on your DB2
databases and applications with the peace of mind that only regular, in-depth,
knowledgeable analysis can deliver.
Check Back Soon
Later in this series we'll uncover more aspects of health checking and look at some software that might be able to assist. So stay tuned...
Later in this series we'll uncover more aspects of health checking and look at some software that might be able to assist. So stay tuned...
Monday, June 16, 2014
Don't Forget the Humble DB2 DISPLAY Command
Although robust performance and administration tools are probably the best solution for gathering information about your DB2 subsystems and databases, you can gain significant insight into your DB2 environment simply using the DISPLAY command. There are multiple variations of the DISPLAY command depending on the type of information you are looking for.
DISPLAY DATABASE is probably the most often-used variation of the DISPLAY command. The output of the basic command shows the status of the database objects specified, along with any exception states. For example, issuing -DISPLAY DATABASE(DBNAME) shows details on the DBNAME database, including information about its tablespaces and indexes. With one simple command you can easily find all of the tablespaces and indexes within any database — pretty powerful stuff. But you also get status information for each space, too. When a status other than RO or RW is encountered, the object is in an indeterminate state or is being processed by a DB2 utility.
There are additional options that can be used with DISPLAY DATABASE. For partitioned page sets, you can specify which partition, or range of partitions, to show. And you can choose to display only objects in restricted or advisory status using either the ADVISORY or RESTRICT key word.
You can control the amount of output generated by DISPLAY DATABASE using the LIMIT parameter. The default number of lines returned by the DISPLAY command is 50, but the LIMIT parameter can be used to set the maximum number of lines returned to any numeric value; or you can use an asterisk (*) to indicate no limit.
Moving on, the DISPLAY BUFFERPOOL command provides the current status and allocation information for each buffer pool. The output includes the number of pages assigned to each pool, whether the pages have been allocated, and the current settings for the sequential steal and deferred write thresholds. For additional information on buffer pools you can specify the DETAIL parameter to return usage information such as number of GETPAGEs, prefetch usage, and synchronous reads. You can use this data for rudimentary buffer pool tuning.
You can gather even more information about your buffer pools using the LIST and LSTATS parameters. The LIST parameter shows open table spaces and indexes within the specified buffer pools; the LSTATS parameter shows statistics for the table spaces and indexes. Statistical information is reset each time DISPLAY with LSTATS is issued, so the statistics are as of the last time LSTATS was issued.
If you are charged with running (IBM) DB2 utilities, another useful command is DISPLAY UTILITY. Issuing this command causes DB2 to display the status of all active, stopped, or terminating utilities. So, if you are in over the weekend running REORGs, RUNSTATS, or image copies, you can issue occasional DISPLAY UTILITY commands to keep up-to-date on the status of your jobs. By monitoring the current phase of the utility and matching this information with the utility phase information, you can determine the relative progress of the utility as it processes. The COUNT specified for each phase lists the number of pages that have been loaded, unloaded, copied, or read.
You can use the DISPLAY LOG command to display information about the number of active logs, their current capacity, and the setting of the LOGLOAD parameter. For archive logs, use the DISPLAY ARCHIVE command.
DISPLAY is helpful, too, if your organization uses stored procedures or user-defined functions (UDFs). DISPLAY PROCEDURE monitors whether procedures are currently started or stopped, how many requests are currently executing, the high-water mark for requests, how many requests are queued, how many times a request has timed out, and the WLM environment in which the stored procedure executes. And you can use the DISPLAY FUNCTION SPECIFIC command to monitor UDF statistics.
DISPLAY also returns a status indicating the state of each procedure or UDF. A procedure or UDF can be in one of four potential states: STARTED, STOPQUE (requests are queued), STOPREJ (requests are rejected), or STOPABN (requests are rejected because of abnormal termination).
And there remains a wealth of additional information that the DISPLAY command can uncover. For distributed environments, DISPLAY DDF shows configuration and status information, as well as statistical details on distributed connections and threads; DISPLAY LOCATION shows distributed threads details; DISPLAY PROFILE shows whether profiling is active or inactive; DISPLAY GROUP provides details of data-sharing groups (including the version of DB2 for each member) and DISPLAY GROUPBUFFERPOOL shows information about the status of DB2 group buffer pools; DISPLAY RLIMIT provides the status of the resource limit facility; DISPLAY THREAD display active and in-doubt connections to DB2; and DISPLAY TRACE lists your active trace types and classes along with the specified destinations for each.
If you are looking for some additional, more in-depth details on the DISPLAY command, take a look at this series of blog posts I wrote last year:
- Part 1 of the series focused on using DISPLAY to monitor details about your database objects;
- Part 2 focused on using DISPLAY to monitor your DB2 buffer pools;
- Part 3 covered utility execution and log information;
- And Part 4 examined using the DISPLAY command to monitor DB2 stored procedures and user-defined functions.
Summary
The DB2 DISPLAY command is indeed a powerful and simple tool that can be used to gather a wide variety of details about your DB2 subsystems and databases. Every DBA should know how to use DISPLAY and its many options to simplify their day-to-day duties and job tasks.
Saturday, February 01, 2014
The Twelve DBA Rules of Thumb... a summary
Over the past couple of months this blog has offered up some rules of thumb for DBAs to follow that can help you to build a successful and satisfying career as a DBA. These twelve rules of thumb worked well for me as I worked my way through my career and I have shared them with you, my faithful readers, so that you can benefit from my experiences. I hope you find them useful... and if I have missed anything, please post a comment with your thoughts and experiences on being a good DBA.
As a reminder of what we have discussed, I am posting a short synopsis of the Twelves DBA Rules of Thumb here, along with links to each blog post.
1. Write Down Everything
2. Automate Routine Tasks
3. Share Your Knowledge
4. Analyze, Simplify and Focus
5. Don't Panic!
6. Be Prepared
7. Don't Be a Hermit
8. Understand the Business, Not Just the Technology
9. Ask for Help When You Need It
10. Keep Up-to-Date
11. Invest in Yourself
12. Be a Packrat
Good luck with your career as a DBA...
As a reminder of what we have discussed, I am posting a short synopsis of the Twelves DBA Rules of Thumb here, along with links to each blog post.
1. Write Down Everything
2. Automate Routine Tasks
3. Share Your Knowledge
4. Analyze, Simplify and Focus
5. Don't Panic!
6. Be Prepared
7. Don't Be a Hermit
8. Understand the Business, Not Just the Technology
9. Ask for Help When You Need It
10. Keep Up-to-Date
11. Invest in Yourself
12. Be a Packrat
Good luck with your career as a DBA...
Saturday, January 25, 2014
DBA Rules of Thumb - Part 12 (Be a Packrat)
Today's post in the DBA Rules of Thumb series is short and sweet. It can be simply stated as "Keep Everything!"
Database administration is the perfect job for you if you are a pack rat.
It is a good practice to keep everything you come across during the course of performing your job. When you slip up and throw something away, it always seems like you come across a task the very next day where that stuff would have come in handy... but you you threw it out!
I still own some printed manuals for DB2 Version 2. They are packed up in a plastic tub in my garage, but I have them in case I need them.
Database administration is the perfect job for you if you are a pack rat.
It is a good practice to keep everything you come across during the course of performing your job. When you slip up and throw something away, it always seems like you come across a task the very next day where that stuff would have come in handy... but you you threw it out!
I still own some printed manuals for DB2 Version 2. They are packed up in a plastic tub in my garage, but I have them in case I need them.
Tuesday, January 21, 2014
DBA Rules of Thumb - Part 11 (Invest in Yourself)
Most IT professionals continually look for their company
to invest money in their ongoing education. Who among us does not want to learn
something new — on company time and with the company’s money? Unless you are
self-employed, that is!
Yes, your company should invest some funds to train you on new
technology and new capabilities, especially if it is asking you to do new
things. And since technology changes so fast, most everyone has to learn
something new at some point every year. But the entire burden of learning
should not be placed on your company.
Budget some of your own money to invest in your career. After
all, you probably won’t be working for the same company your entire career. Why
should your company be forced to bankroll your entire ongoing education? Now, I
know, a lot depends on your particular circumstances. Sometimes we accept a
lower salary than we think we are worth because of the “perks” that are
offered. And one of those perks can be training. But perks have a way of disappearing once you are "on the job."
Some folks simply abhor spending any
of their hard-earned dollars to help advance their careers. This is not a reasonable approach to your career! Shelling out a
couple of bucks to buy some new books, subscribe to a publication, or join a
professional organization should not be out of the reach of most DBAs.
A willingness to spend some money to stay abreast of technology
is a trait that DBAs need to embrace.
Most DBAs are insatiably curious, and many are willing to invest some of their money to learn something new. Maybe they bought that book on NoSQL before anyone at their company started using it. Perhaps it is just that enviable bookshelf full of useful database books in their cubicle. Or maybe they paid that nominal fee to subscribe to the members-only content of that SQL Server portal. They could even have forked over the $25 fee to attend the local user group.
Don’t get me wrong. I’m not saying that companies should not
reimburse for such expenses. They should, because it provides for
better-rounded, more educated, and more useful employees. But if your employer
won’t pay for something that you think will help your career, why not just buy
it yourself?
Sunday, January 12, 2014
DBA Rules of Thumb - Part 10 (Keep Up-to-Date)
If you wish to be a successful DBA for a long period of
time, you will have to keep up-to-date on all kinds of technology — both
database-related and other.
Of course, as a DBA, your first course of action should
be to be aware of all of the features and functions available in the DBMSs in
use at your site — at least at a high level, but preferably in depth. Read the
vendor literature on future releases as it becomes available to prepare for new
functionality before you install and migrate to new DBMS releases. The sooner
you know about new bells and whistles, the better equipped you will be to
prepare new procedures and adopt new policies to support the new features.
Keep up-to-date on technology in general, too. For example, DBAs
should understand new data-related technologies such as NoSQL, Hadoop, and
predictive analytics, but also other newer technologies that interact with
database systems. Don’t ignore industry and technology trends simply because
you cannot immediately think of a database-related impact. Many
non-database-related “things” (for example, XML) eventually find their way into
DBMS software and database applications.
Keep up-to-date on industry standards — particularly those that
impact database technology such as the SQL standard. Understanding these
standards before the new features they engender have been incorporated into
your DBMS will give you an edge in their management. DBMS vendors try to
support industry standards, and many features find their way into the DBMS
because of their adoption of an industry standard.
As we've already discussed in this series, one way of keeping
up-to-date is by attending local and national user groups. The presentations
delivered at these forums provide useful education. Even more important,
though, is the chance to network with other DBAs to share experiences and learn
from each other’s projects.
Through judicious use of the Internet and the Web, it is easier
than ever before for DBAs to keep up-to-date. Dozens of useful and informative
Web sites provide discussion forums, script libraries, articles, manuals, and
how-to documents. Consult my web site at http://www.craigsmullins.com/rellinks.html
for a regularly-updated list of DBMS,
data, and database-related Web resources.
Remember, though, this is just a starting point. There are
countless ways that you can keep-up-to-date on technology. Use every avenue at
your disposal to do so, or risk becoming obsolete.
Sunday, January 05, 2014
DBA Rules of Thumb - Part 9 (Call on Others for Help When Needed)
Use All of the Resources at Your Disposal
Remember that you do not have to do everything yourself.
Use the resources at your disposal. We have talked about some of those
resources, such as articles and books, Web sites and scripts, user groups and
conferences. But there are others.
Do not continue to struggle with problems when you are completely
stumped. Some DBAs harbor the notion that they have to resolve every issue
themselves in order to be successful. Sometimes you just need to know where to
go to get help to solve the problem. Use the DBMS vendor’s technical support,
as well as the technical support line of your DBA tool vendors. Consult
internal resources for areas where you have limited experience, such as network
specialists for network and connectivity problems, system administrators for
operating system and system software problems, and security administrators for
authorization and protection issues.
As a DBA you are sometimes thought of as "knowing everything" (or worse a know-it-all), but it is far more important to know where to go to get help to solve problems than it is to try to know everything there is to know. Let's face it... it is just not possible to know everything about database systems and making them work with all types of applications and users these days.
When you go to user groups, build a network of DBA colleagues
whom you can contact for assistance. Many times others have already encountered
and solved the problem that vexes you. A network of DBAs to call on can be an
invaluable resource (and no one at your company even needs to know that you called for outside help).
Finally, be sure to
understand the resources available from your DBMS vendors. DBMS vendors offer
their customers access to a tremendous amount of useful information. All of the
DBMS vendors offer software support on their Web sites. Many of them provide a
database that users can search to find answers to database problems. IBM
customers can use IBMLink,[1] and both Oracle and Microsoft offer a searchable
database in the support section of their Web sites. Some DBAs claim to be able
to solve 95 percent or more of their problems by researching online databases.
These resources can shrink the amount of time required to fix problems, especially
if your DBMS vendor has a reputation of “taking forever” to respond to issues.
Of course, every DBA should also be equipped with the DBMS vendor’s
technical support phone number for those tough-to-solve problems. Some support
is offered on a pay-per-call basis, whereas other times there is a prepaid
support contract. Be sure you know how your company pays for support before
calling the DBMS vendor. Failure to know this can result in your incurring
significant support charges.
Thursday, January 02, 2014
DBA Rules of Thumb - Part 8 (Being Business Savvy)
Understand the Business, Not Just the Technology
Remember that being technologically adept is just a part
of being a good DBA. Although technology is important, understanding your
business needs is more important. If you do not understand the impact on the
business of the databases you manage, you will simply be throwing technology
around with no clear purpose.
Business needs must dictate what technology is applied to what
database—and to which applications. Using the latest and greatest (and most
expensive) technology and software might be fun and technologically
challenging, but it most likely will not be required for every database you
implement. The DBA’s tools and utilities need to be tied to business strategies
and initiatives. In this way, the DBA’s work becomes integrated with the goals
and operations of the organization.
The first step in achieving this needed synergy is the
integration of DBA services with the other core components of the IT
infrastructure. Of course, DBAs should be able to monitor and control the
databases under their purview, but they should also be able to monitor them
within the context of the broader spectrum of the IT infrastructure—including
systems, applications, storage, and networks. Only then can companies begin to
tie service-level agreements to business needs, rather than technology metrics.
DBAs should be able to gain insight into the natural cycles of
the business just by performing their job. Developers and administrators of
other parts of the IT infrastructure will not have the vision into the busiest
times of the day, week, quarter, or year because they are not privy to the
actual flow of data from corporate business functions. But the DBA has access
to that information as a component of performing the job. It is empowering to
be able to understand business cycle information and apply it on the job.
DBAs need to expand further to take advantage of their special
position in the infrastructure. Talk to the end users — not just the application
developers. Get a sound understanding of how the databases will be used before
implementing any database design. Gain an understanding of the database’s
impact on the company’s bottom line, so that when the inevitable problems occur
in production you will remember the actual business impact of not having that
data available. This also allows you to create procedures that minimize the
potential for such problems.
To fulfill the promise of business/IT integration, it will be
necessary to link business services to the underlying technology. For example,
a technician should be able to immediately comprehend that a service outage to
transaction X7R2 in the PRD2 environment means that regional demand deposit
customers cannot access their accounts. See the difference?
Focusing on transactions, TP monitors, and databases is the core
of the DBA’s job. But servicing customers is the reason the DBA builds those
databases and manages those transactions. Technicians with an understanding of
the business impact of technology decisions will do a better job of servicing
the business strategy. This is doubly true for the DBA’s manager. Technology
managers who speak in business terms are more valuable to their company.
Of course, the devil is in the details. A key component of
realizing effective business/IT integration for DBAs is the ability to link
specific pieces of technology to specific business services. This requires a
service impact management capability—that is, analyzing the technology required
to power each critical business service and documenting the link. Technologies
exist to automate some of this through event automation and service modeling.
Such capabilities help to transform availability and performance data into
detailed knowledge about the status of business services and service-level
agreements.
Today’s modern corporations need technicians who are cognizant of
the business impact of their management decisions. As such, DBAs need to get
busy transforming themselves to become more business savvy — that is, to keep
an eye on the business impact of the technology under their span of control.
Friday, December 20, 2013
DBA Rules of Thumb - Part 7 (Don't Become a Hermit!)
Part 7 of our ongoing series on DBA Rules of Thumb is a short one on being accessible and approachable... in other words, Don't Be a Hermit!
Sometimes DBAs are viewed as the "curmudgeon in the corner" -- you know the type, don't bother "Neil," he'll just yell at you and call you stupid. Don't be like Neil!
Instead, develop a good working relationship with the application developers. Don’t isolate
yourself in your own little DBA corner of the world. The more you learn about
what the applications do and the application requirements, the better you can
adjust and tune the databases to support those applications.
A DBA should be accessible. Don’t be one of those DBAs whom
everyone is afraid to approach. The more you are valued for your expertise and
availability, the more valuable you are to your company.
Sunday, December 15, 2013
DBA Rules of Thumb - Part 6 (Preparation)
Measure Twice, Cut Once
Being prepared means analyzing, documenting, and testing
your DBA policies and procedures. Creating procedures in a vacuum without
testing will do little to help you run an efficient database environment. Moreover,
it will not prepare you to react rapidly and effectively to problem situations.
The old maxim applies: Measure twice, cut once. In the case of
DBA procedures, this means analyze, test, and then apply. Analyze your
environment and the business needs of the databases to create procedures and
policies that match those needs. Test those procedures. Finally, apply them to
the production databases.
DBAs must be calm amid stress. DBAs must prepare for every situation that can be reasonably thought to have the potential to occur... ...and when the unthinkable occurs, the DBA remains logical and thorough in collecting details, ferreting out the root cause of the problem, and taking only the necessary actions to remediate the problem. This Rule of Thumb ties in nicely with the last one (Don't Panic!)... Every action you take should be planned and implemented with a calm disposition. Analysis and preparation are the friend of the DBA. The last thing you want to do is rage into a problem scenario making changes like gunslinger who acts first and worries about the consequences later. |
Monday, December 09, 2013
DBA Rules of Thumb - Part 5 (Don’t Panic!)
Way back in the early 1990s when I was
working as a DBA I had a button pinned up in my cubicle that read in large
letters “DON’T PANIC!” If I recall correctly, I got it for free inside a game
from back in those days based on “The Hitchhiker’s Guide to the Galaxy.” When I
left my job as a DBA to go to work for a software company I bequeathed that
button to a friend of mine (Hello, Chris!) who was taking over my duties… for
all I know, he still has that button pinned up in his office.
But the ability to forgo panicking is a very
important quality in a DBA.
A calm disposition and the ability to remain
cool under strenuous conditions are essential to the makeup of a good DBA.
Problems will occur—nothing you can do can eliminate every possible problem or
error. Part of your job as a DBA is to be able to react to problems with a calm
demeanor and analytical disposition.
When a database is down and applications are
unavailable, your environment will become hectic and frazzled. The best things
you can do when problems occur are to remain calm and draw on your extensive
knowledge and training. As the DBA, you will be the focus of the company (or at
least the business units affected) until the database and applications are
brought back online. It can be a harrowing experience to recover a database
with your boss and your users hovering behind your computer terminal and
looking over your shoulder. Be prepared for such events, because eventually
they will happen. Panic can cause manual errors—the last thing you want to
happen when you are trying to recover from an error.
The more comprehensive your planning and the
better your procedures, the faster you will be able to resolve problems.
Furthermore, if you are sure of your procedures, you will remain much calmer.
So Don’t Panic!
Monday, December 02, 2013
DBA Rules of Thumb - Part 4 (Analyze, Simplify, and Focus)
The job of a DBA
is complex and spans many diverse technological and functional areas. It is
easy for a DBA to get overwhelmed with certain tasks—especially those that are
not performed regularly. In a complex, heterogeneous, distributed world it can
be hard to keep your eye on the right ball, at the right time. The best advice
I can give you is to remain focused and keep a clear head.
Understand the purpose for each task and focus on performing the
steps that will help you to achieve that end. Do not be persuaded to broaden
the scope of work for individual tasks unless it cannot be avoided. In other
words, don’t try to boil the ocean. If non-related goals get grouped together
into a task, it can become easy to work long hours with no clear end in sight.
I am not saying that a DBA should (necessarily) specialize in one
particular area (e.g., performance). What I am suggesting is that each task
should be given the appropriate level of focus and attention to details. Of
course, I am not suggesting that you should not multitask either. The
successful DBA will be able to multitask while giving full attention to each
task as it is being worked on.
What is the enemy of focus? There are many: distraction, lack of
knowledge, “management,” and always worrying about the next thing to try or do.
Such distractions can wreak havoc on tasks that require forethought and
attention to detail.
Analyze, simplify, and focus. Only then will tasks become
measurable and easier to achieve.
Monday, November 18, 2013
DBA Rules of Thumb - Part 2 (Automate)
Why should you do it by hand if you can automate DBA processes? Anything you
can do probably can be done better by the computer – if it is programmed to do
it properly. And once it is automated you save yourself valuable time. And that
time can be spent tackling other problems, learning about new features and
functionality, or training others.
Furthermore, don’t reinvent the wheel. Someone, somewhere, at some time many have already solved the problem you currently are attempting to solve. Look to the web for sites that allow you to download and share scripts. Or if you have budget money look to purchase DBA tools from ISVs. There are a lot of good tools out there, available from multiple vendors, that can greatly simplify the task of database administration. Automating performance management, change management, backup and recovery, and other tasks can help to reduce the amount of time, effort, and human error involved in managing database systems.
Of course, you can take the automation idea too far. There has been a lot of talk and vendor hype lately about self-managing database systems. For years now, pundits and polls have been asking when automation will make the DBA job obsolete. The correct answer is "never" - or, at least, not any time soon.
There are many reasons why DBAs are not on the fast path to extinction. Self-managing databases systems are indeed a laudable goal, but we are very far away from a “lights-out” DBMS environment. Yes, little-by-little and step-by-step, database maintenance and performance management is being improved, simplified, and automated. But you know what? DBAs will not be automated out of existence in my lifetime – and probably not in your children’s lifetime either.
Many of the self-managing features require using the built-in tools from the DBMS vendor, but many organizations prefer to use heterogeneous solutions that can administer databases from multiple vendors (Oracle, DB2, SQL Server, MySQL, etc.) all from a single console. Most of these tools have had self-managing features for years and yet they did not make the DBA obsolete.
And let’s face it, a lot of the DBMS vendors claims are more hyperbole than fact. Some self-managing features are announced years before they will become generally available in the DBMS. All vendors claims to the contrary, no database today is truly 100% self-contained. Every database needs some amount of DBA management – even when today’s best self-management features are being used.
What about the future? Well, things will get better – and probably more costly. You don’t think the DBMS vendors are building this self-management technology for free, do you? But let’s remove cost from the equation for a moment. What can a self-managing database actually manage?
Most performance management solutions allow you to set performance thresholds. A threshold allows you to set up a trigger that says something like “When x% of a table’s pages contain chained rows or fragmentation, schedule a reorganization.” But these thresholds are only as good as the variables you set and the actions you define to be taken upon tripping the threshold. Some software is bordering on intelligent; that is, it “knows” what to do and when to do it. Furthermore, it may be able to learn from past actions and results. The more intelligence that can be built into a self-managing system, the better the results typically will be. But who among us currently trusts software to work like a grizzled veteran DBA? The management software should be configurable such that it alerts the DBA as to what action it wants to take. The DBA can review the action and give a “thumbs up” or “thumbs down” before the corrective measure is applied. In this way, the software can earn the DBA’s respect and trust. When the DBA trusts the software, he can turn it on so that it self-manages “on the fly” without DBA intervention. But today, in most cases, a DBA is required to set up the thresholds, as well as to ensure their on-going viability.
Of course, not all DBA duties can be self-managed by software. Most self-management claims are made for performance management, but what about change management? The DBMS cannot somehow read the mind of its user and add a new column or index, or change a data type or length. This non-trivial activity requires a skilled DBA to analyze the database structures, develop the modifications, and deploy the proper scripts or tools to implement the change. Of course, software can help simplify the process, but software cannot replace the DBA.
Furthermore, database backup and recovery will need to be guided by the trained eye of a DBA. Perhaps the DBMS can become savvy enough to schedule a backup when a system process occurs that requires it. Maybe the DBMS of the future will automatically schedule a backup when enough data changes. But sometimes backups are made for other reasons: to propagate changes from one system to another, to build test beds, as part of program testing, and so on. A skilled professional is needed to build the proper backup scripts, run them appropriately, and test the backup files for accuracy. And what about recovery? How can a damaged database know it needs to be recovered? Because the database is damaged any self-managed recovery it might attempt is automatically rendered suspect. Here again, we need the wisdom and knowledge of the DBA.
And there are many other DBA duties that cannot be completely automated. Because each company is different, the DBMS must be customized using configuration parameters. Of course, you can opt to use the DBMS “as is,” right out-of-the-box. But a knowledgeable DBA can configure the DBMS so that it runs appropriately for their organization. Problem diagnosis is another tricky subject. Not every problem is readily solvable by developers using just the Messages and Codes manual and a help desk. What happens with particularly thorny problems if the DBA is not around to help?
Of course, the pure, heads-down systems DBA may (no, let's say should) become a thing of the past. Instead, the modern DBA will need to understand multiple DBMS products, not just one. DBAs furthermore must have knowledge of the business impact of each database under their care (more details here). And DBAs will need better knowledge of logical database design and data modeling – because it will advance their understanding of the meaning of the data in their databases.
Finally, keep in mind that we didn't automate people out of existence when we automated HR or finance. Finance and HR professionals are doing their jobs more efficiently and effectively, and they have the ability to deliver a better product in their field. That's the goal of automation. So, as we automate portions of the DBA’s job, we'll have more efficient data professionals managing data more proficiently.
This blog entry started out as a call to automate, but I guess it kinda veered off into an extended dialogue on what can, and cannot, be accomplished with automation. I guess the bottom line is this... Automation is key to successful, smooth-running databases and applications... but don't get too carried away by the concept.
I hope you found the ideas here to be useful... and feel free to add your own thoughts and comments below!
Wednesday, November 13, 2013
DBA Rules of Thumb - Part 1
Over the years I have gathered, written, and assimilated multiple collections of general rules of the road that apply to the management discipline of Database Administration (DBA). With that in mind, I thought it would be a good idea to share some of these Rules of Thumb (or ROTs) with you in a series of entries to my blog.
Now even though this is a DB2-focused blog, the ROTs that I will be covering here are generally applicable to all DBMSs and database professionals.
The theme for this series of posts is that database administration is a very technical discipline. There is a lot to know and a lot to learn. But just as important as technical acumen is the ability to carry oneself properly and to embrace the job appropriately. DBAs are, or at least should be, very visible politically within the organization. As such, DBAs should be armed with the proper attitude and knowledge before attempting to practice the discipline of database administration.
Today's blog entry offers up an introduction, to let you know what is coming. But I also will share with you the first Rule of Thumb... which is
Think about it like this... aren't we always encouraging developers to document their code? Well, you should be documenting your DBA procedures and practices, too!
And in Future Posts...
In subsequent posts over the course of the next few weeks I post some basic guidelines to help you become a well-rounded, respected, and professional DBA.
I encourage your feedback along the way. Feel free to share your thoughts and Rules of Thumb -- and to agree or disagree with those I share.
Now even though this is a DB2-focused blog, the ROTs that I will be covering here are generally applicable to all DBMSs and database professionals.
The theme for this series of posts is that database administration is a very technical discipline. There is a lot to know and a lot to learn. But just as important as technical acumen is the ability to carry oneself properly and to embrace the job appropriately. DBAs are, or at least should be, very visible politically within the organization. As such, DBAs should be armed with the proper attitude and knowledge before attempting to practice the discipline of database administration.
Today's blog entry offers up an introduction, to let you know what is coming. But I also will share with you the first Rule of Thumb... which is
#1 -- Write Down Everything
During the course of performing your job as a DBA, you are
likely to encounter many challenging tasks and time consuming problems. Be sure
to document the processes you use to resolve problems and overcome challenges.
Such documentation can be very valuable should you encounter the same, or a
similar, problem in the future. It is better to read your notes than to try to
recreate a scenario from memory.
Think about it like this... aren't we always encouraging developers to document their code? Well, you should be documenting your DBA procedures and practices, too!
And in Future Posts...
In subsequent posts over the course of the next few weeks I post some basic guidelines to help you become a well-rounded, respected, and professional DBA.
I encourage your feedback along the way. Feel free to share your thoughts and Rules of Thumb -- and to agree or disagree with those I share.
Friday, October 25, 2013
Say "Hello" to DB2 11 for z/OS
DB2 11 for z/OS
Generally Available Today, October 25, 2013
As was announced earlier this month (see press release) Version 11 of DB2 for z/OS is officially available as of today. Even if your
company won’t be migrating right away, the sooner you start learning about DB2
11, the better equipped you will be to embrace it when you inevitably must use
and support it at your company.
So let’s take a quick look at some of the highlights of this
latest and greatest version of our favorite DBMS. As usual, a new version of
DB2 delivers a large number of new features, functions, and enhancements, so of
course, not every new DB2 11 “thing” will be addressed in today’s blog entry.
Performance Claims
Similar to most recent DB2 versions, IBM boasts of
performance improvements that can be achieved by migrating to DB2 11. The
claims for DB2 11 from IBM are out-of-the-box savings ranging from 10 percent
to 40 percent for different types of query workloads: up to 10 percent for complex
OLTP and update intensive batch – up to 40 percent for queries.
As usual, your actual mileage may vary. It all depends upon things
like the query itself, number of columns requests, number of partitions that
must be accessed, indexing, and on and on. So even though it looks like
performance gets better in DB2 11, take these estimates with a grain of salt.
The standard operating procedure of rebinding to achieve the
best results still applies. And, of course, if you use the new features of DB2
11 IBM claims that you can achieve additional performance improvements.
DB2 11 also offers improved synergy with the latest
mainframe hardware, the zEC12. For example, FLASH Express and pageable 1MB
frames are used for buffer pool control blocks and DB2 executable code. So keep
in mind that getting to the latest hardware can help out your DB2 performance
and operation!
Programmer Features
Let’s move along and take a look at some of the great new
features for building applications offered up by DB2 11. There are a slew of
new SQL and analytical capabilities in the new release, including:
- Global variables – which can be used to pass data from program to program without the need to put data into a DB2 table
- Improved SQLPL functionality, including an array data type which makes SQLPL more computationally complete and simplifies coding SQL stored procedures.
- Alias support for sequence objects.
- Improvements to Declared Global Temporary Tables (DGTTs) including the ability to create NOT LOGGED DBTTs and the ability to use RELEASE DEALLOCATE for SQL statements written against DGTTs.
- SQL Compatibility feature which can be used to minimize the impact of new version changes on existing applications.Support for views on temporal data.
- SQL Grouping Sets, including Rollup, Cube
- XML enhancements including XQuery support, XMLMODIFY for improved updating of XML nodes, and improved validation of XML documents.
The first new capability is
the addition of the APREUSE(WARN) parameter. Before we learn about the new
feature, let’s backtrack for a moment to talk about the current (DB2 10)
capabilities of the APREUSE parameter. There are currently two options:
- APREUSE(NONE): DB2 will not try to reuse previous access paths for statements in the package. (default value)
- APREUSE(ERROR): DB2 tries to reuse previous access paths for SQL statements in the package. If the access paths cannot be reused, the operation fails and no new package is created.
So you can
either not try to reuse or try to reuse, and if you can’t reuse when you try
to, you fail. Obviously, a third, more palatable choice was needed. And DB2 11
adds this third option.
- APREUSE(WARN): DB2 tries to reuse previous access paths for SQL statements in the package, but the bind or rebind is not prevented when they cannot be reused. Instead, DB2 generates a new access path for that SQL statement.
DBA and Other
Technical Features
There are also a slew of new in-depth technical and DBA-related
features in DB2 11. Probably the most important, and one that impacts
developers too, is transparent archiving using DB2’s temporal capabilities first
introduced in DB2 10.
Basically, if you know how to set up SYSTEM time temporal
tables, setting up transparent archiving will be a breeze. You create both the table
and the archive table and then associate the two. This is done by means of the ENABLE
ARCHIVE USE clause. DB2 is aware of the connection between the operational
table and the archive table, so any data that is deleted will be moved to the
archive table.
Unlike SYSTEM time, only
deleted data is moved to the archive table. There is a new system defined
global variable MOVE_TO_ARCHIVE to control the ability to DELETE data without
archiving it, should you need to do so.
Of course, there are more details to learn about this
capability, but remember, we are just touching on the highlights today!
Another notable feature that will interest many DBAs is the
ability to use SQL to query more DB2 Directory tables. The list of DB2
Directory tables which now can be accessed via SQL includes:
- SYSIBM.DBDR
- SYSIBM.SCTR
- SYSIBM.SPTR
- SYSIBM.SYSLGRNX
- SYSIBM.SYSUTIL
Another regular area of improvement for new DB2 version is
enhanced IBM DB2 Utilities, and DB2 11 is no exception to the rule. DB2 11
brings the following improvements:
- REORG – automated mapping tables (where DB2 takes care of the allocation and removal of the mapping table during a SHRLEVEL CHANGE reorganization), online support for REORG REBALANCE, automatic cleanup of empty partitions for PBG table spaces, LISTPARTS for controlling parallelism, and improved switch phase processing.
- RUNSTATS – additional zIIP processing, RESET ACCESSPATH capability to reset existing statistics, and improved inline statistics gathering in other utilities.
- LOAD – additional zIIP processing, multiple partitions can be loaded in parallel using a single SYSREC and support for extended RBA LRSN.
- REPAIR – new REPAIR CATALOG capability to find and correct for discrepancies between the DB2 Catalog and database objects.
- DSNACCOX – performance improvements
DB2 11 also delivers a bevy of new security-related enhancements,
including:
- Better coordination between DB2 and RACF, including new installation parameters (AUTHEXIT_CHECK and AUTHECIT_CACHEREFRESH) and the ability for DB2 to capture event notifications from RACF
- New PROGAUTH bind plan option to ensure the program is authorized to use the plan.
- The ability to create MASKs and PERMISSIONs on archive tables and archive-enabled tables
- Column masking restrictions are removed for GROUP BY and DISTINCT processing
An additional online schema change capability in DB2 11 is
support for online altering of limit keys, which enables DBAs to change the
limit keys for a partitioned table space without impacting data availability.
Finally, in terms of online schema change, we have an
improvement to operational administration for deferred schema changes. DB2 11
provides improved recovery for deferred schema changes. With DB2 10, when the REORG
begins to materialize pending change it is no longer possible to perform a recovery
to a prior point in time. DB2 11 removes this restriction, allowing recovery to
any valid prior point.
In terms of Buffer Pool enhancements, DB2 11 offers up the new
2GB frame size for very large BP requirements.
In terms of Data Sharing enhancements, DB2 11 offers faster
CASTOUT, improved RESTART LIGHT capability, and automatic recovery of all pages
in LPL during a DB2 restart.
Analytics and Big
Data Features
There are also a lot of features added to DB2 11 to support
Big Data and analytical processing. Probably the biggest is the ability to
support Hadoop access. If you don’t know what Hadoop is, this is not the place
to learn about that. Instead, check out this link.
Anyway, DB2 11 can be used to enable applications to easily
and efficiently access Hadoop data sources. This is done via the generic table
UDF capability in DB2 11. Using this feature you can create a variable shape of
UDF output table.
This capability allows access to BigInsights, which is IBM’s
Hadoop-based platform for Big Data. As such, you can use JSON to access Hadoop
data via DB2 using the UDF supplied by IBM BigInsights.
DB2 11 also adds new SQL analytical extensions, including:
- GROUPING SETS can be used for GROUP BY operations to enable multiple grouping clauses to be specified in a single statement.
- ROLLUP can be used to aggregate values along a dimension hierarchy. In addition to aggregation along the dimensions a grand total is produced. Multiple ROLLUPs can be coded in a single query to produce multidimensional hierarchies in a result set.
- CUBE can be used to aggregate data based on columns from multiple dimensions. You can think of it like a cross tabulation.
- The ability to store 1.3 PB of data
- Change Data Capture support to capture changes to DB2 data and propagate them to IDAA as they happen
- Additional SQL function support for IDAA queries (including SUBSTRING, among others, and additional OLAP functions).
- Work Load Manager integration
Of course, there are additional features and functionality
being introduced with DB2 11 for z/OS. A blog entry of this nature on the day
of GA cannot exhaustively cover everything. That being said, two additional
areas are worth noting.
- Extended log record addressing – increases the size of the RBA and LRSN from 6 bytes to 10 bytes. This avoids the outage that is required if the amount of log records accumulated exhausts the capability of DB2 to create new RBAs or LRSNs. To move to the new extended log record addressing requires converting your BSDSs.
- DRDA enhancements – including improved client info properties, new FORCE option to cancel distributed threads, and multiple performance related improvements.
Subscribe to:
Posts (Atom)