Showing posts with label automation. Show all posts
Showing posts with label automation. Show all posts

Monday, March 11, 2024

Mixing Db2 Database Administration with DevOps - Part 3: Automating DevOps Toolchain

Understanding what is meant by DevOps and the many requisite pieces of the DevOps toolchain is just the first step in adopted DevOps.

The key to success with automating DevOps is a well-constructed toolchain for automating all of the afore-mentioned processes in an integrated and cooperative manner. If you have been an IT professional for any length of time most of the categories we just discussed will not be new to you. What is new, however, is likely the integration of the tools hooked up together to work in concert for the purpose of delivering application software quickly and accurately.

Additionally, the list of activities we just reviewed is not comprehensive. A glaring omission is the lack of integration with your database management systems. The orchestration of application changes in the DevOps toolchain is fairly well established, but this is not so much the case for database changes. And this has especially been the case for the mainframe world in terms of incorporating Db2 for z/OS into the DevOps toolchain.

As any good DBA or performance analyst knows, there are multiple management and operational activities required to assure functionality and optimal performance when applications rely on Db2 databases for persistent storage. Perhaps the single most difficult task is managing database schema changes. 

Thursday, January 07, 2021

BMC AMI Ops: The Next Generation of Mainframe Systems Management

Assuring the performance of your mainframe systems and applications is an imposing task that keeps getting more complex all the time. It makes sense to arm your IT performance analysts, DBAs, and systems programmers with modern tools so you can optimize performance and thereby deliver superior service to your customers.

Of course, BMC MainView has helped IT professionals manage the performance of their mainframe systems and applications for years. But there are new challenges facing modern organizations that require adaptation and transformation.

Organizations are transforming to become autonomous digital enterprises (ADE). This means that things are getting more complex because availability requirements are expanding (many times requiring 24/7 availability), but IT pros are expected to resolve problems rapidly even as workloads become more unpredictable and IT staff has less experience. These challenges are real and require attention.

And that is why BMC is transforming its MainView product line into BMC AMI Ops!

With BMC AMI Ops you can experience next-level mainframe operational resiliency, AI-powered observability, an intuitive user interface with embedded expertise, actionable insights, and enterprise platform interoperability.

How is BMC AMI Ops engineered to help? Well, it is built for digital business with the understanding that being reactive is not sufficient these days. BMC AMI Ops provides a complete, modular solution with central administration and management.

Artificial intelligence and machine learning techniques are being embraced by an increasing number of organizations for improving their business, so it only stands to reason that your IT operations and support functions should be looking to improve their capabilities using AI and ML, too. And BMC AMI Ops helps you to do that because it is infused with AI/ML-powered analytics to find and fix problems before business services are impacted. With BMC AMI Ops you can improve performance and availability by taking advantage of its built-in intelligent automation and remediation features.

And the user interface is brand new, engineered to support ease of use, to facilitate information instead of raw data, and to guide the user experience. BMC AMI Ops delivers a custom dashboard approach where you can group widgets together for related logical systems or business areas. And you get “out of the box” health indicators for each of the widgets you deploy, meaning it takes less time to be productive right away. Furthermore, a guided path is provided so the user can drill down into additional details as needed. If you are interested in seeing more details on the new user experience for BMC AMI Ops, chick out this blog post from Shay Alsberg (BMC AMI Ops: Evolving the MainView User Experience).

And not to fear, for those of you experienced mainframe pros who not only know how to drive ISPF panels but prefer it, BMC AMI Ops can still be accessed using character-based panels.

The bottom line is that BMC AMI Ops is designed for modern businesses and IT, as they embrace digital transformation to become autonomous digital enterprises, enabling them to deliver a simplified yet customizable systems management experience for optimizing your system and application performance. That’s BMC AMI Ops in a nutshell… and it is worth looking into how BMC AMI Ops can help you to improve the performance of your systems and applications.

Wednesday, October 21, 2020

Automation and the Future of Modern Db2 Data Management

Recently I was invited by BMC Software to participate in their AMI Z Talk podcast series to talk about modern data management for Db2... and I was happy to accept.

Anne Hoelscher, Director of R+D for BMC's Db2 solutions, and I spent about 30 minutes discussing modern data management, the need for intelligent automation, DevOps, the cloud, and how organizations can achieve greater availability, resiliency, and agility managing their mainframe Db2 environment.

Here's a link to the podcast that you can play right here in the blog!


Modern data management, to me, means flexibility, adaptability, and working in an integrated way with a team. Today’s data professionals have to move faster and more nimbly than ever before. This has given rise to agile development and DevOps - and, as such, modern DBAs participate in development teams. And DBA tasks and procedures are integrated into the DevOps pipeline. 

And as all of this DevOps adoption is happening, the amount of data we store, and have to manage, continues to grow faster than ever before.

These are just some of the challenges that Anne and I discuss in this podcast... and at the end, Anne even asks me to predict the future... 

I hope you'll take the time to listen to our discussion and sharing your thoughts and issues regarding the resiliency and agility required to succeed with modern data management and Db2 for z/OS.

----------

I’d also like to extend an offer to all the listeners of this BMC podcast (and readers of this blog post) to get a discount on my latest book, A Guide to Db2 Performance for Application Developers. The link is https://tinyurl.com/craigdb2

There’s also a link to the book publisher on home page of my website. Once you are there, click on the link/banner for the book and when you order from the publisher you can use the discount code 10percent to get 10% off your order of the print or ebook.

 


Wednesday, May 20, 2020

IBM Think 2020: Virtual, On Demand, Hybrid Cloud and Z

This year’s IBM Think event was quite different than in past years. Usually, Think is an in-person event and attracts a lot of people, typically more than ten thousand IT executives and practitioners. But as we all know, this year with the global COVID-19 pandemic an in-person event was not practical, so IBM held it on-line. And I have to say, they did a fantastic job of managing multiple threads of content without experiencing bandwidth or access issues – at least none that I encountered.
The theme and focus of the content for the event was different, too. Instead of the usual conference focus on products, announcements, and customer stories, this year’s event was more philanthropic. Oh, sure, you could still hear about IBM’s products and customer successes, but the keynote and featured sessions were at a higher level this year.
In the kickoff session, new IBM CEO Arvind Krishna spoke about the driving forces in IT as being hybrid cloud and AI. And he spoke about these things in the context of moving IBM forward, but also how they can be used to help healthcare workers combat pandemics like we are currently experiencing.
In another keynoteIBM Executive Chairman Ginni Rometty spoke with Will.i.am (of the Black-Eyed Peas) about making the digital era inclusive through education, skills development, and the digital workforce. 


And then there was Mayim Bialik’s session on women and STEM, which was sincere, heartfelt, and entertaining. 

For those who don’t know who she is, she is the actress who played Blossom (on Blossom) and Amy Farrah Fowler (on The Big Bang Theory)… but she is also a scientist with a doctorate in neuroscience. Bialik’s session focused on putting a positive female face on STEM, something that is definitely needed!

So, what about the technology side of things? Well, you can take a clue from Krishna’s assertion that IBM as a company has to have a “maniacal” focus on hybrid cloud and AI in order to compete. But the company has a rich and deep heritage across the computing spectrum that gives it a key advantage even as it adjusts to embracing hybrid cloud and AI.
The first thing to remember is that IBM uses the term “hybrid multicloud[RB1] ” very specifically and deliberately. Everything is not going to be in the cloud[RB2] . Large enterprises continue to rely on the infrastructure and applications they have built over many years, many of them on z Systems mainframes. The key to the future is both on-premises and cloud, and IBM understands this with its hybrid cloud approach… as they clearly demonstrated at Think 2020.
My specific area of focus and expertise is the mainframe and Db2 for z/OS, so I sought out some sessions at Think in those areas. Let me tell you a bit about two of them.

First let’s take a quick look at how IBM Cloud Pak for Data can work with data on the Z platform. This information was drawn from IBM Distinguished Engineer Gary Crupi’s session, titled "Drive Actionable, Real-Time Insight from Your High-Value IBM Z Data Using IBM Cloud Pak for Data."

What is Cloud Pak for Data? Well, it is an IBM platform for unifying and simplifying the collection, organization, and analysis of data. Heretofore, it was mostly focused on non-mainframe platforms, but the latest release, version 3.0, is a major upgrade with an enhanced unified experience, expanded ecosystem, and optimized Red Hat integration. And it enables several ways for you to turn your enterprise data on IBM Z into actionable, real-time insight through the integrated cloud-native architecture of IBM Cloud Pak for Data.



Crupi’s session started out with the now familiar (at least to IBM customers and Think attendees) Ladder to AI and how Cloud Pak for Data helps to enable customer’s journey up the ladder. Data is the foundation for smart business decisions and AI can unlock the value of this data.

He went on to discuss the continuing importance of the mainframe providing facts including:
  •  70% of Fortune 500 companies use mainframe for their most critical business functions
  •  72% of customer-facing applications are completely or very dependent on mainframe processing
  •  The mainframe handles 1.1 million transactions per second (as compared to Google experiences of 60,000 searches per second)
  •  95% of transactions in the banking, insurance, airline and retail industries run on the mainframe

These are all good points; and things that mainframe users like to hear. It is good to see IBM promoting the ubiquity and capabilities of the mainframe.



Now, what about IBM Cloud Pak for Data better-exploiting mainframe data? Crupi goes back to the AI Ladder to talk about z/OS capabilities for analyzing and collecting data for AI.


Solutions such as Watson Machine Learing for z/OS, Db2 AI for z/OS, and QMF can be used for analyzing data; while Db2 for z/OS and Tools, IDAA, and Data Virtualization Manager can be used for data collection. These things already exist, but using them effectively with distributed platform capabilities will be crucial to be able to climb the ladder to AI.

IBM Cloud Pak for Data will leverage IBM Z technology to bring valuable IBM Z data into a modern analytics/AI platform. It can now exploit IBM Z data and resources where appropriate enabling you to further benefit from IBM Z technology and data.

A key new component of making the data on IBM Z accessible is IBM Db2 for z/OS Data Gate, a new product announced during Think 2020. Db2 Data Gate can help you reduce the cost and complexity of your data delivery with a simple, easy-to-deploy mechanism to deliver read-only access to Db2 for z/OS data. Instead of building and maintaining costly custom code, Db2 Data Gate do the work. Data can be synchronized between Db2 for z/OS data sources and target databases on IBM Cloud Pak for Data.


Instead of accessing data in the IBM Z data source directly, an application accesses a synchronized copy of the Db2 for z/OS data, hosted by a separate system. This target system can be established anywhere Cloud Pak for Data is supported, thus enabling a wide range of target platforms that include public cloud, on-premises, and private cloud deployments.


So IBM is helping you to expand the accessibility of your Z data.

And that brings me to the second session I’d like to briefly mention, Automate Your Mainframe z/OS Processes with Ansible [Session 6760]. 

Although Ansible is not a replacement for your operational mainframe automation tools, it can be used to communicate with and automate z/OS using the out-of-the-box SSH into z/OS Unix Systems Services to execute commands and scripts, submit JCL, and copy data. And Ansible has existing modules that can be used to make calls to RESTful/SOAP APIs that are available in many z/OS products.


Ansible can be beneficial to orchestrate cross-platform, including Z systems, and to simplify configuration and deployment management. But keep in mind that Ansible is a proactive framework for automation and is not intended to replace automation solutions that monitor and react.

Here is a nice, but by no means exhaustive, list of examples showing how Ansible can be used to interact with popular z/OS products.


The Bottom Line

The IBM Think 2020 conference was a great success considering how rapidly IBM had to move to convert it from an in-person event, to an online, virtual one. And the content was informative, entertaining, and had something for everybody. I hope you enjoyed my take on the event… feel free to share your comments below on anything I’ve written here, or on your experiences at the event.


Thursday, August 15, 2019

BMC AMI for DevOps Intelligently Integrates Db2 for z/OS Schema Changes

Organizations of all types and sizes have adopted a DevOps approach to building applications because it effectively implements small and frequent code changes using agile development techniques. This approach can significantly improve the time to value for application development. The DevOps approach is quite mature on distributed platforms, but it is also gaining traction on the mainframe.

As mainframe development teams begin to rely on DevOps practices more extensively, the need arises to incorporate Db2 for z/OS database changes. This capacity has been lacking until recently, requiring manual intervention by the DBA team to analyze and approve schema changes. This, of course, slows things down, the exact opposite of the desired impact of DevOps. But now BMC has introduced a new solution that brings automated Db2 schema changes to DevOps, namely BMC AMI for DevOps.

BMC AMI for DevOps is designed to integrate into the DevOps tooling that your developers are already using. It integrates with the Jenkins Pipeline tool suite to provide an automated method of receiving, analyzing, and implementing Db2 schema changes as part of an application update.

By integrating with your application orchestration tools AMI for DevOps can capture the necessary database changes required to move from test to production. But it does not just apply these changes; it enforces and ensures best practices using built-in intelligence and automated communication between development and database administration.

The ability to enforce best practices is driven by BMC’s Automated Mainframe Intelligence (AMI), which is policy driven. The AMI capability builds much of the DBA oversight for schema changes into the DevOps pipeline, enforcing database design best practices as you go instead of requiring in-depth manual DBA oversight.

Incorporating a database design advisory capability into the process offloads manual, error-prone tasks to the computer. This integrated automation enables automatic evaluation of Db2 database schema change requests to streamline the DBA approval process and remove the manual processes that inhibit continuous delivery of application functionality.

Furthermore, consider that intelligent database administration functionality can be used to help alleviate the loss of expertise resulting from an aging, retiring workforce. This is a significant challenge for many organizations in the mainframe world.

But let’s not forget the developers. The goal of adopting a DevOps approach on the mainframe is to speed up application development, but at the same time it is important that we do not forgo the safeguards built into mainframe development and operations. So you need a streamlined DevOps process—powered by intelligent automation—in which application developers do not have to wait around for DBA reviews and responses. A self-service model with built-in communication and intelligence such as provided by AMI for DevOps delivers this capability.

The Bottom Line

BMC AMI for DevOps helps you to bring DevOps to the mainframe by integrating Db2 for z/OS schema changes into established and existing DevOps orchestration processes. This means you can use BMC AMI for DevOps to deliver the speed of development required by agile techniques used for modern application delivery without abandoning the safeguards required by DBAs to assure the accuracy of the database changes for assuring availability and reliability of the production system. And developers gain more self-service capability for Db2 schema changes using a well-defined pipeline process.

Friday, August 01, 2014

DB2 Health Checks - Part Two

In the first part of this series on DB2 health checks, DB2 Health Checks - Part One, I discussed the general concept of a health check and their basic importance in terms of maintaining a smooth-running DB2 environment.

Today, I want to briefly look at how DB2 health checks are usually done... if they are done at all.

The Scope of a DB2 Health Check

Some people mistakenly view a DB2 Health Check as being performance-focused only. Yes, performance is an important aspect of a health check -- and I admit that performance is generally the area that causes an organization to undergo the health check process. But the overall health of the DB2 environment needs to be addressed by the health check. In addition to performance-related issues (system, database and application), this can include:


  • availability
  • fault tolerance
  • recoverability
  • use of automation
  • process review
  • documentation
  • people skills (DBA, sysprog, development, etc.)
Considerations Before Undergoing a DB2 Health Check

DB2 health checks are important and crucial to the on-going stability of your systems, but there are issues:
  • Health checks can be costly (consulting engagements)
  • When a consulting company conducts a health check the analysis usually is done off-site, so your DBAs do not learn the techniques used by the consultants as they massage and analyze the data
  • Health checks generally are valid for a specific point-in-time and can become obsolete quickly

Conducting DB2 Health Checks

DB2 health checks typically are conducted by IBM personnel, a DB2 consultant, or a larger services firm. The engagement begins with experts/consultants interviewing the DBAs, submitting questionnaires as needed and collecting data from DB2. After collecting the data the consulting team goes off site and analyzes the reams of collected data. There may be intermittent communication between the consulting team and the on-site DBAs to clear up any lingering questions or to clarify things during the analysis phase. After some time (usually a week or more), a report on the health of your DB2 environment, perhaps with some recommendations to implement, is delivered.

What happens next is all up to you. After reading the report you can ignore it, implement some or all of the recommendations, conduct further in-house investigation for the feasibility of implementing the recommendations, or send it along to management for their perusal. But there is a deadline involved. After all, your systems are not static. So the health check report is only as good as the point-in-time for which it was delivered. Time, as it always does, will creep up on you. If you wait too long, the recommendations become stale and you might not be doing the proper thing for your environment by implementing changes based on old information.

Of course, when too much time has gone by after the health check, you could always engage with the services company and consultants again, requiring additional spending.

Is another way? 

Stay tuned, as we'll look at some other options in upcoming installments of this blog series on DB2 health checking...

Friday, March 21, 2014

DB2 Tool Requirements

The last blog post here at the DB2 Portal offered up a brief overview of the types of tools that you might want to consider to help you use, manage, and administer your DB2 applications and databases. But it did not really look into the capabilities and requirements for modern DB2 tools and solutions.
Today’s DB2 management and administration tools should provide intelligent automation to reduce the problems inherent in the tedious day-to-day tasks of database administration. Simple automation is no longer sufficient. Modern data management software must be able to intelligently monitor, analyze, and optimize applications using past, present, and future analysis of collected data. Simply stated, the software should work the way a consultant works--fulfilling the role of a trusted advisor. The end result should be software that functions like a consultant, enabling your precious human resources to spend time on research, strategy, planning, and implementing new and advanced features and technologies, instead of rote day-to-day tasks.
Furthermore, modern database tools should provide cross-platform, heterogeneous management. For most medium-to-large IT organization it is not enough to manage just DB2 for z/OS systems, for example. The ability to offer administrative and development assistance across multiple DBMS platforms (for example, DB2 for LUW, Oracle, SQL Server, MySQL, and so on). Most companies have multiple DBMSs that need to be managed -- not just one... and DBAs and developers are precious resources that increasingly are being asked to work on more than just a single DBMS. When the tools can manage cross-platform, the learning curve is reduced and productivity can be enhanced.
And while it is true that today’s DBMS products are becoming more self-managing, they do not yet provide out-of-the-box, lights-out operation, nor do they offer all of the speed, usability, and ease of use features of ISV admin, management, and development tools. An organization looking to provide 24/7 data availability coupled with efficient performance will have to augment the capabilities of their DBMS software with data management and DBA tools to get the job done.
As data management tasks get more complex and DBAs become harder to find and retain, more and more database maintenance duties should be automated using intelligent management software. Using intelligent, automated DB2 tools will help to reduce the amount of time, effort, and human error associated with implementing and managing efficient database applications.

Monday, November 18, 2013

DBA Rules of Thumb - Part 2 (Automate)

Why should you do it by hand if you can automate DBA processes? Anything you can do probably can be done better by the computer – if it is programmed to do it properly. And once it is automated you save yourself valuable time. And that time can be spent tackling other problems, learning about new features and functionality, or training others.


Furthermore, don’t reinvent the wheel. Someone, somewhere, at some time many have already solved the problem you currently are attempting to solve. Look to the web for sites that allow you to download and share scripts. Or if you have budget money look to purchase DBA tools from ISVs. There are a lot of good tools out there, available from multiple vendors, that can greatly simplify the task of database administration. Automating performance management, change management, backup and recovery, and other tasks can help to reduce the amount of time, effort, and human error involved in managing database systems.

Of course, you can take the automation idea too far. There has been a lot of talk and vendor hype lately about self-managing database systems. For years now, pundits and polls have been asking when automation will make the DBA job obsolete. The correct answer is "never" - or, at least, not any time soon.

There are many reasons why DBAs are not on the fast path to extinction. Self-managing databases systems are indeed a laudable goal, but we are very far away from a “lights-out” DBMS environment. Yes, little-by-little and step-by-step, database maintenance and performance management is being improved, simplified, and automated. But you know what? DBAs will not be automated out of existence in my lifetime – and probably not in your children’s lifetime either.
Many of the self-managing features require using the built-in tools from the DBMS vendor, but many organizations prefer to use heterogeneous solutions that can administer databases from multiple vendors (Oracle, DB2, SQL Server, MySQL, etc.) all from a single console. Most of these tools have had self-managing features for years and yet they did not make the DBA obsolete.

And let’s face it, a lot of the DBMS vendors claims are more hyperbole than fact. Some self-managing features are announced years before they will become generally available in the DBMS. All vendors claims to the contrary, no database today is truly 100% self-contained. Every database needs some amount of DBA management – even when today’s best self-management features are being used.

What about the future? Well, things will get better – and probably more costly. You don’t think the DBMS vendors are building this self-management technology for free, do you? But let’s remove cost from the equation for a moment. What can a self-managing database actually manage?

Most performance management solutions allow you to set performance thresholds. A threshold allows you to set up a trigger that says something like “When x% of a table’s pages contain chained rows or fragmentation, schedule a reorganization.” But these thresholds are only as good as the variables you set and the actions you define to be taken upon tripping the threshold. Some software is bordering on intelligent; that is, it “knows” what to do and when to do it. Furthermore, it may be able to learn from past actions and results. The more intelligence that can be built into a self-managing system, the better the results typically will be. But who among us currently trusts software to work like a grizzled veteran DBA? The management software should be configurable such that it alerts the DBA as to what action it wants to take. The DBA can review the action and give a “thumbs up” or “thumbs down” before the corrective measure is applied. In this way, the software can earn the DBA’s respect and trust. When the DBA trusts the software, he can turn it on so that it self-manages “on the fly” without DBA intervention. But today, in most cases, a DBA is required to set up the thresholds, as well as to ensure their on-going viability.

Of course, not all DBA duties can be self-managed by software. Most self-management claims are made for performance management, but what about change management? The DBMS cannot somehow read the mind of its user and add a new column or index, or change a data type or length. This non-trivial activity requires a skilled DBA to analyze the database structures, develop the modifications, and deploy the proper scripts or tools to implement the change. Of course, software can help simplify the process, but software cannot replace the DBA.

Furthermore, database backup and recovery will need to be guided by the trained eye of a DBA. Perhaps the DBMS can become savvy enough to schedule a backup when a system process occurs that requires it. Maybe the DBMS of the future will automatically schedule a backup when enough data changes. But sometimes backups are made for other reasons: to propagate changes from one system to another, to build test beds, as part of program testing, and so on. A skilled professional is needed to build the proper backup scripts, run them appropriately, and test the backup files for accuracy. And what about recovery? How can a damaged database know it needs to be recovered? Because the database is damaged any self-managed recovery it might attempt is automatically rendered suspect. Here again, we need the wisdom and knowledge of the DBA.

And there are many other DBA duties that cannot be completely automated. Because each company is different, the DBMS must be customized using configuration parameters. Of course, you can opt to use the DBMS “as is,” right out-of-the-box. But a knowledgeable DBA can configure the DBMS so that it runs appropriately for their organization. Problem diagnosis is another tricky subject. Not every problem is readily solvable by developers using just the Messages and Codes manual and a help desk. What happens with particularly thorny problems if the DBA is not around to help?

Of course, the pure, heads-down systems DBA may (no, let's say should) become a thing of the past. Instead, the modern DBA will need to understand multiple DBMS products, not just one. DBAs furthermore must have knowledge of the business impact of each database under their care (more details here). And DBAs will need better knowledge of logical database design and data modeling – because it will advance their understanding of the meaning of the data in their databases.

Finally, keep in mind that we didn't automate people out of existence when we automated HR or finance. Finance and HR professionals are doing their jobs more efficiently and effectively, and they have the ability to deliver a better product in their field. That's the goal of automation. So, as we automate portions of the DBA’s job, we'll have more efficient data professionals managing data more proficiently.


This blog entry started out as a call to automate, but I guess it kinda veered off into an extended dialogue on what can, and cannot, be accomplished with automation. I guess the bottom line is this... Automation is key to successful, smooth-running databases and applications... but don't get too carried away by the concept.


I hope you found the ideas here to be useful... and feel free to add your own thoughts and comments below! 



Tuesday, September 30, 2008

A Perfect Storm?

There is something of a perfect storm brewing in the world of data today. The world is becoming more automated, more connected, more wireless, and more complex. The next wave of database administration is intelligent automation. I refer to this as implementing software scrubbing bubbles that “work hard, so you don’t have to.” (Remember that commercial!)

As more of the tasks required of DBAs become more automated, the DBA will be freed to expand into other areas. So one front on this storm is the autonomic computing initiatives that automate DBA tasks. At the same time, IT professionals are being asked to know more about the business instead of just knowing the technology. So DBAs need to understand the business purpose and definition of the data they manage, as well as the technological underpinnings of the DBMS. The driving force here is predominantly regulatory compliance. This second front of the perfect storm will cause DBAs to work more closely with metadata to drive database archiving, data auditing, and security to ensure their organization complies with regulations like Sarbanes-Oxley, HIPAA, and others.

Regarding the wireless aspect of things, pervasive devices (PDA, handhelds, cell phones, etc.) will increasingly interact with database systems. DBAs will need to get involved there to ensure successful data synchronization. And database systems will work with disconnected data seamlessly, such as data generated by RFID tags.

Yet another big database trend is technology "suck." By that I mean the DBMS is as it sucks up technologies and functions that previously required you to purchase separate software. Remember when the DBMS had no ETL or OLAP functionality? Those days are gone. This will continue as the DBMS adds capabilities to tackle more and more IT tasks.

Another trend impacting DBAs will be a change in some of their roles as more and more of the recent DBMS features actually start being used in more production systems.

The net result of this perfect storm of changes is that data professionals are absolutely being required to do more... sometimes with less (less time, less money, less staff, etc.)

If you know the technology but are then required to know the business, this is doing more – much more. But the technology, in many cases, is also expanding. For example, DB2 9 incorporates native XML. Most DBAs are not XML savvy, but increasingly they will have to learn more about XML as the DBMS technology expands. And this is just one example.

Additionally, data is growing at an ever-increasing rate. Every year the amount of data under management increases (some analysts peg the compound annual rate of data growth at 125%) and in many cases the number of DBAs to manage that growing data is not increasing, and indeed, could be decreasing.

And, budgetary limitations can cause DBAs to have to do more work, to more data, with less resources. When a company reduces budget but demands more work, automation is an absolute necessity. Turning work over to the computer can help (although it is unlikely to solve every administrative issue). Sometimes IT professionals fight against the very thing they excel in – that is, automating work. If you think about it, every computer program is written to automate someone’s work – the write (word processing), the accountant (financials, payroll, spreadsheets), and so on. This automation did not put the executives whose work was automated out of a job; instead it made them more efficient. Yet, for some reason, there is a notion in the IT industry that automating IT tasks will eliminate jobs. You cannot automate a DBA out of existence – but you can make that DBA’s job more effective and efficient with DBA tools and autonomic computing.

And the sad truth of the matter is that there is still a LOT more than can, and should, be done in most companies. We can start with better automation of DBA tasks, but we shouldn't stop there!

Corporate governance is hot – that is, technologies to help companies comply with governmental regulations. Software to enable archiving for long-term data retention, auditing to determine who did what to which piece of data, and security to better protect data are all hot data technologies right now. But database security need to be improve and technologies for securing and auditing data need to be more pervasively implemented.

Metadata is increasing in importance. As data professionals really begin to meld together technology and business, they find that metadata is imperative. But most organizations do not have a metadata repository fully-populated and up-to-date that acts as a lexicon for business data.

And finally, something that isn’t nearly hot enough is data quality and integrity. Tools, processes, and database options that can be used to make data more accurate and reliable are not implemented appropriately with any regularity. So the data stored in our corporate databases is suspect. According to Thomas C. Redman, data quality guru, poor data quality costs the typical company at least ten percent (10%) of revenue. That is a significant cost! Data quality is generally bad in most organizations – and more needs to be done to address that problem.

With all of these thoughts in mind, are you prepared to weather this perfect storm?