Thursday, August 15, 2019

BMC AMI for DevOps Intelligently Integrates Db2 for z/OS Schema Changes

Organizations of all types and sizes have adopted a DevOps approach to building applications because it effectively implements small and frequent code changes using agile development techniques. This approach can significantly improve the time to value for application development. The DevOps approach is quite mature on distributed platforms, but it is also gaining traction on the mainframe.

As mainframe development teams begin to rely on DevOps practices more extensively, the need arises to incorporate Db2 for z/OS database changes. This capacity has been lacking until recently, requiring manual intervention by the DBA team to analyze and approve schema changes. This, of course, slows things down, the exact opposite of the desired impact of DevOps. But now BMC has introduced a new solution that brings automated Db2 schema changes to DevOps, namely BMC AMI for DevOps.

BMC AMI for DevOps is designed to integrate into the DevOps tooling that your developers are already using. It integrates with the Jenkins Pipeline tool suite to provide an automated method of receiving, analyzing, and implementing Db2 schema changes as part of an application update.

By integrating with your application orchestration tools AMI for DevOps can capture the necessary database changes required to move from test to production. But it does not just apply these changes; it enforces and ensures best practices using built-in intelligence and automated communication between development and database administration.

The ability to enforce best practices is driven by BMC’s Automated Mainframe Intelligence (AMI), which is policy driven. The AMI capability builds much of the DBA oversight for schema changes into the DevOps pipeline, enforcing database design best practices as you go instead of requiring in-depth manual DBA oversight.

Incorporating a database design advisory capability into the process offloads manual, error-prone tasks to the computer. This integrated automation enables automatic evaluation of Db2 database schema change requests to streamline the DBA approval process and remove the manual processes that inhibit continuous delivery of application functionality.

Furthermore, consider that intelligent database administration functionality can be used to help alleviate the loss of expertise resulting from an aging, retiring workforce. This is a significant challenge for many organizations in the mainframe world.

But let’s not forget the developers. The goal of adopting a DevOps approach on the mainframe is to speed up application development, but at the same time it is important that we do not forgo the safeguards built into mainframe development and operations. So you need a streamlined DevOps process—powered by intelligent automation—in which application developers do not have to wait around for DBA reviews and responses. A self-service model with built-in communication and intelligence such as provided by AMI for DevOps delivers this capability.

The Bottom Line

BMC AMI for DevOps helps you to bring DevOps to the mainframe by integrating Db2 for z/OS schema changes into established and existing DevOps orchestration processes. This means you can use BMC AMI for DevOps to deliver the speed of development required by agile techniques used for modern application delivery without abandoning the safeguards required by DBAs to assure the accuracy of the database changes for assuring availability and reliability of the production system. And developers gain more self-service capability for Db2 schema changes using a well-defined pipeline process.

Thursday, August 01, 2019

DevOps is Coming to Db2 for z/OS


Mainframe development teams are relying on DevOps practices more extensively, bringing the need to incorporate Db2 for z/OS database changes into the toolset that is supporting their software development lifecycle (SDLC).

But most mainframe professionals have only heard a little about DevOps and are not really savvy as to what it entails. DevOps is an amalgamation of Development and Operations. The goal of DevOps is to increase collaboration between developers and operational support and management professionals, with the desired outcome of faster, more accurate software delivery.

DevOps typically relies on agile development, coupled with a collaborative approach between development and operations personnel during all stages of the application development lifecycle. The DevOps approach results in small and frequent code changes and it can significantly reduce the lead time for changes, lower the rate of failure, and reduce the mean time to recovery when errors are encountered. These are all desirable qualities, especially as organizations are embracing digital transformation driven by the 24/7 expectations of users and customers to access data and apps at any time from any device.

The need to be able to survive and thrive in the new digital economy has caused organizations to adopt new and faster methods of developing, testing and delivering application software. Moving from a waterfall software development methodology to an agile methodology is one way that organizations are speeding the time-to-delivery of their software development. Incorporating a DevOps approach is another.

Instead of long software development projects that may not deliver value for months, or perhaps even years (common using the Waterfall development methodology) an agile DevOps approach delivers value quickly, and then incrementally over time. DevOps enables the continuous delivery of new functionality demanded by customers in the digital economy.

Succeeding with DevOps, however, requires a cultural shift in which all groups within IT work in collaboration with one another, and where management endorses and cultivates this cultural change. Because DevOps relies upon incremental development and rapid software delivery, your IT department can only thrive if there is a culture of accountability, collaboration, and team responsibility for desired business outcomes. Furthermore, it requires solid, integrated automated tooling to facilitate the SDLC from development, through testing, to delivery. Creating such an environment and culture can be challenging.

With DevOps the result will be a constantly repeating cycle of continuous development, continuous integration and continuous deployment. This is typically depicted graphically as the infinity symbol such as in Figure 1 (below).

Figure 1 - continuous development, integration and deployment


Note, however, that this particular iteration of the DevOps infinity graphic calls out the participation of both the application and the database. This is an important, though often lacking, detail that should be stressed when adopting DevOps practices.

The Mainframe and DevOps

The adoption of DevOps has, until now, been much slower within mainframe development teams than for distributed and cloud application development. The staid nature of mainframe development and support, coupled with a glass house mentality, and a rigid production turnover process contribute to the delayed adoption of DevOps on the mainframe. This is not surprising as mainframes mostly are used by large organizations running mission critical workloads with an aversion to any kind of change and risk-averse.

Additionally, the traditional waterfall development methodology has been used by most mainframe software developers for multiple decades, whereas DevOps is closely aligned with an agile approach, which differs significantly from waterfall.

Notwithstanding all of these barriers to acceptance of DevOps on the mainframe, mainframe developers can, and in some cases already do successfully utilize a DevOps approach. Technically speaking, the mainframe is just another platform and there is nothing inherent in its design or usage that obviates the ability to participate in a DevOps approach to application development and delivery.

What about Db2 for z/OS?

Integrating database change into the application delivery lifecycle can be a stumbling block on the road to DevOps success. Development teams focus on application code, as they should, and typically view database structure changes as ancillary to their coding efforts. In most application development projects, it is not the programmer’s responsibility to administer the database and modify database structures. But applications rely on the database being designed, implemented, and changed in accordance with the needs of the business and the code.

This means that many development projects have automated their SDLC tool chain to speed up the delivery of applications. This is the “Dev” portion of DevOps. But the requisite automation and tooling has not been as pervasively implemented to speed up the delivery of database changes. This is the “Ops” portion of DevOps. And this is changing.

A big consideration is that the manner in which change is applied to applications differs from how database changes are applied. That means each must be managed using different techniques and probably different tools. When an application program changes, the code is compiled, and the load module is migrated from test to production. The old load module is saved for posterity in case the change needs to be backed out, but the change is a wholesale replacement of the executable code.

Database changes are different. The database is an entire configuration in each environment and changes get migrated. There is no wholesale replacement of the database structures. DDL commands are issued to ALTER, DROP, and CREATE the changes to the database structures as needed.

From the perspective of database changes on Db2 for z/OS, DBAs need the ability to modify all the database objects supported by Db2 for z/OS. Supporting Db2 for z/OS using DevOps requires tooling that understands both Db2 for z/OS and the DevOps methodology and toolchain. And the tooling must understand how changes are made, as well as any underlying changes that may be required to effectively implement the database change. Some types of database changes are intrusive, requiring a complicated series of unloads, metadata captures, drops, creates, loads, and additional steps to implement. The tooling must be capable of making any of these changes in an automated way that the DBA trusts.

Fortunately, for organizations adopting DevOps on the mainframe with Db2, there is a solution for integrating Db2 database change into the DevOps toolchain: BMC AMI DevOps for Db2. BMC AMI DevOps for Db2 integrates with Jenkins, an application development orchestration tool, to automatically research and determine database schema change requirements, to streamline the review and approval process, and to safely implement the database schema changes making development and operations teams more efficient and agile.  

Monday, July 29, 2019

Webinar: DevOps and Database Change Management for Db2 for z/OS - August 13, 2019

DevOps practices are gaining popularity on all development platforms and the mainframe is no exception. DevOps relies heavily on agile development and automated software delivery. However, the ability to integrate and orchestrate database changes has lagged. To learn more about DevOps, change management, and Db2 for z/OS, I am delivering a webinar on this topic along with John Barry of BMC. We will discusses issues including an overview of DevOps, the requirements for database change management, and an introduction to BMC’s new AMI DevOps for Db2 that solves the change management dilemma for Db2 for z/OS development. You can register today to attend the webinar on August 13, 2019 (Noon Central) at https://event.webcasts.com/starthere.jsp?ei=1251892&tp_key=3ff9b7af72.

Tuesday, July 16, 2019

Proud to be an IBM Champion

Just a quick post today about the IBM Champions program, which if you haven't heard about, is a special program run by IBM to recognize and reward non-IBM thought leaders for their work associated with IBM products and communities. 

IBM publishes the list of IBM Champions annually and the title is valid for one year. So, champions must be nominated each year to maintain their status.

I want to thank IBM for running such a wonderful program and for all they have done to help recognize those of us in the trenches using IBM's technology. I have been named an IBM Champion for Data and Analytics again this year... for the 10th time. So IBM bestowed upon me this Acclaim badge:


As an IBM Champion I have had the opportunity to interact with IBM folks and with other IBM Champions at events, webinars, and in person, and it has definitely helped to enrich my professional life.

Although the majority of IBM Champions focus on data and analytics, the program is not just for data people! IBM names champions in each of the following nine categories: 
  • Data & Analytics
  • Cloud 
  • Collaboration Solutions 
  • Power Systems 
  • Storage 
  • IBM Z 
  • Watson IoT 
  • Blockchain 
  • Security 
If you are, or know of, somebody who should be an IBM Champion, you can nominate them here: https://developer.ibm.com/champions/.

Thanks again, IBM... and congratulations to all of this year's IBM Champions.

Wednesday, July 10, 2019

There’s a New Db2 12 for z/OS Function Level (505)


In late June 2019, IBM delivered more great new capabilities with the latest new function level for Db2 12 for z/OS, Function Level 505 (or FL505).

If you do not know what a function level is, then you probably aren’t yet on Version 12, because function levels are how new capabilities are being rolled out for Db2 12 and beyond. It is how IBM has enabled a continuous delivery model for Db2 functionality. You can learn more about function levels here.

Although the first link above goes into all of the gory details of the new functionality, I will take a bit of time to summarize the highlights of this new function level.

The first thing that will appeal to most Db2 users is improved performance. And FL505 delivers improved performance in two areas: HTAP and RUNSTATS.
  • For HTAP, FL505 improves the latency between Db2 and the IBM Analytics Accelerator (sometimes called IDAA). Nobody likes latency and these improvements can enable transactional and analytical applications to see the same data.
  • For RUNSTATS, FL505 makes the default RUNSTATS behavior to use page sampling for universal table spaces (unless the RUNSTATS specification explicitly states TABLESAMPLE SYSTEM with a value other than AUTO). This will boost RUNSTATS performance. (A nice description of this is provided by Peter Hartmann here.)

FL505 also delivers
REBIND phase-in for executing packages. Waiting for a package to be idle (not running) has long been a deterrent to rebinding. Now, you can REBIND a package while it is running. Db2 makes this happen by creating a new copy of the package. When the REBIND completes, new executions of the package will use the newly rebound package and the threads already running with the old package continue to do so successfully until completion.

We also get some new built-in functions (BIFs) in FL505, for encrypting and decrypting data using key labels. You may be aware that Db2 already had functions for encryption and decryption but these functions, introduced back in V9 were not very capable because they required you to provide and manage a password to decrypt the data. The new functions work with key labels: encrypting plain text using ENCRYPT_DATAKEY to a block of encrypted text using a specified algorithm and key label; and decrypting with DECRYPT_DATAKEY to return the block of data decrypted to the specified data type.

And with FL505 we finally get additional functionality for DECFLOAT data type. The DECFLOAT data type was introduced in DB2 9 for z/OS, but it is not widely used because of some shortcoming. But first, what is DECFLOAT? Well, DECFLOAT is basically a combination of DECIMAL and floating-point data types, that is a decimal floating-point or DECFLOAT data type. Specified as DECXFLOAT(n), where the value of n can be either 16 or or 34, representing the number of significant digits that can be stored. A decimal floating-point value is an IEEE 754r number with a decimal point and it can be useful to store and manage very large numbers.

So what is the improvement? Quite simply, it is now possible to specify columns defined as DECFLOAT in an index and as a key in a primary key or a unique key. Unfortunately, there is still no support for DECFLOAT usage in COBOL programs, which will likely continue to hinder its uptake in many shops.

And finally, FL505 improves temporal support for triggers. It delivers the capability to reference system temporal tables and archive-enabled tables in the WHEN clause of your triggers.  

Summary

IBM is using function levels to deliver significant new capabilities for Db2 12 for z/OS. It is important for you and your organization to keep up-to-date on this new functionality and to determine where and when it makes sense to introduce it into your Db2 databases and applications.

Also, be aware that if you are not currently running at FL504, moving to FL505 activates all earlier function levels. You can find a list of all the current function levels here.