Showing posts with label mainframe. Show all posts
Showing posts with label mainframe. Show all posts

Wednesday, February 22, 2023

AI and ChatGPT Tell You Why You Should Use a Mainframe

ChatGPT has been getting a lot of attention for its ability to formulate reasonable responses to questions and inquiries. There have even been examples of ChatGPT being able to pass college exams and write up readable articles.

ChatGPT is a chatbot developed by OpenAI. It was launched in November pf 2022. It is built on top of OpenAI's GPT-3 family of large language models and it has been fine-tuned using both supervised and reinforcement learning techniques. 

So it is a somewhat practical example of AI being applied to a common proglem... communication.

Being the mainframe zealot that I am, I decided to ask ChatGPT why I should use a mainframe. I wasn't sure what to expect, but I am delighted with the response that I got. 

Me: Why should I still use a mainframe?

ChatGPT: There are several reasons why you might still want to use a mainframe for certain applications:

  1. High Reliability: Mainframes are designed to handle massive workloads and to provide high levels of reliability and availability. They are built with redundant components and features that allow them to recover quickly from failures, making them ideal for mission-critical applications.
  2. Security: Mainframes are known for their robust security features. They have been used in industries such as finance, healthcare, and government for decades because they offer features such as hardware encryption, advanced access controls, and secure connectivity options.
  3. Scalability: Mainframes can scale vertically to handle large amounts of data and processing power. They can also scale horizontally by adding more processors or servers to a mainframe cluster, making them a good fit for organizations with rapidly growing or fluctuating workloads.
  4. Cost-Effective: While mainframes are expensive to purchase, they can provide a cost-effective solution for certain applications. Mainframes are designed to handle large workloads with high efficiency, and they can consolidate multiple applications onto a single platform, reducing the need for additional hardware and software.
  5. Legacy Systems: Many organizations have critical legacy applications that were designed to run on mainframes. Replacing these systems can be expensive and time-consuming, and in some cases, it may not be feasible to migrate these applications to modern platforms.
  6. Integration with Modern Technologies: Many mainframe systems now offer modern connectivity options, such as APIs and web services, which make it easier to integrate them with newer technologies and applications.

In summary, while mainframes are no longer the primary platform for many organizations, they still offer unique benefits and advantages for certain applications. Mainframes provide high levels of reliability, security, scalability, and cost-effectiveness, making them an attractive option for organizations with specific needs and requirements.

This seems like a  reasonable response to me. What do you think? With technology this good, I can only imagine how much better it will get as the models are honed over time.


Thursday, June 02, 2022

Db2 13 for z/OS is Here!

Here we are, in June 2022, about 5 years or so since Db2 12 for z/OS was released. And lo' and behold, IBM has given us a new version of Db2 for z/OS to learn and adopt: Db2 13 for z/OS

The new version is generally available (as of May 31, 2022). If you were not paying close attention though, you may have missed it. Db2 13 was announced at the same time as the new mainframe (IBM z16), so it didn't get quite the same level of attention. But those of us who use Db2 for z/OS day in and day out will find a lot of great new stuff in this latest and greatest version of Db2.

I'm not going to go into great detail about the new features and functionality of Db2 13 for z/OS today, but I will offer a high-level overview. Look for future blog posts to dig into more of the nitty gritty tech details and capabilities.

The first thing to mention is that you will need to activate the last Db2 12 function level 510 (FL510) before you can migrate to Db2 13. As many organizations are lagging behind in terms of function level activation, it will be interesting to see how this requirement impacts migration to Db2 13.

AI

So what can users expect from this new version? Well, it seems that the most talked-about features are related to adopting AI. Functions that deliver AI capabilities into Db2 will make it easier for organizations on the AI journey to integrate Db2 into their processes.

Perhaps the most significant AI addition to Db2 13 is the SQL Data Insights feature. Provided as an extension to Db2, SQL Data Insights is delivered using built-in functions to deliver AI capabilities like uncovering heretofore unknown relationships in your data. Since it uses built-in functions you can use it anywhere that you use SQL!

Db2 13 offers additional AI help delivering the ability to simplify building models, Natural Language Processing (NLP), and exploiting the speed of the IBM z16 for training and querying data.

IBM z16 Synergy

The next thing that has been highly-touted is that Db2 13 takes advantage of new capabilities delivered in the IBM z16 hardware.  The new Telum chip used by the z16 mainframe provides powerful AI capabilities that Db2 uses to bolster its AI capabilities (such as SQL Data Insights). And we have already touched on that in terms of speeding up training and querying data for AI.

Db2 for z/OS is unique in that it is the only major DBMS that is designed specifically for a single operating system (z/OS) and hardware platform (IBM Z). This enables IBM (the provider of the DBMS, O/S, and hardware) to take advantage of capabilities unique to the platform, because there is no worry about supporting other platforms.

One example of this unique synergy is the ability to improve sort performance using the SORTL instruction of the IBM Z15 and z16. Additionally, the IBM z16 System Recovery Boost can minimize downtime by speeding up the performance of Db2 for z/OS restart. 

But What About BAU?

OK, so there is new AI stuff and great synergy with the IBM Z, but what about the features and functionality that make it easier to keep up with Business As Usual (BUA)? You know, things like easier administration, better performance, and so on?

Good news! There are a plethora of great new capabilities and improvements in Db2 13 for z/OS. While I cannot adequately cover them in detail today, some examples include:

  • The ability to convert back-and-forth between partition-by-growth and partition-by-range Db2 table spaces. 
  • Support for more concurrent threads and open data sets, as well as improved storage conditions. 
  • DDF storage relief.
  • Real Time Statistics (RTS) improvements.
  • Many improvements to IBM Db2 utility functionality.
  • Security and compliance improvements, including integration to the IBM Z Security and Compliance Center.

Summary

The bottom line is that there is a new version of Db2 for z/OS that mainframe shops will need to learn and prepare for. As with any new Db2 version, it will be exciting to dig in and discover all of the new stuff that can help us do our jobs better... and improve our organization's efforts to use its data to improve business.

Keep checking back here as I will blog in more detail about the new capabilities of Db2 13 for z/OS over time... 

Thursday, July 29, 2021

New IBM Storage Systems Boost Ability to Gain Value from Your Mainframe Data

Gain more value from your mainframe data with IBM Storage

Every year the amount of data that is created continues to expand. Analysts at IDC estimate that data will grow at a compound annual growth rate of 23 percent through the year 2025Furthermore, efficient access to critical business data can mean the difference between success and failure, yet we sometimes forget about the crucial role that storage systems play in our everyday business transactions.

While storage systems have gotten more intelligent and fault-tolerant over the years, there’s always room for advances that can deliver an improved user experience. This can be seen by IBM’s latest storage announcements. The announcement highlights new and improved storage capabilities including cloud-like consumption models, data resiliency, and mainframe storage. This post will focus on the mainframe aspects of the announcement.

Why mainframe? Well, the platform continues to prosper and grow. According to the latest BMC Mainframe Survey, 90 percent of the IT leaders surveyed see the mainframe as a long-term platform for growth. The world’s largest organizations rely on the mainframe to deliver superior performance, reliability, and security. Mainframes are being used not just for traditional transaction processing and batch workloads, but also for new workloads running business analytics and AI applications on structured data. Not to mention that these large shops store most of their data on the mainframe!

What’s New

On July 20, 2021, IBM announced the next generation of its Storage for the IBM Z, the IBM DS8980F analytics class storage system. Engineered to excel for modern workloads that span transaction processing, analytical processing, and AI for native cloud and on-premises computing. The DS8980F offers high-speed and high availability as a single all-flash storage solution.

As part of this announcement, IBM is introducing improvements in Safeguarded Copy to the entire family of IBM DS8900F systems – including DS8910F, DS8950F and the new DS8980F – to greatly reduce the recovery time from a remote location to the production environment. Additionally, IBM is bringing the Safeguarded Copy function in IBM Spectrum Virtualize software to the IBM FlashSystem family and IBM SAN Volume Controller. 

Bringing the focus back to the mainframe: the new IBM DS8980F storage system has been developed by IBM with its z15 mainframe hardware in mind. That means it is optimized for mainframe-class workloads. Organizations are continually looking for ways to improve the performance of their mainframe applications, and the DS8980F provides the fastest mainframe application response times. Therefore, a key method of improving performance can be to upgrade your storage system. Indeed, the new IBM DS8980F, compared to the last generation of IBM storage systems (DS8888F series), can improve response time by up to 25 percent. 

Minimizing downtime is another critical requirement of modern business applications, especially for those that run on mainframes. The new IBM DS8980F delivers 7 nines of availability (99.99999 percent), an improvement of 10x over the previous generation.

Additional improvements include more than twice the amount of system cache and greater bandwidth capacity, all while requiring less energy consumption and in a lighter-weight box.

At the same time, IBM also announced a new tape library system, the IBM TS7770, with all flash cache. The most significant new feature of the TS7770 is that it provides better performance with only 1 flash drawer than the previous 10 SAS HDDs drawers, delivering faster data protection with less infrastructure.

Finally, it is possible to combine the TS7770 tape library and the IBM DS8910F  (the entry version within the DS8900F family) into a single 19-inch industry-standard rack. This enables smaller and medium-sized organizations to deploy an end to end storage solution for mainframe environments, into a smaller amount of floor space with important savings in operating costs.

Summary

Data growth continues unabated, and organizations continue to use mainframes expecting them to deliver unparalleled performance and availability for their mission-critical workloads of all types.

To achieve this level of performance and availability, while managing data growth, organizations need the latest and greatest storage technology. And IBM’s latest DS8980F and TS7770 will help organizations achieve the performance and availability they require for all their application workloads.

If you’d like to learn more about the latest from IBM storage, you can read the full details in the IBM announcement.


Sunday, May 09, 2021

Thinking About the Mainframe, the Cloud, and IBM Think 2021

A Bit about Think

I am looking forward to attending the IBM Think 2021 conference, IBM's annual flagship technology event. I have attended several in-person Think events, as well as last year’s virtual conference, and I always come away with new knowledge and additional insight into technology and IBM’s vast portfolio of hardware, software, and solutions. The Think conference is always one of the tech highlights of the year for me!

This year’s event, IBM Think 2021, is again being held as a virtual conference, May 11 and 12, 2021. And it is free of charge, which means that you can experience all the great education, presentations, and networking opportunities without having to leave your desk.

My favorite aspect of the Think conference is the breadth and scope of pertinent technical content that it covers. Whether you are a developer, a DBA, a data scientist, a manager, an executive, or any flavor of IT or business specialist, there will be a wealth of useful information presented to educate you and make you “think.”  Be sure to register here.

My Think 2021 Agenda

There are multiple sessions to be delivered at this year’s IBM Think conference that intrigue me because they focus on areas where I specialize.  For example, Dr. Dario Gil, SVP and Director of IBM Research will be delivering a keynote session on IT infrastructure which is sure to be educational. This session, 2081, offers a deep dive into the IBM innovations powering the next generation of hardware, including IBM Z.

Another session I am looking forward to is session 2303 focusing on security “everywhere.” It features IBM luminaries like Tom Rosamilia, Senior Vice President, IBM Systems, and Mary O’Brien, General Manager IBM Security. And Forrester Research Director, Lauren Nelson, will also be lending her industry expertise to the session.

But I think the Think 2021 session I am most looking forward to is The IBM Z roadmap for hybrid cloud and AI (session 1605) featuring Ross Mauri General Manager for IBM Z. Mauri promises to offer a timely discussion on the business value of integrating the IBM Z platform as a full participant into your hybrid cloud. And he’ll speak with Russell Plew, Technology Senior Manager at M&T Bank who will discuss their real-life experiences in doing so!

Why is this session so interesting to me? Well, I’ve worked with the mainframe my entire career, and as anybody who works on the mainframe knows, the IBM Z platform is used to drive mission-critical workloads across all major industry sectors, worldwide. If your organization needs to perform large-scale transaction processing (thousands of transactions per second), support thousands of users and programs concurrently, manage terabytes of information, and handle large-bandwidth communication, chances are you rely on the mainframe to do that because the platform excels at all of those things.

If you’ve ever deposited a check into your bank account, booked a flight on an airline, or used a credit card to purchase something, it is probable that a mainframe was involved in completing that activity!

Ever since Stewart Alsop of InfoWorld predicted the last mainframe would be unplugged on March 15, 1996 there has been a lingering perception that the mainframe would go away at some point. But here we are, 25 years later, and the mainframe is still going strong! At last year’s IBM Think conference IBM presented the following statistics on the mainframe’s ubiquity and power:

      70% of the Fortune 500 use mainframes and 72% of customer-facing applications are dependent on the mainframe for some or all data processing.

      Mainframes are designed to be able to process a trillion web transactions a day with the capability to process 1.1 million transactions per second.

      95% of transactions in the banking, insurance, airline and retail industries are handled by mainframes.


Indeed, the mainframe continues to offer a strong, unparalleled platform for performance, security, and reliability. Of course, the mainframe has changed and grown over its 50+ year lifespan. Today’s IBM z15 is light-years beyond the original IBM System/360 introduced in 1964. Some of the great newer capabilities of the IBM Z include encryptions everywhere with pervasive encryption and Data Privacy Passports, rack-mountable mainframes, Instant Recovery, and cloud-native development. I’m looking forward to hear how IBM’s customers have taken advantage of these, and other capabilities, to integrate the IBM Z into their hybrid cloud architecture.

It only makes sense that businesses relying on the mainframe will continue to do so, even as they embrace cloud computing. This is what the “hybrid” in the term hybrid cloud implies, an IT infrastructure that uses a mix of on-premises and private / public cloud from multiple providers. And this approach makes the most sense because everything can’t shift to the cloud immediately (perhaps ever) because most existing applications were not built with an understanding of the public cloud and it would take a lot of investment to re-engineer them to properly take advantage of a public cloud architecture. And even if you wanted to move everything, cloud service providers (CSPs) can’t build out their infrastructure fast enough to support all the existing data center capacity “out there” to immediately support everything.

So, it will be exciting to watch the IBM continue to innovate on the IBM Z platform as enterprise customers work to integrate Z as a vital component of their hybrid cloud infrastructure. With the large investment enterprises have in their working mainframe applications, large data sets and databases containing crucial data, and high-volume processing requirements they will continue to rely on the mainframe well into the future… and that makes it important to understand how IBM is enabling the IBM Z to participate in your hybrid cloud architecture.

So, join me at Think 2021 for session 1605 to learn how to use your investments in IBM Z and build and modernize applications into container-based workloads using a common DevOps experience. And stick around for other sessions to gain insights on harnessing the full value of IBM hardware, software and services in your organization as you continue to support, manage, and transform traditional business and IT operations.


Wednesday, April 07, 2021

Happy Birthday to the IBM Mainframe

I am older than the mainframe... I turned 58 on April 3rd, and the IBM mainframe officially celebrates its 57th birthday today, April 7th.

The IBM 360 was launched on April 7, 1964 and the world of enterprise computing has never been the same.

Here are a few links and articles to check out as we celebrate the ongoing vitality of mainframe computing:

So, all of you mainframe users out there, today is indeed a day to celebrate... another year has gone by, and mainframes are still here... running the world!

Thursday, January 07, 2021

BMC AMI Ops: The Next Generation of Mainframe Systems Management

Assuring the performance of your mainframe systems and applications is an imposing task that keeps getting more complex all the time. It makes sense to arm your IT performance analysts, DBAs, and systems programmers with modern tools so you can optimize performance and thereby deliver superior service to your customers.

Of course, BMC MainView has helped IT professionals manage the performance of their mainframe systems and applications for years. But there are new challenges facing modern organizations that require adaptation and transformation.

Organizations are transforming to become autonomous digital enterprises (ADE). This means that things are getting more complex because availability requirements are expanding (many times requiring 24/7 availability), but IT pros are expected to resolve problems rapidly even as workloads become more unpredictable and IT staff has less experience. These challenges are real and require attention.

And that is why BMC is transforming its MainView product line into BMC AMI Ops!

With BMC AMI Ops you can experience next-level mainframe operational resiliency, AI-powered observability, an intuitive user interface with embedded expertise, actionable insights, and enterprise platform interoperability.

How is BMC AMI Ops engineered to help? Well, it is built for digital business with the understanding that being reactive is not sufficient these days. BMC AMI Ops provides a complete, modular solution with central administration and management.

Artificial intelligence and machine learning techniques are being embraced by an increasing number of organizations for improving their business, so it only stands to reason that your IT operations and support functions should be looking to improve their capabilities using AI and ML, too. And BMC AMI Ops helps you to do that because it is infused with AI/ML-powered analytics to find and fix problems before business services are impacted. With BMC AMI Ops you can improve performance and availability by taking advantage of its built-in intelligent automation and remediation features.

And the user interface is brand new, engineered to support ease of use, to facilitate information instead of raw data, and to guide the user experience. BMC AMI Ops delivers a custom dashboard approach where you can group widgets together for related logical systems or business areas. And you get “out of the box” health indicators for each of the widgets you deploy, meaning it takes less time to be productive right away. Furthermore, a guided path is provided so the user can drill down into additional details as needed. If you are interested in seeing more details on the new user experience for BMC AMI Ops, chick out this blog post from Shay Alsberg (BMC AMI Ops: Evolving the MainView User Experience).

And not to fear, for those of you experienced mainframe pros who not only know how to drive ISPF panels but prefer it, BMC AMI Ops can still be accessed using character-based panels.

The bottom line is that BMC AMI Ops is designed for modern businesses and IT, as they embrace digital transformation to become autonomous digital enterprises, enabling them to deliver a simplified yet customizable systems management experience for optimizing your system and application performance. That’s BMC AMI Ops in a nutshell… and it is worth looking into how BMC AMI Ops can help you to improve the performance of your systems and applications.

Wednesday, May 20, 2020

IBM Think 2020: Virtual, On Demand, Hybrid Cloud and Z

This year’s IBM Think event was quite different than in past years. Usually, Think is an in-person event and attracts a lot of people, typically more than ten thousand IT executives and practitioners. But as we all know, this year with the global COVID-19 pandemic an in-person event was not practical, so IBM held it on-line. And I have to say, they did a fantastic job of managing multiple threads of content without experiencing bandwidth or access issues – at least none that I encountered.
The theme and focus of the content for the event was different, too. Instead of the usual conference focus on products, announcements, and customer stories, this year’s event was more philanthropic. Oh, sure, you could still hear about IBM’s products and customer successes, but the keynote and featured sessions were at a higher level this year.
In the kickoff session, new IBM CEO Arvind Krishna spoke about the driving forces in IT as being hybrid cloud and AI. And he spoke about these things in the context of moving IBM forward, but also how they can be used to help healthcare workers combat pandemics like we are currently experiencing.
In another keynoteIBM Executive Chairman Ginni Rometty spoke with Will.i.am (of the Black-Eyed Peas) about making the digital era inclusive through education, skills development, and the digital workforce. 


And then there was Mayim Bialik’s session on women and STEM, which was sincere, heartfelt, and entertaining. 

For those who don’t know who she is, she is the actress who played Blossom (on Blossom) and Amy Farrah Fowler (on The Big Bang Theory)… but she is also a scientist with a doctorate in neuroscience. Bialik’s session focused on putting a positive female face on STEM, something that is definitely needed!

So, what about the technology side of things? Well, you can take a clue from Krishna’s assertion that IBM as a company has to have a “maniacal” focus on hybrid cloud and AI in order to compete. But the company has a rich and deep heritage across the computing spectrum that gives it a key advantage even as it adjusts to embracing hybrid cloud and AI.
The first thing to remember is that IBM uses the term “hybrid multicloud[RB1] ” very specifically and deliberately. Everything is not going to be in the cloud[RB2] . Large enterprises continue to rely on the infrastructure and applications they have built over many years, many of them on z Systems mainframes. The key to the future is both on-premises and cloud, and IBM understands this with its hybrid cloud approach… as they clearly demonstrated at Think 2020.
My specific area of focus and expertise is the mainframe and Db2 for z/OS, so I sought out some sessions at Think in those areas. Let me tell you a bit about two of them.

First let’s take a quick look at how IBM Cloud Pak for Data can work with data on the Z platform. This information was drawn from IBM Distinguished Engineer Gary Crupi’s session, titled "Drive Actionable, Real-Time Insight from Your High-Value IBM Z Data Using IBM Cloud Pak for Data."

What is Cloud Pak for Data? Well, it is an IBM platform for unifying and simplifying the collection, organization, and analysis of data. Heretofore, it was mostly focused on non-mainframe platforms, but the latest release, version 3.0, is a major upgrade with an enhanced unified experience, expanded ecosystem, and optimized Red Hat integration. And it enables several ways for you to turn your enterprise data on IBM Z into actionable, real-time insight through the integrated cloud-native architecture of IBM Cloud Pak for Data.



Crupi’s session started out with the now familiar (at least to IBM customers and Think attendees) Ladder to AI and how Cloud Pak for Data helps to enable customer’s journey up the ladder. Data is the foundation for smart business decisions and AI can unlock the value of this data.

He went on to discuss the continuing importance of the mainframe providing facts including:
  •  70% of Fortune 500 companies use mainframe for their most critical business functions
  •  72% of customer-facing applications are completely or very dependent on mainframe processing
  •  The mainframe handles 1.1 million transactions per second (as compared to Google experiences of 60,000 searches per second)
  •  95% of transactions in the banking, insurance, airline and retail industries run on the mainframe

These are all good points; and things that mainframe users like to hear. It is good to see IBM promoting the ubiquity and capabilities of the mainframe.



Now, what about IBM Cloud Pak for Data better-exploiting mainframe data? Crupi goes back to the AI Ladder to talk about z/OS capabilities for analyzing and collecting data for AI.


Solutions such as Watson Machine Learing for z/OS, Db2 AI for z/OS, and QMF can be used for analyzing data; while Db2 for z/OS and Tools, IDAA, and Data Virtualization Manager can be used for data collection. These things already exist, but using them effectively with distributed platform capabilities will be crucial to be able to climb the ladder to AI.

IBM Cloud Pak for Data will leverage IBM Z technology to bring valuable IBM Z data into a modern analytics/AI platform. It can now exploit IBM Z data and resources where appropriate enabling you to further benefit from IBM Z technology and data.

A key new component of making the data on IBM Z accessible is IBM Db2 for z/OS Data Gate, a new product announced during Think 2020. Db2 Data Gate can help you reduce the cost and complexity of your data delivery with a simple, easy-to-deploy mechanism to deliver read-only access to Db2 for z/OS data. Instead of building and maintaining costly custom code, Db2 Data Gate do the work. Data can be synchronized between Db2 for z/OS data sources and target databases on IBM Cloud Pak for Data.


Instead of accessing data in the IBM Z data source directly, an application accesses a synchronized copy of the Db2 for z/OS data, hosted by a separate system. This target system can be established anywhere Cloud Pak for Data is supported, thus enabling a wide range of target platforms that include public cloud, on-premises, and private cloud deployments.


So IBM is helping you to expand the accessibility of your Z data.

And that brings me to the second session I’d like to briefly mention, Automate Your Mainframe z/OS Processes with Ansible [Session 6760]. 

Although Ansible is not a replacement for your operational mainframe automation tools, it can be used to communicate with and automate z/OS using the out-of-the-box SSH into z/OS Unix Systems Services to execute commands and scripts, submit JCL, and copy data. And Ansible has existing modules that can be used to make calls to RESTful/SOAP APIs that are available in many z/OS products.


Ansible can be beneficial to orchestrate cross-platform, including Z systems, and to simplify configuration and deployment management. But keep in mind that Ansible is a proactive framework for automation and is not intended to replace automation solutions that monitor and react.

Here is a nice, but by no means exhaustive, list of examples showing how Ansible can be used to interact with popular z/OS products.


The Bottom Line

The IBM Think 2020 conference was a great success considering how rapidly IBM had to move to convert it from an in-person event, to an online, virtual one. And the content was informative, entertaining, and had something for everybody. I hope you enjoyed my take on the event… feel free to share your comments below on anything I’ve written here, or on your experiences at the event.


Tuesday, September 17, 2019

IBM Unleashes the z15 Mainframe



In New York City, on September 12, 2019, IBM announced the latest and greatest iteration of its Z systems mainframe computing platform, the IBM z15. And I was lucky enough to be there for the unveiling.

The official IBM announcement letter can be found here if you want to dive into the details. But before you go there, consier first reading what I have to say about it below.

Before going any further, here I am with the new z15 in New York… don’t we make a handsome couple? 



The event was held at 3 World Trade Center in lower Manhattan. Ross Mauri, General Manager of IBM Z, kicked off the event extolling the unprecedented security delivered by the z15 with encryption everywhere and the data privacy passports. He claims that the IBM z15 is the most secure platform you can get, and the new capabilities back that up. Mauri also acknowledged that "there's always the next big thing in technology" but stated that "IBM is innovating and leading by anticipating customer needs to ensure the on-going relevance of the mainframe."

And there is a lot to like about the new IBM z15 platform, both for long-time users and those embracing the platform for new development. IBM is embracing the multicloud approach and reminding everybody that the mainframe is a vital component of multicloud for many organizations.

But modern infrastructure with the latest application development techniques is not as simple as throw out the old and bring in the new. I mean, let’s face it, if you have a mainframe with possibly hundreds or thousands of man years of work invested in it, are you really going to take the time to re-code all of that mission-critical work just to have it on a “new” platform? Rewriting applications that work today cannot be the priority for serious businesses! Especially when the modern mainframe is as new as it gets, runs all of that legacy code that runs your business, and also supports new cloud apps and development, too.

The IBM Z works perfectly as a part of your multicloud development strategy. The cloud promises an open, flexible world. But your most critical workloads also need to run securely and without interruption. To accomplish both objectives you must support cloud with an underlying IT infrastructure. And for Fortune 500 companies and other large organizations, the multicloud includes the mainframe as part of the enabling infrastructure.

What’s New

The new IBM z15 is housed in a convenient 19 inch rack, and that means it can be integrated into a standard rack. So you get all the benefit and strengths of the mainframe while fitting into the size expected by a standard data center.

Did you know that there are more transistors in the new IBM z15 chip than there are people in the world! Inside the IBM z15 processor chip, there are 15.6 miles of wires, 9.2 billion transistors and 26.2 billion wiring connections — all of which allow a single z15 server to process 1 trillion web transactions per day.

The mainframe is the ideal platform for many organizations. It provides the resiliency, security, and agility needed to power, secure, and integrate your hybrid cloud. And it capably, securely, and efficiently runs your transactions and the batch workload required to keep your business humming. IBM used to talk about five 9s of availability (that is 99.999%) but with the new IBM z15, IBM can deliver seven 9s (that is 99.99999%)! That is 3.16 seconds of downtime per year, or only 60.48 milliseconds of downtime per week. Now that is impressive!

The primary new features that are worth your time to investigate further, and that were highlighted by IBM at the kickoff event are:
  • Encryption everywhere which protects your data anywhere, even after it leaves your system, with new IBM Data Privacy Passports, which delivers privacy by policy.
  • Cloud native development that simplifies life for developers as they build and modernize applications using standard tools, including new support for Red Hat OpenShift. This enables you to both modernize the apps you have and to deploy new ones using the tools of your choice.
  • IBM Z Instant Recovery can reduce the impact of planned and unplanned downtime. Instant Recovery can speed the return to your pre-shutdown SLAs by up to 2x.

The flexibility of the z15 is noteworthy, too. The new IBM z15 provides the flexibility to implement 1 frame...


or up to 4 frames, as your capacity needs dictate.


And did you know it can run multiple operating systems, not just z/OS? The IBM Z platform can run z/OS, Linux on Z, z/VM, z/VSE, and z/TPF. This enables organizations to run legacy applications and modern, specialist ones using the operating system of their choice. Indeed, convenience and flexibility are hallmarks of the IBM Z platform.

The IBM z15 is a modern platform for all of your processing needs. And that is backed up not just by IBM, but also a brand new survery from BMC Software, in their 14th annual mainframe survey for 2019. The survey shows that 93% are confident in the combined long-term and new workload strength of the IBM Z platform, the strongest showing since 2013! Other highlights inlcude a majority thinking that mainframe growth will continue, along with increasing MIPS/MSU consumption... not to mention that the m
ainframe is handling increases in data volume, number of databases, and transaction volume. If you are working with mainframes in any way, be sure to check out the new BMC Mainframe Survey.


Indeed, with the new IBM z15 things are looking great for the mainframe and those that rely upon it to power their digital business.

Thursday, August 01, 2019

DevOps is Coming to Db2 for z/OS


Mainframe development teams are relying on DevOps practices more extensively, bringing the need to incorporate Db2 for z/OS database changes into the toolset that is supporting their software development lifecycle (SDLC).

But most mainframe professionals have only heard a little about DevOps and are not really savvy as to what it entails. DevOps is an amalgamation of Development and Operations. The goal of DevOps is to increase collaboration between developers and operational support and management professionals, with the desired outcome of faster, more accurate software delivery.

DevOps typically relies on agile development, coupled with a collaborative approach between development and operations personnel during all stages of the application development lifecycle. The DevOps approach results in small and frequent code changes and it can significantly reduce the lead time for changes, lower the rate of failure, and reduce the mean time to recovery when errors are encountered. These are all desirable qualities, especially as organizations are embracing digital transformation driven by the 24/7 expectations of users and customers to access data and apps at any time from any device.

The need to be able to survive and thrive in the new digital economy has caused organizations to adopt new and faster methods of developing, testing and delivering application software. Moving from a waterfall software development methodology to an agile methodology is one way that organizations are speeding the time-to-delivery of their software development. Incorporating a DevOps approach is another.

Instead of long software development projects that may not deliver value for months, or perhaps even years (common using the Waterfall development methodology) an agile DevOps approach delivers value quickly, and then incrementally over time. DevOps enables the continuous delivery of new functionality demanded by customers in the digital economy.

Succeeding with DevOps, however, requires a cultural shift in which all groups within IT work in collaboration with one another, and where management endorses and cultivates this cultural change. Because DevOps relies upon incremental development and rapid software delivery, your IT department can only thrive if there is a culture of accountability, collaboration, and team responsibility for desired business outcomes. Furthermore, it requires solid, integrated automated tooling to facilitate the SDLC from development, through testing, to delivery. Creating such an environment and culture can be challenging.

With DevOps the result will be a constantly repeating cycle of continuous development, continuous integration and continuous deployment. This is typically depicted graphically as the infinity symbol such as in Figure 1 (below).

Figure 1 - continuous development, integration and deployment


Note, however, that this particular iteration of the DevOps infinity graphic calls out the participation of both the application and the database. This is an important, though often lacking, detail that should be stressed when adopting DevOps practices.

The Mainframe and DevOps

The adoption of DevOps has, until now, been much slower within mainframe development teams than for distributed and cloud application development. The staid nature of mainframe development and support, coupled with a glass house mentality, and a rigid production turnover process contribute to the delayed adoption of DevOps on the mainframe. This is not surprising as mainframes mostly are used by large organizations running mission critical workloads with an aversion to any kind of change and risk-averse.

Additionally, the traditional waterfall development methodology has been used by most mainframe software developers for multiple decades, whereas DevOps is closely aligned with an agile approach, which differs significantly from waterfall.

Notwithstanding all of these barriers to acceptance of DevOps on the mainframe, mainframe developers can, and in some cases already do successfully utilize a DevOps approach. Technically speaking, the mainframe is just another platform and there is nothing inherent in its design or usage that obviates the ability to participate in a DevOps approach to application development and delivery.

What about Db2 for z/OS?

Integrating database change into the application delivery lifecycle can be a stumbling block on the road to DevOps success. Development teams focus on application code, as they should, and typically view database structure changes as ancillary to their coding efforts. In most application development projects, it is not the programmer’s responsibility to administer the database and modify database structures. But applications rely on the database being designed, implemented, and changed in accordance with the needs of the business and the code.

This means that many development projects have automated their SDLC tool chain to speed up the delivery of applications. This is the “Dev” portion of DevOps. But the requisite automation and tooling has not been as pervasively implemented to speed up the delivery of database changes. This is the “Ops” portion of DevOps. And this is changing.

A big consideration is that the manner in which change is applied to applications differs from how database changes are applied. That means each must be managed using different techniques and probably different tools. When an application program changes, the code is compiled, and the load module is migrated from test to production. The old load module is saved for posterity in case the change needs to be backed out, but the change is a wholesale replacement of the executable code.

Database changes are different. The database is an entire configuration in each environment and changes get migrated. There is no wholesale replacement of the database structures. DDL commands are issued to ALTER, DROP, and CREATE the changes to the database structures as needed.

From the perspective of database changes on Db2 for z/OS, DBAs need the ability to modify all the database objects supported by Db2 for z/OS. Supporting Db2 for z/OS using DevOps requires tooling that understands both Db2 for z/OS and the DevOps methodology and toolchain. And the tooling must understand how changes are made, as well as any underlying changes that may be required to effectively implement the database change. Some types of database changes are intrusive, requiring a complicated series of unloads, metadata captures, drops, creates, loads, and additional steps to implement. The tooling must be capable of making any of these changes in an automated way that the DBA trusts.

Fortunately, for organizations adopting DevOps on the mainframe with Db2, there is a solution for integrating Db2 database change into the DevOps toolchain: BMC AMI DevOps for Db2. BMC AMI DevOps for Db2 integrates with Jenkins, an application development orchestration tool, to automatically research and determine database schema change requirements, to streamline the review and approval process, and to safely implement the database schema changes making development and operations teams more efficient and agile.