- Application Development
- Database Systems
- Middleware
- Networks
- Operating Systems (z/OS, z/VM, Linux)
- Security
- Storage
- Systems Management
Wednesday, February 12, 2020
Will I See You at SHARE in Fort Worth 2020?
Thursday, February 06, 2020
IBM Gold Consultant for Data and AI :: 2020
I am proud to announce that I will be continuing as an IBM Gold Consultant for Data and AI in 2020.
For those of you who do not know what an IBM Gold Consultant is... the IBM Gold Consultant program is an elite group of independent consultants with vast experience in IBM data repositories, unified governance, artificial intelligence (AI) and machine learning.
IBM Gold Consultants bring extensive industry experience and technical expertise to help IBM clients define and implement strong strategies for their data and analytics initiatives using IBM Db2 on all platforms, IBM Informix, IBM InfoSphere, IBM CICS, and related technologies and tools. The group is recognized by its peers, and IBM, as some of the world’s most experienced independent consultants for these products.
Thank you, IBM, for creating such great data management tools and solutions that I have been able to build a career - spanning more than three decades - using them.
Mullins Consulting, Inc.
Friday, January 03, 2020
Db2 11 for z/OS End of Support Coming This Year (2020)
Version 11 of our favorite DBMS was made generally available way back on October 25, 2013 and IBM has not been marketing and selling this version since July of 2018. But if you are still using Db2 11 IBM has continued to provide support... and will continue for the first three quarters of 2020. But after that, support ends.
In other words, the end of support date for Db2 11 for z/OS is September 30, 2020. And that date appears to be a firm one... don't bet on IBM extending it.
Whtat does that mean for you if you are still using Version 11? It should mean that you will be spending the first three quarters of 2020 planning for, and migrating to Db2 12 for z/OS.
There are a lot of great resources that IBM provides to help you migrate smoothly. Here are a few of them for you reference:
Db2 12 Installation and Migration Guide
Db2 12 for z/OS Product Documentation
Webcast: Db2 12 for z/OS Migration Planning and Customer Experiences with John Campbell
Db2 12 for z/OS Migration Considerations (Mark Rader)
So if you are still running Db2 11 and you haven't started planning to upgrade, now is the time to start planning... and if you have started planning, that is great, because 2020 is the time to get your shop migrated to Db2 12!
Friday, December 27, 2019
Planning Your Db2 Performance Monitoring Strategy
- Batch reports run against Db2 trace records. While Db2 is running, you can activate traces that accumulate information, which can be used to monitor both the performance of the Db2 subsystem and the applications being run. For more details on Db2 traces see my earlier 2-part blog post (part 1, part 2).
- Online access to Db2 trace information and Db2 control blocks. This type of monitoring also can provide information on Db2 and its subordinate applications.
- Sampling Db2 application programs as they run and analyzing which portions of the code use the most resources.
- Do not overdo monitoring and tracing. Db2 performance monitoring can consume a tremendous amount of resources. Sometimes the associated overhead is worthwhile because the monitoring (problem determination or exception notification) can help alleviate or avoid a problem. However, absorbing a large CPU overhead to monitor a Db2 subsystem that is already performing within the desired scope of acceptance might not be worthwhile.
- Plan and implement two types of monitoring strategies at your shop:
- ongoing performance monitoring to ferret out exceptions, and;
- procedures for monitoring exceptions after they have been observed.
- Do not try to drive a nail with a bulldozer. Use the correct tool for the job, based on the type of problem you’re monitoring. You would be unwise to turn on a trace that causes 200% CPU overhead to solve a production problem that could be solved just as easily by other types of monitoring (e.g. using EXPLAIN or Db2 Catalog reports).
- Tuning should not consume your every waking moment. Establish your Db2 performance tuning goals in advance, and stop when they have been achieved. Too often, tuning goes beyond the point at which reasonable gains can be realized for the amount of effort exerted. (For example, if your goal is to achieve a five-second response time for a TSO application, stop when you have achieved that goal instead of tuning it further even if you can.)
Wednesday, December 18, 2019
High Level Db2 Indexing Advice for Large and Small Tables
Tuesday, December 03, 2019
A Guide to Db2 Application Performance for Developers: A Holiday Discount!
You see, in my current role as an independent consultant that focuses on data management issues and involves a lot of work with Db2, I get to visit a lot of different organizations... and I get to see a lot of poorly performing programs and applications. So I thought: "Wouldn't it be great if there was a book I could recommend that would advise coders on how to ensure optimal performance in their code as they write their Db2 programs?" Well, now there is... A Guide to Db2 Application Performance for Developers.
This book is written for all Db2 professionals, covering both Db2 for LUW and Db2 for z/OS. When there are pertinent differences between the two it will be pointed out in the text. The book’s focus is on developing applications, not database and system administration. So it doesn’t cover the things you don’t do on a daily basis as an application coder. Instead, the book offers guidance on application development procedures, techniques, and philosophies for producing optimal code. The goal is to educate developers on how to write good application code that lends itself to optimal performance.
By following the principles in this book you should be able to write code that does not require significant remedial, after-
The book does not rehash material that is freely available in Db2 manuals that can be downloaded or read online. It is assumed that the reader has access to the Db2 manuals for their environment (Linux, Unix, Windows, z/OS).
The book is not a tutorial on SQL; it assumes that you have knowledge of how to code SQL statements and embed them in your applications. Instead, it offers advice on how to code your programs and SQL statements for performance.
What you will get from reading this book is a well-
OK, you may be saying, but what about that "Holiday Discount" you mention in the title? Well, I am offering a discount for anyone who buys the book before the end of the year (2019). There are different discounts and codes for the print and ebook versions of the book:
- To receive a 5% discount on the print version of the book, use code 5poff when you order at this link.
- To receive $5.00 off on the ebook version of the book, user code 5off when you order at this link.
Wednesday, November 27, 2019
Happy Thanksgiving 2019
Thanksgiving is a day we celebrate in the USA by spending time with family, eating well (traditionally turkey), and giving thanks for all that we have and hold dear.
Oh... and also for watching football!
May all of you reading this have a warm and happy Thanksgiving holiday surrounded by your family and loved one.
Happy Thanksgiving!
Thursday, November 07, 2019
Db2 12 for z/OS Function Level 506
- Alternative function names support
- Support for implicitly dropping explicitly created table spaces
Existing
Function Name
|
New
Alternative Syntax Name
|
CHARACTER_LENGTH
|
CHAR_LENGTH
|
COVARIANCE
or COVAR
|
COVAR_POP
|
HASH_MD5
or HASH-SHA1 or HASH_SHA256
|
HASH
|
POWER
|
POW
|
RAND
|
RANDOM
|
LEFT
|
STRLEFT
|
POSSTR
|
STRPOS
|
RIGHT
|
STRRIGHT
|
CLOB
|
TO_CLOB
|
TIMESTAMP_FORMAT
|
TO_TIMESTAMP
|
Support for these alternative spelling of built-in function names should make it easier to support applications across multiple members of the Db2 family where support already exists for these spellings. Of course, you may run into issues if you used any of the new spellings in your existing applications, for example as variable names.
The other significant feature of FL506 is support for implicitly dropping explicitly created universal table spaces when a DROP TABLE statement is executed. Prior to FL506 dropping a table that resides in an explicitly created table space does not drop the table space.
Thursday, October 31, 2019
Have You Considered Speaking at the IDUG Db2 Technical Conference? You should!
Speaking at a user group is a good way to expand your contacts and develop additional personal interaction skills. And I have also found it to be a good way to increase my technical knowledge and skills. Sure, as the presenter you are sharing your knowledge with the audience, but it always seems like I expand my knowledge and way of thinking about things when I deliver a presentation. Either because of questions I receive, or because putting the presentation together made me stop and think about things in different ways.
And if you are accepted to speak your attendance at the conference is complimentary!
Putting together an abstract is not that difficult at all. You just need to complete a bit of biographical information about yourself, select a category for your presentation, provide an overview of your topic, and offer up a bulleted list of 5 objectives. The site guides you through submitting all of these things at this link.
Speaking at a conference can be a very rewarding experience... and once you start doing it, you'll want to do it again and again. So go ahead. Click here and submit your abstract and I hope I'll see you in Dallas in June 2020!
Thursday, October 17, 2019
See You in Rotterdam... at the IDUG Db2 Tech Conference
A Guide to Db2 Performance for Application Developers |
Monday, October 14, 2019
Mainframe Modernization: The Why and How
This webinar will discuss the rich heritage of the mainframe and the value of the applications and systems that have been written over many decades. Organizations rely on these legacy systems and the business knowledge built into these applications drive their businesses.
But an application created 20 or more years ago will not be as accessible to modern users as it should be. Digital transformation that enables users to access applications and data quickly is the norm, but this requires modernizing access to the rich data and processes on the mainframe.
This presentation will expose the value proposition of the mainframe, and look at the trends driving its usage and capabilities. I will look at the IT infrastructure challenges including changing technology, cloud adoption, legacy applications, and development trends. And look at tactics to achieve mainframe modernization amid complexity and change.
So if mainframes are your thing, or you just want to learn more about the state of the modern mainframe, be sure to sign up and attend!
Tuesday, September 17, 2019
IBM Unleashes the z15 Mainframe
- Encryption everywhere which protects your data anywhere, even after it leaves your system, with new IBM Data Privacy Passports, which delivers privacy by policy.
- Cloud native development that simplifies life for developers as they build and modernize applications using standard tools, including new support for Red Hat OpenShift. This enables you to both modernize the apps you have and to deploy new ones using the tools of your choice.
- IBM Z Instant Recovery can reduce the impact of planned and unplanned downtime. Instant Recovery can speed the return to your pre-shutdown SLAs by up to 2x.
The IBM z15 is a modern platform for all of your processing needs. And that is backed up not just by IBM, but also a brand new survery from BMC Software, in their 14th annual mainframe survey for 2019. The survey shows that 93% are confident in the combined long-term and new workload strength of the IBM Z platform, the strongest showing since 2013! Other highlights inlcude a majority thinking that mainframe growth will continue, along with increasing MIPS/MSU consumption... not to mention that the mainframe is handling increases in data volume, number of databases, and transaction volume. If you are working with mainframes in any way, be sure to check out the new BMC Mainframe Survey.
Indeed, with the new IBM z15 things are looking great for the mainframe and those that rely upon it to power their digital business.
Wednesday, September 04, 2019
The Power of Data Masking for Data Protection
Click to watch the video |
Data masking is not a simple task, and as the video helps to explain, there is much to consider. To effectively mask your data requires a well-thought-out process and method for implementation to achieve success. As such, a tool like BCV5 Masking Tool can simplify how you address your Db2 data protection requirements. It provides dozens of easy to use masking algorithms implemented using Db2 user-defined functions. It ensures that the same actual value is translated to the same masked value every time. And the value will be a plausible value that works the same as the data it is masking. The tool understands thing like referential integrity, unique constraints, related data, and so on.
Thursday, August 15, 2019
BMC AMI for DevOps Intelligently Integrates Db2 for z/OS Schema Changes
As mainframe development teams begin to rely on DevOps practices more extensively, the need arises to incorporate Db2 for z/OS database changes. This capacity has been lacking until recently, requiring manual intervention by the DBA team to analyze and approve schema changes. This, of course, slows things down, the exact opposite of the desired impact of DevOps. But now BMC has introduced a new solution that brings automated Db2 schema changes to DevOps, namely BMC AMI for DevOps.
BMC AMI for DevOps is designed to integrate into the DevOps tooling that your developers are already using. It integrates with the Jenkins Pipeline tool suite to provide an automated method of receiving, analyzing, and implementing Db2 schema changes as part of an application update.
By integrating with your application orchestration tools AMI for DevOps can capture the necessary database changes required to move from test to production. But it does not just apply these changes; it enforces and ensures best practices using built-in intelligence and automated communication between development and database administration.
The ability to enforce best practices is driven by BMC’s Automated Mainframe Intelligence (AMI), which is policy driven. The AMI capability builds much of the DBA oversight for schema changes into the DevOps pipeline, enforcing database design best practices as you go instead of requiring in-depth manual DBA oversight.
Incorporating a database design advisory capability into the process offloads manual, error-prone tasks to the computer. This integrated automation enables automatic evaluation of Db2 database schema change requests to streamline the DBA approval process and remove the manual processes that inhibit continuous delivery of application functionality.
Furthermore, consider that intelligent database administration functionality can be used to help alleviate the loss of expertise resulting from an aging, retiring workforce. This is a significant challenge for many organizations in the mainframe world.
But let’s not forget the developers. The goal of adopting a DevOps approach on the mainframe is to speed up application development, but at the same time it is important that we do not forgo the safeguards built into mainframe development and operations. So you need a streamlined DevOps process—powered by intelligent automation—in which application developers do not have to wait around for DBA reviews and responses. A self-service model with built-in communication and intelligence such as provided by AMI for DevOps delivers this capability.
The Bottom Line
BMC AMI for DevOps helps you to bring DevOps to the mainframe by integrating Db2 for z/OS schema changes into established and existing DevOps orchestration processes. This means you can use BMC AMI for DevOps to deliver the speed of development required by agile techniques used for modern application delivery without abandoning the safeguards required by DBAs to assure the accuracy of the database changes for assuring availability and reliability of the production system. And developers gain more self-service capability for Db2 schema changes using a well-defined pipeline process.