Tuesday, June 30, 2020

Consider Cross-Compiling COBOL to Java to Reduce Costs


Most organizations that rely on the mainframe for their mission-critical workload have a considerable amount of COBOL programs. COBOL was one of the first business-oriented programming languages having been first introduced in 1959. Designed for business and available when the IBM 360 became popular, COBOL is ubiquitous in most mainframe shops.

Organizations that rely on COBOL need to make sure that they continue to support and manage these applications or risk interruptions to their business, such as those experienced by the COBOL applications that run the state unemployment systems when the COVID-19 pandemic caused a spike in unemployment applications.

Although COBOL continues to work -- and work well -- for many application needs, there are on-going challenges that will arise for organizations using COBOL. One issue is the lack of skilled COBOL programmers. The average age of a COBOL programmer is in the mid-50’s, and that means many are close to retirement. What happens when all these old programmers retire? 

Another issue is cost containment. As business improves and workloads increase, your monthly mainframe software bill is likely increasing. IBM continues to release new pricing models that can help, such as tailored-fit pricing, but it is not easy to understand all of the different pricing models, nor is it quick or simple to switch, at least if you want to understand what you are switching to.

And you can’t really consider reducing cost without also managing to maintain your existing performance requirements. Sure, we all want to pay less, but we need to maintain our existing service level agreements and meet our daily batch window deadline.

Cross-Compiling COBOL to Java

Which brings me to the main point of today’s blog post. Have you considered cross-compiling your COBOL applications to Java? Doing so can help to address some of the issues we just discussed, as well as being a starting point toward your application modernization efforts.


What do I mean by cross-compiling COBOL to Java? Well, the general idea is to refactor the COBOL into high-quality Java using CloudframeTM. CloudFrame is the company and the product, which is used to migrate business logic in COBOL into modular Java. This refactoring of the code changes the program structure from COBOL to object-oriented Java without changing its external behavior.

After refactoring, there are no platform dependencies, which allows the converted Java workloads to run on any platform while not requiring changes to legacy data, batch schedulers, CICS triggers or Db2 stored procedures.

I can already hear some of you out there saying “wait-a-minute… do you really want me to convert all of my COBOL to Java?” You can, but I’m not really suggesting that you convert it all and leave COBOL behind… at least not immediately.

But first, let’s think about the benefits you can get when you refactor your COBOL into Java. Code that runs on a Java Virtual Machine (JVM) can run on zIIP processors. When programs run on the zIIP, the workload is not charged against the rolling four-hour average or the monthly capacity for your mainframe software bill. So, refactoring some of your COBOL to Java can help to lower your software bill.

Additionally, moving workload to zIIPs frees up your general-purpose processors to accommodate additional capacity. Many mainframe organizations are growing their workloads year after year, requiring them to upgrade their capacity. But if you can offload some of that work to the zIIP, not only can you use the general purpose capacity that is freed, but if you need to expand capacity you may be able to do it on zIIPs, which are less expensive to acquire than general purpose processors.

It's like Cloudframe is brining cloud economics to the mainframe.

COBOL and Java

CloudFrame refactors Batch COBOL workloads to Java without changing data, schedulers, and other infrastructure (e.g. MQ). CloudFrame is fully automated and seamlessly integrated with the change management systems you use on the mainframe. This means that your existing COBOL programmers can maintain the programs in COBOL while running the actual workloads in Java.

Yes, it is possible to use Cloudframe to refactor the COBOL to Java and then maintain and run Java only. But it is also possible to continue using your existing programmers to maintain the code in COBOL, and then use Cloudframe to refactor to Java and run the Java. This enables you to keep your existing developers while you embrace modernization in a manageable, progressive way that increases the frequency of tangible business deliverables at a lower risk.

An important consideration for such an approach is the backward compatibility that you can maintain. Cloudframe provides binary compatible integration with your existing data sources (QSAM, flat files, VSAM, Db2), subsystems, and job schedulers. By maintaining COBOL and cross-compiling to Java, you keep your COBOL until you are ready to shift to Java. At any time, you can quickly fall back to your COBOL load module with no data changes. The Java data is identical to the COBOL data except for date and timestamp.

With this progressive transformation approach, your migration team is in complete control of the granularity and velocity of the migration. It reduces the business risk of an all-or-nothing, shift-and-lift approach because you convert at your pace without completely eliminating the COBOL code.

Performance is always a consideration with conversions like this, but you can achieve similar performance, and sometimes even better performance as long as you understand your code and refactor wisely. Of course, you are not going to convert all of your COBOL code to Java, but only those applications that make sense. By considering the cost savings that can be achieved and the type of programs involved, cross-compiling to Java using Cloudframe can be an effective, reasonable, and cost-saving approach to application modernization.

Check out their website at www.cloudframe.com or request more information.

7 comments:

Thangaraj said...

Great explanation and very interesting. I have been working on a large scale COBOL to Java project. Would like to you how does COBOL pointers are handled in the converted Java? And How are the crazy redefinitions possible in COBOL are handled in Java?

Anonymous said...

It’s 2020, the phones in our pockets have more memory than was available to most mainframe applications in the 70’s, 80’s, and 90’s. Wisely, the inventors of Java recognized that the constraints resulting in the need for COBOL redefines, or C unions, were no longer constraints, and omitted the similar language feature in Java. COBOL procedure pointers, however, are a long-standing construct in all YACLs, which COBOL (is not and) was late to adopt. When rewriting legacy COBOL or C applications in Java, these are examples of the interesting problems to solve by crafting a solution that is intuitive and normal in the Java language, lest the solution carries forward design patterns in Java that are unnatural, unnecessary, confusing, and difficult to maintain.


To address COBOL redefines or C unions, one approach might be to create unique classes for each of the COBOL or C data structures, then encapsulate replication of changes in state amongst the classes with getter and setter functions. While in principle this is simple, in practice this is quite complex due to the extent COBOL redefines are in use. The result also can be brittle unmaintainable code, and one must question what was gained by abandoning COBOL? This is an example where CloudFrame adds considerable value by encapsulating the complexity, eliminating dead code and data structures, and generating highly maintainable and performant code. With CloudFrame, Java developers are not required to understand 60 years of COBOL’s history to maintain a brand-new Java system.

Background. The COBOL redefines statement was critically important 30 – 60 years ago, due to physical memory constraints. The C language has a similar construct, known as union. The COBOL and C language solution use the same memory locations for different data structures, contextually based upon data values and application logic. Though only one data structure is valid at any one instance. For example, a 1K input record may be a header, transaction, or footer record, all uniquely identified by byte 10 of a data structure common to all records. Using COBOL redefines enables the program to need only 1K of memory to process any of the records.

Craig S. Mullins said...

Thangaraj... the above reply was from the vendor, Cloudframe

coMakeIT said...

Good Post. Application Modernization

Credit Solution To Businesses said...

I appreciate you sharing this kind of information. It was a pleasure to read your article and to be informed of the most recent trends and changes. I like the information on this site because it was helpful to me in my research on conducting business in india. Keep sharing.

WDP Technologies said...

You have written an excellent blog. I learned something new from your Blog. Keep sharing valuable information. WDP Technologies

Creative Web said...

Keep sharing such kind of useful post with us! Get error-free AI Productivity Templates by deeming Taskade.