Language:English VersionChinese Version

Somewhere beneath the gleaming surface of modern fintech apps and cloud-native microservices, a COBOL program written in 1987 is processing your paycheck. Another one, roughly the same vintage, is calculating your insurance premium. A third is settling interbank transactions worth billions of dollars daily. These systems work — which is precisely why replacing them has proven nearly impossible.

Fujitsu’s launch of Application Transform, an AI-powered legacy modernization platform that reads old code and writes design documents 97 percent faster with 60 percent better quality and 95 percent more thorough coverage, represents the most credible attempt yet to solve a problem that has defeated the software industry for decades. But to understand why this matters, you need to appreciate the sheer scale of the legacy code crisis.

The Trillion-Line Problem

Conservative estimates place the volume of actively running COBOL code at over 800 billion lines worldwide. Add Fortran, PL/I, RPG, and various assembly languages, and the total legacy codebase powering critical global infrastructure likely exceeds two trillion lines. These are not museum pieces. They are load-bearing walls in the architecture of the global economy.

The United States Social Security Administration runs on approximately 60 million lines of COBOL. Major banks process over 95 percent of ATM transactions through COBOL backends. Airlines, hospitals, government agencies, and utility companies all depend on systems whose original developers have largely retired or died. The average age of a COBOL programmer in 2026 is north of 55, and university computer science programs stopped teaching the language decades ago.

This is not a technology problem in the traditional sense. The old systems function correctly. The problem is maintainability: when something needs to change — a new regulatory requirement, a business logic modification, an integration with a modern API — the pool of people who can safely make that change shrinks every year.

Why Traditional Migration Fails

The graveyard of failed legacy modernization projects is vast and expensive. Queensland Health’s payroll system migration in Australia cost $1.25 billion and was ultimately abandoned. The U.S. Air Force’s Expeditionary Combat Support System consumed $1 billion before cancellation. The fundamental pattern repeats: organizations underestimate the complexity embedded in legacy code, the migration takes longer than expected, costs balloon, and the project is either abandoned or delivers a system that is functionally inferior to the one it replaced.

The core difficulty is that legacy code encodes decades of business logic, edge cases, regulatory workarounds, and institutional knowledge that is nowhere documented. The code itself is the documentation. When human developers attempt to rewrite these systems in modern languages, they inevitably miss edge cases that the original system handles correctly — edge cases that only manifest under specific conditions that may occur once a quarter or once a year. The result is a modern system that works perfectly in testing and fails catastrophically in production.

Manual code analysis — the prerequisite to any migration — is where projects stall. A senior developer might analyze 200 lines of complex COBOL per day, understanding the business logic, data flows, and dependencies. At that rate, analyzing a 10-million-line system takes roughly 250 developer-years. The economics simply do not work.

How AI Changes the Economics

Fujitsu’s Application Transform attacks exactly this bottleneck. By using AI to read legacy code and generate comprehensive design documents — data flow diagrams, business rule extractions, dependency maps, and interface specifications — the platform compresses months of analysis into days. The 97 percent speed improvement is not hyperbole if you compare AI analysis against manual code reading; it reflects the fundamental difference between a system that can process thousands of lines per second and a human who processes hundreds per day.

The 60 percent quality improvement and 95 percent thoroughness gains are arguably more important. Human analysts, constrained by time and cognitive load, inevitably make judgment calls about what to analyze deeply and what to skim. AI analysis does not skim. It traces every code path, maps every variable dependency, and identifies every conditional branch — including the obscure ones that encode those critical edge cases.

Fujitsu is not alone in this space. IBM’s Watsonx Code Assistant for Z targets COBOL-to-Java transformation specifically for mainframe environments. Modern Systems offers automated COBOL migration with a focus on preserving business logic fidelity. Micro Focus provides tools for wrapping legacy code in modern APIs without full rewriting — a pragmatic approach that extends the life of existing systems while enabling modern integration.

The Risks of AI-Driven Modernization

Enthusiasm for AI-powered migration must be tempered by honest assessment of the risks. AI models can misinterpret ambiguous code constructs, particularly in legacy languages with non-obvious semantics. COBOL’s PERFORM THRU statement, PL/I’s ON conditions, and various assembly language idioms can confuse even sophisticated language models. The generated design documents are only as reliable as the model’s understanding of the source language.

There is also the validation problem. If the original code is the only authoritative specification of business behavior, how do you verify that the AI’s interpretation is correct? You need domain experts who understand the business logic — often the same scarce resources you were trying to work around. AI does not eliminate the need for human expertise; it changes where that expertise is applied, shifting it from code reading to output validation.

Testing AI-migrated systems demands extraordinary rigor. Parallel running — operating the old and new systems simultaneously and comparing outputs — remains the gold standard, but it is expensive and time-consuming. Differential testing at scale, using production-like workloads to compare system behavior, offers a more practical path but requires careful test data management.

A Practical Path Forward

The most successful legacy modernization strategies in 2026 combine AI analysis with incremental migration. Rather than attempting a big-bang replacement, organizations use AI to map their legacy landscape, identify the highest-risk and highest-value components, and migrate them iteratively. The strangler fig pattern — gradually replacing legacy functionality with modern services while keeping the old system running — becomes dramatically more feasible when AI can accurately map the boundaries between components.

Fujitsu’s Application Transform and its competitors do not solve the legacy code crisis. They make it tractable. For the first time, the economics of understanding and modernizing billion-line legacy systems approach viability. That alone makes this one of the most consequential applications of AI in enterprise technology — not glamorous, not headline-grabbing, but profoundly important to the invisible infrastructure that keeps the world running.

By Michael Sun

Founder and Editor-in-Chief of NovVista. Software engineer with hands-on experience in cloud infrastructure, full-stack development, and DevOps. Writes about AI tools, developer workflows, server architecture, and the practical side of technology. Based in China.

Leave a Reply

Your email address will not be published. Required fields are marked *