Canadian Underwriter
Feature

Space-Age Dinosaurs


February 1, 2007   by David Gambrill


Print this page Share

The drum beat sounding the clarion call for faster, more efficient, increasingly agile technology can be heard all around us. And perhaps no one hears the clarion call more acutely than the Canadian insurance industry, where managing data is a critical aspect of what the industry does. Insurers rely on computer systems to input, store and change all kinds of records during the normal course of everyday business. For example, data might be related to policies and policyholders – including data as simple as names, addresses or telephone numbers of the insured; the model, year and colour of insured’s automobile; or the age of the home or the location of the business to be insured. Data also relates to underwriting rules and terms, rating criteria, pricing and conditions. Managing claims and claim-related information requires data. And of course, records must be kept for routine, administrative matters such as payroll, billing, invoicing and human resources.

Now that we live in the wired world of the Information Age, demands are omnipresent for insurance companies to prepare and deliver the data they store quickly, efficiently and in a user-friendly way. Brokers, for example, want quick and easy Web access to an insurer’s pricing information, so that brokers can produce quotes to insureds in a timely manner. Insurance consumers are demanding the online convenience of getting information on whatever they want, whenever they want. To a degree, the demands of the buying public are driving rapid technological innovation; in turn, this is putting pressure on insurance companies to keep up with the rapid pace of technology. IT systems and components built for insurers just three years ago are now considered to be “dinosaurs” in a hot IT industry. Insurance company chief information officers are seeing so-called “new solutions” go the way of the Commodore 64 in a short period of between just six months and one or two years. Enter so-called ‘legacy’ systems, which is how the insurance industry refers to its aging computer systems.

There is some debate within the industry about how old a company’s legacy system has to be before it is considered a ‘legacy’ system. When some insurance IT experts hear the term ‘legacy system,’ they often conjure up the image of a decades-old mainframe or mini-mainframe computer. Often these models are described as ‘antiques’ running on rule-based architecture, using computer codes written in programming languages such as RPG and COBOL. Such models were built to house all available information in one spot. They have been de-bugged a thousand times over, and the architecture of the systems is sound, secure and built to last.

Wayne Beck of CGI cautions not to define a ‘legacy system’ strictly in terms of a certain programming language like COBOL. “I don’t think language is the issue,” he says. “It’s basic architecture and design.” Microsoft supports a version of COBOL for .NET, he adds, noting a link between COBOL and a relatively new Microsoft product.

A more contemporary description of a ‘legacy’ system is: whatever technology an insurance company has last used, or a system currently in production. Here it is implied the ‘legacy system’ simply no longer supports the business function for which it was built. One could easily buy a new module for a component-based system – for example, a module that manages data related to just one aspect of the insurance business (claims data, for example, or underwriting) – and have it become an antique in two years’ time. Under this definition, a ‘legacy system’ could be only six months old, assuming it no longer can support its primary business function.

The common element of each of the above definitions is the age of the system’s architecture. Which brings up a dilemma for a company’s IT officer in a world of hyper-fast, technological evolution: how, and how often, should an insurer change its IT solutions to keep up with the rapid pace of it all?

WHY KEEP LEGACY SYSTEMS?

The fact that technology constantly changes is unquestioned. For example, nowadays one hears a lot about Service-Oriented Architecture (SOA) as the latest form of system architecture. And work is progressing on Web-based portals that will allow insurance brokers online access to insurers’ information for real-time pricing and quote information. It is taken for granted that real-time access to information is key to any insurance business; this stands to reason, since the speed at which a company receives a quote on a large piece of commercial business, for example, could make or break a commercial broker’s or insurer’s bottom line. Given the need for insurers to manage and deliver data quickly, do today’s legacy systems – given their use of limited data fields and occasionally cumbersome means of delivering information to third parties – keep up to clients’ demands for better, faster and accessible data?

More than a few research surveys of U.S. insurers and other financial service representatives say ‘no.’ For example, in June 2006, BearingPoint, a global management and technology consulting firm, conducted a survey with tech executives at global insurance companies. According to the survey results, “most insurance executives were concerned that their companies would be severely constrained by the limitations of inflexible, complex legacy systems and the inability to obtain useful data.” To this, Paul McDonnell, senior vice president at BearingPoint, added: “It’s increasingly clear that the world’s leading property and casualty firms will need to replace their core policy administration infrastructure in order to address the demands of the 21st century.”

BearingPoint is not the only U.S. tech consulting firm to express this opinion. In November 2006, the research company Celent conducted an IT study of global financial service companies (including insurers). “Frequently, financial institutions are running systems that are too obsolete, too slow, and inflexible,” Celent concluded. “Systems like these are impediments to achieving optimum operational efficiency, as well as new product deployment. Slowly, but surely, many financial services firms that often rely on technologies that are nearly 30 years old are realizing the competitive advantage of modernizing their antiquated core systems.”

So how is it, then, that these IT dinosaurs called ‘legacy systems’ still roam the world of Web-based, real-time data transfer?

One simple reason is: they still do the job.

John Krpan, vice president and chief operating officer of RIS-The Applications Support and Maintenance Company, says the older systems of 10 and more years ago were very intricate, didn’t look pretty, but in many cases they still do support an insurer’s modern business needs. “We [in the office] were talking … about them being used like little monsters chugging away in the background, with these tentacles that reached out to all parts of the organization,” Krpan says. “Back then, you didn’t buy packages, you just built it kind of like a Frankenstein, but it worked. Wires were hanging out of it and sparks were flying and so on. The purpose they served is they actually get the job done.”

Krpan noted these early systems were built to do everything, including claims adjudication, underwriting rules, payroll and client management. Because these IT “behemoths” formed the backbone of the company’s data management, companies had invested too heavily into the early legacy systems to replace them. The prevailing attitude then – and now, to a certain degree – is that “if it isn’t broke, don’t fix it.”

But technology has developed quickly. Soon many insurers faced the scary choice of keeping up with technological change, which meant retiring Old Faithful, or losing business because Old Faithful was fine for internal use, but wasn’t keeping up with the pace of technology innovation around it. Web-based technologies are typically cited as one of many ways in which new technologies are straining the capabilities of legacy systems.

“If you
look at the path that most [insurers] have taken, where they are giving more Web access to brokers, the next logical progression is they’re going to start looking at giving some access to insured or third parties,” says George Semeczko, the chief technology officer of Royal & SunAlliance. That might mean giving insureds some limited ability to see or amend things on their policy, or to make a claim online. Or it might mean providing data to third parties such as body shops or lawyers, etc. “Those sorts of extensions of your environment through the Web are probably the direction that most people are taking,” Semeczko observes. The issue then becomes: How easily can you interface the Web technology with the back-end office processing systems, which are traditionally ‘legacy systems.’ And can this be done efficiently?

A legacy system, says John Czerwinski, the vice president of business development at EGI Financial Holdings Inc., “is typically a system that is inflexible in terms of effectively taking data from a broker or any other party and allowing communication between a broker system and a company system. The challenge for most insurance companies is that they had this big behemoth that is currently the backbone of their system, but it has very little flexibility in terms of being able to transact business efficiently and effectively with their distribution network.”

Many insurers now stand at a “crossroads,” where they have to make decision, Czerwinski says: “Do I invest in technology that has the ability to bridge to my current legacy system? Or do I get rid of the current system and go for the ultimate system that will enable me to transact business with my brokers in such a fashion that it will be completely fluid between the two? Quite often, it comes down to capital. How much is it going to cost?”

For some, the cost of a complete “systems renewal” is prohibitive. One Canadian insurance IT officer noted a recent published report in which WestJet invested Cdn$38 million to perform a wholesale change of its systems, and then had to pull the plug on the project. Certainly, the fear of losing money is real. But the risks of unsuccessfully performing open-heart surgery on a dying legacy system are equally important to insurance IT officers.

Czerwinski says one risk is whether the new system, once it’s up and running, will actually do what insurers need it to do. “There’s a major risk when you’re conducting anything in terms of technology,” he says. “Say you go to Future Shop and you buy a computer. The chances are, it’s not going to work as soon as you plug it in when you take it home.” For this reason, said Czerwinski, any large-scale tech projects have to be exhaustively planned.

There is also an element of risk that changing from the ‘legacy system’ will create far more trouble than the change to new data models is worth. “The risk is, we have all of this data in very tried and tested and true data architectures, and we all have these systems that have been de-bugged to the nth degree, so they run and they work all of the time and for the most part the legacy systems are quite cost-efficient to run,” notes David Osmars of AXA Canada Inc.-Group of Companies. “Invariably when you go through a session of systems renewal, you need to significantly update data models.”

Osmars says the reality is that once you give users an increased ability to segment data more specifically, or to manage the data according to more business categories – accessing data on the colour of a car, for example, instead of just the model, or introducing a new underwriting rule – then the users’ expectations will increase accordingly. New data models must therefore be created to meet the users’ elevated expectations. “There’s a lot of risk in doing the conversion from a lot of this old stuff to the new system and the new data models,” Osmars notes. “It’s very time-consuming, it’s very distracting. It requires a terrific amount of testing – both within the tech community and on the user side – to make sure the new algorithms and the new applications run properly, and the new algorithms are calling against the right data elements to work properly….There’s a lot of time, testing, energy, work, de-bugging, all of that stuff needs to be done. The risk is that if you don’t have a very high quality of conversion written, then you have a terrific amount of re-work and fixing when you go into the conversion. To replace a single application, you’re probably looking for most companies an absolute minimum of 18 months – more likely going on three years.”

To minimize these risks, Canada’s insurance tech industry has adopted a methodical, component-based approach to change. In this school of thought, systems are updated incrementally, using individual business components, as needed. If the company’s policy data system needs an upgrade, for example, then that single system gets replaced, not the company’s entire legacy system. Individually, a company might buy a new billing system, a new reinsurance system, claims data system, brokers system, payroll system, etc. All of these things can be bought as individual components and coupled to a main system. The point is, component-based architecture doesn’t necessitate a revamp of the entire system.

COMPONENT-BASED CHANGE

Canadian insurance CIOs say the component-based strategy is partly based on necessity rather than by design. They note Canada has a smaller tech market than the U.S. tech market, and so some technology solutions are simply not available to Canadian insurers. Also, making a full-scale systems renewal using imported U.S. technology would not be in the financial interests of most Canadian companies. For example, U.S. tech solutions are often designed for insurance companies that are very large – Allstate, for example, or Progressive. The largest Canadian company, in comparison, would be equivalent only to a mid-sized U.S. firm. It would be too expensive, therefore, for a Canadian insurer to buy a tech solution priced for a large-scale U.S. firm.

Another issue is that American tech solutions need to be “Canadianized.” This requires programming the U.S. tech solution to reflect Canadian regulatory realities, and even the bilingual English-French fact in Canada. As one Canadian insurance CIO noted, American programming is more likely to have bilingual Spanish-English programs rather than English-French programs.

Perhaps the single biggest advantage of the component-replacement approach is that it allows Canadian insurers to become more flexible or “agile” when it comes to much-needed updates. “IT needs to be much more agile in the future to respond to those business needs,” says Nicole Brouillard, vice president of IT at Chubb Canada. “For Chubb Canada, especially in terms of legacy systems or the future, agility [means] taking an aggressive or opportunistic approach to replacing those legacy systems – really trying to identify modules or areas of the legacy systems and taking a business opportunity to deliver business value by replacing that module.” She gives the example that if a company “has identified the business need to present on [its] portal an electronic copy of the policy, for example, using a component-based approach, it could carve out a module from its legacy system and take the opportunity to deliver business value while refreshing that technology.” Brouillard goes on to say that business demands, whether they include offering services online, or accelerating service delivery to customers, “are all putting pressure on legacy systems today.”

One effect of taking a components-based approach to systems renewal is that business strategies become central: business strategies will effectively dictate what and when components will be replaced. Richard Maertens of Gore Mutual Insurance Company believes it is important for a company’s business policy to drive technological change, rather than the other way around. “I think the key driver is: what business objectives are you trying to accomplish?” says Maertens. “Can your l
egacy systems accomplish that or not? That’s a key driver behind all of it. Every company has a strategy and does the system support that? That’s the bottom line.”

Maertens cautions how easy it is in the world of technology to change for change’s sake. He is wary of a company getting too caught up in keeping pace with rapid change and straying too far from overall corporate objectives. “One of the things about a legacy system is that the new system of today is the legacy system of tomorrow,” he cautions. “Technology goes so fast, every six months to two years things are turning over, [and so] you might be behind the eight ball within two years… If you want to take advantage of the latest technologies, your current systems will become a legacy system quite quickly. I don’t think ‘legacy’ is a bad word. Companies have been using them for many years. They do the job.”

KEEPING PACE WITH CHANGE

But assuming insurers opt to maintain their current legacy systems, who, in terms of tech support, will be doing the job of feeding the monsters? According to Osmars, Canadian insurers may be facing a skill set dilemma over the course of the next decade.

“One of the problems with the legacy systems is that the legacy systems are typically written in COBOL – or, if the company is running the mini-mainframes like the old AS400s or I-series… it’s written in RPG,” Osmars observes. “But if you take those two languages, particularly the COBOL, nobody is trained in university on writing COBOL in Canada. I’m guessing [that] in 10 or 15 years, they’ve stopped graduating COBOL programmers… The thing is, in the marketplace, all your programmers for COBOL are 40-45 or older. We see a problem down the road – and it’s not that far down the road – where we will have a significant problem in hiring the skill sets to replace our retirees. And these are the people that maintain those legacy applications.” Elsewhere, Krpan refers to Osmar’s dilemma as the ‘hostage’ scenario. “These guys are silver-haired now, and in some cases they are holding the company hostage because of the fact that they’re the only ones who know this thing,” Krpan adds.

The potential lack of COBOL experts means two possible scenarios awaiting insurers in the future. First, a Canadian insurer could see its tech support costs skyrocket: with fewer people available to service older legacy systems, the few remaining COBOL programmers will demand a higher price to perform the required systems maintenance.

The second scenario is based on the first. If tech support expenses for COBOL-trained programmers get to be too high, one option would be to outsource the tech programming to countries like India, for example, where COBOL training and use is much more prevalent. The dilemma here is that exporting a Canadian insurer’s data outside the country would, of course, raise all kinds of red flags for Canadian insurance regulators.

Technology vendors might dispute the claim of a small Canadian tech market. “Actually, I would suggest that there are probably more vendors in the Canadian space now than there has been for awhile,” says Beck. He attributes this in part to tech companies following the influx of capital into the Canadian insurance market. As for the skill set dilemma, it may be somewhat overstated, one vendor suggests, noting IBM has set up training courses in COBOL throughout the world.

Aside from debating the theoretical disappearance of COBOL programmers, what else might the future hold for Canadian insurance technology solutions? Some insurance companies are contemplating, if not advocating a move to a paperless environment. “We are an industry that is inundated with paper and frankly it’s very inefficient,” said Czerwinski. “There are many advantages to being paperless, such as easy access to information. There’s offsite storage that would be eliminated, improved workflow, control, so you can keep an eye on things. And there’s also disaster recovery. A paperless environment is one that would be effective from our standpoint.”

Also look for future improvements in office automation – everything from desktop faxing, access to networks for field personnel, so they don’t always have to come back into the office to log into the mainframe computer. “Anything that creates efficiencies for the underwriter or claims people sitting at their desks,” Czerwinski says, when asked what the future holds for tech advances in the next five years. “They’re not getting up from their desk to do anything at all, quite frankly.”

And so as tech continues to pass us all by – including insurers – the question will always remain for Canadian insurers using legacy systems: What’s the most effective way to build a space-age dinosaur? Referring to the Canadian insurers’ components-based approach to change, Osmars predicts the extinction of the legacy systems will be a gradual and not a “cataclysmic” event. “I don’t think it’s going to be that a lightswitch goes off,” he says. “This is winning the war in The Game of 1,000 Slashes.”


Print this page Share

Have your say:

Your email address will not be published. Required fields are marked *

*