|System z on Facebook
System z in today’s Banking world
Caroline Exum 270004MPQK firstname.lastname@example.org | | 5,201 Visits
By Ken Muckenhaupt, Executive IT Specialist and Financial Services Sector Chief Technology Officer, IBM Systems and Technology Group
Back in 1972, I got my first job as a summer intern in the auditing department at a local bank. The bank’s “data center” consisted of an IBM 1401 Computer, a 1402 Card Reader/Punch, and a 1403 Printer. No disk drives; no tape drives. Programs were executed simply by loading a deck of cards that contained the application’s object code followed by a card deck that contained the data. Usually, that data was the previous day’s account reconciliation information that had been transferred from a paper tape generated from each teller terminal. Bank checks were processed by sorting and loading returned checks (which were punch cards) into the 1402 Card Reader. Then, an auditor manually reconciled a printed list of checks against account holder withdrawal reports.
By the time my third summer as an intern came around, the bank had renovated their data center into a brand new, glass enclosed computer room and upgraded its data processing infrastructure to an IBM System 360. The punch cards had been phased out and paper checks were now issued by the bank. End-of-day general ledger reconciliation and other core banking operations were more automated, and the IT and accounting tasks required less manual intervention.
Still, banking IT was exclusively a back office operation. There were no ATMs, internet banking was still 30 years away, and mobile banking was not even a vision yet. Banks large and small did not face the pressures and risks that modern financial institutions grapple with today. The bank’s front office operations and customer interactions were conducted face to face, and paper transactions were the norm. The people who staffed the IT departments of most banks did not have to worry about the risks from internet fraud, fines from regulatory non-compliance, IT operations availability, and competition from other banks. As long as the account data was accurate, the checks reconciled, and the books balanced at the conclusion of each day, the IT department was doing its job.
Over the years, the data processing industry advanced with the IBM mainframe leading the way. The Financial Services industry followed IBM’s lead by embracing the mainframe to satisfy its rapidly changing technology requirements. By the late-1990s with advent of internet banking, the bank’s IT operations were no longer relegated to the back office. Data processing was now a 24x7x365 global operation. Central banks were connected to their member banks by leased lines as large sums of money were transferred between the mainframes of major financial institutions. The CIO emerged as one of the most critical managers in the executive hierarchy of many banks. Computers had to be reliable, constantly available, and dynamically serviceable if the bank was to competitively survive. The IBM mainframe met these business requirements with first class hardware and middleware. Banks began to adopt IT strategies to accommodate their growth and expansion. Changes in financial jurisdictional laws permitted banks to cross state lines, and smaller banks became regional banks. And their IT requirements grew as well.
Now, let’s fast forward to 2014. Modern banking is quite different than when I started my first summer job in 1972. Global banks are still recovering from the financial crisis of 2008 and risk is on the mind of every C-level banking executive. Banking customers have unprecedented access to financial services through mobile devices and banks face unforeseen competition not only from each other but from non-traditional sources such as payment providers and retailers. CIOs constantly juggle the demands imposed by the four Risks: Operational, Criminal, Regulatory Compliance, and Competitive. Each of these risk categories carries with it the potential of significant revenue loss. The foundation amidst all this turmoil is the mainframe. The world’s major financial institutions still rely on the IBM mainframe to manage critical financial transactions with confidence. In the past 50 years, mainframe operations at financial institutions have grown from simple back office account reconciliations and end of day processing to around-the-clock sub-second real time transactions between the bank and a myriad of access devices. No other platform can sustain the transactional demand imposed by internet and mobile banking operations while simultaneously supporting the scalability requirements from the influx of new account holders from around the globe.
So, what is the future of mainframe computing for Banking? The financial services industry is already embracing the four pillars of IT transformation: Cloud, Analytics, Mobile, and Social Media (CAMS), and the mainframe is ready to respond to these challenges. One thing for certain is that each pillar requires banks to utilize their mainframes in ways never before imagined. While mobile banking and social media have opened the doors to the glass house data centers by enabling new accesses to financial data, they have in turn intensified the mandate for stricter security practices and capabilities. Likewise, banks must incorporate sophisticated analytics into their IT operations to comply with a growing set of regulations, to mitigate organized fraud and money laundering threats, and to develop competitive knowledge of each client. Only the mainframe with security technologies such as SAF and RACF coupled with advanced analytic technologies such as the IBM DB2 Analytics Accelerator (IDAA) can provide modern banks with the confidence to remain solvent for the next 50 years.
Kenneth is an Executive IT Specialist and Financial Services Sector CTO. He has 35 years experience in IBM hardware and software development. His past assignments include microcode development on IBM’s mainframe processors, management of a mainframe microcode development department, a technology leader assignment for the introduction of object-oriented technology on OS/390, a 3-year assignment in the development and test of IBM’s Component Broker on OS/390, and an 8-year assignment as an e-business on-demand consultant at the IBM Design Center and Center for Solution Integration. As FSS CTO, Ken is the thought leader for the integration of IBM hardware and middleware technologies for the development of complex, multi-platform banking and financial market solutions. Currently, he directs a worldwide team of systems architects and IT specialists across IBM’s Systems, Software, and service organizations, and provides FSS industry consultation to IBM marketing and sales teams.