OpRisk Europe 2011 – now in its 13th year, commenced today at the historic Waldorf Hotel in the West End of London. Somewhat ironic that the risk management conference is taking place in the stylish hotel whose interior is said to have inspired the designers of the “unsinkable” Titanic – a classic case study on risk management.
In one breakout session, Andrew Sheen of the FSA’s risk frameworks and governance team discussed recent developments from the BIS and their impact on operational risk. Citing updates to the “Sound Practices for the Management of Operational Risk” paper recently updated by the BIS Committee, Sheen emphasized several key considerations for the board of directors and senior management team. In particular, he emphasized the need for the board to set the tone at the top in order to promote a strong risk management culture and that banks should “develop, implement and maintain an operational risk management framework that is fully integrated into the banks overall risk management processes.” He also provided guidance for senior management. In particular, he noted that senior management should:
“Develop for approval by the board a clear, effective and robust governance structure
Be held responsible for implementing policies, processes and systems for managing operational risk and ones that are consistent with risk appetite and tolerance
Implement an approval process for all new prods, activities, processes and systems that fully assesses operational risk, and;
Regularly monitor operational risk profiles and material exposures to losses”
There’s been a lot of great content in Day One of OpRisk Europe, looking forward to tomorrow’s panel discussion on “The Impact of New Regulation on Operational Risk Management.”
With over $400b in assets under management and 57,000 employees in 38 countries, Old Mutual is a Fortune 500 company (#225) with an operational footprint that spans all 7 continents. Now based in London and listed on the FTSE100, Old Mutual was founded in South Africa in 1845 as the 166-member Mutual Life Association of Cape of Good Hope.
While steeped in history and tradition, Old Mutual has a progressive approach to risk management which includes a ‘risk governance framework’ based on a ‘three lines of defense’ model:
functions owning and managing risk
functions overseeing the management of risk; and
functions providing independent assurance.
Old Mutual recently adopted OpenPages Operational Risk Management (ORM) to improve its enterprise-wide risk management efforts. OpenPages ORM is being used by numerous global organizations like Old Mutual to manage risk through self-assessments, end-user surveys, automated workflow and executive dashboards that provide management with the visibility, control and decision support required to understand and manage risks throughout the organization.
The name is Kweku Adoboli, and you’ll be hearing a lot more about him. He works – or rather, worked – at UBS, Switzerland’s largest bank. He graduated from the well-regarded University of Nottingham, and moved up at UBS. What did he do? Well, UBS executives say he engaged in unauthorized investment trades, and cost the bank $2 billion! We’re no longer talking in terms of millions, or even hundreds of millions – but billions of dollars – enough to wipe out the bank’s profit for the entire quarter and send its stock price tumbling.
So, presuming the charges prove true, we can add Adoboli’s name to the likes of such infamous “rogue traders” as Jerome Kerviel, who cost Societe Generale about $7 billion; Nicholas Leeson, who lost more than $1 billion, enough to bring down Bearings Bank; and Joseph Jett, who reportedly cost Kidder, Peabody $350 million – a substantial sum back in 1994. They may be among the most well-known, though there were many others. John Rusnak reportedly cost Allfirst Financial $700 million, Yasuo Hamanaka $2.6 billion at Sumitomo, and Toshihide Iguchi over $1 billion at Daiwa – among others.
Adoboli, a director in exchange traded funds working on the bank’s Delta One desk, was arrested in London, where he worked. We can expect information to be forthcoming as to exactly what was done, and how. For now, though, it’s worth noting that, not surprisingly, Adoboli is called a “rogue trader,” indicating that he did this by going outside established protocols – with the unspoken implication that the actions of such a rogue were unavoidable. Well, even without knowing exactly what went wrong, we can surmise that there was something terribly wrong with UBS’ risk management and internal control practices. Certainly the risk of unauthorized trading is well known in the banking industry, and with the amounts of money at risk, one would think that sufficient controls would be firmly in place to prevent or timely detect unauthorized trading that could approach a huge sum, certainly well before a loss reached anything in the neighborhood of a figure like $2 billion! Isn’t it cost-effective to spend a relatively few dollars to avoid losses in the millions or billions? How difficult is it to ensure the right people, processes and technology are in place?
We can only wonder why adequate controls weren’t there – and when and where the next “rogue trader” will surface, which hopefully will be before serious damage has already been done.
Prior to the onset of the Basel II Accord and its resulting loss event category structure, there was no existing or suggested standards for financial institutions in how to classify loss events and risks. The reality was that there was no need for a standard, as companies were not particularly focused on tracking loss events and identifying operational risks within a formal structure. As banks were nudged along the operational and enterprise risk management path by the regulators and Basel II, a need for guidance was evident and the Basel II loss event category structure emerged to meet the need. Of course, many financial institutions clung to the new standard and began to implement their programs. Although the Basel II category structure was largely designed for the classification of loss events, many institutions have been leveraging the taxonomy for a risk classification structure as well.
As financial Institutions gained experience in operational risk management and the implementation of such risk programs within their organizations, they began to question the business alignment and validity of the Basel II loss event category model. Various consortiums, industry associations, consultants, academic researchers, and analysts began to study the structure and started to poke holes in its loss type basis and alternate classification models began to emerge. The RMA joined with RiskBusiness to coordinate an effort with banks to establish standards for Key Risk Indicators, which resulted in a risk classification structure that is gaining popularity. The Operational Riskdata eXchange Association (ORX) formed as a consortium to provide a platform for the exchange of operational loss data, and in due course developed standards and a classification structure for its member financial institutions. We also see the BITS organization looking at loss and risk classification structures, as well as many articles that have been written on the topic.
The article speaks of the importance of organizing data in a sound and clear-cut manner, and reaches a conclusion that the Basel II loss event category structure falls short with too much allowance for inconsistency. Dr. Alvarez proposes a classification schema that is based on causes, as opposed to types of loss events, which leads to a more structured and consistent classification of loss events. I encourage you to read t his article, as it article represents the current thinking in the industry, which is that the causes of an event are important to identify and understand, and when an organization captures its loss data and views risks form the causal point of view, it is better enabled to analyze the data and more effectively manage and mitigate risks, thereby being more successful in lowering operational losses and increasing operating efficiencies.
There will likely be more debate and thought put into loss event and risk taxonomies over the next few years, and the industry’s need for an effective and consistent standard that could enable benchmarking of operational risk will help drive convergence to a widely accepted loss event and risk data classification schema.
Some months ago I came across an article co-authored by a colleague of mine on enterprise risk management. It’s aimed at boards of directors, providing needed insight into difficulties companies have experienced implementing ERM, and puts forth principles for its effective use.
What I found particularly interesting is reference to principles outlined by “legendary management thinker” Peter F. Drucker, and the authors’ description of how those principles can be applied to ERM:
There’s no indication in the article that Peter Drucker ever spoke to ERM specifically, and to my knowledge he never did (if any readers know otherwise, please let us know). I had the great pleasure of knowing and spending some quality time with Mr. Drucker when after my stint at the Wharton School I was doing graduate work at NYU Graduate School of Business where I was fortunate to have him as a professor in an advanced management seminar. It was evident to me even then, as a still wet-behind-the-ears student, that Peter Drucker indeed was someone extraordinarily special. He had an amazing ability to identify and articulate valuable truths about business, which while obvious after he spoke them, were previously hidden from everyone else’s view.
With that said, I’d like to take the liberty of guessing what Peter Drucker, if he were still with us, might put forth as simple truths about enterprise risk management:
Forget “risk assessments” – they have little to do with ERM
ERM must be embedded throughout the entirety of an enterprise
ERM isn’t done by a staff function – it must be incorporated into the soul of every manager in the company
It must encompass clear responsibilities and accountability, with open and rapid communication up and down the organization
And it needs to become an integral part of daily business, enhancing judgments and decision-making at every level – it’s not an add-on, but rather how business is conducted throughout the organization
Mr. Drucker, if somehow you’re listening, I hope you’re smiling at what you hear.
Financial services firms, pharmaceutical companies and other heavily regulated organizations have long devoted significant resources to a compliance office, typically with a chief compliance officer and strong support staff. Multinationals have embedded part of the compliance function locally, typically with reporting to both the central compliance office and local management. But companies not facing heavy regulation, even large ones, have struggled in deciding whether a full time compliance office is needed.
Well, now there are clear indications that a full time role is becoming more common. Compliance Week recently reported on two studies saying just that. One is from the Open Compliance and Ethics Group (OCEG), who’s survey shows 75% of the 365 respondents has a chief ethics and compliance officer or similar title with “top-level oversight of compliance.” And 40% said the compliance chief has no other role in their company, and for companies with over $1 billion in revenue, the number is 55%. Where the title is shared, it’s with the company’s legal department in 23% of the time. The other survey was conducted by the Society of Corporate Compliance & Ethics, showing that of 560 respondents, 97% have a designated compliance or ethics officer, with 36% having no other title. Of those with another role in the company, 20% share responsibilities in the legal department. As with the OCEG study, other shared roles range from the chief audit executive, CFO, and head of human resources, among others.
Also telling about the relative importance of the compliance officer role is the reporting relationships. The SCCE study, for instance, shows the chief compliance officer reporting directly to the CEO in 55% of the organizations. And the compliance officer provides reports to the board of directors or a board committee both in writing and face-to-face in 80% of the companies. And with a more senior role comes higher pay. The OCEG study shows the most common level of compensation (36%) is between $150,000 and $250,000, with 20% reporting pay at $350,000 and above, not counting bonuses, stock options or other forms of pay. As we might expect, pay in larger companies is at the higher end, with companies with more than $1 billion in revenue showing 23% with total compensation at the $450,000 level or higher.
Certainly, if you’re directly or tangentially involved with compliance, these numbers probably aren’t surprising. With the regulatory spotlight shining brightly and companies struggling to keep costs from soaring out of control and to enhance compliance program effectiveness, companies are looking to strengthen the role of their chief compliance officer.
We know the banks and related mortgage service organizations have been under fire for their role in the financial system’s near meltdown and ensuing foreclosure fiasco. JPMorgan Chase’s CEO Jamie Dimon reportedly owned up to taking some responsibility, saying “Some of the mistakes were egregious, and they’re embarrassing . . . but we made a mistake, and we’re going to pay for that mistake.” The 50 state attorney generals and the SEC, among others, are pushing for changes in how the banks and services operate, and there’s little doubt changes are coming.
In the interim, a report emanating from investigations by the Office of Comptroller of the Currency, Federal Reserve Board, Office of Thrift Supervision, and Federal Deposit Insurance Corporation, is expected to form a basis for a settlement where the financial institutions would make fundamental changes in operations and controls. The banks and other servicers would, for instance, have to:
Set up a single contact point within the organization, enabling homeowners to avoid what’s often a maze of different departments
Take steps to ensure there will be no action to foreclose while borrowers are pursuing loan modifications
Improve training of staff handling foreclosures
Establish more layers of management oversight over the process
Engage an independent consultant to review foreclosures over the past two years, and compensate homeowners who were treated improperly.
One wonders why adequate business process design and basics of internal control weren’t in place long ago, even though the volume of foreclosures wasn’t anticipated. The sloppiness has caused tremendous problems for both the banks and servicers on the one hand and their customers on the other – and executives should know by now that if a large swath of consumers is damaged, then laws and regulations will surely follow.
This of course is not the end for the banks and servicers – not by a long shot. They still need to deal with the state attorney generals and other regulators, and we can expect more required changes to be forthcoming, along with large financial payments for past misdeeds. Oh, if only the risks had been identified earlier and better managed, with appropriately designed business processes, and basic and supervisory controls and compliance in place.
Two recent events involving hurricanes provide insight into what risk management is about. Many of us who live in on the east coast of the U.S. know all too well the damage wrought by Irene. And many in the Florida are dealing with damage to the University of Miami “Hurricanes” football team.
Let’s begin with Miami, where student athletes are said to have taken gifts from a fan – against NCAA rules. The University has already suspended a number of players. But what could be coming is worse, when the NCAA completes its investigation and decides on such sanctions as loss of scholarships, ability to play in bowl games, and the like. The impact on the football team and indeed the University are seen by some as potentially devastating. Miami’s President seems to be taking an appropriate course in saying the University will take action to be sure this kind of thing doesn’t happen again. Kind of sounds like what many senior business executives say when they suffer a major mistake. But, wait a minute – haven’t many, many other university football programs suffered the same kind of misconduct and paid a very high price? Since the answer is a resounding “yes,” then why wouldn’t a university like Miami, which treasures its football program, have long ago recognized the risks and taken action to prevent, or early on detect, any such kind of misconduct?
As for Hurricane Irene, let’s take a look at the plight of homeowners. Certainly those residing in the Carolinas know well the paths of past hurricanes. And while the Northeast has fewer, it is by no means unfamiliar with hurricanes, nor’easters, and the like. Whether or not they’re in some level of denial, people residing in flood zones aren’t ignorant of the risks, and others are aware of the possibility of wind damage, loss of power and the like. Certainly storms can’t be prevented, but their impact can be mitigated, through storm shutters or plywood boards, generators, and insurance coverage, among other actions. Yes there’s a cost-benefit relationship, but the other side is the cost of being emotionally and financially devastated. Yes, as we see the news coverage our hearts go out to those who have suffered, and we recognize that some simply can’t afford even basic protections. But we can wonder whether sufficient advance thought was given to managing the risks.
A key learning point from this is that risk management can be viewed as having several “tiers”: identifying what has not yet occurred but could occur, seeing what has happened to others, and knowing what harm has already hit home. The last two tiers are by far the easiest to recognize and analyze in terms of potential impact, while the first takes more thought and analysis though still cannot be ignored. In the cases of Irene and Miami, these events clearly have occurred previously, and the inherent risks were well known and needed to be managed. The same holds true for businesses looking to survive and prosper in a dangerous economic and competitive environment. It’s well known that supply chains can be interrupted, product quality compromised, IT systems hacked, and company personnel can do bad things. In all likelihood, risks have materialized in one’s own company or at a competitor, and are well known and can be managed cost-effectively. It takes identification and analysis, along with the right tools and technology to ensure appropriate attention, accountability and communication – all critical to making better business decisions.
My sense is that as a reader of this blog, you already have a good handle on what’s involved here. But hopefully it will prove useful if you’re striving to influence and convince others in your organizations of what risk management is about, and why it needs to be taken seriously.
My last posting spoke to one of COSO’s two recently issued guidance reports on enterprise risk management. The first provides approaches for getting started on an ERM initiative, and while it’s based on good intentions and provides useful information, especially to smaller companies, in Olympic games terms with only two entrants, that report gets the silver. The second report, Developing Key Risk Indicators to Strengthen Enterprise Risk Management – How Key Risk Indicators Can Sharpen Focus on Emerging Risk wins the gold – by a good margin.
COSO’s ERM report Application Techniques volume touches on the topic of key risk indicators, use of which was not commonplace at the time. Since then, along with key performance indicators, which focus primarily on past performance, more organizations have incorporated forward looking key risk indicators into their ERM processes, further enhancing risk management effectiveness. This new report does a good job of explaining KRIs and how they can be of benefit. A couple of simple examples include:
For customer credit, where a common KPI includes data about customer delinquencies and write-offs, KRIs are developed to help anticipate future collection issues, focusing for example on analysis of reported financial results of a company’s 25 largest customers or general collection challenges throughout the industry to see what trends might be emerging among customers that could potentially signal challenges related to collection efforts going forward.
Management of a chain of family-style restaurants sought to avoid a negative earnings event that could arise with unexpected market conditions. Recognizing that restaurant traffic is directly affected by customers’ discretionary income – where as discretionary income levels fall off, customers are less likely to dine out – management establishes as a KRI average gasoline prices people pay at the pump. This is based on the premise that when gasoline prices rise, discretionary income for individuals and families representing their core customer base decreases, and customer traffic begins to drop.
As such, KRIs enable management to take quicker action in dealing with the risks. In the later example, management is positioned to adjust marketing and promotion events to reduce the impact of the risk.
The report explains how KRIs are most effective when closest to the ultimate root cause of the risk event, providing more time for management to act proactively. And multiple KRIs can provide still more relevant information, keeping in mind that a close relationship between the KRI and the risk, and accuracy of information used, are both critical. Another benefit is the ability to readily track trend lines with dash boards or exception reports, quickly and easily communicating where action may be needed.
With KRIs continuing to gain recognition as important elements of enterprise risk management, this COSO report provides readily usable information and is definitely worth the read.
As you may know, the Dodd-Frank Act gave institutional investors and shareholder activists perhaps the item highest on their wish list – gaining ready access to the proxy statement with ability to name its own director nominees. And the SEC developed enabling rules to make it happen. Well, the U.S. Court of Appeals for the D.C. circuit just pulled the rule out from under shareholders. If you’re a shareholder activist, you’re probably outraged, but if you’re a board member or member of the senior management team, you’re likely breathing a sigh of relief!
The suit was brought by the Business Roundtable and U.S. Chamber of Commerce, and many thought it didn’t have much chance of succeeding. But succeed it did. The court ruled the S.E.C. “acted arbitrarily and capriciously” in failing to adequately consider the rule’s effect on “efficiency, competition and capital formation.” In its unanimous decision, the court added that the SEC “inconsistently and opportunistically framed the costs and benefits of the rule; failed adequately to quantify the certain costs or to explain why those costs could not be quantified; neglected to support its predictive judgments; contradicted itself; and failed to respond to substantial problems raised by commenters.”
And this isn’t the first time the Court shot down SEC rules – it’s happened several times in the last few years, also on the basis that the SEC didn’t properly assess the economic effects. So, where does the Commission go from here? Since this decision was issued by a panel of the Court, the SEC could ask the entire Court to review the case, or appeal to the U.S. Supreme Court. Or, it might want to conduct a more in-depth economic assessment of the rule to satisfy the Court, or come up with another rule. As the U.S. Chamber calls its victory “a big win for America’s job creators and investors,” the SEC is “reviewing the decision and considering our options.”
For what it’s worth, my view is that direct shareholder nominating of directors can be counterproductive. While seemingly supported by the concept of a democratic process, putting dissident or one-issue directors on the board, which might have occurred, would normally not serve a board, the company or its shareholders well. While the SEC’s rule seemed reasonable in terms of effecting the law’s mandate, perhaps the SEC can come up with something better.
Chief audit executives do a lot of things really well, adding value to the companies they serve. What is especially interesting is how well many, especially CAEs of larger companies, gain information and insight through networking. Many are involved with their peers in industry or geographically based discussion groups, sharing through blogs, conferences, and internet-based information exchanges. And of course there’s still the opportunity to communicate via email or text or pick up the phone to talk with a valued colleague.
I’m a member of one internet-based group – though I tend to read rather than write – and am struck by several themes that are the subject of intense discussion and debate. Among them is the extent to which internal audit can and should become more actively involved in their company’s “governance” activities, however the term is defined. There’s an emerging consensus that yes, they should, and with their insights and skill sets they can add significant value, with an eye toward moving up the organization scale from process to senior management’s and the board’s activities. Another topic is transition from providing risk and assurance to performing more consultative services. The debate is heated, recognizing that IIA Standards speak to and enable both, with strong views expressed regarding the opportunities to add value while keeping in mind the need to maintain independence and objectivity. A related subject under discussion involves opportunities for internal audit personnel to move within their companies to other staff or operating units, into any number of management positions. There’s recognition of the benefits to the internal audit function’s recruiting and development and ability to add value, though caveats are expressed and concerns exist regarding retaining objectivity.
Relevant is the IIA Research Foundation’s 2010 Common Body of Knowledge Global Internal Audit Survey, called the “most comprehensive global study conducted on the practice of internal auditing.” Of particular interest is where practitioners focus attention now versus where they see internal audit five years from now. The study shows that while current attention is centered on operation and compliance audits, auditing financial risks, fraud investigations and internal control evaluations, the focus will shift. Going forward internal audit is expected to be looking more closely at corporate governance, enterprise risk management, linkage of strategy and corporate performance, ethics, migration to IFRS, social and sustainability issues, and disaster recovery testing and support. Other topics are mentioned, so readers might want to take a look at the report.
I marvel at the internal auditor networks, where practitioners are benefiting from the exchange of information and thought. If you’re not already involved in one, you might consider looking into how you can do so.
The sea of blue suits at the OpRisk North America conference being held in New York City this week provides a stark contrast to the cold rain falling in Times Square. The conference kicked off with a keynote address from Mitsutoshi Adachi, director and deputy division chief at the Bank of Japan. Mr. Adachi, who also serves as chair of the SIG Operational Risk Subgroup for the Basel Committee, noted that his travel plans had to be moved up a few days in order to account for the continued travel delays out of Japan.
His keynote highlighted a recent report published by the Basel Committee on Banking Supervision titled “Operational Risk – Supervisory Guidelines for the Advanced Measurement Approaches” which found that “Operational risk capital for non-AMA banks is higher than for AMA banks, regardless of the exposure indicator used for scaling.” Mr. Adachi also noted in his address that AMA firms showed “only modest increases in losses during the financial crisis period.” Certainly not an unexpected result but what was telling was his finding that for the period of 2008 to 2009 (during the financial crisis), operational risk losses for all banks were “2 to 3 times fold” compared with the previous Basel Committee internal loss data collection period of 2005 – 2007. Mr. Adachi declined to field a question on which business lines contributed the most to the losses, per his obligation to keep such information confidential.
He concluded by saying that “the Basel Committee finds it even more important to engage with the industry” moving forward. Looking forward to the Plenary address “Reforming U.S. financial markets: reflections before and beyond Dodd-Frank.”
It’s well known that a company’s tone at the top is critically important in determining its culture, including whether or not it will act with integrity and ethical values – fundamental elements of effective internal control and risk management. And we know it’s not only the words spoken at the top, but also the CEO’s actions that drive culture. What brings this to mind is the recent conviction of the CEO of fraud detection firm Fraud Discovery Institute. While a conviction of the head of this type of firm might appear unusual though not particularly noteworthy, what’s truly compelling about this news is that the CEO is none other than Barry Minkow.
If you were following internal control, risk management and fraud back in the late 1980’s, you’ll likely remember the well-publicized fraud carried out by Minkow when he led ZZZZ Best Co. Reportedly he started the business at age 16, and took it public with the value exceeding over $200 million. But it turns out he cooked the books and falsified documents to support the fraudulent financial statements. Having been found out, he was convicted and sentenced to a 25-year prison term, ultimately serving a bit more than seven. After leaving prison, he started Fraud Discovery Institute in San Diego to uncover corporate fraud for clients, and took on a role as pastor of a community church. Why would anyone hire his newly formed firm? Well, certainly Minkow could be termed an expert in how to commit fraud, and thus how to prevent it, and having paid his dues to society it’s understandable that he was given the benefit of the doubt in redemption and starting a new and productive life.
It would be nice if this story had a happy ending, but it turns out that in his new firm Minkow reverted to his old ways. Prosecutors claimed that Minkow made false and misleading statements about Miami homebuilder Lennar Corp.’s financial condition to drive down the company’s share price [and] abused his relationship with federal law enforcement agents to get non-public information about Lennar and traded on that information.” And the 45-year old Minkow was sentenced in federal court to a five year prison term.
One could say that “once a crook, always a crook,” but that would be unfair. People do bad things and then turn to the straight and narrow, and have done good deeds in their lives. Nonetheless, when it comes to leading a business, it’s not three strikes and you’re out, but two, or more likely one. The tone at the top and actions of a CEO are too important to trust to anyone with anything other than a background not only of skill and performance, but also acting with integrity and ethical values.
Unless you’ve escaped to a remote island with no communication capability, you know about the serious issues facing banks and mortgage generators and service companies surrounding the foreclosure fiasco. For background, you might want to refer back to my October 15 blog which outlines some of the problems stemming from shortcomings in risk management and related internal control.
Well, the lawsuits have begun, with tens of billions of dollars at stake. State courts already have issued rulings, with the Supreme Judicial Court of Massachusetts, the State’s highest court, deciding that two major banks didn’t have the appropriate documentation when they foreclosed, and returned the properties to the borrowers. New York State’s chief judge, noting “it’s such an uneven playing field [where] banks wind up with the property and the homeowner winds up over the cliff [not serving] anyone’s interest, including the banks,” set forth procedures to ensure all homeowners facing foreclosure have legal representation. The impact in human terms is illustrated by recent reports of how two large banks took action against active servicemen and overcharged 4000 service personnel, reportedly failing to follow the Servicemembers’ Civil Relief Act that allows mortgage rate reductions and outlaws foreclosures. More lawsuits are on the way, led by a former prosecutor driving a class action.
Not only might other states become more proactive, but no less than three federal government agencies have begun investigations – the Department of Justice’s Executive Office for U.S. Trustees, the Federal Housing Administration, and the Federal Reserve. And none of this has been lost on a coalition of all 50 state attorneys general, which recently presented the five largest banks with a set of game-changing demands. Reports say these include prohibition against beginning foreclosure proceedings while a borrower is actively seeking loan modification, a requirement that a borrower making three payments under a temporary loan modification agreement be granted a permanent modification, modification turn-down subject to automatic review by an ombudsman or independent review panel, compensation programs that reward employees for pursuing loan modification rather than foreclosure, curtailing of late fees, and where banks engage in misconduct borrowers would be compensated by a pre-established fund and mortgage balances would be subject to reduction. While some analysts say these changes would drag out the foreclosure process and delay stabilization of the housing market, this attorneys general plan is reportedly supported by the newly formed Consumer Financial Protection Bureau, along with the Departments of Treasury, Justice, and Housing and Urban Development, and the Federal Trade Commission.
We continue to wonder how major banks dealt with the basics of risk identification and analysis – the risk that reliable documents would be needed in the foreclosure process – and establishing control activities to ensure document processing was accurate and complete, with files intact and readily accessible when needed, and accountability in carrying out control procedures. And we can wonder about due diligence in selecting and using outsourcing firms.
Does risk management and related internal control matter? Unfortunately, learning too late may cost financial institutions billions of dollars.
Several months ago I had the pleasure of presenting with Richard Brilliant, Carnival’s vice president and chief audit executive of Audit Services in a Compliance Week webinar titled: “Leveraging the Power of Integrated Risk Management”. Richard began his presentation by asking a very telling question: “Who specifically is best suited to manage risk in your organization?” The answer of course was “Everyone”. After all, enterprise risk management is about managing risks across multiple risk and compliance disciplines as well as across multiple business units. In other words, ERM requires everyone’s participation to be truly effective and risk awareness and expertise must be instilled at all levels of the organization.
Coming in at #4 on the 2010 GCR Wish List, Risk Expertise is something that needs to start at the top. Risk expertise is a skill set that boards are looking for in their executive teams and is something that could potentially find its way into regulatory reform this year.
Sponsored by the UK government and published this past fall, the Walker Review recommends overhauling the boards of banks and other big financial institutions by requiring the Chief Risk Officer to have a reporting line to the risk committee, in addition to strengthening the role of non-executives and giving them new responsibilities to monitor risk and remuneration.
Some of the specific recommendations in the Walker Review include:
Banks should have board level risk committees chaired by non-executive
Risk committees to scrutinise and if necessary block big transactions
Chief Risk Officer to have reporting line to risk committee
Chief Risk Officer can only be sacked with agreement of board
It is clear that risk management will be under increasing scrutiny in the UK (and across the globe), and that risk expertise will be increasingly important in 2010.
It’s become clear that a risk-aware corporate culture is of critical importance to an organization. In the past year alone, we’ve seen plenty of examples in the news where a lack of risk-aware corporate culture has hurt companies, some beyond repair. Coming in at #3 on the 2010 GRC Wish List is a “Robust Organizational Risk Culture”.
While it is critical to be thoughtful, disciplined, and strategic in your approach, it’s also important to understand how technology can promote a risk-aware culture and become a tool to embed effective integrated compliance and risk management practices within an organization. It can act as a training and awareness tool, a marketing tool, and can help build accountability and push policies and processes into daily activities.
Does your organizational culture reinforce your strategy and risk appetite or undermine it? Pricewaterhouse Coopers has developed a “Risk Culture Self Assessment” that will help you understand where your organization stands in terms of how it manages risk. They also published a five-step guide titled, “Building a risk-aware culture for success.”
We recently had an interesting discussion on what GRC professionals are hoping to achieve in 2010. We had so much fun we decided to publish a 2010 wish list for risk and compliance managers. The list is based on conversations we had with our customers, prospects and industry experts over the past several months.
Why are there 10? Well, as George Carlin mused in his skit about Moses and The Ten Commandments, “because 10 sounds official. Ten sounds important! Ten is the basis for the decimal system, it’s a decade, it’s a psychologically satisfying number (the top ten, the ten most wanted, the ten best dressed). So having ten commandments was really a marketing decision!”
All kidding aside, we’d love to get your reaction to our list and see if we left anything out. We’ll drill down into more detail for each one over the next ten days! Here’s the list:
It should be news to no one that global companies today are struggling with increased regulatory onslaught. And as we’ve seen with Dodd-Frank, it’s clear that we can expect continued landmark legislation globally to address the risk management failures of the financial crisis. Chris McClean of Forrester Research recently commented that there are nearly 200 regulatory changes still on the US federal agenda across finance, healthcare and consumer protection. Beyond congressional action, we’ve also seen current regulators cracking down under their existing mandates. The question that many OpenPages customers are addressing today is, how can organizations prioritize and cope with such a large number of regulatory changes, and how can organizations prepare for upcoming rulemaking? Many companies are turning to policy management software to establish regulatory change management, regulator interaction management and policy lifecycle management.
Policies establish the culture, values, ethics, and duties of the corporation. Organizations that take an ad hoc approach to managing and communicating policies face significant risk to their business. The key to effective compliance and policy management is having a formalized and efficient mechanism for communicating changes to regulations and managing the internal regulatory change process so the business can react quickly – particularly in these times where you know the regulatory environment is complex and changing frequently. It is also important to manage the interactions, communication and internal work associated with external regulators such as inquiries, submissions, filings, exams and Audits. Today, this tends to be a very time-consuming, manual process for most companies.
To learn more about implementing an effective compliance and policy lifecycle management program, check out a recent webinar we conducted with Michael Rasmussen, president of Corporate Integrity LLC.
Leading research and analysis provider Chartis Research recently released the 2009 RiskTech100™ report – a comprehensive study of the top technology firms active in the risk management market.
Based on assessment criteria including functionality, core technology, organizational strength, customer satisfaction, market presence and innovation, Chartis named OpenPages the Category Winner in Operational Risk and GRC solutions. This is a real testament to OpenPages commitment and success in delivering integrated risk management solutions as Chartis surveyed hundreds of operational risk vendors.
The study included a survey which found that “66% of respondents expect to increase their risk technology expenditure by 10% or more in 2010” and that users are moving from a siloed approach toward an integrated risk management approach.
We had the opportunity to host a panel on operational risk at GARP this week in New York. The panel, “Using Operational Risk Management to Gain Competitive Edge”, included moderator Christopher Donohue, Managing Director, Research and Educational Programs, (GARP), and panelists Marcelo Cruz, Global Head of Operational Risk Management and Metrics, Morgan Stanley, Patrick McDermott, Senior Director, Enterprise Operational Risk, Freddie Mac, and Mairtin Brady, Head of Operational Risk Management, TIAA-CREF, as well as me, Gordon Burnes.
At the beginning of the the panel, McDermott outlined the basic set of questions that operational risk managers have to answer:
- What can go wrong? - How bad can it get? - How likely is it to happen? - What are we going to do about it?
This is a great way to frame the essence of an operational risk manager’s job, and those new to the discipline will do well to make sure that their program covers off on these fundamental questions.
This was an interesting panel in that each panelist represented a different perspective on managing operational risk programs. The starkest contrasts were between Cruz, representing the quants, and McDermott, representing the value and importance of qualitative information. Cruz took particular issue with scenario analysis but did acknowledge the limitations of models as expressed in confidence levels. It’s clear that there’s a wide range of practice in the industry on this topic, with some banks relying heavily on scenarios to model their capital, others relying more on internal data.
All panelist agreed that the operational risk function is on its ascendancy and is increasingly being brought to the table to weigh in on strategic matters, such as acquisitions or new product launches. One of the key takeaways was that operational risk information can help businesses better define their risk profile, allowing business managers to make better decisions about where to invest, and where to focus mitigation efforts.
The subprime mortgage crisis has sparked a lot of discussion about risk management and, specifically, whether banks that suffered huge losses did so as a result of failures in the risk management function or in business management in general. The general business management failures occurred in situations where the risk management identified unacceptable risks but the business managers in charge of risk mitigation opted not to mitigate the risk(s).
This failure of exercising good business judgement in spite of warnings from the risk management function is exactly what the CEO at Freddie Mac, Richard F. Syron, is being criticized for in an article in today’s New York Times. Reporters Charles Duhigg and Eric Dash interviewed former executives and others associated with Freddie Mac, and their article paints a picture of an executive team, led by Syron, taking unacceptable risks despite the warnings from his Chief Risk Officer and others.
If senior management, in conjunction with the board, cannot be trusted to make the correct decisions about risk management, then there needs to be better transparency about the risks being assumed by the company, and shareholders can make their own decisions about whether to hold the stock or not. In this case, according to the article, “shoddier” underwriting standards exposed the company to too much risk, and Syron was warned of this situation. But did shareholders have a view into these changing underwriting standards?
Whether or not Freddie Mac could have avoided their recent meltdown given their market share and decline of the housing market is an open question. What is clear is that the risk/reward tradeoff was not managed well and that while shareholders had full visibility to the company’s earnings (the reward side of the equation), there is little doubt that the company did not provide similar transparency to the risk side of the equation. My guess is that increased regulation or shareholder demands will start to encourage better reporting of risks in the business, and not the kind of reporting you currently find in most 10-Ks.
Today we announced that Julian Parkin, Group Privacy Programme Director at Barclays will deliver the day two keynote address at OPUS 2010. In his address titled, “Supporting Risk Management Initiatives Across the Enterprise with OpenPages,” Julian will discuss how Barclays has leveraged OpenPages for its risk and compliance management initiatives across the globe including data privacy, operational risk and financial controls management.
“As a global financial services organization, Barclays has wide ranging requirements for managing risk and compliance activities across the enterprise and across the globe,” said Julian. “The OpenPages platform provides the integration layer for enterprise risk management, assessment, monitoring and reporting which delivers risk intelligence to business end-users and management. I look forward to discussing successful risk management approaches and how the OpenPages Platform can be leveraged to drive sustainable improvements.”
If you’re an OpenPages customer and would like to learn more from Julian and the extensive cast of industry experts and practitioners at OPUS 2010, register now by clicking here.
Many of our customers are in the process of rethinking their risk management programs. A key element of any program is the risk control self assessment, and, in fact, in many cases, provides the foundation for the overall program. The RCSA provides a baseline for risk exposure that drives further activity in key areas of risk for the business. Of course, as human judgement is involved, no company would rely solely on this single process for their exposure metrics. Many back test the RCSA process with actual loss events and validate management’s self-assessment of risk through an internal audit function.
The recent edition of Operational Risk and Regulation highlight the importance of the RCSA process at a large Japanese financial services company, Mizuho Financial Group, one of only two AMA-approved banks in Japan. The article notes that Mizuho Financial Group’s AMA model is largely driven by over 660 different scenarios, which, in turn, are based on the risk control self-assessment. One of Mizuho Financial Group’s subsidiaries, Mizuho Securities, is an OpenPages customer.
660 scenarios represents a lot of data to keep track of in spreadsheets, especially if you’re tying the scenarios to the RCSA process and ultimately want to back test the results with actual loss data. Only an integrated, automated approach make sense, and we’re seeing more financial services institutions abandon their first gen operational risk systems (and Excel!) as regulatory oversight heats up.
Recently I’ve been communicating with a former COSO board member about a couple of terms in COSO ERM – specifically about “risk appetite” versus “risk tolerance.”
It’s interesting, as this board member was intimately involved in reviewing drafts of the ERM report as it was being developed and signed off on the final, and continues to be actively involved in discussions on the subject of risk management.
It becomes clear to me that anyone can easily fall into a trap, as follows. When a report, article, or other written document arrives in our hardcopy or electronic inbox, we take care in reading it, digesting it, and being sure we understand it. But over time, as we use the underlying terms and concepts, we begin to factor in our own thinking and judgments, and unintentionally modify their use.
In the case at hand, confusion arose about use of the term “risk appetite,” where it was being used at a lower level than appropriate – a level reserved for “risk tolerance.” To refresh memories, COSO ERM says “Risk appetite is the amount of risk, on a broad level, an entity is willing to accept in pursuit of value. It reflects the entity’s risk management philosophy, and in turn influences the entity’s culture and operating style.” On the other hand, “Risk tolerances relate to the entity’s objectives. Risk tolerance is the acceptable level of variation relative to achievement of a specific objective, and often is best measured in the same units as those used to measure the related objective.” It goes on to say “Management considers interrelated risks from an entity-level portfolio perspective. Risks for individual units of the entity may be within the units’ risk tolerances, but taken together may exceed the risk appetite of the entity as a whole.”
There’s more in the report making clear what each term means, but I don’t want to bore you. And the point here isn’t about these specific terms, but rather our being able to communicate effectively with business colleagues and partners. Okay, maybe I am a stickler for words, though I like to think there’s good reason we all should do our best to use terms precisely.
I work in the computer software business and experienced firsthand the dot-com bust of 2000. As VP of Corporate Strategy for a public software company, I was involved in M&A activities, strategic partnerships and large OEM deals with dot-com companies. I rode the wave of going from $15/share to $95 and back down to $5. I understand the difference between client/server, n-tier, and cloud computing, and the subtleties between ISV, OEM and VAR relationships (in this context VAR means “value added reseller” not “value at risk”). I know why the dot-com era was a façade and why the bubble eventually had to burst.
As I read accounts of what was happening during the subprime crisis, I struggled to understand key concepts such as CDS (credit default swap), CDO (collateralized debt obligation) and SPV (Special Purpose Vehicle). I blamed my inability to grasp what was really happening on my lack of experience with complex financial products: I wasn’t “in the business.”
After reading Tett’s book, I now realize that I wasn’t the only one who couldn’t figure out what was going on. “As the pace of innovations heated up,” Tett writes, “credit products were spinning off into a cyber-world that eventually even the financiers struggled to understand. The link between the final product and its underlying assets was becoming so complex that it appeared increasingly tenuous. . . . Most financiers lacked the cognitive skills to truly understand the connections in this new world.” Oh yes, and “even regulators seemed only vaguely aware of what the banks were really doing.”
I highly recommend reading Tett’s book. She is able to decipher Wall Street mumbo-jumbo in terms that a lay reader, or at least a determined lay reader, can understand. Tett provides a rich cast of characters and a storytelling device that helps make this book compelling fun to read. More importantly for risk managers, however, you will also gain a new appreciation for the significance of sound risk management for your organizations. There are lots of reasons why the crisis developed, for example greed, carelessness, and deceptive practices. But across the financial services industry, systemic weaknesses in risk management culture, discipline, and implementation of best practices added fuel to the flame.
In a subsequent blog I will summarize some of the key risk management lessons that Fool’s Gold uncovers.
If you’re in the financial services sector, any GRC manager’s wish list includes regulatory clarity for 2010. In the depths of the financial crisis, the Obama administration promised financial services regulatory reform. President Obama himself remarked during his inaugural address: “But this crisis has reminded us that without a watchful eye, the market can spin out of control.” But what has happened since then?
A credit card bill was passed, but meaningful overhaul is still buried in the legislative process, and there are still major differences between the House and Senate versions of the critical elements of reg reform, including the systemic risk regulator, consumer protection and mortgage reform. Last week, Senator Dodd, who chairs the powerful Senate Committee on Banking, Housing and Urban affairs, announced that he wouldn’t be seeking reelection. Given the narrow margin in the Senate and his likely desire to get something done before he retires, we’re likely to see more compromise before anything gets passed.
Further, the political climate in Washington has shifted over the last year, and financial services reg reform is not the top priority for the administration–health care is (and now terrorism). In the end, as the political momentum behind reg reform fragments into competing alternatives, GRC managers are going to have to accept this uncertainty and the current regulatory structure, which may endure longer than expected. Of course, this in and of itself offers some clarity, which explains why we’re continuing to see strong growth in the GRC platform market, as companies move forward with their plans for integrated risk management, despite the uncertainty.
Fueled by a global audience that is desperately looking for disclosure in the wake of the economic crisis and mature digital computing technologies that make it more and more difficult to contain sensitive information, WikiLeaks has emerged as a viable new threat to data security.
Until now the United States government has been the central target of WikiLeaks attacks, however, with WikiLeaks founder Julian Assange’s recent claim to be ready to release corporate secrets in early 2011, organizations everywhere are faced with a looming risk management challenge that is not likely to dissipate anytime soon.
Experts agree, and Assange himself has suggested, that the information that will be leaked is more likely to consist of internal communications between executives and other employees rather than the personal data protected by privacy compliance laws. However, the threat of any kind of exposure means that corporations need to tighten data security and evaluate areas of potential vulnerability.
Unfortunately, WikiLeaks has highlighted a liability that persists across all corporations and government agencies that technology and compliance measures alone simply cannot contain: the human factor. The increasing number of compliance and regulatory mandates that have been put in place in recent years have not proven enough to combat the risk posed by employees leaking sensitive information.
A recent poll by Harris Interactive reports that only 9% of companies have adequate crisis protocols in place to protect themselves from a potential onslaught. In this period of uncertainty, with virtually all large enterprises under the WikiLeaks radar, it is vital that organizations devise an adaptable enterprise risk management strategy to identify and manage areas of weakness without sacrificing business performance.
Just as a sharp increase in regulatory compliance mandates has created a necessary shift in industry risk management tactics, so has WikiLeaks spawned the recognition of new vulnerabilities that face companies in the modern digital age. The organizations that are well prepared to assess and mitigate against untested threats, like the one posed by WikiLeaks, are those that combine deep domain expertise with powerful and flexible tools to analyze and weigh the probability and cost associated with any given challenge.
Against the backdrop of Copley Square, Boston on St. Patty’s Day, Yousef Valine, Executive Vice President at First Horizon described the need to focus on non-financial risk and particularly, operational and business risk. GCOR (Global Conference on Operational Risk) 2010 is the fourth annual event hosted by the RMA (Risk Management Association). In his keynote address, Mr. Valine stated that while most believe earnings volatility is a factor of financial risk, earnings volatility can be attributed to non-financial risk 30% of the time – operational risk (12%) and business risk (18%) – versus financial risk 70% of the time. The key message being that business managers need to be operational risk managers at heart and need to foster and facilitate a strong risk-aware culture.
Mr. Valine also outlined how during 2002-2008, losses realized from the following events totaled $42b!
Enron, WorldCom, Adelphia scandals
Late mutual fund trading
Overdraft and credit card excessive fees
Auction rate securities
Of course this makes the Madoff scandal at $65b even more troubling (note: Harry Markopolos will provide an in-depth review of the factors that enabled Madoff and how to prevent similar fraud in the future in his Keynote Address at OPUS 2010). Yousef emphasized that 45% of the loss amount ($19b) was the result of loss events in “Client Products and Business Practices” and that while it represented 45% of losses, the number of events (frequency) only represented 11% of total. Conversely, “Execution, Delivery and Process Management” represented 35% of frequency but only a fraction of the dollars lost. Ultimately, organizations need to consider severity versus frequency when reviewing loss events and mitigation practices.
Just attended a great session presented by Matthew Neels, Chief Compliance and Risk Officer at Capital One. Mr. Neels focused on building board interaction and driving board attention to the right areas of risk through an integrated risk management framework. He began with an interesting question, “Should you be using an implicit or explicit framework and how is your board making a decision on that framework?” The correct answer of course is: both are required to effectively manage risk.
He explained how explicit frameworks enable structured board discussions through a consistent and common approach, whereas implicit frameworks rely on “corporate culture and deep experience.”
In his session, Mr. Neels also detailed how multiple stakeholders use frameworks for ‘decision making, reporting and escalation’ and in particular, how the Board uses frameworks to:
Provide an objective yardstick or measure
Create a basis for understanding
Identify situations and areas that need attention
Highlight areas doing well
Help differentiate between expected and unexpected
The discussion then moved to how “driving board attention to the right areas can be difficult” as board reporting is often a “laundry list of potential risks, current issues and decision requests.” He stated, “Without a framework you have everything coming in at once without context.” He then offered several suggestions for preventing information overload:
Specific and quantifiable tolerance measurement is critical to driving board attention to the right areas
Set your risk appetite
Create a risk framework
Determine standard metrics and KRIs
Establish risk tolerances
Establish risk limit
The goal according to Matthew is to establish a “common scale that enables cross-category comparisons and risk aggregation.”
When several prominent industry analysts (Gartner, Chartis and Celent) recently published research on operational risk management (ORM), a common theme emerged – ORM is a critical and growing discipline; and OpenPages is a leading software provider in this market.
OpenPages was cited as a leading provider of operational risk management software in the Chartis Operational Risk Management Systems 2009 market analysis report. The report states that, “Successful vendors need to be able to assist in the implementation, training and methodological aspects of ORM,” and identifies OpenPages as a company with particularly strong efforts in this area.
Chartis is forecasting the worldwide ORM market will grow at 6.9% to $1.68 billion by 2013. They expect this growth to be fuelled by among other things:
An increased focus on the benefits of compliance
The convergence of oprisk, ERM and GRC, and;
Ongoing demand from emerging markets of Asia, Africa and Latin America.
This month, OpenPages was also recognized as a leading software company in the Enterprise Operational Risk Management Compliance, and Governance Solutions report by independent analyst group, Celent. The report notes that OpenPages is one company that is, “leading the field in terms of depth of functional capabilities.” The report continues that, “OpenPages is particularly strong in its multidomain governance, risk, and compliance management approach.”
The Globe published an interesting article today about a Harvard Business School professor that resigned just before the scandal at Satyam broke. This was no ordinary professor. Krishna Palepu is an expert in corporate governance, control and accounting, and corporate management in emerging markets. In short, the perfect resume for a Satyam board member. So what went wrong?
This is not an isolated incident. In this financial crisis, many good people on boards of struggling companies have been surprised. And we’ll likely see more of that in the months to come. I think it’s overly simplistic to blame the board, and certainly in this case in which Palepu is so obviously qualified. What we see frequently is that internal control systems and risk assessment processes are not mature enough to catch wrong doing or, and this may be more important, change behavior. Companies that are growing quickly, like Satyam, have the most difficulty putting in place the risk management process to catch the kind of fraud perpetrated at the company. My guess is that in the future business process will be designed from the bottom up with risk management in mind. As we’re learning, it’s too hard to do it after the fact, especially for the complicated businesses we’re trying to govern today.
Last week we announced the availability of OpenPages version 6.0, which marks a major milestone in the evolution of the GRC market-from convergence to insight. It also represents the completion of the first phase of our technical integration with IBM. And, the new release will help prepare our customers for managing through regulatory change in the post-Dodd-Frank environment.
Several industry experts have had positive things to say about the news:
“But there is a significant gap between collecting data and actually making it usable. The release of version 6.0 of the OpenPages GRC platform, which IBM acquired last year, is a significant step forward in terms of closing that gap by tightening the integration between OpenPages and the business intelligence (BI) software from Cognos that IBM also acquired back in 2007.”
Industry Analyst Guillermo Kopp wrote a report on 6.0, which details the key benefits and opportunities for the combined solutions of OpenPages and IBM. In regards to integrated risk management he says:
“A centralized governance, risk, and compliance (GRC) platform will help large companies manage various risks across client, location, product, and service domains. For financial firms, integrating financial risk dimensions (e.g., credit, market) will augment the challenge substantially.”
6.0 was also featured as the top story in CMS Wire’s GRC Roll-up
We’re pleased to announce that OpenPages and Network Frontiers have partnered to deliver the Unified Compliance Framework (UCF) to the OpenPages customer base. The addition of the UCF content into the OpenPages IT governance solution – OpenPages ITG supports OpenPages’ goal of providing its customers with a holistic approach to managing IT risk and compliance.
The partnership provides strong synergies for our customer base of enterprise GRC professionals, many of whom are looking to OpenPages for IT risk and compliance management. Previewed at OPEN 2009 – the OpenPages European Network Summit recently held in London – the UCF data gives OpenPages customers access to the most comprehensive set of IT policies and controls that cross multiple regulations, thus reducing the time commitment and costs associated with complying with the slew of IT risk and compliance mandates nearly all companies are faced with today. In a survey conducted at OPEN 2009, 93% of organizations stated that within 2-3 years they are likely to converge or coordinate IT risk and compliance with GRC management.
The announcement was well received by industry experts including Michael Rasmussen, President of Corporate Integrity, a GRC strategy advisory firm:
“In today’s economy, wasting valuable resources on costly and time-consuming processes associated with compliance and risk management can be damaging to IT GRC programs. With the UCF enhancements to the OpenPages Platform offering, customers are given the tools to more quickly and effectively comply with a multitude of regulations and from there, can focus more attention on ensuring that their IT GRC programs are sustainable, repeatable and increase transparency across the enterprise.”
You’re a CEO, senior manager, or board member watching your once-great company brought to its knees. You imagine yourself on the deck of the Titanic, your world coming to an end—your once confident self embarrassed in front of colleagues, competitors, friends, family, and the larger communities in which you once thrived and were held in such high esteem.
This is the first sentence a just-released book published by John Wiley & Sons. I got my hands on an advance copy, and it is compelling reading. It analyzes how – while facing different circumstances in different industries – common themes underlie why once-great companies have seen their fortunes sink, while others withstand economic turbulence and hazards to continue to grow and reap the rewards of success. But the book is not solely about how to avoid disaster. It highlights how having the right infrastructure enables an organization’s positive qualities to lead to success. This includes what’s needed to avoid the kinds of disasters that can befall any organization, but also essential to identifying opportunities and being positioned to seize them for competitive advantage.
I don’t often recommend books to others, but this one is exceptional. It has a long title: Governance, Risk Management and Compliance – It Can’t Happen to Us: Avoiding Corporate Disaster While Driving Success. I believe the substance stands up to its claim that “unlike other books, this one is not aimed solely at senior managers or solely at members of boards of directors. It’s directed to both, with an added objective of providing insight into the interface between the two.”
You might be asking why Steinberg is spending so much space here touting this book – it is because the book is really that valuable, or does he have some ulterior motive? Well, okay, I’ll fess up – the answer is “both.” Yes, as you may have guessed, I wrote the book. And I apologize for withholding that important fact until now! But I do believe virtually any reader of this blog will greatly benefit from reading the book. And I’m pleased that I’m not the only one who thinks so. Here’s what some others, whose names you might recognize, are saying:
Rick Steinberg is a time-tested expert in this ever more essential field. His refreshing candor in assessing recent shortfalls makes this book a must-read for corporate leaders — Mark R. Fetting, Chairman and CEO, Legg Mason, Inc.
This outstanding book provides a critically important perspective on how risk management can only be truly achieved by aligning culture, strategy, compliance programs, and compensation. It should be must reading for any board member concerned with improving the management of risk — Jay Lorsch, Louis E. Kirstein Professor of Human Relations, Harvard Business School
A comprehensive and insightful examination of corporate governance. A must-read for those of us who are CEOs and serve on public boards — Randall L. Clark, Chairman and CEO, Dunn Tire LLC; former Chairman and CEO, Dunlop Tire North America
Attention directors and officers: Ignore this book at your own peril. Richard Steinberg has crafted a careful, thoughtful approach to managing risks, and it should be required reading for Corporate America — Scott S. Cohen, founder and former Editor and Publisher, Compliance Week
Richard Steinberg’s comprehensive and clearly written work will substantially benefit both new and experienced directors. It will help corporate boards recognize the challenging forces businesses face, as well as the techniques and standards available to intelligently monitor and supervise firms and their senior management. An easy and engaging read, this book should be on the bookshelf of every corporate director — William T. Allen, Director, NYU Pollack Center of Law & Business; former Chancellor, Court of Chancery of the State of Delaware
Richard Steinberg, a respected and time-proven governance hand, has written a most enjoyable and thought-provoking work—an excellent addition to anyone’s governance shelf! — Charles Elson, Edgar S. Woolard, Jr., Chair in Corporate Governance and Director of the Weinberg Center for Corporate Governance, University of Delaware
By the way, the IBM Open Pages people were kind to allow me to use a paper I wrote for them as the basis of one of the chapters. I hope you will consider reading the book, and I trust you will not be disappointed!
Brandishing new authority thanks to the Dodd-Frank Act, the SEC was quick to act on an agenda item that had been on the table for 30 years. Yesterday, the SEC approved a ‘Proxy Access’ rule that allows shareholders to place nominations for board member seats on the annual proxy ballot of public companies. The rule applies to shareholder groups who have owned greater than 3% of a public company’s stock for at least 3 years.
SEC Chairman Mary Shapiro succeeded where her two predecessors had failed in gathering a 3-2 vote in favor of the rule which was divided along party lines as both Republican members objected. While this is a win for investor groups who now have increased influence over board make-up, there are no provisions in the rule for smaller, individual investors who own less than 3% of the stock and have held the stock for less than 3 years.
One thing that is certain, the new rule reflects the anger and backlash of shareholders who feel that boards of directors were not acting in the shareholders’ best interest when taking highly leveraged and risky positions that led to the 2008 financial meltdown. As Rick Stenberg pointed out in his recent blog, this indicates a clear trend toward increasing shareholder power and of companies and their boards ‘opening channels of communication with shareholders.’ As these channels are opened, an information architecture that provides full transparency into risk exposure and enables information sharing will help to fill the communication gap between the Board and shareholders.
COSO recently released reports providing guidance in two areas related to risk management. One is Embracing Enterprise Risk Management – Practical Approaches for Getting Started, which suggests ways in which companies, especially smaller ones, can begin a risk management initiative with the objective of ultimately moving to an ERM process. It puts forth “keys to success” in terms of a number of “themes,” beginning with being sure to have support from the top. Theme 2 is building on incremental steps, which includes implementing key practices to gain immediate and tangible results. Theme 3 continues with focusing first on a small number of “top” risks, and theme 4 is leveraging existing resources by utilizing the capabilities of the chief audit executive, chief financial officer or other executive as a catalyst to begin the initiative.
The guidance continues with theme 5, building on existing risk management activities already being performed, for example, by internal audit, insurance or compliance functions, fraud protection/detection measures, or credit or treasury functions. Theme 6 involves embedding risk management into the fabric of the business, and concludes with theme 7’s continuing to update and educate senior management and the board on evolving ERM practices.
The guidance also provides seven “action steps” to support development of an ERM initiative: Seeking board and top management leadership, involvement and oversight; selecting a strong leader for the ERM initiative; establishing a risk committee or working group; conducting an enterprise wide risk assessment and developing a related action plan; inventorying existing risk management practices; developing a communication and reporting process; and developing the next phase of action plans and communication.
As stated in the report, the guidance says the suggested incremental step-by-step approach may be particularly useful to smaller companies, and importantly, the suggested approach is a only a starting point for moving to an enterprise risk management process. I believe the report is well meaning, looking to break down barriers and resistance to embarking on building an ERM process, and as such may be useful to companies considering taking a first step. But that’s all it is. It doesn’t provide guidance on how to design an ERM process, and how it can be effectively implemented throughout an organization. Yes, some of the “steps” are a start, but my concern is that, despite the warnings, companies going down this path will somehow believe they will have installed ERM in their organizations.
In Olympic games terms, with only two entrants, this report gets the silver. The second report on key risk indicators wins the gold – by a good margin. I’ll speak to that report in my next blog posting.
Lesson 3: You cannot afford to overlook or underestimate the correlation of risks.
There were two innovations that fueled the growth in the subprime mortgage market. The first was credit derivatives: in its simplest form, a credit derivative is a contract between two parties in which the seller agrees to compensate the buyer if a loan goes into default. The second innovation involved a process called securitization, which traditionally involved lenders selling their loans to an investment bank. The investment bank “bundled” the loans together and sold pieces of the bundle to pension funds and other investors. The original lenders, having offloaded their loans, could make new ones. The investors acquired a slice of the loan bundle and its interest income without having to go to the trouble of meeting and assessing the borrowers.
The innovation was securitizing not just loans but credit derivatives. It was first applied to corporate loans which tend to have very little correlation (correlation is the degree to which the defaults in any given basket of loans might be interconnected). But then it was carried over to mortgages and more importantly subprime mortgages. The financial services sector industrialized the procedure, and began selling securitized debt and derivatives on an extraordinary scale. The fatal mistake was not realizing that subprime mortgages were highly correlated, especially in an economy where interest rates were rising and housing prices were falling nationwide. Moreover, subprime mortgages had intrinsic flaws (such as issuing loans with escalating interest rates to homebuyers with dubious credit ratings) that inevitably resulted in extremely high default rates.
J.P. Morgan opted not to get into this market, a very smart expression of a cautious corporate risk culture that ultimately saved the company from the disasters others suffered. Fool’s Gold gives a great account of how Morgan risk managers struggled to understand how other banks could be making so much money and covering their risks at the same time. To their credit, they did not enter the market because they understood the risk and did not have a way to mitigate it.
Lesson 4: Do not think that models are anything more than a guide or a compass.
Models are useful but they have limits. They are essential for navigating in the world of modern finance, but they are not infallible, no matter how well crafted they are. Models are only as good as the data that is fed into them and the assumptions that underpin their mathematics. The key simplifying assumption on which the credit derivative models rested was that the future was likely to look like the recent past. New financial innovations have no way to be tested relative to their risk level except by means of computer simulations that use historical data. But there are no statistics that truly represent the environment surrounding the new instrument and, as a consequence, no one really fully knows what are the risks associated with the instrument. This is especially true of risks connected with the “correlation” factor. Hence, innovations can always have “surprises” connected with their usage. Remember that models are only tools and should not be used without human intelligence.
Lesson 5: Regulation is not a panacea.
As the crisis unfolded, there was a lot of blame placed on regulators and regulation. Although the Federal Reserve had the legal authority, they did not have the inclination to regulate the behavior by banks that led to the disaster. Alan Greenspan, head of the Fed, admitted that he had made a ‘mistake’ in believing that banks would do what was necessary to protect their shareholders and institutions. This “absence” of the oversight of the bank regulators has resulted in lots of discussion around new regulations, new regulatory agencies and so on. Tett’s book does an especially nice job in explaining how banks worked to get around capital requirements using the new tools and instruments. Part of the problem connected with the absence of the regulators during this period of time was that the banks worked very hard to expand their use of leverage in ways the policy makers could not see. Of course, this came back to haunt them when the collapse occurred. Financial institutions will always attempt to get around regulations in one way or another because it is profitable to do so. In addition, regulators are always behind what is going on in the industry. This is just the nature of the relationship.
Even in the wake of sweeping deregulation of the energy industry, few companies face as much government oversight as utilities. Power generation and distribution companies are subject to a maze of regulatory oversight, including state agencies and the federal agencies, the Federal Energy Regulatory Commission (FERC), the North American Electric Reliability Corporation (NERC), the Nuclear Regulatory Commission (NRC), the Environmental Protection Agency (EPA) and the Occupational Safety and Health Administration (OSHA).
As Managing Director of Corporate Compliance at Duke Energy, Tom Wiles knows first hand the challenges of operating a business in a regulated industry. Duke Energy – a Fortune 500 company traded on the New York Stock Exchange – is one of the largest electric power companies in the United States delivering energy to approximately 4 million U.S. customers.
In a Compliance Week Webinar titled “Proactive Ethics and Compliance Programs in a Regulated World”, Tom Wiles discusses how a “proactive partnering” and “risk-focused coverage” approach has delivered positive results for Duke. He states that in order to create an effective and efficient enterprise-wide ethics & compliance infrastructure, the Ethics and Compliance Manager needs to establish expectations, communicate expectations, monitor behavior, report results and provide continuous improvement.
If you’d like to learn the key steps your organization can follow to integrate disciplined ethics and compliance management into your business and hear about the value organizations are receiving from effective programs, check out this Webinar.
Julian Parkin, Group Privacy Programme Director at Barclays, recently delivered the day two keynote address at OPUS 2010 – the OpenPages User Symposium. In his keynote address, Parkin discussed how Barclays has leveraged OpenPages for its risk management initiatives and how the flexibility of OpenPages’ technology has been harnessed to drive sustainable improvements across evolving risk types.
After his keynote, I had the opportunity to interview Julian and discuss his experience at OPUS 2010 and as a member of the OpenPages user community.
With the passing of the Dodd-Frank Wall Street Reform and Consumer Protection Act, many companies are bracing for the regulatory onslaught. The problem is that few of the provisions in the legislation take effect immediately, and what we’re really facing is much rulemaking from new (e.g. the Consumer Financial Protection Bureau) and existing regulatory bodies. This rulemaking will take place over the next five years, with the bulk of the activity in the next two. So how should financial services companies position themselves?
It is clear that a major theme of the legislation is greater transparency into risk exposure across the financial system. Basel II can be faulted for taking an institutional approach to risk management, and the financial crisis of 2008 clearly revealed gaps in the way regulators assessed and managed risk across institutions. This wave of regulatory rulemaking will try to address those gaps, and, in fact, Treasury Assistant Secretary Michael Barr in a recent speech at the Chicago Club made several references to Basel III, an indication that regulators worldwide will be coordinating on liquidity and capital standards to manage systemic risk.
Regardless, regulators worldwide will still be collecting risk exposure data from institutions. As a first step, institutions can put in place an information architecture that can quickly an accurately serve up risk exposure information, and all financial services institutions need to work on this. The Dodd-Frank law, for instance, creates a Financial Stability Oversight Council that will have the authority to instruct the Federal Reserve and other agencies to collect all sorts of risk exposure data. Most companies know where their current gaps are; these need to be addressed immediately.
The scope of the rulemaking also suggests that we’re going to be in a very dynamic regulatory environment for a long time. As such, covered companies would do well to make sure this information architecture can adapt to change over time. Implementations of static frameworks for regulatory compliance could be obsolete before the project is finished! Any solution must be able to adapt and extend over time.
Finally, as companies put in place this information architecture to surface enterprise risk exposure, thinking about interdependencies will be critical to reduce cost. Inevitably, there will be much overlap between the information requests from different regulatory agencies. Your ability to handle these requests, as well as those from the business, with a minimal set of reports will save you time and resources. An integrated risk and compliance framework can reduce the disparate databases and reporting structures. Of course, you may not be able to consolidate everything onto a single, integrated system, but thinking about pairwise combinations is a good start.
Linda Tucci, Senior News Writer at SearchCompliance.com recently wrote “the rap against governance, risk and compliance (GRC) software is that the solutions either fall short of effectively managing the complexity of an enterprise’s compliance programs, or are so complex that enterprises never realize the software’s full capability. For compliance officers who are patient enough to start small while thinking big, however, the right GRC software can help put large, complex organizations on the path to Sarbanes-Oxley Act (SOX) compliance nirvana: a risk-based program optimized by automation.”
In the article, Ms. Tucci refers to Tommy Thompson, IT security compliance coordinator at The Williams Cos. Inc. and long-time OpenPages customer. Mr. Thompson has been using OpenPages ITG for nearly four years, taking a tops-down, risk-based approach to managing William’s financial controls, and is now looking to expand to other areas of compliance.
A key challenge in many IT organizations is being able to allocate resources to the right set of problems. Many times, IT managers will take a bottoms-up approach to IT risk and security and implement controls and procedures without any clear link to the key risks in the business. Williams is a great example of why you should consider a tops-down approach to risk management that supports overall corporate objectives and operates within the corporate ERM framework.
To achieve this, your IT risk and compliance solution should provide a way for your IT organization to link IT risk management activities to the key risks in the business to ensure better business performance against corporate objectives and a more efficient use of resources (capital and operating budget).
A recent client discussion reminds me of an article I came across a few years back with important implications for dealing with risk – or rather a risk that materializes into a major problem. The article, “What Organizations Don’t Want To Know Can Hurt,” focuses on events surrounding the College Board when it learned of extensive errors scoring its SAT tests, and provides a good example of not to do.
The company’s president reportedly said that finding the specific cause of the failure “did not really matter,” but rather what’s important is to ensure that improved controls catch future problems. His position was supported by the engagement leader of a consulting firm hired by the company, saying that dissecting past problems is not necessary either to ensure that the scoring system works better in the future or there is a good safety net to catch errors. He goes on, “You can do both without knowing whether it was rain that made the papers wet, or whether someone spilled a cup of coffee…[and] if we tried to brainstorm everything that could go wrong, we’d be here for years – for a lifetime. But if controls are in place to identify problems, and rescore tests that were misscored, that’s what you’re really looking for.”
These statements are fascinating – that there’s no need either to look back at why something went wrong because it’s unnecessary, or to dig deeply into what could go wrong because it would take too long. It suggests that problems in test scoring – which would certainly seem to be central to the company’s credibility and indeed its sustainability – are okay as long as they ultimately are found and test results rescored. Simply “catching future problems” by “rescoring tests” means that the company is satisfied detecting major problems with scoring after they occur, rather than taking steps to prevent such problems in the first place. I wonder what users of SAT scores think about that!
If you’re smiling at this you’ve got company. Cleary, looking neither backward nor forward is not a viable option. And, doing one or the other also is not the answer. Rather, it’s necessary to do both. Only by getting behind what went so wrong can management feel comfortable it understands what risks continue to exist, and only then is it positioned to look at what additional risks need to be the focus of its attention going forward.
It doesn’t take a genius to know that when a problem rears its ugly head it essential to find out why. The article talks about fields like aviation and medicine that conduct investigations to find out exactly what went wrong, to learn from often deadly mistakes and to improve processes and protocols. The National Transportation Safety Board does so focusing primarily not on casting blame but on making things better. Similarly, many hospitals hold mortality and morbidity conferences to analyze and learn from mistakes. Many businesses do that as well, learning from what went wrong. They don’t choose between learning from the past and working to make things better. They do both, with one supporting the other. And no, it doesn’t take “a lifetime” to find out what caused a major problem or to identify the source of the next potential disaster.
This week OpenPages is sponsoring the RMA Operational Risk Management Discussion Group being held at The Federal Reserve Bank of Philadelphia. The two-day forum was kicked-off by Victoria Garrity, Senior Quantitative Analyst from the Boston Federal Reserve. Victoria’s session titled “Regulatory Perspective on Scenarios: Challenges and Issues”, was well attended and sparked a number of conversations on potential forthcoming regulations. Other interesting sessions included a discussion moderated by Michael Fenn of DTCC and Patrick McDermott of Freddie Mac on the evolution of ORM assessments, and a roundtable facilitated by Kathy Miller of KeyCorp on “Recent Experiences with Regulators” in which the discussion focused on operational risk examinations and emerging guidance from the regulatory environment.
Overall a very timely and thought-provoking forum attended by some of the leading operational risk practitioners.
If nothing else, the financial crisis of 2008 has driven home the need to improve reporting to the organization regarding risk posture and exposure. As we look to 2010 and beyond, risk and compliance processes will no doubt evolve to meet changing business and regulatory requirements. Coming in at #8 on the 2010 GRC Wish List is “Strong Reporting with Easy-to-Use Formatting.” While the value of strong reporting is clear, a few challenges remain:
Cross-domain Reporting – With the large number of risk and compliance initiatives underway at organizations today, users are struggling to deliver comprehensive enterprise risk management. Users need a way to understand and manage their risk exposure across the numerous risk and compliance domains through enterprise risk assessments and integrated reporting. GRC solutions that are developed independently in silos, produce application specific reports that only reference data local to that application and provide an incomplete picture of enterprise risk exposure.
Multiple Reporting Regimes – Companies are struggling to meet the needs of an increasing number of reporting regimes. For instance, a financial services company may have adopted the CoBIT framework for IT management, adhere to FFIEC best practices guidelines and may be looking to establish an Anti-Money Laundering (AML) program. The key challenge facing these organizations is in establishing a risk framework that integrates multiple reporting regimes and provides visibility into the state of key risks across the enterprise.
Linking Oversight with Operating Environment – Effective “governance” implies effective oversight and reporting. To deliver effective oversight, GRC professionals need to be able to link their oversight and reporting to their operating environment by drilling-down to view control status at the asset level.
Profile-based Reporting – Risk management professionals, compliance professionals and auditors frequently have access to highly confidential and sensitive information. Oftentimes, that information needs to be segmented from other stakeholders in different roles, entities, geographies or functional risk areas. GRC solutions need to provide a highly configurable, flexible and secure access control and security model to ensure that risk data is seen only by the right people, in the right context, at the right time.
What reporting challenges does you organization face?
GRC is touching just about everyone these days. A lot has been written about the CFO, CRO, CCO and CIO and their roles in deploying GRC technologies. Mike Rothman at the Daily Incite writes here about the CISO’s role in deploying GRC solutions and makes the point that CISO’s should be focused not on implementing specific controls but on the program (my emphasis added). We could not agree more. A security program identifies the key areas of focus and prioritizes activities accordingly. A bottom-up approach doesn’t necessarily allocate resources to the high risk areas, and, given that most companies are operating with increasingly scare financial resources, a risk-based approach is the best way to allocate resources.
When organizations choose to shift their corporate mission and redefine organizational goals, it is vital that they carefully evaluate the potential risks and fallout from redefined core value propositions and tactics. A case in point is Toyota—a company that has built its reputation on the quality of its product, but in recent years focused its sights on profits.
With the introduction of the Prius to the U.S. market in 2000, it appeared that a strategic risk had paid off, Toyota had created a hybrid engine for the mass market that was a clear success and was even marked in the press by a drove of Hollywood celebrity drivers including Leonardo DiCaprio, Cameron Diaz, Larry David, Billy Joel, David Duchovny, and more.
However, in recent years Toyota has been plagued by a series of escalated vehicle malfunctions. While the entire scope of the financial loss is currently unclear, since 2009 the company has initiated over 14 million recalls worldwide and more than $48.8 million in fines in the U.S. alone. The world’s number one automaker has also temporarily suspended U.S. sales of eight of its top models and halted production in five U.S. plants, an unprecedented step that clearly demonstrates the effort being made to maintain Toyota’s once solid reputation for customer satisfaction.
Overwhelming growth and the pressure to match increasing demand with production to has stifled Toyota’s promise of reliability. It is yet unclear what affect these recalls will have on Toyota’s global standing in years to come, but potential customers will certainly approach the automaker’s brand more tentatively than in decades past.
The lesson here is that all corporations must be prepared to mitigate risk, especially when taking such a precarious step as redefining their core vision and business strategy. Toyota now faces the huge challenge of recreating its customer brand loyalty while at the same time maintaining the momentum that their swollen infrastructure investments require.
In a recent research brief published by Forrester Research, analyst Chris McClean listed his predictions for GRC in 2011 and beyond. #3 on his list is: “New and changing regulations will hinder GRC maturity in the short term.”
We believe that new and changing regulations will segment the GRC market between those vendors that manage regulatory change, and those that do not. As we’ve seen with Dodd-Frank and the countless new and upcoming regulations across finance, healthcare and consumer protection, risk and compliance managers are struggling with an unprecedented onslaught of regulation that as Chris states, will pile on “countless control and reporting requirements onto already complex and taxed compliance departments.”
If you’re considering a GRC solution to assist with this dynamic environment of regulatory change, you would do well to require one that can help you put in a place a programmatic framework for communicating changes to regulations and managing the internal regulatory change process so your business can react quickly. You will also want to consider a solution that can help you manage the interactions, communication and internal work associated with external regulators such as inquiries, submissions, filings, exams and Audits.
According to an IBM study of over 1,200 CFOs and senior finance executives, 62 percent of enterprises with over $5 billion in revenue encountered a major risk event in the previous three years, and when a major risk event did occur, 42 percent were not well prepared. Unlike Sarbanes Oxley and other structured, clearly defined compliance initiatives, building an effective operational risk control environment and culture requires proactive identification and frequent review of potentially harmful events.
GRC industry expert and Corporate Integrity president Michael Rasmussen’s favorite operational risk case study is the Titanic in which as he states, “There are a variety of risks the Titanic faced – overconfidence, poorly manufactured rivets, focus on speed while ignoring the external risk environment, inadequate design, and lack of someone diligently watching for icebergs”. While the Titanic was heralded for its superior safety in engineering design, not all risks were considered holistically. In many organizations today, operational risk continues to be managed in silos, where distributed business units and processes maintain their own data, spreadsheets, analytics, modeling, frameworks, and assumptions.
To learn more, check out the “Ultimate ORM Platform” webinar in which Michael Rasmussen and OpenPages director of product management Patrick O’Brien describe the need for a common, enterprise-wide view of risk and what to look for in an “Ultimate ORM Platform”.
Today we announced the availability of OpenPages 6.0. This release represents a significant new phase in the evolution of GRC and provides organizations with the insight needed to drive business outcomes as well as the ability to manage effectively through the changing regulatory environment. We’re also excited to have completed the first phase of technical integration with IBM with the release of AIX support.
The GRC market developed out of the tactical, departmental deployment of SOX and other compliance and risk management solutions. Companies realized that they could leverage their control testing and risk assessment activities across multiple different oversight functions by consolidating their risk and compliance efforts on a common technology platform. Indeed, we’ve seen very strong ROIs for Enterprise GRC platforms, ROIs driven by this efficiency. The next phase in the evolution of GRC is about insight, using the GRC data to help drive business outcomes.
Here’s an example of how GRC data can be used to drive business outcomes. Imagine a multinational bank that has a subsidiary in France. The compliance team has identified some procedure violations with regard to the handling of customer account data. The audit team has found some major control weaknesses surrounding customer account data, and the operational risk team has observed some KRIs above threshold. Any one of those functions may not escalate their particular findings, but, taken as a whole, the GM in France would be able to see that the business is at great risk of a significant loss. This is the kind of insight that can help drive business performance, in this case avoiding a fine and loss of brand stature.
OpenPages 6.0 will provide better insight through enhanced business intelligence. The power user will benefit from easier report building and in context data presentation through Cognos mash-up services. The business user will benefit from interactive dashboards, and the executive from data syndication through Office and mobile devices. We’ll discuss some of the other new capabilities in 6.0 in subsequent blogs.
A tag is a keyword you assign to make a blog or blog content easier to find. Click a tag to find content that has been assigned that keyword. Click another tag to refine the search further. Click Find a tag to search for a tag that is not displayed in the collection.