The Globe published an interesting article today about a Harvard Business School professor that resigned just before the scandal at Satyam broke. This was no ordinary professor. Krishna Palepu is an expert in corporate governance, control and accounting, and corporate management in emerging markets. In short, the perfect resume for a Satyam board member. So what went wrong?
This is not an isolated incident. In this financial crisis, many good people on boards of struggling companies have been surprised. And we’ll likely see more of that in the months to come. I think it’s overly simplistic to blame the board, and certainly in this case in which Palepu is so obviously qualified. What we see frequently is that internal control systems and risk assessment processes are not mature enough to catch wrong doing or, and this may be more important, change behavior. Companies that are growing quickly, like Satyam, have the most difficulty putting in place the risk management process to catch the kind of fraud perpetrated at the company. My guess is that in the future business process will be designed from the bottom up with risk management in mind. As we’re learning, it’s too hard to do it after the fact, especially for the complicated businesses we’re trying to govern today.
Just attended a great session presented by Matthew Neels, Chief Compliance and Risk Officer at Capital One. Mr. Neels focused on building board interaction and driving board attention to the right areas of risk through an integrated risk management framework. He began with an interesting question, “Should you be using an implicit or explicit framework and how is your board making a decision on that framework?” The correct answer of course is: both are required to effectively manage risk.
He explained how explicit frameworks enable structured board discussions through a consistent and common approach, whereas implicit frameworks rely on “corporate culture and deep experience.”
In his session, Mr. Neels also detailed how multiple stakeholders use frameworks for ‘decision making, reporting and escalation’ and in particular, how the Board uses frameworks to:
Provide an objective yardstick or measure
Create a basis for understanding
Identify situations and areas that need attention
Highlight areas doing well
Help differentiate between expected and unexpected
The discussion then moved to how “driving board attention to the right areas can be difficult” as board reporting is often a “laundry list of potential risks, current issues and decision requests.” He stated, “Without a framework you have everything coming in at once without context.” He then offered several suggestions for preventing information overload:
Specific and quantifiable tolerance measurement is critical to driving board attention to the right areas
Set your risk appetite
Create a risk framework
Determine standard metrics and KRIs
Establish risk tolerances
Establish risk limit
The goal according to Matthew is to establish a “common scale that enables cross-category comparisons and risk aggregation.”
Lesson 3: You cannot afford to overlook or underestimate the correlation of risks.
There were two innovations that fueled the growth in the subprime mortgage market. The first was credit derivatives: in its simplest form, a credit derivative is a contract between two parties in which the seller agrees to compensate the buyer if a loan goes into default. The second innovation involved a process called securitization, which traditionally involved lenders selling their loans to an investment bank. The investment bank “bundled” the loans together and sold pieces of the bundle to pension funds and other investors. The original lenders, having offloaded their loans, could make new ones. The investors acquired a slice of the loan bundle and its interest income without having to go to the trouble of meeting and assessing the borrowers.
The innovation was securitizing not just loans but credit derivatives. It was first applied to corporate loans which tend to have very little correlation (correlation is the degree to which the defaults in any given basket of loans might be interconnected). But then it was carried over to mortgages and more importantly subprime mortgages. The financial services sector industrialized the procedure, and began selling securitized debt and derivatives on an extraordinary scale. The fatal mistake was not realizing that subprime mortgages were highly correlated, especially in an economy where interest rates were rising and housing prices were falling nationwide. Moreover, subprime mortgages had intrinsic flaws (such as issuing loans with escalating interest rates to homebuyers with dubious credit ratings) that inevitably resulted in extremely high default rates.
J.P. Morgan opted not to get into this market, a very smart expression of a cautious corporate risk culture that ultimately saved the company from the disasters others suffered. Fool’s Gold gives a great account of how Morgan risk managers struggled to understand how other banks could be making so much money and covering their risks at the same time. To their credit, they did not enter the market because they understood the risk and did not have a way to mitigate it.
Lesson 4: Do not think that models are anything more than a guide or a compass.
Models are useful but they have limits. They are essential for navigating in the world of modern finance, but they are not infallible, no matter how well crafted they are. Models are only as good as the data that is fed into them and the assumptions that underpin their mathematics. The key simplifying assumption on which the credit derivative models rested was that the future was likely to look like the recent past. New financial innovations have no way to be tested relative to their risk level except by means of computer simulations that use historical data. But there are no statistics that truly represent the environment surrounding the new instrument and, as a consequence, no one really fully knows what are the risks associated with the instrument. This is especially true of risks connected with the “correlation” factor. Hence, innovations can always have “surprises” connected with their usage. Remember that models are only tools and should not be used without human intelligence.
Lesson 5: Regulation is not a panacea.
As the crisis unfolded, there was a lot of blame placed on regulators and regulation. Although the Federal Reserve had the legal authority, they did not have the inclination to regulate the behavior by banks that led to the disaster. Alan Greenspan, head of the Fed, admitted that he had made a ‘mistake’ in believing that banks would do what was necessary to protect their shareholders and institutions. This “absence” of the oversight of the bank regulators has resulted in lots of discussion around new regulations, new regulatory agencies and so on. Tett’s book does an especially nice job in explaining how banks worked to get around capital requirements using the new tools and instruments. Part of the problem connected with the absence of the regulators during this period of time was that the banks worked very hard to expand their use of leverage in ways the policy makers could not see. Of course, this came back to haunt them when the collapse occurred. Financial institutions will always attempt to get around regulations in one way or another because it is profitable to do so. In addition, regulators are always behind what is going on in the industry. This is just the nature of the relationship.
Against the backdrop of Copley Square, Boston on St. Patty’s Day, Yousef Valine, Executive Vice President at First Horizon described the need to focus on non-financial risk and particularly, operational and business risk. GCOR (Global Conference on Operational Risk) 2010 is the fourth annual event hosted by the RMA (Risk Management Association). In his keynote address, Mr. Valine stated that while most believe earnings volatility is a factor of financial risk, earnings volatility can be attributed to non-financial risk 30% of the time – operational risk (12%) and business risk (18%) – versus financial risk 70% of the time. The key message being that business managers need to be operational risk managers at heart and need to foster and facilitate a strong risk-aware culture.
Mr. Valine also outlined how during 2002-2008, losses realized from the following events totaled $42b!
Enron, WorldCom, Adelphia scandals
Late mutual fund trading
Overdraft and credit card excessive fees
Auction rate securities
Of course this makes the Madoff scandal at $65b even more troubling (note: Harry Markopolos will provide an in-depth review of the factors that enabled Madoff and how to prevent similar fraud in the future in his Keynote Address at OPUS 2010). Yousef emphasized that 45% of the loss amount ($19b) was the result of loss events in “Client Products and Business Practices” and that while it represented 45% of losses, the number of events (frequency) only represented 11% of total. Conversely, “Execution, Delivery and Process Management” represented 35% of frequency but only a fraction of the dollars lost. Ultimately, organizations need to consider severity versus frequency when reviewing loss events and mitigation practices.
You’re a CEO, senior manager, or board member watching your once-great company brought to its knees. You imagine yourself on the deck of the Titanic, your world coming to an end—your once confident self embarrassed in front of colleagues, competitors, friends, family, and the larger communities in which you once thrived and were held in such high esteem.
This is the first sentence a just-released book published by John Wiley & Sons. I got my hands on an advance copy, and it is compelling reading. It analyzes how – while facing different circumstances in different industries – common themes underlie why once-great companies have seen their fortunes sink, while others withstand economic turbulence and hazards to continue to grow and reap the rewards of success. But the book is not solely about how to avoid disaster. It highlights how having the right infrastructure enables an organization’s positive qualities to lead to success. This includes what’s needed to avoid the kinds of disasters that can befall any organization, but also essential to identifying opportunities and being positioned to seize them for competitive advantage.
I don’t often recommend books to others, but this one is exceptional. It has a long title: Governance, Risk Management and Compliance – It Can’t Happen to Us: Avoiding Corporate Disaster While Driving Success. I believe the substance stands up to its claim that “unlike other books, this one is not aimed solely at senior managers or solely at members of boards of directors. It’s directed to both, with an added objective of providing insight into the interface between the two.”
You might be asking why Steinberg is spending so much space here touting this book – it is because the book is really that valuable, or does he have some ulterior motive? Well, okay, I’ll fess up – the answer is “both.” Yes, as you may have guessed, I wrote the book. And I apologize for withholding that important fact until now! But I do believe virtually any reader of this blog will greatly benefit from reading the book. And I’m pleased that I’m not the only one who thinks so. Here’s what some others, whose names you might recognize, are saying:
Rick Steinberg is a time-tested expert in this ever more essential field. His refreshing candor in assessing recent shortfalls makes this book a must-read for corporate leaders — Mark R. Fetting, Chairman and CEO, Legg Mason, Inc.
This outstanding book provides a critically important perspective on how risk management can only be truly achieved by aligning culture, strategy, compliance programs, and compensation. It should be must reading for any board member concerned with improving the management of risk — Jay Lorsch, Louis E. Kirstein Professor of Human Relations, Harvard Business School
A comprehensive and insightful examination of corporate governance. A must-read for those of us who are CEOs and serve on public boards — Randall L. Clark, Chairman and CEO, Dunn Tire LLC; former Chairman and CEO, Dunlop Tire North America
Attention directors and officers: Ignore this book at your own peril. Richard Steinberg has crafted a careful, thoughtful approach to managing risks, and it should be required reading for Corporate America — Scott S. Cohen, founder and former Editor and Publisher, Compliance Week
Richard Steinberg’s comprehensive and clearly written work will substantially benefit both new and experienced directors. It will help corporate boards recognize the challenging forces businesses face, as well as the techniques and standards available to intelligently monitor and supervise firms and their senior management. An easy and engaging read, this book should be on the bookshelf of every corporate director — William T. Allen, Director, NYU Pollack Center of Law & Business; former Chancellor, Court of Chancery of the State of Delaware
Richard Steinberg, a respected and time-proven governance hand, has written a most enjoyable and thought-provoking work—an excellent addition to anyone’s governance shelf! — Charles Elson, Edgar S. Woolard, Jr., Chair in Corporate Governance and Director of the Weinberg Center for Corporate Governance, University of Delaware
By the way, the IBM Open Pages people were kind to allow me to use a paper I wrote for them as the basis of one of the chapters. I hope you will consider reading the book, and I trust you will not be disappointed!
We’re pleased to announce that OpenPages and Network Frontiers have partnered to deliver the Unified Compliance Framework (UCF) to the OpenPages customer base. The addition of the UCF content into the OpenPages IT governance solution – OpenPages ITG supports OpenPages’ goal of providing its customers with a holistic approach to managing IT risk and compliance.
The partnership provides strong synergies for our customer base of enterprise GRC professionals, many of whom are looking to OpenPages for IT risk and compliance management. Previewed at OPEN 2009 – the OpenPages European Network Summit recently held in London – the UCF data gives OpenPages customers access to the most comprehensive set of IT policies and controls that cross multiple regulations, thus reducing the time commitment and costs associated with complying with the slew of IT risk and compliance mandates nearly all companies are faced with today. In a survey conducted at OPEN 2009, 93% of organizations stated that within 2-3 years they are likely to converge or coordinate IT risk and compliance with GRC management.
The announcement was well received by industry experts including Michael Rasmussen, President of Corporate Integrity, a GRC strategy advisory firm:
“In today’s economy, wasting valuable resources on costly and time-consuming processes associated with compliance and risk management can be damaging to IT GRC programs. With the UCF enhancements to the OpenPages Platform offering, customers are given the tools to more quickly and effectively comply with a multitude of regulations and from there, can focus more attention on ensuring that their IT GRC programs are sustainable, repeatable and increase transparency across the enterprise.”
Linda Tucci, Senior News Writer at SearchCompliance.com recently wrote “the rap against governance, risk and compliance (GRC) software is that the solutions either fall short of effectively managing the complexity of an enterprise’s compliance programs, or are so complex that enterprises never realize the software’s full capability. For compliance officers who are patient enough to start small while thinking big, however, the right GRC software can help put large, complex organizations on the path to Sarbanes-Oxley Act (SOX) compliance nirvana: a risk-based program optimized by automation.”
In the article, Ms. Tucci refers to Tommy Thompson, IT security compliance coordinator at The Williams Cos. Inc. and long-time OpenPages customer. Mr. Thompson has been using OpenPages ITG for nearly four years, taking a tops-down, risk-based approach to managing William’s financial controls, and is now looking to expand to other areas of compliance.
A key challenge in many IT organizations is being able to allocate resources to the right set of problems. Many times, IT managers will take a bottoms-up approach to IT risk and security and implement controls and procedures without any clear link to the key risks in the business. Williams is a great example of why you should consider a tops-down approach to risk management that supports overall corporate objectives and operates within the corporate ERM framework.
To achieve this, your IT risk and compliance solution should provide a way for your IT organization to link IT risk management activities to the key risks in the business to ensure better business performance against corporate objectives and a more efficient use of resources (capital and operating budget).
This morning’s featured panel discussion at GARP includes several CROs and senior risk practitioners from Morgan Stanley, The Vanguard Group, Credit Suisse and Western Asset Management.
The first topic was VAR. VAR works in “normal markets.” There a question of what is the appropriate time window. One panelist remarked that it would be good to have better regulatory consistency on this issue: should companies be focused on a 1-year or 4-year timeframe, for instance?
VAR tends to distract you from the tails, and one panelist remarked that “you really need to stay focused on the tails” e.g. gap risk, liqudity risk, etc. The panelist continued to say that he’s really focused on the deep downside risk: how much money could the position/desk possibly lose. You have to be very dynamic in thinking about where you can be hit next.
The third panelist asserted, “I think VAR is worthless and pernicious and should banned,” noting that it’s not a coherent risk measure (99.9% VAR doesn’t handle a 1 in 200 year event). Also, the panelist pointed out that it doesn’t encourage diversification. He focused on scenario analysis but said that there is no easy answer.
Another panelist defended VAR as a tool that has its pluses and minuses.
The panelists then turned to the role of risk managers, and their role in predicting the future (in the context of the financial crisis). If risk is lack of information about the future, many companies failed to hedge when there was a very cloudy future (lack of information). One panelist noted that in many cases the risk management failure was more than just the technical capability of know what to do but actually a failure to be able to drive action.
The question of the changing regulatory landscape came up, with one panelist joking that CRO stands for Chief Regulatory Officer now. Another joked that he’s trying to stay away from the regulatory topic because they don’t know whether what they do will be “legal or illegal” under reg reform.
There was agreement that the FDIC has been very successful in carrying out their mission. But one panelist said that in the near term we don’t seem to on a path towards getting an effective systemic risk regulator. Another said that we’re creating systemic risk through regulatory uncertainty.
COSO recently released reports providing guidance in two areas related to risk management. One is Embracing Enterprise Risk Management – Practical Approaches for Getting Started, which suggests ways in which companies, especially smaller ones, can begin a risk management initiative with the objective of ultimately moving to an ERM process. It puts forth “keys to success” in terms of a number of “themes,” beginning with being sure to have support from the top. Theme 2 is building on incremental steps, which includes implementing key practices to gain immediate and tangible results. Theme 3 continues with focusing first on a small number of “top” risks, and theme 4 is leveraging existing resources by utilizing the capabilities of the chief audit executive, chief financial officer or other executive as a catalyst to begin the initiative.
The guidance continues with theme 5, building on existing risk management activities already being performed, for example, by internal audit, insurance or compliance functions, fraud protection/detection measures, or credit or treasury functions. Theme 6 involves embedding risk management into the fabric of the business, and concludes with theme 7’s continuing to update and educate senior management and the board on evolving ERM practices.
The guidance also provides seven “action steps” to support development of an ERM initiative: Seeking board and top management leadership, involvement and oversight; selecting a strong leader for the ERM initiative; establishing a risk committee or working group; conducting an enterprise wide risk assessment and developing a related action plan; inventorying existing risk management practices; developing a communication and reporting process; and developing the next phase of action plans and communication.
As stated in the report, the guidance says the suggested incremental step-by-step approach may be particularly useful to smaller companies, and importantly, the suggested approach is a only a starting point for moving to an enterprise risk management process. I believe the report is well meaning, looking to break down barriers and resistance to embarking on building an ERM process, and as such may be useful to companies considering taking a first step. But that’s all it is. It doesn’t provide guidance on how to design an ERM process, and how it can be effectively implemented throughout an organization. Yes, some of the “steps” are a start, but my concern is that, despite the warnings, companies going down this path will somehow believe they will have installed ERM in their organizations.
In Olympic games terms, with only two entrants, this report gets the silver. The second report on key risk indicators wins the gold – by a good margin. I’ll speak to that report in my next blog posting.
Last week we announced the availability of OpenPages version 6.0, which marks a major milestone in the evolution of the GRC market-from convergence to insight. It also represents the completion of the first phase of our technical integration with IBM. And, the new release will help prepare our customers for managing through regulatory change in the post-Dodd-Frank environment.
Several industry experts have had positive things to say about the news:
“But there is a significant gap between collecting data and actually making it usable. The release of version 6.0 of the OpenPages GRC platform, which IBM acquired last year, is a significant step forward in terms of closing that gap by tightening the integration between OpenPages and the business intelligence (BI) software from Cognos that IBM also acquired back in 2007.”
Industry Analyst Guillermo Kopp wrote a report on 6.0, which details the key benefits and opportunities for the combined solutions of OpenPages and IBM. In regards to integrated risk management he says:
“A centralized governance, risk, and compliance (GRC) platform will help large companies manage various risks across client, location, product, and service domains. For financial firms, integrating financial risk dimensions (e.g., credit, market) will augment the challenge substantially.”
6.0 was also featured as the top story in CMS Wire’s GRC Roll-up
Businesses have always been engaged in managing risk, but it has taken an unprecedented wave of regulatory oversight to convince many organizations how inadequate their risk management policies and procedures really are.
Firms should have completed or be in the process of completing a detailed gap analysis to identify any shortfalls in expected compliance with the emerging Solvency II requirements, as they bear on their operations.”
A gap analysis should evaluate the current state of an insurer’s risk management system against current risk standards and the desired state. The organization then must develop a roadmap on how to achieve that desired state. Organizations need to evaluate their entire risk management system and how all of its risk areas are being managed.
Given that executive management is charged with ownership of operational risk management and the need to embed it within the organization, many companies are turning to integrated risk management solutions to better understand and proactively manage the risks that can impact the business.
For more information on Solvency II and meeting the Solvency II operational risk challenge, check out this white paper.
Brandishing new authority thanks to the Dodd-Frank Act, the SEC was quick to act on an agenda item that had been on the table for 30 years. Yesterday, the SEC approved a ‘Proxy Access’ rule that allows shareholders to place nominations for board member seats on the annual proxy ballot of public companies. The rule applies to shareholder groups who have owned greater than 3% of a public company’s stock for at least 3 years.
SEC Chairman Mary Shapiro succeeded where her two predecessors had failed in gathering a 3-2 vote in favor of the rule which was divided along party lines as both Republican members objected. While this is a win for investor groups who now have increased influence over board make-up, there are no provisions in the rule for smaller, individual investors who own less than 3% of the stock and have held the stock for less than 3 years.
One thing that is certain, the new rule reflects the anger and backlash of shareholders who feel that boards of directors were not acting in the shareholders’ best interest when taking highly leveraged and risky positions that led to the 2008 financial meltdown. As Rick Stenberg pointed out in his recent blog, this indicates a clear trend toward increasing shareholder power and of companies and their boards ‘opening channels of communication with shareholders.’ As these channels are opened, an information architecture that provides full transparency into risk exposure and enables information sharing will help to fill the communication gap between the Board and shareholders.
With the passing of the Dodd-Frank Wall Street Reform and Consumer Protection Act, many companies are bracing for the regulatory onslaught. The problem is that few of the provisions in the legislation take effect immediately, and what we’re really facing is much rulemaking from new (e.g. the Consumer Financial Protection Bureau) and existing regulatory bodies. This rulemaking will take place over the next five years, with the bulk of the activity in the next two. So how should financial services companies position themselves?
It is clear that a major theme of the legislation is greater transparency into risk exposure across the financial system. Basel II can be faulted for taking an institutional approach to risk management, and the financial crisis of 2008 clearly revealed gaps in the way regulators assessed and managed risk across institutions. This wave of regulatory rulemaking will try to address those gaps, and, in fact, Treasury Assistant Secretary Michael Barr in a recent speech at the Chicago Club made several references to Basel III, an indication that regulators worldwide will be coordinating on liquidity and capital standards to manage systemic risk.
Regardless, regulators worldwide will still be collecting risk exposure data from institutions. As a first step, institutions can put in place an information architecture that can quickly an accurately serve up risk exposure information, and all financial services institutions need to work on this. The Dodd-Frank law, for instance, creates a Financial Stability Oversight Council that will have the authority to instruct the Federal Reserve and other agencies to collect all sorts of risk exposure data. Most companies know where their current gaps are; these need to be addressed immediately.
The scope of the rulemaking also suggests that we’re going to be in a very dynamic regulatory environment for a long time. As such, covered companies would do well to make sure this information architecture can adapt to change over time. Implementations of static frameworks for regulatory compliance could be obsolete before the project is finished! Any solution must be able to adapt and extend over time.
Finally, as companies put in place this information architecture to surface enterprise risk exposure, thinking about interdependencies will be critical to reduce cost. Inevitably, there will be much overlap between the information requests from different regulatory agencies. Your ability to handle these requests, as well as those from the business, with a minimal set of reports will save you time and resources. An integrated risk and compliance framework can reduce the disparate databases and reporting structures. Of course, you may not be able to consolidate everything onto a single, integrated system, but thinking about pairwise combinations is a good start.
Julian Parkin, Group Privacy Programme Director at Barclays, recently delivered the day two keynote address at OPUS 2010 – the OpenPages User Symposium. In his keynote address, Parkin discussed how Barclays has leveraged OpenPages for its risk management initiatives and how the flexibility of OpenPages’ technology has been harnessed to drive sustainable improvements across evolving risk types.
After his keynote, I had the opportunity to interview Julian and discuss his experience at OPUS 2010 and as a member of the OpenPages user community.
Even in the wake of sweeping deregulation of the energy industry, few companies face as much government oversight as utilities. Power generation and distribution companies are subject to a maze of regulatory oversight, including state agencies and the federal agencies, the Federal Energy Regulatory Commission (FERC), the North American Electric Reliability Corporation (NERC), the Nuclear Regulatory Commission (NRC), the Environmental Protection Agency (EPA) and the Occupational Safety and Health Administration (OSHA).
As Managing Director of Corporate Compliance at Duke Energy, Tom Wiles knows first hand the challenges of operating a business in a regulated industry. Duke Energy – a Fortune 500 company traded on the New York Stock Exchange – is one of the largest electric power companies in the United States delivering energy to approximately 4 million U.S. customers.
In a Compliance Week Webinar titled “Proactive Ethics and Compliance Programs in a Regulated World”, Tom Wiles discusses how a “proactive partnering” and “risk-focused coverage” approach has delivered positive results for Duke. He states that in order to create an effective and efficient enterprise-wide ethics & compliance infrastructure, the Ethics and Compliance Manager needs to establish expectations, communicate expectations, monitor behavior, report results and provide continuous improvement.
If you’d like to learn the key steps your organization can follow to integrate disciplined ethics and compliance management into your business and hear about the value organizations are receiving from effective programs, check out this Webinar.
On a daily basis, we’re out speaking with prospects, customers, analysts, press, and thought leaders in the GRC/ERM space. Over the course of the last year, we’ve heard many myths about risk management, and, over the course of the next couple weeks, we’ll address these myths. But we thought that we would give you a taste of what’s to come, so here is a list of the top 10 myths in risk management. Please feel free to add your own in the comments section. This list is certainly not exhaustive!
1. IT Risk Management = Information Security
2. CIOs Have Embraced GRC
3. A Rigid, Standardized Approach Is Best
4. You Can Only Manage Risk from the Center
5. You Can Manage Risk and Compliance in Spreadsheets
Today we announced the availability of OpenPages 6.0. This release represents a significant new phase in the evolution of GRC and provides organizations with the insight needed to drive business outcomes as well as the ability to manage effectively through the changing regulatory environment. We’re also excited to have completed the first phase of technical integration with IBM with the release of AIX support.
The GRC market developed out of the tactical, departmental deployment of SOX and other compliance and risk management solutions. Companies realized that they could leverage their control testing and risk assessment activities across multiple different oversight functions by consolidating their risk and compliance efforts on a common technology platform. Indeed, we’ve seen very strong ROIs for Enterprise GRC platforms, ROIs driven by this efficiency. The next phase in the evolution of GRC is about insight, using the GRC data to help drive business outcomes.
Here’s an example of how GRC data can be used to drive business outcomes. Imagine a multinational bank that has a subsidiary in France. The compliance team has identified some procedure violations with regard to the handling of customer account data. The audit team has found some major control weaknesses surrounding customer account data, and the operational risk team has observed some KRIs above threshold. Any one of those functions may not escalate their particular findings, but, taken as a whole, the GM in France would be able to see that the business is at great risk of a significant loss. This is the kind of insight that can help drive business performance, in this case avoiding a fine and loss of brand stature.
OpenPages 6.0 will provide better insight through enhanced business intelligence. The power user will benefit from easier report building and in context data presentation through Cognos mash-up services. The business user will benefit from interactive dashboards, and the executive from data syndication through Office and mobile devices. We’ll discuss some of the other new capabilities in 6.0 in subsequent blogs.
A recent client discussion reminds me of an article I came across a few years back with important implications for dealing with risk – or rather a risk that materializes into a major problem. The article, “What Organizations Don’t Want To Know Can Hurt,” focuses on events surrounding the College Board when it learned of extensive errors scoring its SAT tests, and provides a good example of not to do.
The company’s president reportedly said that finding the specific cause of the failure “did not really matter,” but rather what’s important is to ensure that improved controls catch future problems. His position was supported by the engagement leader of a consulting firm hired by the company, saying that dissecting past problems is not necessary either to ensure that the scoring system works better in the future or there is a good safety net to catch errors. He goes on, “You can do both without knowing whether it was rain that made the papers wet, or whether someone spilled a cup of coffee…[and] if we tried to brainstorm everything that could go wrong, we’d be here for years – for a lifetime. But if controls are in place to identify problems, and rescore tests that were misscored, that’s what you’re really looking for.”
These statements are fascinating – that there’s no need either to look back at why something went wrong because it’s unnecessary, or to dig deeply into what could go wrong because it would take too long. It suggests that problems in test scoring – which would certainly seem to be central to the company’s credibility and indeed its sustainability – are okay as long as they ultimately are found and test results rescored. Simply “catching future problems” by “rescoring tests” means that the company is satisfied detecting major problems with scoring after they occur, rather than taking steps to prevent such problems in the first place. I wonder what users of SAT scores think about that!
If you’re smiling at this you’ve got company. Cleary, looking neither backward nor forward is not a viable option. And, doing one or the other also is not the answer. Rather, it’s necessary to do both. Only by getting behind what went so wrong can management feel comfortable it understands what risks continue to exist, and only then is it positioned to look at what additional risks need to be the focus of its attention going forward.
It doesn’t take a genius to know that when a problem rears its ugly head it essential to find out why. The article talks about fields like aviation and medicine that conduct investigations to find out exactly what went wrong, to learn from often deadly mistakes and to improve processes and protocols. The National Transportation Safety Board does so focusing primarily not on casting blame but on making things better. Similarly, many hospitals hold mortality and morbidity conferences to analyze and learn from mistakes. Many businesses do that as well, learning from what went wrong. They don’t choose between learning from the past and working to make things better. They do both, with one supporting the other. And no, it doesn’t take “a lifetime” to find out what caused a major problem or to identify the source of the next potential disaster.
According to an IBM study of over 1,200 CFOs and senior finance executives, 62 percent of enterprises with over $5 billion in revenue encountered a major risk event in the previous three years, and when a major risk event did occur, 42 percent were not well prepared. Unlike Sarbanes Oxley and other structured, clearly defined compliance initiatives, building an effective operational risk control environment and culture requires proactive identification and frequent review of potentially harmful events.
GRC industry expert and Corporate Integrity president Michael Rasmussen’s favorite operational risk case study is the Titanic in which as he states, “There are a variety of risks the Titanic faced – overconfidence, poorly manufactured rivets, focus on speed while ignoring the external risk environment, inadequate design, and lack of someone diligently watching for icebergs”. While the Titanic was heralded for its superior safety in engineering design, not all risks were considered holistically. In many organizations today, operational risk continues to be managed in silos, where distributed business units and processes maintain their own data, spreadsheets, analytics, modeling, frameworks, and assumptions.
To learn more, check out the “Ultimate ORM Platform” webinar in which Michael Rasmussen and OpenPages director of product management Patrick O’Brien describe the need for a common, enterprise-wide view of risk and what to look for in an “Ultimate ORM Platform”.
If nothing else, the financial crisis of 2008 has driven home the need to improve reporting to the organization regarding risk posture and exposure. As we look to 2010 and beyond, risk and compliance processes will no doubt evolve to meet changing business and regulatory requirements. Coming in at #8 on the 2010 GRC Wish List is “Strong Reporting with Easy-to-Use Formatting.” While the value of strong reporting is clear, a few challenges remain:
Cross-domain Reporting – With the large number of risk and compliance initiatives underway at organizations today, users are struggling to deliver comprehensive enterprise risk management. Users need a way to understand and manage their risk exposure across the numerous risk and compliance domains through enterprise risk assessments and integrated reporting. GRC solutions that are developed independently in silos, produce application specific reports that only reference data local to that application and provide an incomplete picture of enterprise risk exposure.
Multiple Reporting Regimes – Companies are struggling to meet the needs of an increasing number of reporting regimes. For instance, a financial services company may have adopted the CoBIT framework for IT management, adhere to FFIEC best practices guidelines and may be looking to establish an Anti-Money Laundering (AML) program. The key challenge facing these organizations is in establishing a risk framework that integrates multiple reporting regimes and provides visibility into the state of key risks across the enterprise.
Linking Oversight with Operating Environment – Effective “governance” implies effective oversight and reporting. To deliver effective oversight, GRC professionals need to be able to link their oversight and reporting to their operating environment by drilling-down to view control status at the asset level.
Profile-based Reporting – Risk management professionals, compliance professionals and auditors frequently have access to highly confidential and sensitive information. Oftentimes, that information needs to be segmented from other stakeholders in different roles, entities, geographies or functional risk areas. GRC solutions need to provide a highly configurable, flexible and secure access control and security model to ensure that risk data is seen only by the right people, in the right context, at the right time.
What reporting challenges does you organization face?
A tag is a keyword you assign to make a blog or blog content easier to find. Click a tag to find content that has been assigned that keyword. Click another tag to refine the search further. Click Find a tag to search for a tag that is not displayed in the collection.