On a daily basis, we’re out speaking with prospects, customers, analysts, press, and thought leaders in the GRC/ERM space. Over the course of the last year, we’ve heard many myths about risk management, and, over the course of the next couple weeks, we’ll address these myths. But we thought that we would give you a taste of what’s to come, so here is a list of the top 10 myths in risk management. Please feel free to add your own in the comments section. This list is certainly not exhaustive!
1. IT Risk Management = Information Security
2. CIOs Have Embraced GRC
3. A Rigid, Standardized Approach Is Best
4. You Can Only Manage Risk from the Center
5. You Can Manage Risk and Compliance in Spreadsheets
One wonders what the heck was going on at Daimler, maker of the high quality, classy Mercedes Benz automobile. In case you missed it, media reports depict Daimler as admitting to having engaged in a massive and pervasive bribery scheme, and agreeing to pay $185 million to settle charges. And this wasn’t information the company volunteered, but rather the result of a lengthy government investigation.
And it wasn’t just a one-time event – not by a long shot. Rather, hundreds of bribes totaling tens of millions of dollars were paid in no less than 22 countries over a ten year period. In a number of instances so called “cash desks” were used to pay currency directly to government officials. In other cases the company used foreign bank accounts of shell companies to hide payments. Daimler reportedly also jacked up invoices for cars to generate still other payments.
What’s perhaps most disturbing is that the reports say this wasn’t a lower and middle management activity, but involved “important executives” including heads of overseas sales divisions, and more unsettling, even the company’s internal audit office. The Department of Justice complaint speaks to Daimler’s “longstanding violations” of bribery rules and a “corporate culture that tolerated and/or encouraged bribery.” The reports also says the complaint points to “a lack of central oversight over foreign operations.”
It’s well known the Justice Department in the U.S. is pushing hard on possible Foreign Corrupt Practices Act violations, and European regulators are increasing rule making and enforcement as well. And internal controls to help deal with the risk of improper payments are well known. Of course, if senior managers are turning a blind eye, or worse yet encouraging such payments, then all bets are off. For readers with responsibility for dealing with these kinds of issues, a company’s corporate culture, including the tone at the top of the organization, is the first place you’ll want to focus attention. And then you’ll want to look at the kind of risk management and compliance processes in place, and how they’re working, to hopefully gain comfort in your organization that anti-bribery indeed is under control.
If nothing else, the financial crisis of 2008 has driven home the need to improve reporting to the organization regarding risk posture and exposure. As we look to 2010 and beyond, risk and compliance processes will no doubt evolve to meet changing business and regulatory requirements. Coming in at #8 on the 2010 GRC Wish List is “Strong Reporting with Easy-to-Use Formatting.” While the value of strong reporting is clear, a few challenges remain:
Cross-domain Reporting – With the large number of risk and compliance initiatives underway at organizations today, users are struggling to deliver comprehensive enterprise risk management. Users need a way to understand and manage their risk exposure across the numerous risk and compliance domains through enterprise risk assessments and integrated reporting. GRC solutions that are developed independently in silos, produce application specific reports that only reference data local to that application and provide an incomplete picture of enterprise risk exposure.
Multiple Reporting Regimes – Companies are struggling to meet the needs of an increasing number of reporting regimes. For instance, a financial services company may have adopted the CoBIT framework for IT management, adhere to FFIEC best practices guidelines and may be looking to establish an Anti-Money Laundering (AML) program. The key challenge facing these organizations is in establishing a risk framework that integrates multiple reporting regimes and provides visibility into the state of key risks across the enterprise.
Linking Oversight with Operating Environment – Effective “governance” implies effective oversight and reporting. To deliver effective oversight, GRC professionals need to be able to link their oversight and reporting to their operating environment by drilling-down to view control status at the asset level.
Profile-based Reporting – Risk management professionals, compliance professionals and auditors frequently have access to highly confidential and sensitive information. Oftentimes, that information needs to be segmented from other stakeholders in different roles, entities, geographies or functional risk areas. GRC solutions need to provide a highly configurable, flexible and secure access control and security model to ensure that risk data is seen only by the right people, in the right context, at the right time.
What reporting challenges does you organization face?
In a recent research brief published by Forrester Research, analyst Chris McClean listed his predictions for GRC in 2011 and beyond. #3 on his list is: “New and changing regulations will hinder GRC maturity in the short term.”
We believe that new and changing regulations will segment the GRC market between those vendors that manage regulatory change, and those that do not. As we’ve seen with Dodd-Frank and the countless new and upcoming regulations across finance, healthcare and consumer protection, risk and compliance managers are struggling with an unprecedented onslaught of regulation that as Chris states, will pile on “countless control and reporting requirements onto already complex and taxed compliance departments.”
If you’re considering a GRC solution to assist with this dynamic environment of regulatory change, you would do well to require one that can help you put in a place a programmatic framework for communicating changes to regulations and managing the internal regulatory change process so your business can react quickly. You will also want to consider a solution that can help you manage the interactions, communication and internal work associated with external regulators such as inquiries, submissions, filings, exams and Audits.
The noon panel at GARP discussed risk and performance management, with a diverse set of participants, including representation from Hess, Swiss Re, and Vanguard.
Kanwardeep Ahluwalia from Swiss Re noted that many companies are going through a derisking process right now. However, Ahluwalia cautioned that companies need to be cognizant of how much they are paying to reduce their risk. In many cases, especially now, it may make more sense to manage the risk internally to maximize performance.
What is the role of risk management in the budget process? Panelists suggested that during the budgetary process risk management should step up and call out inconsistencies between risk and performance goals. The moderator, Kevin Buehler from McKinsey, noted that many times he has found that companies in trouble have misaligned expectations between risk and reward. For instance, a company may have aggressive revenue goals to take share in a particular (emerging) market, but those goals may in conflict with a risk adjusted return on capital. However, he said that typically risk management does not normally win out in a conflict in which the CEO is on the other side, but you have to force the dialog.
Jonathan Stein from Hess argued that risk management needs to move beyond the Be Careful mantra and move into recommendations for risk mitigation. He talked about the importance of developing scenarios that help define triggers risk mitigation actions.
In general, the message from the panelists was that deeper interaction with the business allows risk managers to be more effective. This includes everything from designing risk management processes around the way the business makes money to prompting a dialog at the executive level when risk and performance expectations are not aligned.
The Institute of Internal Auditors 2009 General Audit Management Conference is coming up and should be quite timely given the evolving role that Audit is playing in providing an independent assessment of enterprise risk and governance. The conference has some intriguiging sessions including:
As you can see, internal audit has evolved from its traditional role of record examination and identification of policy violations to a more modern, consultative approach aimed at risk mitigation. As part of this evolutionary process, internal auditors have also focused more of their efforts on the risk assessment process and a top-down approach to audit scoping.
One of the key roadblocks to an integrated approach was the sheer complexity of data gathering and management. In the past, it represented a tremendous amount of effort for internal audit to collect relevant information and to govern access to that information securely. A centralized technology platform for identifying, assessing and monitoring risk and controls presents a unique and unprecedented opportunity to help the business focus on making risk decisions based on management’s risk appetite and tolerances. This common framework and process can make the business more predictable in meeting financial and management objectives and can help managers anticipate major risk and control problems of the future.
As a partner with the business in managing risk, internal audit should be a driving factor in evaluating technological and process-based changes and evolving the organization’s risk management practices.
If you’re planning on attending IIA GAM March 16-18 in Washington, DC please visit the OpenPages booth. And don’t forget to enter the raffle for a Flip handheld video recorder. Or, to learn more download our informative white paper, Internal Audit and its Evolving Role in ERM.
Yesterday, we announced a joint business relationship with PwC. This is the result of our closer alignment in the market for GRC solutions. We’re proud to be associated with such a great firm: with over $26 billion in revenue and 163,000 people in 151 countries, PwC has a strong global presence. We’ve found that PwC also has a strong presence at our financial services customers, and, given the challenges facing that industry, we think there’s a great opportunity to deliver joint solutions to our common customers.
OpenPages’ solutions inherently deliver a risk-based approach to GRC. This approach aligns perfectly with PwC’s top-down approach to GRC. They’re always asking the question, “What are your business objectives and what are you doing to achieve them?” We find that many service providers in the GRC business tend to take a bottoms up approach, implementing a comprehensive controls infrastucture, for instance, without making sure that the right controls are being implemented or that the right business objectives are being met. Given the financial constraints facing many customers, allocating resources effectively is a critical success factor for GRC programs, and we look forward to working with PwC to help our joint customers operationalize those programs for better business outcomes.
When I took my first class on financial engineering as a naïve applied mathematics undergrad, we started with portfolio selection and the capital asset pricing model. In my typically confident (some might say arrogant) fashion, I decided I knew more than the professors, and that we should be focused on maximizing returns, rather than with the almost religious deference we were giving the notion of risk. A few case studies on LTCM (and modern hedge funds) brings into sharp relief the importance of risk. And yet, years later, I did it again. A few years ago, I claimed to be an expert on risk. In actuality, I was an expert on security, who knew very little about risk. In fact, I knew so little about risk, I had no idea how little I knew about it.
I come from the information security space. I spent a number of years there, and throughout my tenure, I continually abused the word “risk.” Oh, I had no idea I was doing it. In fact, 99% of my colleagues in security were doing the same thing. The fact of the matter is, the cloak and dagger security types, self-professed “security experts,” continue to misuse the word. It wasn’t until I really tried to peel back the onion and build a product that managed risk for the security space that I realized that what often passes for risk management in IT is actually control management and compliance. True risk management deals with uncertainty around unexpected losses – looking at consequences in business terms and weighing those against potential reward. Information security management, as currently practiced, is in most regards a necessary, but not sufficient, component of information risk management.
A little experience in different disciplines and verticals can make all the difference in the world. Financial Services is arguably the most sophisticated industry when it comes to managing risk. From a credit and market risk perspective, the average investment bank or hedge fund has teams of Ivy League PHDs running thousands of financial models 24×7 with a virtually unlimited budget on server farms with more firepower than NASA. From an operational risk perspective (much more analogous to information security), these same banks have taken the lessons they’ve learned in years of managing credit and market risk and have applied them to the more esoteric. Where they lack the hard, quantitative data of their peers, they’ve adapted clever ways of working around it.
Information security practitioners, on the other hand, are great at managing compliance by checklist. We have impressive standards, frameworks and regulations like ISO 17799, PCI, BITS, CobiT and a whole slew of others that are pretty good at spelling out a series of “thou shalt have’s.” NIST 800-30 even gives a set of guidelines for doing risk management for IT systems. So what’s missing?
Information Security standards and guidelines are a good thing, but they can be very easily misused and abused. They encourage cookie cutter thinking, and miss the bigger point – no two industries are the same. No two companies within an industry are the same. No two geographies within a company are the same. No two data centers within a company geography are the same. No two services run on hardware in the same data center are the same. No two business processes serviced by the same service are the same. And guess what? Depending on the time of the year, the needs of your customers and other factors, the same business process may have different needs on different days!
OK, clearly mapping all of those dependencies is hard. So, most organizations give a data sensitivity rating to their information assets. Maybe they get cute, and provide a “platinum, gold, silver, bronze” type scheme. Maybe they even set some arbitrary control thresholds based on this classification. So why do we have multiple large company executives going on record claiming that PCI compliance is too hard? Two things here – first, PCI is an ISO 17799 derivative. Second, with sensitive customer data sitting on these information assets, shouldn’t they have already warranted a platinum rating? Logically, it should follow that in any 17799 shop (many), information assets should be close to PCI compliant.
In reality, however, we all know that InfoSec groups are asked to do way too much with increasingly smaller budgets. It’s difficult to get management to buy into the need for information security, which exacerbates the problem. As such, it’s critical that we work smarter, not harder. If only there was a tool that let us do that…
Enter risk management. Throwing a set of checklist controls at our enterprise architecture is not risk management. Theoretically, it should result in some risk reduction, granted, but that’s not an optimal return on investment. Imagine running a hedge fund without a complex risk model for every conceivable position – running countless layers deep. You’d be insolvent within a month.
So what are the roadblocks to risk management in information security? The biggest is a lack of business context. For years, IT has talked about aligning to the needs of the business. It’s still a challenge. The fact of the matter is, it’s tough getting C-level executives to prioritize business objectives and processes amongst themselves (think politics, agendas, silos), much less as a deliverable for IT (who they see as less and less of a strategic asset). And even if they could agree on a real priority for those corporate objectives, navigating the rat’s nest down of dependency from the objective to the asset level would prove difficult for most organizations. As a result, it’s impossible to prioritize the consequence of an attack on a specific tangible thing.
That starts to cover the consequence side of things. How about impact? Actuaries have tables for flood rates, financial engineers have volatility metrics for options calculations. Unfortunately, it’s very difficult to compile reliable loss data on the IT side of the house. Difficult, but not impossible. We safeguard that information like it is customer data. But, if you look at our peers managing operational risk, there several initiatives around sharing anonymous loss data. Banks collaborate on internal loss metrics to quantify the costs and probability of fraud, malfeasance, etc. Back to security, TJX set aside a penny a share to cover their data breach, and current press estimates range from $12 – $25 million. (Many experts think these estimates are overwhelmingly optimistic, by the way). Are the metrics we have available perfect? Not even close. But qualitative factors are a good stopgap to supplement the limited quant data we have.
Don’t get me wrong – we have some brilliant people working information security. Brilliant people doing amazing things with limited budgets in a game with stakes that would make a high roller at the Bellagio head for the nickel slots. What we need is to buy them some leverage. Risk Management help information security professionals make better decisions faster, helping practitioners do more with less. Risk Management is a great tool to help information security practitioners work more efficiently – just don’t confuse the two.
Recently purchased by The Bank of Tokyo Mitsubishi (the 2nd largest banking group in the world), Union Bank, N.A. out of San Francisco has been asked to lead the way for the entire organization with respect to adopting Basel II and the advanced measurement approach for operational risk measurement.
Marty Blaauw, Senior Vice President of Operational Risk at Union Bank stated, “At Union Bank, we are striving to use the advanced measurement approach for operational risk measurement and OpenPages provides an integrated operational risk management framework that will assist us in this goal. We are confident that OpenPages’ solution will allow us to streamline our operational risk management and measurement process and provide the integrated risk reporting and dashboards being requested at the executive level.”
With $86 billion in assets under management and 340 banking offices in California, Oregon, Washington and Texas as well as two international offices, this is a strategic initiative with enterprise-wide implications. Union Bank purchased licenses for the entire OpenPages Platform and selected OpenPages ORM as the operational risk system of record for managing risk assessments, key risk indicators (KRIs), issue management and scenario analysis, as well as integrated risk reporting.
Managing compliance obligations within a large organization often involves the coordination of many compliance professionals from the CCO to Chief Ethics Officer, CIO/CTO, Chief Privacy Officer, Director of Corporate Compliance, IT Director, Legal counsel, HR Director and Supply Chain Manager. There are also many challenges in managing fragmented and disparate compliance obligations from financial controls to IT compliance to environmental health and safety.
Consequently, many of these initiatives are being managed in silos which limit a compliance officer’s ability to effectively train, communicate, monitor and ultimately measure the status of compliance obligations across the enterprise with any level of confidence. Effective compliance programs leverage a more practical, cross-regulatory approach to managing compliance that can alleviate increasing costs and complexity. This more strategic approach to compliance reduces overall costs as you leverage common assessment processes and technology infrastructure.
To learn how you can break down the compliance silos and implement a cost-effective approach to managing compliance, check out a recent Compliance WeekWebinar presented by Michael Rasmussen, President of Corporate Integrity, and Julian Parkin, Group Privacy Programme Director at Barclays.
We’re nearing the second anniversary of SAP’s purchase of Virsa and their entry in a serious way to the GRC space. Last week, they made a series of announcements about their GRC products, which now extend beyond industry apps and the SOD/access control arena to other areas of GRC. Business Finance has a new GRC blog and covered SAP’s announcements. John Cummings notes that "the sheer scope of GRC offerings from SAP and other enterprise software providers is impressive, and point-solution vendors will need all of their agility to respond."
Certainly, we wouldn’t argue with that statement, but we would say that one of the most important parts of a GRC solution is how it fits into the rest of the system. While SAP (and maybe Oracle) might be able to make the argument that you should be single threaded on SAP, the rest of us cannot make that argument, so we have to play nice in the sandbox and 1) fit into the existing (heterogeneous) environment and 2) work across silos. This latter point is critical because what the enterprise GRC platform vendors are delivering is a way to see risk across the organization. When SAP demonstrates their risk management application, they focus on controls associated with a sales process; that’s a very different solution, a tightly integrated top-to-bottom solution, but not very good at crossing silos. And, as I blogged earlier in the week, the real value in risk management comes from relating risk together at the top of the business. Of course, we’re not an ERP vendor, but you have to wonder if you want the fox guarding the hen house.
Businesses have always been engaged in managing risk, but it has taken an unprecedented wave of regulatory oversight to convince many organizations how inadequate their risk management policies and procedures really are.
Firms should have completed or be in the process of completing a detailed gap analysis to identify any shortfalls in expected compliance with the emerging Solvency II requirements, as they bear on their operations.”
A gap analysis should evaluate the current state of an insurer’s risk management system against current risk standards and the desired state. The organization then must develop a roadmap on how to achieve that desired state. Organizations need to evaluate their entire risk management system and how all of its risk areas are being managed.
Given that executive management is charged with ownership of operational risk management and the need to embed it within the organization, many companies are turning to integrated risk management solutions to better understand and proactively manage the risks that can impact the business.
For more information on Solvency II and meeting the Solvency II operational risk challenge, check out this white paper.
Now that healthcare reform has passed, the Obama administration has turned its focus on financial services regulatory reform. Today, Obama gave a speech on the administration’s position and priorities. The House has already passed a bill, and the senate may take up one this week, largely authored by Senator Dodd. A major sticking point has been the fund to facilitate an orderly liquidation (labeled a “bailout” fund by some critics) and the way to handle derivatives, but Senator Grassley’s vote yesterday to approve a senate committee’s plan for derivatives trading gave new momentum to a bipartisan effort on regulatory reform, and it looks increasingly likely that in the coming months (if not weeks) we’ll see a major overhaul of the regulations that govern Wall Street.
Further, the SEC demonstrated late last week that they are one government agency that is going to take their oversight responsibilities seriously. Their civil suit against Wall Street giant Goldman Sachs sent shock waves through the financial services sector. It’s clear that there’s a major shift on in the way regulators are regulating. Whether or not you agree with the merits of the suit, SEC Chair Shapiro is sending a message to the industry that they are going to be watching closely.
A common theme here is transparency: the SEC argues that Goldman didn’t provide adequate disclosure about the nature of the Abacus investment opportunity; Obama argues today that “reform would bring new transparency to many financial markets.” We also see this as a common theme with our customers–they are looking for greater transparency into the risks in their business. We see this push for regulatory reform and increased oversight as driving the demand for a new information architecture that provides this transparency to managers, executives, board members and regulators. Of course, many companies are finding that it can help you run your business better, too.
In 1958, IBM researcher Hans Peter Luhn first introduced the term business intelligence (BI) in an article he contributed to the IBM Journal. He described business intelligence as "the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal." Cleary, business intelligence plays a key role in risk management providing executive level decision-makers the ability to look across all categories of risk (in different business units, categories, geographies etc.) and providing a global view into business performance and risk exposure.
Business Intelligence and risk management are linked on two levels. First, when used in conjunction they provide executive level transparency into risks within the organization, and secondly, they provide product planners and corporate strategists a risk-adjusted performance view.
If you’re considering a risk management solution, you might want to listen to a recent IT-Finance Connectionpodcast on the role of business intelligence in risk management. Additionally, here are some tips on leveraging risk management practices to provide stronger and more introspective BI analysis:
Identify and eliminate risk factors and exposure points within the organization to create a strong foundation/base.
Examine opportunities related to taking strategic risks within the business (new products, launches into new geographies/industries, M&A, etc.).
Asses the potential risk exposures tied to moving forward with strategic company direction and initiatives.
Apply this risk management analysis to your overall business intelligence framework to provide executive management/management board with a clear view of not just the company’s risk exposure (and where risks have been eliminated altogether) but where there is an opportunity to take strategic risks with the added layer of business intelligence needed to make smarter business decisions.
We’ve blogged frequently on the topic of IT risk management, most recently here. With recent events highlighting the need for better risk management, now, more than ever, people are thinking about how to improve their processes and technology for supporting their risk management programs. Ben Worthen over at the WSJ BizTech blog has written recently that tech departments shortchange risk management. We couldn’t agree more.
The basic problem, as Symantec’s Samir Kapuria notes via Ben’s post, is that IT tends to think of risk management as a project vs. a continuous process. This may be the result of the fact that most IT infrastructure vendors sell risk management for project delivery but don’t really have solutions that allow IT to take a risk-based approach to all their activities. It may also be the result of IT having to keep everything running, all the time. Regardless, unless you start with a top-down approach using a risk assessment process, identifying which vulnerabilities match to the most significant potential business impacts, you will never be able to allocate IT resources appropriately. Once you understand how realized risks will impact the business, you can take a truly risk-based approach to IT management. Obviously, we have a horse in this race, but any effort to tackle the IT risk management challenge must involve the business and identify the key risks therein.
You’ve surely heard about Goldman Sachs’ settlement with the SEC on fraud charges related to the firm’s disclosure, or lack thereof, of a collateralized debt obligation that purportedly was designed to fail. The $550 million to be paid may seem like a lot, and indeed is said to be the largest SEC fine against a Wall Street bank, but many observers maintain that the firm got off easy, especially when the amount is viewed in light of Goldman’s revenue and profits.
But there’s another way in which Goldman seems to have dodged a bullet. While other companies have had to accept a government appointed monitor working inside the organization, Goldman won’t be subject to such meddling. In my mind, avoiding this kind of intrusive interloping is just as big, if not more so, than the manageable size of the fine – especially for a firm as sophisticated as Goldman Sachs.
There is, however, an annual requirement for filing a certificate, for three years, that Goldman is in compliance with the terms of the settlement. Of considerable interest is that the certificate is to be signed by the firm’s general counsel or global head of compliance. Some pundits are saying this makes eminent sense, while others take the position that it should be the CEO or board, who are ultimately responsible for ensuring compliance, to be putting their signature on the dotted line. In any event, all this puts more of a spotlight on chief compliance officers and compliance programs. One former chief compliance officer reportedly said the SEC “seems to be attempting to elevate importance of the chief compliance officer role,” while an active compliance chief says the settlement shows that compliance officers “are becoming true C-suite level executives.”
There’s a lot going on here, and we can expect to see the focus on compliance officers ratcheting up further going forward.
As companies prepare for Solvency II, many are struggling with the right approach to address key aspects of the directive. Due to come into effect in 2012, Solvency II promises a more sophisticated ‘risk-based’ form of supervision that will require many insurers to augment their risk management framework – particularly with regard to operational risk.
In a Webinar hosted by OpRisk & Compliance Magazine, Stuart Robinson, senior vice president of group risk at Germany’s largest insurer, Allianz discusses how Solvency II operational risk requirements are well aligned with Basel II requirements and other regulatory and industry standards. In particular, he points to how insurers need to:
Identify operational losses and capture data on them
Understand the key risks and key controls in business processes and review control effectiveness on a regular basis
Use scenario analysis to assess the impact of potential operational risks
Quantify operational risk capital requirements
Demonstrate that operational risks are managed through reporting, KRIs and action plans
To understand the operational risk requirements and challenges facing insurance companies and how Solvency II impacts operational risk management, check out this informative Webinar.
I’ve been having conversations with customers and prospects about the value of an integrated risk management platform. (You can substitute ‘GRC’ for ‘integrated risk management’ if you’ve been reading any analyst covering the space recently.) There are lots of value drivers, but to date most CIOs haven’t embraced the logic yet and are opting instead to buy for very specific solutions areas. There are some exceptions, and on Friday I had a conversation with one of those exceptions, and he made a compelling case for why an IT organization should work with the business on an integrated control environment.
The specific case this customer made was around the need to manage the General Computing Controls associated with Sarbanes-Oxley. The finance side of the company had been the buyer of their SOX solution, and they, of course, look at the world through accounts and processes. Their SOX solution was configured accordingly, and all of their controls roll up to processes associated with accounts. Unfortunately, the IT organization doesn’t look at the world that way, and, according to this customer, “There’s nowhere in this model to stick the IT controls in a rational way.” The IT organization would much rather organize the GCCs by ISO 17799 or some other framework and associate each control to the appropriate risk in the finance model. In this way, the IT organization can leverage a control management structure already in place, without duplicating any effort.
This is the most basic value proposition for an integrated risk management platform. And many companies are seeing big savings as the number of regulations they are trying to manage increase. Sure, you can probably manage SOX in a bunch of spreadsheets, but try adding a couple more regs and reporting and policy management, and you’re very quickly into the realm of a purpose-built solution. The interesting problem is that that the cost of siloed solutions doesn’t fall fully in the office of the CIO. If it did, we would have many more CIO converts.
Fed Chairman Ben Bernanke went on the offensive yesterday at the annual meeting of the American Economic Association, arguing that lax regulatory oversight, not loose monetary policy, led to the housing bubble and subsequent financial crisis. You can read his remarks here.
After working behind the scenes for most of the fall, lobbying legislators one-on-one, Bernanke took a very public position yesterday, blaming the rise in housing prices on the alternative types of variable rate mortgages which priced in more demand than that which could be expected from prevailing interest rates.
Bernanke argued that “stronger regulation and supervision aimed at problems with underwriting practices and lenders’ risk management would have been a more effective and surgical approach to constraining the housing bubble than a general increase in interest rates.”
Further, he said that “the lesson I take from this experience is not that financial regulation and supervision are ineffective for controlling emerging risks, but that their execution must be better and smarter.”
To some extent he’s trying to deflect the spotlight onto other regulatory agencies chartered with overseeing the factory for different kinds of mortgages. But Bernanke can’t have it both ways. He’s argued in the past that the Fed has a role in consumer financial protection and has lobbied against the CFPA, so, if it is the case that the Fed’s mandate extends to the financial consumer, why did he let these mortgages with low monthly payments proliferate? While he was convincing that there were other factors beyond monetary policy that led to the housing bubble, he was less clear on what kind of regulatory structure would have prevented the bubble and how we should move forward on consumer financial protection. At this point, my bet is that the CFPA has enough momentum to pass with financial reg reform.
A recent industry survey by PTC shows that the highest cost of product compliance failures is not always fines and legal fees, but delayed time to market and product shipments. This is particularly true in manufacturing where restricted substance-based product recalls have cost manufacturers and consumer product companies millions in lost revenue due to compliance failures or supply chain disruptions.
Of course implementing a compliance program has its costs as well. As our recent white paper “The High Cost of Non-Compliance” authored by Rick Steinberg points out, an OCEG Benchmarking Study shows the cost of Sarbanes-Oxley compliance alone averaging:
$4 million for companies with $5 billion revenue
$10 million for companies with $10 billion and more in revenue, and;
for companies with more than $1 billion revenue, compliance costs equaled 190 full time equivalent employees.
So, while implementing a compliance program may seem high, it’s clear that not putting an effective compliance program in place can be significantly more expensive.
The white paper points out several key ways companies have succeeded not only in reducing compliance costs, but also enhancing efficiency and gaining real business benefits:
Built into Business Processes
A Program Founded on Ethics and Integrity
A Risk-Based Approach and Clarity Around Responsibilities
Compliance expert Eric Krell from DRS Technologies speaks to Business Finance editor in chief Jack Sweeney about how the tactical precision with which key risk and compliance decisions were made allowed internal audit to blossom. DRS Technologies currently utilizes OpenPages to manage their SOX compliance requirements and takes advantage of the technology’s workflow automation capability to supplement the 302 certification process.
We’ve blogged frequently on the topic of IT risk management, most recently here. With recent events highlighting the need for better risk management, now, more than ever, people are thinking about how to improve their processes and technology for supporting their risk management programs. Ben Worthen over at the WSJ BizTech blog has written recently that tech departments shortchange risk management. We couldn’t agree more.
The basic problem, as Symantec’s Samir Kapuria notes via Ben’s post, is that IT tends to think of risk management as a project vs. a continuous process. This may be the result of the fact that most IT infrastructure vendors sell risk management for project delivery but don’t really have solutions that allow IT to take a risk-based approach to all their activities. It may also be the result of IT having to keep everything running, all the time. Regardless, unless you start with a top-down approach using a risk assessment process, identifying which vulnerabilities match to the most significant potential business impacts, you will never be able to allocate IT resources appropriately. Once you understand how realized risks will impact the business, you can take a truly risk-based approach to IT management. Obviously, we have a horse in this race, but any effort to tackle the IT risk management challenge must involve the business and identify the key risks therein.
Or more to the point, was he thinking at all? We’re talking about Rajat Gupta, operating at the highest echelons of multinational business, who finds himself charged by the Securities and Exchange Commission with illegally passing inside information to Raj Rajaratnam, the Galleon Group founder about to go on trial on charges of insider trading. Mr. Gupta, a Harvard Business School graduate and former head of McKinsey & Co., has been a board member of the likes of Goldman Sachs, Proctor & Gamble, and American Airlines.
What did he do? Well, he of course is innocent until proven guilty, and according to media reports, his lawyer says he has done nothing wrong. But the SEC says otherwise. It alleges Gupta gave the Rajaratnam advance information about earnings at both Goldman and P&G. On top of that, the SEC maintains that Gupta called the Galleon head with the inside scoop of the Goldman Board’s approval of Warren Buffett’s $5 billion investment in the firm. The allegations speak to multiple phone calls between the two men, enabling Galleon to reap millions in profits. What must be particularly troubling for both is that the SEC says it has recordings of numerous telephone conversations.
Let’s presume for a moment that the allegations are factual. A relevant question is, is this a black eye on the companies on whose boards Gupta sat (by the way, the reports say he resigned months ago from the Goldman board, and recently from P&G). My answer, based on the information available, is “no.” Certainly, if the allegations are true, a statement by SEC Director of Enforcement is on point: “Mr. Gupta was honored with the highest trust of leading public companies, and he betrayed that trust by disclosing their most sensitive and valuable secrets.” But what could or should have been done to prevent wrong doing at the board level?
We know well the importance of a company’s board of directors in keeping a close eye on what the CEO and senior management team do, and on the company’s system of internal control. We recognize the importance of compliance officers, risk officers and internal audit functions. But who keeps an eye on the board, especially when their actions are outside the inner workings of the company itself? We can look to what happened years ago at HP, when a board member leaked information to the media, which resulted in the pretexting fiasco.
There are no immediate answers, other than to continue to ensure full vetting of director candidates, and maintaining effective board and internal audit processes to best identify and manage potential misbehavior. With the thousands of directors of major companies acting with extraordinary integrity and ethics and in the best interests of their companies and shareholders, I believe we don’t have much to worry about. But it is worth more thought going forward.
With increased scrutiny being given to risk management accelerated in the aftermath of the near financial system meltdown, COSO has released a new thought paper to support companies’ efforts to enhance their risk management processes. Titled Strengthening Enterprise Risk Management for Strategic Advantage, the paper is geared to senior executives and boards of directors, highlighting key elements of ERM. This paper is a follow up to COSO’s Effective Enterprise Risk Oversight: The Role of the Board of Directors, reported by John Kelly in his September 1 blog entry.
This newest paper is intended to provide a “basis for introspection about current approaches to risk management and be a catalyst for management to strengthen risk management for the purpose of enhancing the board’s risk oversight capabilities and the organization’s strategic value.” As such, COSO encourages boards and management to turn to COSO’s Enterprise Risk Management— Integrated Framework for in-depth discussion of core components of enterprise risk management.
The paper sets the stage by focusing on how the financial crisis and business complexity, advances in technology, globalization, speed of product cycles, and the overall increased pace of change increases risks facing organizations. It points to a perception that senior executives and boards “could be more aware of the risks they are taking” and do more to prepare for their downside. It also points to legislative and regulatory initiatives providing further impetus for focusing on risk management.
The paper centers on four areas where senior management can work with its board to enhance the board’s risk oversight capabilities:
Discuss risk management philosophy and risk appetite
Understand risk management practices
Review portfolio risks in relation to risk appetite
Be apprised of the most significant risks and related responses
The paper does a good job highlighting how these activities can be effectively operationalized, and contains points of focus particularly for directors. It’s especially useful for senior executives and board members struggling to cope with their management and oversight responsibilities. It may even be worth the read for professionals with some knowledge of the COSO ERM Framework, to refresh memories and sharpen a focus on what ERM is all about. Probably most of all, the paper should provide useful support to those who are working to make the case for ERM in their organizations.
Strengthening Enterprise Risk Management for Strategic Advantage is available at www.coso.org.
We did an interesting survey at OPUS a couple weeks ago. We’ll be publishing the results here next week, but one of the GRC topics that people have been talking about is whether GRC spending will decrease like most of the rest of the tech sector, or increase based on the very obvious need for better risk management in corporate America. Whether or not GRC spending increases next year will depend, of course, on the state of the economy, and a host of other issue that Brian Sommer discusses in a blog post this week at ZDNet.
Brian and I discussed a variety of topics on the value of GRC deployments and in particular on the importance of risk management. While technology alone would not have prevented the current crisis, it can be an enabler for change, and many firms at OPUS indicated that using a GRC management system can enforce policy and help catalyze behavioral change around risk management. The beauty of such a system is that you can very quickly find out who’s following the rules and who’s not. That might have been helpful for some of the financial services institutions trying to deal with risk exposure they never knew they had.
If you’re attending OpRisk USA in New York City March 24-26, don’t miss Scott Green’s discussion on reinventing risk processes. A frequent speaker and well versed GRC industry expert, Scott is the managing vice president of operational risk management at Capital One and also serves as vice president of the OpenForum User Group. At OpRisk USA, Scott will make a case for risk process integration and success factors in driving change to create a more effective and efficient ORM.
Scott also recently joined Compliance Week for a web cast conversation on “Risk Management Strategies at Capital One” where he discussed how Capital One is reinventing risk management processes to improve efficiency and effectiveness in managing risk requirements and their associated controls. The archived web cast is available on-demand.
Our friends at OpRisk & Compliance magazine recently kicked-off the third annual operational risk software ranking survey. The survey consists of 5 categories, within which you are asked to rank the top five firms (OpenPages!), or to enter a firm of your own choosing. The categories are:
Scenario analysis functionality
OpRisk loss data collection
Key risk indicators
Risk control and self assessments (RCSAs)
Regulatory and economic capital modeling
This is a great way to show support for your favorite operational risk software vendor (OpenPages) while also being entered into a prize draw to win a copy of "The Basel Handbook", second edition. All responses remain anonymous and will not be attributed to individuals. OpRisk & Compliance will publish the results from this survey in the May edition of OpRisk & Compliance magazine.
So if you have a moment (1-2 minutes), please take part in this fun and informative survey.
One of the most commonly requested agenda items from past OPUS attendees has been for more Hands-on Workshop sessions.
We’re happy to announce for OPUS 2008, we’ve added an entire track dedicated to these valuable workshops. Each 2-hour session will provide you with a discussion on best practices, a technical demonstration, and a chance to work within a sample OpenPages 5.5 environment to implement the techniques that you have learned.
These sessions will provide a unique training opportunity that you won’t want to miss!
OpenPages 5.5 Hands-On Workshop: Workflow - Review workflow concepts and terminology, then create and test your own workflow containing two tasks.
OpenPages 5.5 Hands-On Workshop: Creating Profiles and Home Page Configuration – Discuss best practices methodology to meet business requirements, see a technical demonstration on configuring an Object Profile, then create your own profile, associate users to the profile, configure the home page and restrict user access to fields.
OpenPages 5.5 Hands-On Workshop: OpenPages CommandCenter Reporting – Review a business use case and discuss best practices methodology, then create, troubleshoot, and run a report on your own sample OpenPages 5.5 environment.
To learn more about the OPUS 2008 agenda, which includes the industry’s best line-up of GRC thought leaders and customer case studies, download the detailed agenda and register on-line.
Keep in mind that the exclusive OPUS room rate will expire on September 28, 2008, so please book your room today. To make your hotel reservations call the hotel directly at 1-800-HOTELS-1 (1-800-468-3571) and ask for the OPUS room rate, or visit the Renaissance Boston Waterfront Hotel website.
Okay, this is difficult for me – to think that I might have actually made a mistake! Those of you who know me well realize that I seldom if ever make such a statement. For example, there have been instances when speaking at conferences a participant suggests that a statement I’ve made might not have been entirely accurate, and my response is not “oops, I made a mistake,” but rather “I might have misspoken”!
Relevant to these self-observations is that years ago in advising clients and otherwise communicating with the business community, I believed that while it was challenging to implement enterprise risk management effectively, it could be done without use of advanced technology. Well, my thinking has evolved as to the need for an effective software solution. It’s true that in mid-size companies (I generally don’t work with smaller organizations) that were centralized with few levels of management I saw opportunities for enterprise risk management to work successfully with protocols that didn’t necessarily require use of specialized software. Effectively addressing risk factors in each operating and staff unit at every management level, with highly effective information sharing and communication, made ERM workable.
But over time I’ve come to recognize that the above scenario is extraordinarily rare or non-existent in companies of any decent size. With increasingly challenging economic, regulatory and competitive environments, fewer personnel stretched thin and channels and markets rapidly changing, the need for effective software becomes essential. Otherwise, the ability to capture all significant risks related to business objectives and related mitigating actions and control activities becomes difficult if not impossible. And coupled with a need to track assignment of responsibility to specific personnel and manage accountability – along with effective communication across organization layers and business units – specialized technology becomes that much more important. And when we superimpose a need for senior managers to readily obtain relevant risk-related analyses with dashboards with drill-down capability, then it’s a no-brainer that the right software solutions is essential.
Well, maybe things were simpler in the “old days,” and it’s only time and circumstances that have changed, rather than my sense of what’s required for effective ERM implementation. I hope you’ll let me leave it at that!
This year’s OPUS is shaping up to be the best yet! In addition to leading GRC executives presenting case studies and lessons learned, OPUS 2008 includes the who’s who of GRC thought leaders:
French Caldwell, Research VP at Gartner, Inc. will discuss the latest GRC Technology Trends.
Chris McClean, Analyst at Forrester Research, Inc. will provide a perspective on Corporate Social Responsibility and the growing influence of GRC.
John Haggerty, Vice President & Research Fellow at AMR Research will discuss his view on the future of GRC.
David Holcombe, Director of Risk Management for International Speedway Corporation and NASCAR, Inc will provide a history and evolution of safety in motor sports from the NASCAR and motor sport facility perspective.
Mark Beasley, Director, ERM Initiative and Deloitte Professor of ERM at NC State will discuss how many organizations are responding to external pressure by leveraging traditional risk management processes into an enterprise risk management (ERM) view.
Richard Steinberg, Founder and Principal, Steinberg Governance Advisors, will provide his insight on risk convergence.
You can get a preview of Richard Steinberg’s perspective on enterprise-wide risk management by checking out our recent blog entry. In this video, Richard is interviewed by Gordon Burnes at the recent Executive ERM Forum.
Check back soon for more detail on our extensive line-up of customer case studies being presented at OPUS and our extended Hands-On Workshops.
In November, OpenPages and Compliance Week hosted a roundtable with 17 senior executives where the discussion covered the kinds of metrics to use when discussing and reporting on risk and compliance initiatives. Companies ranged from a variety of industries and covered the risk and compliance functions. Matt Kelly, editor in chief at Compliance Week, moderated the session and blogged about it. To review the article written about the roundtable, “Shop Talk: Metrics for Risk, Compliance,” visit the Compliance Week website.
One of the interesting topics that surfaced during the discussion was how do we, as risk and compliance professionals, get closer to the business to influence the decisions that are being made in the context of an operational process. Everyone agreed that it’s relatively easy to track external metrics–calls to the whistleblower hotline, for instance, and it’s also straightforward to put in training procedures. However, the challenge is getting the “in-process” metrics to influence decision-making.
This question of how to get closer to the business was also a theme at RiskMinds in Geneva last week. Many operational risk executives said that the next frontier is more effectively influencing the “in-process” decisions to reduce operational risk.
For us, the takeaway is that reporting and KRIs have to get more granular and more timely for both the business and the group risk and compliance functions.
Over the years, there have been many studies about CEO compensation and risk taking with the data on outcomes derived from data available from public companies. The latest salvo comes from two professors, one from BYU and the other Penn State, who have published a new paper in the current issue of The Academy of Management Journal. They studied 950 companies from 1994 through 2000 and found that CEOs who recieved more than half their compensation from stock options were more likely to undertake risky investments to deliver extreme company performance. The problem is that these investments were more likely to end up in big losses than big wins.
Floyd Norris in the New York Times describes the conclusion that the study’s authors come to. Namely, CEO pay should include deeply in the money options and longer holding periods so that the CEO acts more like shareholders.
This may or may not be a good idea. What’s clear is that in the studied companies risk taking was at the discretion of the CEO. Companies apparently could not distinguish between good and bad risks to take, and the decision about whether to take them rested on the shoulders of the CEO. But this not need be the case. Better transparency into the state of risk into the business would have provided more sunlight on these so-called risky decisions. In the current climate of risk management focus, boards complain that they don’t have good visibility into the state of risk in the enterprise, and judging from this study, this lack of visibility is causing poor performance, with companies investing in areas with sub par returns for the chance of a big win. This bodes well for the risk management business.
Rising from the banks of the Potomac in National Harbor, Maryland, the Gaylord National is an engineering marvel which provides a scenic venue for the 2010 Gartner Security and Risk Management Summit. I attended an intriguing session by Richard Hunter, Gartner vice president and distinguished analyst in which he described the value of IT risk management.
Hunter recently published a book titled, “The Real Business of IT: How CIOs Create and Communicate Value” which is co-authored with George Westerman of MIT. As part of the research for his book, Hunter conducted a survey of CIOs from 2006 to 2009 on IT Risk management. One of his takeaways from his research is that the business context for the value of IT can be summed up as:
Run the business
Grow the business
Transform the business
In terms of running the business, Hunter put it into the context of “at the best possible balance between price and performance” (i.e., cost of doing business). The key point Hunter stressed was that the measure of value should not be based on the return on investment (ROI), rather it should be based price and performance. As an example, Hunter asked, “Would you ask for an ROI on a firewall, or an audit?” The point being, there is no measurable return on these investments, they are a cost of running the business and the alternative is much costlier.
IT grows business, continued Hunter, by ensuring “capacity and capability and providing the ability to conduct business in a certain way.” In others words, he explained, it supports someone else’s profit and loss. The third value (transforming the business), is about “enabling new value propositions for new customer segments.”
He recommended IT organizations take the following steps to show value:
Change the way you think. Frame every comment in terms of business outcomes and business performance. Adopt the language of business in every discussion of risk (i.e., the point of BCM is not to recover the server farm, it is to recover customer service, accounts receivable
Show value for money, meaning the right services at the right level of quality at the right time. Never discuss cost apart from quality of service.
Position IT (and IT risk management) as a component of investment in near and long-term business performance.
A very common theme at the Summit is supported here in that “performance should be defined in terms of business outcomes and performance, not IT performance.”
For readers interfacing with your companies’ audit committees, a just released survey from Directorship Boardroom Intelligence highlights what’s in the forefront of committee members’ minds today. The results are reported in a top-ten list (unlike the Letterman top ten lists, this one appears to begin with the most significant):
Uncertainties of economic/legislative environments
A group of investors issued a report yesterday that criticized the Obama administration’s plan to have the Federal Reserve operate as the systemic risk regulator. The Investor’s Working Group consists of a high-octane set of investors, including CalPERS, Capital Group and GMO, as well as academics and journalists, and it is chaired by former SEC chairs William Donaldson and Arthur Levitt.
There are lots of interesting ideas in the report, but, most significantly, the report argues that the Federal Reserve as systemic risk regulator has “serious drawbacks”. It cites the “potentially competing responsibilities" from monetary policy, to managing the payments system. More importantly, the report cites the recent regulatory failures of failing to police mortgage underwriting and to impose suitability standards on mortgage lenders.
Whether it’s the Fed or some other entity that ends up being responsible for systemic risk regulation, a new information architecture will be required to surface up the right information to the systemic risk regulator. The Investor’s Working Group has suggested that the regulator “should have the authority to gather all information it deem relevant to systemic risk”. Such an information gathering exercise will not be a trivial effort. The ORX has some experience with that, and the systemic risk regulator would presumably be looking for timely information about positions, counterparties and activity. The more the information requests align with current operational reporting at banks and other financial services institutions, the easier it will be to implement this new reporting requirement. This alignment is something ORX has learned is important. The reporting requirement alone is an argument for an omnibus approach to financial services regulatory reform, because surely there will be other reporting requirements coming out of the regulatory reform process. Let’s hope there’s a coordinated approach here; otherwise, potential reforms could be very expensive to implement and fail to deliver as promised.
Attrition.org maintains a list of public, high profile data breaches. The list is staggeringly long, and goes back to the year 2000. TJX, while a high profile data breach and perhaps one of the biggest stories of 2007, is only one of the many that were publicly reported. And, companies have a vested interest in not making these events public. Add to that the breaches that happen every day that go undiscovered and it becomes clear that this staggeringly long list is just the tip of the iceberg.
But why is this list growing? Preventative technology and knowledge gets better and better every day. Shouldn’t we be getting safer? Information risk management is sometimes a thankless job. As an old mentor of mine used to say, a good day is a day where nothing happens. The villains get better and better every day, however, and the gap remains. Your organization is susceptible, and it’s critical you do everything you can to keep the gap as narrow as possible.
The court did take issue with the way PCAOB members could be removed, and ruled that board members could be removed “at will” by the commissioners of the Security and Exchange Commission. In the majority opinion, Chief Justice Roberts wrote that, despite the unconstitutional tenure provisions, the Act remains “fully operative as a law.”
So what does this mean? Congress clearly tried to insulate the PCAOB from the political whims of the executive office, passing the Act, as it did, during an administration skeptical of regulation. Roberts’ court handed advocates of executive power a victory by ruling that dual for-cause limitation on the removal of officers is not constitutional and that the president must have a direct line to remove officers of the government, which the Board members were determined to be.
However, given the current administration’s concern about corporate accountability and the integrity of financial risk reporting in general, it would be very surprising if SEC Chair Mary Shapiro were to exercise her new found power and replace Board members with someone more lenient on the accounting firms. And, AS5 really took the heat off of corporate America vis-a-vis their auditors, anyway; the SEC’s the one that carries the big stick with regard to the integrity of financial controls.
Further, more and more SOX efforts are being rolled into a comprehensive program of managing risks enterprise-wide. Companies are more interested in broadening the application of the approaches, tools and techniques for testing financial controls to their broader control environment. The net here is that the Supreme Court’s ruling will probably have little to no effect on how companies actually manage their risk with respect to financial reporting.
While many companies have basic elements of a compliance program in place such as code of conduct and whistleblower programs, simply having these elements is no substitute for a comprehensive program. In reality, many companies have implemented a “one-off” approach in which procedures often become fragmented, duplicative and outdated over time. For these organizations, the cost of non-compliance can be extraordinarily high, whereas a well-designed, comprehensive compliance program provides numerous efficiencies and can serve as a solid foundation for effective Enterprise Risk Management.
Don’t miss Rick Steinberg, founder and CEO of Steinberg Governance Advisors and Compliance Week columnist, as he outlines steps that companies can take toward achieving a well-designed, comprehensive compliance program. In this informative Webinar, Rick describes a strategic, risk-based approach that supports business objectives and provides an enterprise view of compliance.
With individual countries required to implement Solvency II by October 2012, insurance companies face relatively tight deadlines to comply with a more sophisticated risk-based approach to supervision throughout the EU. One of the largest changes for all firms covered by Solvency II is the ORSA requirement. “The ORSA has a two-fold nature,” according to EC documents. “It is an internal assessment process within the undertaking and is as such embedded in the strategic decisions of the undertaking. It is also a supervisory tool for the regulatory authorities, which must be informed about the results of the undertaking’s ORSA.”
ORM software can provide crucial risk self-assessment capabilities that enable organizations to document and evaluate their risk frameworks, including processes, risks, events, key risk indicators (KRI) and controls. Executives can stay on top of organizational risk activities through dashboards and reports that highlight key risk metrics and policy compliance.
Munich-based Allianz spent much of 2008 and 2009 focused on infrastructure and Pillar I of Solvency II. The company selected OpenPages ORM (Operational Risk Management) for loss data capture, risk self-assessment and quantitative scenario analysis. The operational risk framework involves the introduction of an updated methodology, improved business processes and new IT support systems. The goal is to integrate pragmatic operational risk management techniques in core businesses operations and decision making processes.
Allianz hopes that their efforts for Solvency II will form the basis of a deeper change in terms of building a risk management culture and the ability to generate good business from a risk and return perspective.
Canada’s oldest and fourth largest bank has proven a successful risk management framework can reduce risk exposure – even in the midst of a global economic meltdown. Hamish Lock, head of operational risk at Bank of Montreal and valued OpenPages customer is featured on the cover of the November, 2009 OpRisk & Compliance Magazine. In a very candid interview, Hamish describes how he took on his new role as head of operational risk just over a year ago in the midst of the financial crisis and has been able to steer his bank through the crisis relatively unscathed.
“The key goal I’ve been trying to achieve”, said Hamish in the interview, “is to increase the transparency and awareness of where operational risk lies within the bank; ensure ownership is clear and to identify it; and to talk about it in a specific and informed fashion.”
He attributes his success to the value the bank places on effective operational risk management. “It is about maintaining and enhancing the overall risk management capability at the firm. We would be doing a lot of these things even if they weren’t requirements. We believe there is value in trying to continue to evolve the way we manage operational risk as better tools are developed and the discipline matures. I’ve been involved directly in operational risk for almost eight years and there has been a huge amount of maturity in that time, in terms of the way we identify, and particularly the ways we assess and measure, the risks. We would want to continue to do that regardless.”
To view the interview in its entirety, click here.
The Shareholder Bill of Rights Act of 2009 submitted by Senators Schumer and Cantwell addresses one of the key issues in the current financial crisis, that of corporate governance. While the NYSE has a rule that the board must articulate its enterprise risk management strategy, such a proscription has yet to be enshrined in law. The Schumer Bill address that:
16 (A) IN GENERAL.—Each issuer shall…establish a risk committee, comprised entirely of independent directors, which shall be responsible for the establishment and evaluation of the risk management practices of the issuer.”
It’s unlikely that this particular bill passes as written, but the notion that companies will have to formally name a risk committee will certainly shine the light on how companies identify and evaluate risk in their business.
Interestingly, in the UK, the Financial Reporting Council just finished their review of the corporate governance code. There’s an interesting article in Management Today here:
I disagree with the conclusion, however. The ‘comply or explain’ approach will never work. We just learned that lesson from the former investment banks that were supposed to self-regulate in the US. My view is that you can fashion regulation that’s not “over-reaching” (some would say Sarbanes-Oxley falls into this category) yet provides sufficient guidance on operating requirements to actually mitigate real risks.
Although there were differing opinions about the main causes of the current financial crisis, most speakers at RiskMinds in Geneva were unanimous in their belief that the worst is still to come in what many were referring to as the “Great Recession.”
Robert Shiller of Yale University drew many parallels between the Great Depression and today’s crisis. For example, we have lost 60% of the stock market value since the 2000 high, while during the great depression there was an 80% drop. But Shiller refuted many of the commonly believed causes of the current crisis such as weak underwriting standards, unsound risk management practices, increasingly opaque financial products, and aggressive leverage. He maintains that the speculative bubble in both the real estate and stock markets were largely to blame for the worst financial crisis since 1929.
Maureen Miskovic, CRO at State Street, opened her presentation with a quote from Dickens’ Tale of Two Cities: “It was the best of times, it was the worst of times, …” and went on to claim that we are in the midst of a financial revolution. Miskovic predicts that we will see unemployment levels of close to 10% in the U.S. next year which will in turn cause problems in the prime mortgage market. She also predicted that the current political climate will result in punitive regulation which will transform the large U.S. banks into institutions that are very similar to public utilities (increased disclosure, more transparency, and intrusive examination).
Zannie Beddoes, Global Economics Editor at the ECONOMIST, predicts that shrinking personal wealth will greatly effect demand and eventually push the world into depression era economics. She stated that the current situation is unlike other post war recessions due to the asset bubble burst and so we are in for a deep, long recession. She also fears an anti-market backlash which could result in subsidy wars and protectionism policies.
While the speakers painted a picture of doom and gloom, they were clear about the increasing role that risk managers need to play in helping financial institutions restore confidence and trust, as well as create a sense of opportunity in the financial markets.
I’ll summarize some of their recommendations in tomorrow’s blog.
I’ll be the last one to tell you that a strong central risk management function is a bad thing. Unfortunately, many organizations make the mistake of investing only in a centralized function because it’s too difficult to federate, and push risk management to lower levels of responsibility in the organization. It’s a classic consistency vs. quality of information problem.
Accurate information lies at the business line level – a manufacturing company’s CRO may not know that you’re throwing away millions of dollars a year due to a lack of quality suppliers, but the supplier quality manager certainly does. The challenge is that it’s traditionally very expensive to consolidate this local lower level information. Organizations attempt to survey and assess process owners, but the information comes back in various formats, of various levels of quality, and it leads to information silos – it’s impossible to get an apples to apples comparison. Out of frustration, many of these efforts fail, leading to a strong centralized risk function.
Organizations must augment their centralized risk management efforts with localized, distributed data, and the only to reliably do that is to invest in automated technology solutions.
I just returned from the Gartner Security and Risk Summit where IT risk and compliance was a featured topic. In a recent blog post, I mentioned that Gartner Research VP French Caldwell presented a session titled “Selecting and Applying GRC Frameworks and Standards,’ in which he polled the audience on “which areas are you most likely to apply standards?” Not surprisingly, IT risk and IT security ranked highest followed by regulatory compliance and enterprise risk. We hear every day how companies are grappling with compliance requirements of hundreds of regulations, standards and guidelines that include thousands of overlapping controls and which make the task of managing IT compliance an increasingly daunting one.
The folks at Network Frontiers developed the Unified Compliance Framework (UCF) – the first and largest independent initiative to map IT controls across international regulations, standards, guidelines and best practices, with this challenge in mind. The UCF indexes over 400 laws, regulations, standards and guidelines into a set of integrated controls and reduced over 20,000 citations to fewer than 2,700 harmonized activities.
OpenPages partnered with Network Frontiers to integrate the UCF with the OpenPages Platform, thus allowing IT risk and compliance directors to identify where the greatest risk of non-compliance exists from both a business and IT perspective and prioritize resources accordingly. Pairing this approach with a harmonized requirements and control framework, companies are able to reduce redundancy and duplication of effort and achieve an effective and efficient testing and monitoring program.
RIMS 2010 kicked off in Boston this week with no signs of an economic slowdown. The Risk & Insurance Management Society Inc. (RIMS) is celebrating its 60th anniversary on the historic Boston waterfront with its annual conference being held at the Boston Convention and Exhibition Center. RIMS includes greater than 10,000 risk managers from over 3,500 organizations ranging from Fortune 500 enterprises to government, nonprofit and service organizations.
The conference, now in its 19th year includes sessions on Enterprise Risk Management, Loss Control, Finance, Risk Management and Insurance among others. RIMS president Terry Fleming described the past 10 years as the most important in RIMS’ development, “risk management as a discipline has been thrust into center stage in the wake of catastrophe after catastrophe, including the global financial meltdown. The need for risk management has been highlighted more than ever before and RIMS has stepped up to the proverbial plate by testifying before congress, identifying new areas of interest in the discipline, creating inroads abroad and crafting the very definition of enterprise risk management.”
Widely reported today was Bank of America’s replacing their Chief Risk Officer, Amy Woods Brinkley, apparently at the behest of the government, which is eager to improve the risk management capability at the bank. Also, perhaps coincidently (or maybe not), the Wall Street Journal broke a story here about an internal struggle between the FDIC on one side and the OCC and Fed on the other. Apparently, the FDIC and Sheila Bair is pressing the government to change the internal rating they use to gauge the health of financial institutions. Such a rating change would allow the government to apply more pressure on the institution to change key managers in the business. Maybe Brinkley’s replacement is a result of this dynamic. What’s clear is that the risk management function will be in the cross-hairs of government regulators going forward.
As we mentioned last week, during the heyday of buying for Sarbanes-Oxley (SOX) compliance solutions, many companies put in place technology platforms that now support a variety of risk and compliance initiatives. SOX solutions were generally purchased with the tacit approval of IT, but, given the range of solutions currently in deployment (spreadsheets, custom applications using Microsoft Access as a platform, and COTS SOX solutions), it is clear that IT never standardized on a strategy for managing risk and compliance data. The result is that today CIOs have an opportunity to either leverage their existing technology or put in place a standard platform to support risk and compliance data and processes.
The reality is that many CIOs continue to allow the business to buy disparate platforms for different GRC solutions. In numerous buying decisions, IT is at the table to support solution implementation rather than thinking about the long term strategic benefits of a common GRC platform. Just as disparate customer data marts drove down customer satisfaction rates and hampered sales efforts, leading to the rise of the CRM market, so too will scattered risk and compliance data marts cause an immense amount of pain for risk managers trying to get a clear picture of risk throughout the business.
The New York Times is reporting a story today about banks slowing their foreclosure process as a result of cutting corners to speed their way through the legal process. It turns out that the foreclosure process requires lots of signatures by the foreclosing entity and some banks deployed robo-signers–people who signed up to 10,000 documents a month. The problem is that part of what they sign says that they personally had reviewed all the documentation, which, of course, is not possible.
The result of this realized operational risk is that foreclosures and subsequent sales have slowed, tying up banks’ capital for longer. We have long argued that operational risk is the linch pin in risk management. A linch pins keeps the wheel from sliding off the axle it rides on. The analytic approach used to value credit portfolios is a critical component of the overall risk management process, but the operating risks are no less important, as we read today.
The PCAOB’s Auditing Standard 5 (AS5) is structured around a top-down approach to identify the most important controls to test during your Sarbanes Oxley (SOX) effort that address the assessed risk of misstatement for each relevant financial assertion.
At OPUS 2010, Jo Morton, Business Analyst, Internal Audit at Williams Companies, Inc. and Lawrence Joiner, Manager of Internal Audit Operations at Williams presented an informative session titled, “An OpenPages Approach to Auditing Standard 5 Compliance.” In their session, Jo and Lawrence outlined how Williams has been able to move beyond a “process by process” review and up to an Account Level review that truly is an AS5 “Top-down Approach” In the following conversation, Jo Morton describes her session and her overall OPUS 2010 experience.
“Better Collaboration with the Business” was in the #2 spot on our 2010 GRC Wish List and it talked about the need to embed risk management within the business by incorporating risk management practices into everyday business processes. Business line managers should be making risk-based decisions. But this requires them to be able to use internal sources of risk data from across the enterprise and, when available, external risk data.
Another major area of concern is how the constantly increasing and changing array of rules, regulations and industry standards is affecting existing processes and systems. In many cases, the technology solutions that support these processes are under extreme pressure and cannot adapt to satisfy the business needs. Meeting these regulations and standards requires gathering and storing risk data over a significant time frame. It also requires integrated risk reporting of the data for easy consumption by internal and external constituencies such as senior management and regulators.
Our #3 item, “Robust Organizational Risk Culture” talked about how technology can play a role in helping to create a robust risk culture. But it is clear that technology is an enabler and not a complete solution. Businesses must evolve their risk management methodologies to meet these changing requirements. The goal is to establish an effective enterprise-wide risk management program that is flexible to respond to change and it is tailored to an organization’s corporate strategies, business activities and external environment.
Many organizations that I work with are examining their risk management practices and are expecting to make significant changes in 2010. Investment in risk management systems, processes and technologies will be an essential step for many organizations. What is your organization doing to improve the effectiveness or its risk management processes and systems this coming year?
We’ve learned that operational risks played a big part in the losses associated with the current crisis. Now that companies are rethinking their risk management strategies moving forward, many are hoping to leverage operational risk management to improve performance in other risk management disciplines.
In a recent webinar, John Wheeler of Wheelhouse Advisors (formerly Senior Vice President and Senior Risk Officer within the Corporate Risk Management division at SunTrust Banks – see case study) spoke to some of the root causes of the crisis associated with operational risk management, and how moving forward operational risk management can be leveraged for strategic advantage.
What are some of the key linkages between operational risk and other risk disciplines at your company?
I recently had the pleasure of presenting with Richard Brilliant, Carnival’s vice president and chief audit executive of Audit Services in a Compliance Week webinar titled: “Leveraging the Power of Integrated Risk Management”. Richard began his presentation by asking a very telling question: “Who specifically is best suited to manage risk in your organization”? The answer of course was “Everyone”. After all, enterprise risk management is about managing risks across multiple risk and compliance disciplines as well as across multiple business units. In other words, ERM requires everyone’s participation to be truly effective and risk awareness must be instilled at all levels of the organization. In the case of Carnival, this meant integrating risk, compliance and audit activities across business lines to support ERM and audit views of risk that were different but synergetic.
More and more, organizations like Carnival are taking a risk-based approach to governance and utilizing audit to provide an independent assessment of risk exposure as well as better transparency and accountability. As I pointed out in a recent blog post, technology plays a critical role in an organization’s ability to implement an effective enterprise management framework that provides transparency and drives accountability. For Carnival, an integrated risk management program enabled the audit team to provide “independent, objective assurance and consulting” and deliver “a systematic, disciplined approach to evaluate and improve the effectiveness of risk management, control, and governance processes”.
The other (less talked about) benefit of an integrated risk management program is reduced cost. With improved efficiencies for controls, control tests and auditing along with reduced internal overhead and maintenance for reporting products and support systems, ERM solutions deliver rapid returns on investment.
To learn more about the power of an integrated risk management program, check out the archived Webinar:
The RIMS conference hits full stride today with luncheon keynote speaker Nicholas Nassim Taleb – author of ‘The Black Swan: The Impact of the Highly Improbable.’ Taleb’s book, which was the #1 highest selling nonfiction book published in 2007 on Amazon, is based on the notion that low frequency (rare) events such as a ‘black swan’ are unknowable or highly improbable, yet often have the highest impact. With a second edition due out in May, Taleb takes on a unique perspective on risk management and life in general. He recently Tweeted: “Social media are antisocial, health foods are empirically unhealthy, knowledge workers are ignorant, & social sciences aren’t scientific.” On his home page, he describes his philosophy as:
“I am interested in a systematic program of how to live in a world we don’t understand very well –in other words, while most human thought (particularly since the enlightenment) has focused us on how to turn knowledge into decisions, I focus on how to turn lack of information, lack of understanding, and lack of “knowledge” into decisions –how not to be a “turkey”. My last book The Black Swan (and the 4th Quadrant papers) drew a map of what we don’t understand (the ONLY attempt in the history of thought to set a clear and systematic limit to what we don’t know); my current work focuses on how to domesticate the unknown “what to do in a world we don’t understand.”
His keynote is a can’t miss event for risk managers (or anyone in need of some soul searching!).
It’s become clear that a risk-aware corporate culture is of critical importance to an organization. In the past year alone, there have been various examples in the news where a lack of risk-aware corporate culture has hurt companies, some beyond repair. Rick Steinberg, Founder and CEO of Steinberg Governance Advisors, Inc., recently joined OpenPages’ Gordon Burnes for a webinar discussion on how to develop a risk-aware culture, and the role technology can play in transforming an organization’s approach to enterprise risk management.
How does your organization promote a risk-aware culture?
In Observations on Risk Management Practices during the Recent Market Turbulence, the Senior Supervisors Group, which consists of US, UK, Swiss, French and German regulators, took a look at a number of global financial services institutions during the period of recent market turmoil. These institutions included the largest financial services firms in the world. The regulators zeroed in on exposure to the securitization of US subprime mortgage-related credit.
According to the report introduction penned by William Rutledge, Chairman of the NY Fed, " firms that avoided such problems [losses associated with such exposure] demonstrated a comprehensive approach to viewing firm-wide exposures and risk, sharing quantitative and qualitative information more effectively across the firm and engaging in more effective dialog across the management team."
What’s interesting here is that the regulators called out the ability of senior management to share risk information across silos, to discuss how exposures and risks all came together at the top of the business. This is certainly about risk culture, but it’s also about having access to that information so that it can be shared in the first place, which is really a systems problem. Regardless, it’s pretty clear that the days of siloed risk management are going to come to an end. Senior management must look at risk across the business in a more holistic way. It would be overly simplistic to say that Bear Stearns collapsed because of siloed risk management, but for anyone who’s ever read Memos From the Chairman, it’s hard to imagine this happening to a firm once run by Ace Greenberg, who championed a culture that had little tolerance for festering problems.
In November, I blogged about the difference between IT Risk Management and Information Security. For the full post, read here.
There’s a big different between tactical execution and strategic oversight. Therein comes the challenge with most information security programs; they place far too much emphasis on the how and what, and far too little on the why. Information risk management, on the other hand, is necessary to prioritize efforts, and concerns itself with the why.
The problem (and it’s a good problem to have) is that we’ve got a lot of great information available to us regarding how and what. There are libraries of control checklists from numerous standards organizations that provide great common practice guidance around how to secure information assets. As new vulnerabilities are discovered, new patches and workarounds are circulated and proactively communicated through a huge number of alerting services. Modern Information Security practices are mostly controls based — ie they focus on the what. They largely ignore the why — the element of business risk because it’s too difficult to understand.
Where this approach falls down is that there will always be far too much to do. There are too many vulnerabilities to remediate and too many controls to implement across the typical enterprise. As a result, critical deficiencies will go unmanaged. True risk management requires a business perspective on these deficiencies. Only with that business risk perspective is it possible to focus on doing the right things first. That’s lacking in the vast majority of modern businesses, and as a result, time is wasted and risk posture suffers.
The announcement of IBM’s intention to acquire OpenPages generated volumes of editorial response and news coverage in today’s world of instant publishing. The news which provoked a very positive response across the board from OpenPages customers, prospects, media and analysts, has generated over 1,400 ‘tweets’, numerous news stories and some thought-provoking analysis from industry analysts.
In particular, Chris McClean of Forrester raised an interesting point in his blog coverage noting that acquisitions in the GRC market over the past two years have resulted in not only vendor consolidation, but also market fragmentation. He points out that the Thomson Reuters acquisition of Paisley was meant to ‘strengthen its tax and accounting business’, while EMC acquired Archer ‘as a dashboard (at least initially) to pull together IT risk data and processes,’ whereas the IBM acquisition of OpenPages ‘will likely turn the company more toward higher-level corporate performance and enterprise risk management.’ I think Chris is as usual on target, yet would respectfully add that integration with the control infrastructure allows OpenPages to instrument the risk assessment and control testing process, thereby delivering the only comprehensive solution on the market.
Also published recently is Gartner’s ‘First-Take’ on the acquisition in which analysts French Caldwell and John Hagerty report that they are expecting a ‘Market Split’ whereby the vendor landscape will be divided between those that have coupled qualitative risk assessments with quantitative risk analytics, and those that provide just qualitative risk assessments: ‘Vendors that have a risk intelligence strategy would compete for large accounts with combined risk analytics and traditional governance, risk management and compliance (GRC) management functionality, while those without risk analytics capabilities would address less-quantitative risk assessments, compliance and audit management.’
If you’re a risk manager or a business manager, the integration of risk analytics with GRC management will provide your business with more timely and more accurate information to understand the risk exposure to the business and help you make better decisions.
OpenPages will be hosting a webcast with Compliance Week titled, “The Future of Compliance” in which featured speaker Chris McClean of Forrester Research, Inc. will discuss how as regulatory pressures continue to mount from new regulations such as the Dodd-Frank Act, businesses need to adopt a comprehensive and risk-based view of the organization’s regulatory responsibilities and provide early exposure to potential compliance gaps.
This morning’s featured panel discussion at GARP includes several CROs and senior risk practitioners from Morgan Stanley, The Vanguard Group, Credit Suisse and Western Asset Management.
The first topic was VAR. VAR works in “normal markets.” There a question of what is the appropriate time window. One panelist remarked that it would be good to have better regulatory consistency on this issue: should companies be focused on a 1-year or 4-year timeframe, for instance?
VAR tends to distract you from the tails, and one panelist remarked that “you really need to stay focused on the tails” e.g. gap risk, liqudity risk, etc. The panelist continued to say that he’s really focused on the deep downside risk: how much money could the position/desk possibly lose. You have to be very dynamic in thinking about where you can be hit next.
The third panelist asserted, “I think VAR is worthless and pernicious and should banned,” noting that it’s not a coherent risk measure (99.9% VAR doesn’t handle a 1 in 200 year event). Also, the panelist pointed out that it doesn’t encourage diversification. He focused on scenario analysis but said that there is no easy answer.
Another panelist defended VAR as a tool that has its pluses and minuses.
The panelists then turned to the role of risk managers, and their role in predicting the future (in the context of the financial crisis). If risk is lack of information about the future, many companies failed to hedge when there was a very cloudy future (lack of information). One panelist noted that in many cases the risk management failure was more than just the technical capability of know what to do but actually a failure to be able to drive action.
The question of the changing regulatory landscape came up, with one panelist joking that CRO stands for Chief Regulatory Officer now. Another joked that he’s trying to stay away from the regulatory topic because they don’t know whether what they do will be “legal or illegal” under reg reform.
There was agreement that the FDIC has been very successful in carrying out their mission. But one panelist said that in the near term we don’t seem to on a path towards getting an effective systemic risk regulator. Another said that we’re creating systemic risk through regulatory uncertainty.
Risk management should be viewed as a competency that is embedded in the organization. Coming in at #2 in the 2010 GRC Wish List however, “Better Collaboration with the Business” reflects the lack of understanding and poor communication that exists today between the risk function and business managers.
Surveys have shown that only 40 percent of respondents find the importance of risk management to be widely understood throughout the company, suggesting that more needs to be done to embed risk culture and risk thinking more deeply in the institution.
Incorporating risk management into everyday business processes will enable executives to focus on those elements of their risk activity that have the greatest positive impact on the organization.
Business managers can spend less time on assessments and more time on proactively managing risk and processes to meet company objectives.
Providing enhanced visibility into the risk landscape, integrated risk management empowers business managers to make smarter decisions that maximize value, reduce costs and balance risk with returns. When embedded into everyday processes at all levels of the organization, risk management will drive business performance.
This must be the season for surveys on risk management. E&Y recently published their perspective on risk management. They surveyed over 500 companies around the globe, almost 80% of which have over $1 billion in revenue, and the sample represents a variety of industries. You can find the report here.
Only 1% of those surveyed said they would be spending less on risk management in the next 12 to 24 months, which clearly makes sense given recent events; however, putting this in context of overall corporate revenue declines suggests that risk management will emerge from the recent downturn with a large share of the overall spending pie.
E&Y zeroed in on the complexity of risk management at most organizations:
“Over the past few decades, the number of risk management functions has grown to the point where most large companies have seven or more separate risk functions — not counting their independent financial auditor. This has created inefficiencies and resulted in a degree of fatigue on the business.
As the number of risk functions increases, coordination becomes more diffcult and often results in coverage gaps and overlapping responsibilities. The demands and various reporting requirements placed on the business by these risk functions can become significant and burdensome. The number of risk functions and the various communications from these functions can be a challenge for executives and the board of directors to manage and understand.”
In fact, over 90% of those surveyed indicated that there is overlapping coverage in two or more risk functions. Now, redundancy isn’t always a bad idea, but my guess is that a substantial portion of the overlapping coverage is the result of inefficient processes and technology infrastructure. The argument for efficiency addresses not just the business fatigue that multiple risk management functions create but also the huge infrastructure cost of supporting multiple platforms and processes. Interestingly, 61% said that they planned to commit no additional resources augment their capabilities, which means that incremental spending will have to come from savings, which by our measure can amount to millions of dollars per year. Please contact us if you’re interesting in exploring those savings at your organization.
President Obama announced his long-awaited proposals for the overhaul of the regulatory system for the financial services sector. There were no real surprises as the administration has been seeking input from many different parties for months and surrogates have be floating specific proposals over the last couple weeks. The so-called White Paper (www.financialstability.org) that outlines the administration’s thinking is pretty stark in places: “it is clear now that the government could have done more to prevent many of these problems from growing out of control and threatening the stability of our financial system.” (Page 2).
It’s clear that the main thrust of the regulatory reform is around systemic risk and consumer protection. Addressing the systemic risk issue, the Administration is proposing a Financial Services Oversight Council. What’s interesting here is that they scrapped the alternative of consolidating such power in a single agency, e.g. the Federal Reserve. Other proposals to address systemic risk include the creation of a category called Tier 1 FHC which the Fed will oversee, a revision of capital requirements for Tier 1 FHCs, a new Federal Bank Supervisor to conduct prudential supervision of other banks, the regulation of investment banks by the Fed, and other proposals. On consumer protection, the Administration is proposing a new Consumer Financial Protection Agency.
What is utterly lacking in these proposals is any sort of programmatic approach to intra-company risk management. There’s a brief mention of this issue:
“Prudential standards for Tier 1 FHCs—including capital, liquidity, and risk management standards—should be stricter and more conservative than those applicable to other financial firms to account for the greater risks that their potential failure would impose on the financial systems.”
How companies will react to these “stricter standards” remains to be seen. But as taxpayers holding a large stake in several of these Tier 1 FHCs, we should all want greater transparency into the control environment, decision-making process around risk management issues, and the organizational structure around risk management. These proposals do not address those issues.
CapGemini hosted a conversation on enterprise risk management this morning at GARP. Panelists touched on a number of issues that need to be tackled for successful enterprise risk management:
Helga Houston from Phoenix Global Advisors pointed out that many banking institutions grew very rapidly over the last 10 years and for the most part the risk management infrastructure didn’t keep pace.
Bradley Farris of BB&T agreed with Houston and added that the “demands on the data side are incredible.”
Houston touched on another key point: risk information surfaced to the business needs to drive dialog with the business. Everyone agreed that risk management needs to engage with the business, to reinvent language so that risk managers can have fruitful conversations with the business. Her point was that without having buy in from the business it’s very hard to change processes to mitigate risks.
Panelists also focused on the importance on governance processes and infrastructure to support the dialog with the business. All agreed that the market and credit risk processes are typically well-supported and that there’s a lot of opportunity for improvement in the operational risk domain.
It’s clear that one of the themes of the conference is that risk managers have to engage the business with information and dialog that’s useful to enhancing the performance of the business vs. satisfying risk management needs alone.
ERM, similar to most business processes, is not a “one-size-fits-all” solution. It has to be customized and tailored for each firm. As Mark Olson of the Federal Reserve notes, “An effective enterprise-wide compliance-risk management program is flexible to respond to change and it is tailored to an organization’s corporate strategies, business activities and external environment.” (April 10, 2006)1
Companies that try to implement an out of the box methodology will likely fail. ERM methodologies and taxonomies must be adapted to a company’s legal, regulatory, economic and competitive environment, all of which can vary dramatically by industry and must, of course, be tailored to the company’s internal processes and culture. Further, the risk framework must be able to adapt to change over time to avoid losing competitive advantage.
In my last blog post, I mentioned that the new Financial Stability Oversight Council created under Dodd-Frank will collect risk data from various sources including Federal and State financial regulatory agencies and the newly created Office of Financial Research (OFR). The OFR in turn is responsible for collecting risk data from financial services institutions at the behest of the Council. These additional, external information and reporting requests will not only compound the extensive reporting responsibilities of risk committees and risk managers, but will also likely overlap with internal reporting requirements from Boards and executives.
As the Dodd-Frank rulemaking proceeds in the coming years, reacting to each new rule and regulatory requirement with siloed technology and resource investments will clearly not be effective. The financial crisis of 2008 highlighted the interdependency of risks across an enterprise (credit, market, operational) which need to be managed holistically rather than in traditional silos. A siloed approach limits an organization’s ability to streamline risk and compliance processes and reduce costs. It also obscures the opportunity to integrate risk and compliance to gain a comprehensive view of the firm’s risk exposure.
Gordon Burnes commented in a recent blog post that “as companies put in place this information architecture to surface enterprise risk exposure, thinking about interdependencies will be critical to reduce cost.” I’ve worked with numerous OpenPages customers who are actively managing multiple risk and compliance programs on a single framework. The impetus behind these initiatives varies from the need to review enterprise risk and control performance at executive and Board-level meetings, to Federal regulator demands, to the need to simplify and rationalize risk and control assessments. A large, OpenPages financial services customer recently completed the convergence of risk assessments across all risk and compliance programs with the explicit intention of monitoring risk exposure across their business.
Moving forward as new Dodd-Frank requirements emerge, financial services institutions will require a converged information architecture that supports multiple risk and compliance initiatives on a single framework. An integrated risk and compliance framework can reduce the disparate databases and reporting structures, while at the same time meeting internal and external reporting requirements more efficiently. Whatever risk disciplines are significant within your firm, the goal should be to integrate them within a single framework that produces a holistic view of your risk landscape, while meeting the needs of regulatory agencies.
The Senate voted on May 20 to close debate on a far-reaching financial regulatory bill, which clears the path for Congress to approve a broad expansion of government oversight of the increasingly complex financial markets. The goal is to prevent a repeat of the 2008 economic crisis and, according to President Obama, the new financial regulations will “protect consumers, protect our economy, and hold Wall Street accountable.”
I have read several books on the financial crisis over the last couple of months, including:
The common theme through all of these books is that there is plenty of blame to spread around when looking for the root causes of the financial crisis. Mistakes made by regulators, legislators, and ratings agencies had as much to do with the crisis as the greedy and heedless Wall Street firms who were turning subprime mortgages into exotic, toxic financial products that they made a fortune laundering and reselling.
The danger posed by this deranged edifice built on the unstable foundation of subprime mortgages, and the insanity of the growing and highly leveraged trade in mortgage derivatives was not foreseen by government regulators, Treasury officials or the Fed.
Over the last decade, Washington legislators were busy deregulating the Financial Services industry (e.g. the passing of the Gramm-Leach-Bliley Act in 1999 that repealed much of the Glass-Steagall Act) and pressuring financial services companies to provide mortgage loans to low-income groups (Fannie Mae and Freddie Mac were strong-armed into buying subprime loans).
The rating agencies who were supposed to police these securities were completely duped by the financial services companies. They were handing out Triple A ratings to CDOs comprised of adjustable rate, no doc, subprime loans. Not to mention the conflict of interest that is present since the rating agencies are paid by the very firms whose bonds they are asked to judge.
Maybe there is hope on the horizon, since the Senate recently approved a provision that will thrust the government in to the middle of the process of determining who rates complex bond deals. Under the new provision, the SEC will establish and oversee a credit-rating board that will act as a middleman between issuers seeking ratings and the rating agencies.
Can we count on the SEC to make an improvement? Harry Markopolous’s book (“No One Would Listen”) about the Bernie Madoff scandal paints a grim picture of the SEC and the incompetence of the people he interacted with when trying to alert them to the multi-billion dollar Ponzi scheme Bernie was running. I recently had lunch with Harry at OPUS (OpenPages User Symposium) and he was very skeptical about all of the new regulations that appear to be headed our way. His view is that the government is focused on the battle we just fought and the regulators do a poor job at enforcing the regulations in the first place.
Can we really expect that President Obama’s optimism will become reality? Don’t hold your breath.
A traditional model to planning the audit process typically examines 10-20 risk factors for each element of the audit universe, and buckets each auditable entity into a risk categorization which will drive the frequency with which it is audited. While this approach may have worked well in the past, modern audit departments are being asked to do more with less. The known risk universe gets bigger by the day, and investing in a massive risk evaluation for each entity may not be the best use of resources. Is it worth tying up valuable stakeholders in management and on the audit committee to assess the risk inherent in the coffee procurement process for a remote sales office?
Progressive organizations are turning towards a more agile, top down approach to risk assessment to drive audit scheduling. This will lead to more efficient resource allocations, ensuring auditors are focused on the truly risk areas.
Deloitte ERM professor and OPUS 2008 Keynote Speaker Mark Beasley just released an update to the NCSU led ’Report on the Current State of Enterprise Risk Oversight.’ Written in conjunction with the American Institute of Certified Public Accountants (AICPA), the research focused on how boards and senior management teams are responding to the challenges and increased emphasis on board oversight of risk management processes – particularly in light of the new SEC proxy disclosure rules. The study produced some interesting findings:
Over 63% of respondents believe that the volume and complexity of risks have changed “Extensively” or “A Great Deal” in the last five years
Thirty-nine percent of respondents admit they were caught off guard by an operational surprise “Extensively” or “A Great Deal” in the last five years
When boards of directors delegate risk oversight to a board level committee, most (65%) are assigning that task to the audit committee
64% of those audit committees are focusing on financial, operational, or compliance related risks
Only 36% indicate that they also track strategic and/or emerging risks
These findings should be concerning if your organization is looking to meet the requirements of the new SEC disclosure rule which requires among other things, that boards describe their risk oversight process. Here are some thoughts for your team to consider as you prepare:
How does your team create and foster the appropriate risk culture?
Have you established a risk management framework for identifying, measuring, monitoring, managing and communicating risks across all functions?
Do you have plans to enhance your approach to risk management by linking strategy, operational execution and critical risks?
We recently hosted a webinar titled ‘Risk Oversight and the New Sec Rule’ which describes the tools, reporting and resources that you’ll need to provide to the board of directors as they look to meet the new SEC ruling. Check it out here.
Accelerated filers of course have long been subject to SOX 404 (a), requiring management reporting on the effectiveness of internal control over financial reporting, as well as section (b), where auditor attestation is required. While having to incur tremendous costs, with some companies seeing little commensurate benefit, others have seen improvement in business process effectiveness, internal control beyond financial reporting, and improved compliance more broadly. Non-accelerated filers, already subject to management reporting, have gained another reprieve from the auditor attestation requirements of section (b). Great news, many are saying. They hail the opportunity to avoid incurring additional costs and taking focus away from running and growing their businesses.
Recently I came across an article in Directors & Boards by a former colleague of mine that offers a different perspective, which in my view is worth considering. His view is, in addition to the SEC losing credibility – agreeing to another deferral after making clear and definitive statements that no more would be forthcoming – that requiring and adhering to section (b) offers benefits beyond the costs, for a number of reasons. These include (1) Smaller companies traditionally have less sophisticated systems and less experienced individuals in management positions, with statistics showing greater incidences of fraud and restatement of financial results (2) The 404(b) compliance costs have come down with the advent of AS 5 and COSO’s guidance for smaller businesses (3) Studies indicate that companies that are not SOX compliant or have material weaknesses in their internal controls receive a lower valuation, whereas those that are compliant receive higher multiples when sold (4) These companies are less likely to take advantage of IT solutions that provide enhanced efficiently and management capabilities well beyond better controlled financial reporting, and (5) CEOs and CFOs who already must certify to the effectiveness of financial reporting controls are on the hook by themselves, failing to receive the comfort provided by auditor attestation.
Certainly, these arguments are worth considering by senior managements and boards of companies still waiting to see whether and when the 404 (b) requirement ultimately will become effective.
OPUS 2008 – our 5th annual OpenPages User Symposium recently concluded with elevated excitement about improving business performance with OpenPages solutions as well as a renewed emphasis on managing risk given the current market turmoil and downturn.
Hopefully you attended and enjoyed the content-rich agenda with leading keynotes from industry experts as well as a large number of case studies from global organizations deploying enterprise GRC solutions based on OpenPages. In the product direction keynote led by OpenPages’ Gordon Burnes, Pat O’Brien and John Lundgren, a number of thought-provoking and forward looking questions were asked using an OPUS-first innovative electronic name badge by nTag which allowed live voting and results tallying.
We polled the 150+ strategic risk, governance and finance professionals from Fortune 1000 companies in the audience and found some very interesting and telling trends. For instance, while industry experts predict that overall IT spending may be flat or down next year, over 90% reported that investments in GRC technology will increase or at least remain the same in 2009, and 90% of those polled expect new laws and regulations to be introduced next year in an effort “to improve corporate risk management oversight.”
Clearly, the results of this year’s survey highlight the impact of the current financial crisis on enterprise risk management efforts and the role GRC management solutions will play in helping customers mitigate risk while integrating and managing all of their risk management practices.
Today we unveiled ten best practices to help companies prepare for a new era of risk and regulatory oversight. In the New Era of Risk Management, companies will seek to integrate risk management silos from across the business and squeeze out additional efficiencies. This integration will reduce costs through a consolidated technology infrastructure and shared processes and will provide better transparency into the interdependencies of risks in the business.
OpenPages is working closely with its customers as they make the transition to a risk-based approach to managing their business. The ten best practices represent immediate actions OpenPages customers are taking and serve as guidance for others to ensure that their organizations are prepared to face new risk and regulatory demands. Check them out here. We also recently published a paper on the New Era of Risk Management.
Let us know how your company is preparing for the new era.
This week we announced another strong quarter including significant growth in software license revenue and continued profitability. Growth was driven by new customer wins across the globe and across industries including North America (Baker Hughes Selects OpenPages for Audit and Financial Controls Management and Union Bank Selects OpenPages Operational Risk Management), Japan (Mizuho Securities Selects OpenPages to Enhance their Operational Risk Management) and South Africa (Old Mutual Selects OpenPages Operational Risk Management Solution).
While the geographies and industries may differ, it is clear that across the globe, companies have a common objective to improve business performance through reduced risk exposure and better allocation of resources.
There are few things more devastating to a chief executive or board of directors than seeing their company’s name splashed across media headlines with allegations of having broken the law. After wondering how it could possibly happen to us, the focus quickly goes to how best to effect damage control, with accompanying thoughts of billions of dollars in fines, penalties, judgments and lost business, as well as personal exposure, and knowing great amounts of time and energy will be directed to dealing with regulators, lawyers, and investigators instead of growing the business.
It’s fascinating to see that, despite reading of such happenings at other companies, somehow many top managements can’t imagine it happening to them. Hence, too often companies put in place a code of conduct and ancillary policies, a whistleblower channel, and perhaps even a compliance officer – all useful elements – but which fall far short of an effective compliance program. And with each new law or regulation, a new policy and related procedures are installed, frequently duplicating existing procedures but still falling terribly short of an effective program. So we see fragmented and duplicative procedures that are administratively burdensome and often outdated, while the significant risks of non-compliance continue to grow.
In contrast, leading companies are proactively dealing with the associated risks. They take a holistic approach, first recognizing that laws and regulations were set forth in the first place as a reaction to damage to someone – customers, employees, investors or communities. And they recognize that companies satisfying related marketplace expectations – with “green” food products, better child safety products, better automobile gas mileage, or more desirable workplace environment – are rewarded with better workers, greater market share, and enhanced profits. With this recognition, they design a compliance program not only to ensure minimum compliance, but to seize related business opportunities geared to the underlying marketplace drivers. The compliance program is built into strategic objectives, and is risk-based and streamlined, with clarity around responsibilities and accountability, and supported by technology with meaningful communication and reporting.
Yes, there is an initial cost to doing this right, and a chief executive will expect to see a rational business case made for establishing such a program. But the benefits are real, and the CEO and board members will sleep better at night knowing an effective compliance program is in place in their company.
The Treasury is expected to announce this afternoon their long-awaited results to the so-called banking stress tests. They’ve done a good job leaking the key bits of information this week so the market has had time to adjust. Actually, they’ve done a good job attenuating the whole banking system assessment process, which has allowed time for investor sentiment to improve with the recent glimmers of hope for the economy that some prognosticators are seeing. What will be interesting is to see the parameters of the tests. I’ve read that the tests assume a worst case scenario of job loss that we’re very likely to exceed, but I’ll wait to see the fine print.
I was very interested in Warren Buffett’s comment over the weekend. Widely reported and noted in the Boston Globe here, Buffett said that the stress tests largely ignored the strength of the bank’s business models. This question of the viability of the business model going forward is an interesting one, and an important aspect of any ongoing business model would be how the risk management procedures will change to avoid similar problems in the future. Raising capital or creating a bad bank or any of the other strategies to deal with toxic assets don’t address the fundamental risk management weakness that got us into this mess in the first place.
I would be interested in seeing an assessment of the banks’ ability to identify and manage risk moving forward. And a key dimension of that capability is their stance on operational risk. Marc Leipoldt in the recent issue of OpRisk and Compliance (requires login) argues that now is the time for operational risk to assert itself across the business and become "central to the bank’s risk management". Clearly, many of the issues facing banks today were the result of realized operational risks that may have slipped through the cracks in the market or credit risk function. For instance, where would we be if the instance of mortgage fraud were dramatically lower? According to some banking insiders, at certain players this accounts for a good portion of the bad debt they’re having to account for.
The stress tests will certainly be a good snapshot of where we are, but what about our ability to manage risk going forward?
In the wake of Dodd-Frank passage, Chris McClean of Forrester Research commented that there are nearly 200 regulatory changes still on the U.S. federal agenda that span industry verticals such as finance, healthcare, and consumer protection.
As regulatory pressures continue to mount, organizations that adopt a more practical regulatory management approach across the enterprise will be able to react quicker to regulatory change and decrease costs and complexity while gaining valuable insight into the risks that could affect corporate performance in the form of legal action, fines and penalties, or a decline in company/brand loyalty.
The recently announced OpenPages 6 .0 includes significant enhancements to the Policy and Compliance Management (PCM) module that allows organizations to react quickly to changes in regulatory mandates and to manage regulator interactions effectively:
Regulatory Change Management — lets users easily communicate, track, and manage regulatory change and enables quicker reactions.
Regulator Interaction Management — provides workflow enablement to help users prepare for and manage complex regulator interactions.
Policy Lifecycle Management — offers a new user-friendly view to consolidate policy details with configurable field/template definitions.
Policy and compliance management software is playing and increasingly important role in the business by allowing companies to easily communicate changes in laws and regulations and enable quicker reactions by the business.
Compliance Week’s second annual eConference is just around the corner and kicking off the conference will be Rick Steinberg, founder and CEO of Steinberg Governance Advisors. Rick has a wealth of experience in corporate governance and in particular, the board-management interface as he advises boards of directors – and their governance, audit and other committees – of Fortune 100 companies, mid-size corporations, major institutional investors and leading universities, as well as federal governmental bodies.
In the first session of the event titled, “Aligning Risk Reporting with Risk Oversight,” Rick will outline how most boards believe that the CRO is solely responsible for all things risk-related, and that the CCO is solely responsible for all things compliance-related – which in reality, is virtually impossible. He’ll explain that the CRO and CCO are responsible for ensuring that there is an effective risk and compliance process in place to reduce exposure and litigation and that the CRO and CCO need to be sure they are giving the board the appropriate level of information needed to govern. In his presentation, Rick will describe how companies need a programmatic way to report on risk, controls, issues, and other risk and compliance related information to support the senior executives and board.
The G-20 met last weekend to discuss the state of the world’s economy and coordination of economic policy. Perhaps the most significant output of the meeting were the pictures of the groups which very clearly illustrated that the world’s financial system’s major players have expanded beyond the tiny group of 7 countries that made up the initial G7. These emerging economies, like Brazil and China, made it very clear that the world’s financial institutions — like the World Bank and IMF — must be governed more democratically.
While there was little concrete output, there was a set of actions to be taken to strengthen the world’s economy that highlighted the need for risk management and internal controls. My favorite bit was the following:
“Regulators should develop enhanced guidance to strengthen banks’ risk management practices, in line with international best practices, and should encourage financial firms to reexamine their internal controls and implement strengthened policies for sound risk management.”
OpenPages is doing this today with some of the worlds largest banks, one of which we will be announcing in the coming weeks. Stay tuned.
Spreadsheet gurus have carved out a significant role in managing financial and operational data in many companies. The problem with this approach is that it’s a) manually intensive and b) highly reliant on the individuals that manage and operate these spreadsheets. Further, the processes for linking, updating and archiving data in spreadsheets is mostly ad hoc, leading to significant risks associated with this data.
Freddie Mac, for instance, in their 2005 annual report noted that their reliance on “end user computing systems” (read: Excel) posed a significant risk to their ability to report accurately on their financial data. More recently, other financial institutions have noted that the Fed and OCC are shining a light on this undocumented spreadsheet problem, looking for more transparency to the data in spreadsheets and file shares.
The reality is that using spreadsheets and file shares for risk and compliance data is a dead end. While companies may be able to get through one cycle of review with internal auditors, a regulator and/or rating agency, the long term implications of adopting a spreadsheet-based architecture for risk and compliance data are extremely problematic. Not only will risk managers have trouble getting visibility into the data because of poor reporting capabilities, but they will also rightly question the accuracy of the data itself. This skepticism is precisely why so many companies are moving off spreadsheets to a more programmatic approach to managing risk and compliance initiatives.
Today we hosted the inaugural OpenPages African Network (OPAN) Summit in Johannesburg, and we’re excited to report that the level of excitement and number of risk management executives in attendance surpassed all expectations. Joining the more than 40 attendees were executives from OpenPages partner IQ Business Group and key South African customers including FirstRand Banking and The Absa Group Limited, two of South Africa’s largest financial institutions.
The success of this event highlights the global reach of OpenPages and the common need for effective enterprise risk management practices across all regions. The discussions focused on best practices for implementing operational risk management frameworks and the wave of regulatory change that is expected to sweep the globe. As we saw with this week’s release of the Walker Review, we can expect extensive reforms and regulation of corporate governance in the banking sector in 2010.
Despite the difference in time zones, the financial services industry in South Africa is facing the same issues as in Europe, Asia and North America. While South Africa’s financial markets did not suffer a meltdown to the same degree as other regions, the country’s leaders and South African Reserve Bank (SARB) are considering similar measures as the North American and European governments to ensure that banks are managing risk proactively. OpenPages is grateful to FirstRand for hosting the OPAN Summit and providing a forum for global risk leaders to discuss such important and timely topics.
At last week’s OPEN — OpenPages European Network, we conducted a survey of attendees to get a better sense of what they thought about the impact of the financial crisis on the regulatory environment and their own approach to risk management. There were some interesting results, especially when compared with those from OPUS, held 11 months prior in October of 2008.
The first question asked whether or not we’ll see new laws and regulations over corporate risk management oversight within the next year. Just over 80% said they believed that we would see new laws and regulations within the next year. What’s interesting is that almost the same percentage said the same thing almost one year ago. The difference is that we’ve seen no new laws or regulations in the past year. In other words, the expectation of regulatory reform is clearly stronger than the reality. Obama’s focus on healthcare, the EU’s debate over various reg reform proposals, and the general resistance to change are all contributing to a lengthening of the reg reform process.
Our second question asked whether the financial and credit crisis has influenced your company’s thinking and approach to risk management. 62% said yes. Eleven months ago only 46% said yes. The difference here speaks to what companies have found over the last year that suggest a revamping of their approach to risk management. Frankly, I am surprised that the number is not higher. Clearly, we all learned that very smart people can make bad decisions–isn’t that something that companies should want to control for?
It seems we can’t pick up a newspaper today without seeing another story on top management compensation, and its role in the near financial system meltdown. As Congress and the Administration wrestle with regulatory reform, fingers continue to point at CEOs and other senior executives who reaped huge rewards for taking what are deemed to be outsized risks – risks that brought some of their companies, and indeed the financial system, to the brink of disaster. The SEC’s new disclosure rules will shed more of a spotlight on executive pay and how companies and boards deal with corporate risk, and anger over “outsized” pay is boiling over in the form of regulatory reform and additional proposed taxes on financial services industry participants.
Certainly executive compensation should recognize the degree of risk inherent in performance. No one wants to see a CEO “bet the ranch” in a “heads the CEO wins, and tails shareholders and the taxpayers lose” scenario. So, yes, getting risk-reward back in balance at the top management level makes eminent sense, and already is under way.
With that said, however, we shouldn’t fall into a trap of thinking that dealing with the compensation issues can by itself address corporate risk. Those of you with leadership roles in risk management, compliance, auditing, and related areas in your organizations know full well that dealing with risk at the CEO level will not by itself transform how risk is managed throughout the organization. One can argue that CEO compensation has played only a limited role in causing financial institutions to take on such massive risks in the first place. Chief executives already have solid motivation to ensure the companies they lead achieve long term success, and certainly simply keeping their prestigious and lucrative job and reputation in tact are strong motivators. CEOs I’ve dealt with put the success of the company at the same if not higher level than acquiring personal more riches. Make no mistake, many do want to enhance their wealth, and some continue to keep score with peers, but putting their own personal objectives ahead of the company’s and its shareholders is not typical.
So, I hope and trust that neither the powers inside the Beltway nor corporate leaders and boards will think risk management is primarily about managing CEO’s motivations. The focus needs to be on risk management processes throughout the organization, linking risks with corporate objectives and initiatives, and managing risk to best achieve corporate goals.
In addition to the discussions summarized in previous posts, the participants at OpenPages Executive ERM Forum discussed risk quantification for operational risk and compliance. Some interesting ideas surfaced during the discussion.
First, participants agreed that the objective of quantification is for relative sizing and prioritization. In other words, you need to be able to relate the relative severity of a compliance risk in one division to an operational risk in another. This helps allocate resources to the right risks in the business.
It’s Easy Being Green
A key challenge that participants discussed was the surfacing of bad news. One participant described their company’s risk rating methodology in which risk are presented in management reports as red/yellow/green. One quarter, a risk report went up through a change of approval, with the risks “getting greener” at each level of approval, because of a reluctance to surface bad news.
One way to address this problem is to have a scale that relates to an external benchmark. One participant discussed ranking capability in relation to either other business units or competitors. You can use objective evidence to back up, or challenge, the rankings in this scale.
Of course, all measurement has to be done in relation to tolerance, which for many companies is difficult to quantify. In many cases, boards don’t have an inherent sense of risk tolerance so management has to give the board specific examples that helps frame the discussion around tolerance. The issue of tolerance when discussed in relation to compliance is a difficult one, and when the general council’s office is involved, the conversation is typically short, as there’s no real tolerance for non-compliance from a legal perspective. Further, some organizations are even concerned about having the discussion in the first place. One executive noted the differences between US and UK law on this topic, where boards in the UK can refer to discussions about risks as evidence of their discharging their governance responsibilities, whereas in the US boards don’t want to be liable for having discussed, but not fully mitigated, a realized risk.
The airline industry was referenced as one in which zero tolerance has to be the goal, as it’s clearly not acceptable to manage against, say, 1 crash per year or even every 10 years. The question become how much do you spend to mitigate the risk of ever having a crash. This led to a discussion of catastrophic events and the notional amount at risk for being in business.
One participant had an interesting perspective on how boards can think about tolerance. Investors build out their portfolios to reflect their risk/reward profile. They invest in particular companies because of the risk/reward characteristics of that particular company. Boards should always ask, “Are we taking the kinds of risks that are priced in by our investors?” In other words, are we following our stated strategy? Framing the discussion in this light puts a different perspective on risk exposure.
Expected to be released at RIMS 2010 this week in Boston is a new study on enterprise risk management. Sponsored by Marsh Inc. and the Risk & Insurance Management Society Inc. — the study titled “Excellence in Risk Management VII: Elevating the Practice of Strategic Risk Management” includes a 418 participant survey. When asked, “What barriers are in place that may prevent your senior management and board of directors from fully understanding the risk landscape of your organization?,” 40% of the respondents cited “siloed approaches to risk management.” Another thirty-six percent cited lack of awareness of concepts such as enterprise or strategic risk management, and 34% cited inadequate representation of the risk management function at the board and executive level.
For the forty percent of the survey participants representing public companies, the recent SEC disclosure rule should be reason for concern. SEC rule 33-9089 which became effective February 28, 2010 requires that boards describe their risk oversight process. The new disclosure rules relate to among other things, the relationship of a company’s compensation policies and practices to risk management, the background and qualifications of directors and director-nominees, board leadership structure and the board’s role in risk oversight. The discussion of board level oversight is a common theme at RIMS 2010 and promises to remain timely as the SEC continues to emphasize accountability moving forward.
Understand the entity’s risk philosophy and concur with the entity’s risk appetite
Know the extent to which management has established effective enterprise risk management of the organization
Review the entity’s portfolio of risk and consider it against the entity’s risk appetite
Be apprised of the most significant risks and whether management is responding appropriately
The last area is one that cruise line leader Carnival Corporation has taken to heart. In a recent interview with Erik Krell from Business Finance, Carnival’s vice president and chief audit executive Richard Brilliant explained how his team “has done a phenomenal job in developing a framework that enables us to provide risk reporting to the board that they never had before. The reporting not only allows directors to understand how risks are mitigated, but also provides ongoing risk monitoring as well as tracking of action plans for improvements.”
Brilliant says that presenting new, precise information to the board about the company’s overall ability to manage governance, risk, and compliance issues has really improved the dialogue about how the company could better respond to risk in the business. Further, Brilliant notes, “the board can also more clearly see over time how things have improved.”
The media recently has been rife with articles and commentary on what went wrong with BP’s horrific oil spill, Toyota’s multiple automobile recalls and stonewalling regulators, J&J’s recalls of multiple products along with allegations of withholding information from regulators, and Goldman Sachs’ disclosure settlement along with reports of it’s taking an adversarial posture with regulators and the public. Fingers also are pointed at basketball star LeBron James for his unseemly national news conference announcing his move to South Beach, and actor Mel Gibson’s words with his former girlfriend billing him as “The Worst Guy Ever.”
Moving back to the business context, the common underlying theme is bad actions by a company, or at least allegations of such, and attempts at crisis management by withholding information from regulators and the public. And quickly following is a public relations effort to cast the company and its executives in the most favorable light.
I’m not a PR expert by any means, but some who are seem to make a great deal of sense. One says “the companies that typically handle crises well, you never hear about them…. There’s not a lot of news when the company takes responsibility and moves on.” Certainly each situation is different and there’s a lot more to effectively dealing with a crisis than this, but it gets to the core of what I’ve seen to be a successful approach, which centers on taking responsibility rather than somehow trying to “spin” the facts. Some years ago my partners and I were called in to help a large financial services company deal with a crisis, where it was accused of a range of malpractices by regulators with the scandal making headlines. Interestingly, the first reaction by senior management was to call in the company’s PR firm, and let them handle it. Fortunately, after discussion, a different and ultimately much more effective decision was reached – to acknowledge the problem, and roll up shirt sleeves to fix the underpinnings that caused the problem. That is, to focus on the company’s internal control system from top to bottom throughout the enterprise, to provide the infrastructure, procedures and protocols, along with cultural enhancements, to put the company on the right track. Indeed, with a great deal of effort, that’s what happened, and the ultimate cost was very manageable and importantly, the company retained its customers, regained it’s fine reputation, and continues to be a leader in the industry and viewed in the most favorable light. It wasn’t PR that did it, but rather getting to what went wrong and fixing it for the long term.
Okay, PR is important and crisis management a critical tool when done right. Actually, advance preparation for handling a crisis is a key factor as well, but that’s another subject altogether. Here the point is that getting infrastructure right, with risk management and internal controls in place with accountability, communication, and related aspects, is what really counts in dealing with a crisis, or better yet, avoiding one in the first place.
In a recent blog post, OpenPages’ Gordon Burnes pointed out that a major theme of the Dodd Frank legislation is “greater transparency into risk exposure across the financial system.” In fact, there are several major components of the law that will require financial services institutions to collect and report on risk exposure in their business.
The Financial Stability Oversight Council is a new regulatory body created by the law that is tasked with monitoring and regulating companies that are deemed by the Council to be “systemically important.” The Council has the authority to instruct the Federal Reserve to impose new requirements on systemically important companies such as increased capital and liquidity levels as well as disclosing risk practices, regulatory gaps and resolution plans or “living wills.” In its role as systemic risk monitor, the Council will collect risk data from various sources including Federal and State financial regulatory agencies and the newly created Office of Financial Research (OFR) – which will among other things be responsible for collecting data from financial services companies.
The Dodd-Frank law also calls for a Risk Committee to be established by all public, non-bank financial companies, as well as all public, bank holding companies with over $10B in assets under management. Supervised by the Board of Governors of the Federal Reserve, the Risk Committee will be held responsible for enterprise-wide risk management oversight and practices, and be required to include “at least 1 risk management expert having experience in identifying, assessing, and managing risk exposures of large, complex firms.”
To meet these requirements for risk exposure data, financial services institutions need an information architecture that provides full transparency and reporting for the Board, Risk Committee and potentially the OFR. If you’re looking to develop an information architecture that will meet the requirements of Dodd-Frank and new regulations to come, here are a few things to consider:
1. Create a central platform to pull all of the different data elements together and maintain the relationships between elements (RCSA, Loss Events, KRIs, Issue Management, Policy Management, etc.)
2. Establish a common taxonomy and library for policies, processes, risks, controls, regulatory requirements and other key data elements
3. Integrate multiple areas of risk (operational, compliance, strategic, etc.) to provide aggregated analysis and full reporting of all risks across the enterprise
It’s Day Three of RiskMinds in Geneva. “Risk Modeling, Measurement and Management in the New World Order” is the topic of the day. We’re in a session on operational risk, “Operational Risk Management Going Forward”, hosted by Joachim Pfeiffer, Commerzbank, and John Whittaker, Barclays.
This is an interactive session, with questions coming from the attendees. Whittaker kicks off the session by noting that many in the conference have said that operational risk is off the agenda as the roots of the crisis lay elsewhere (credit, in particular). Whittaker disagrees, pointing out that impairment at most banks has gone up at most banks, and, and if you look at the root causes of the impairment, certainly credit decisions played a significant part, but so did poor process (collateral not in place) or process not followed.
Whittaker also discussed his concern about regulatory change over the next year. In the coming year, for instance, the “living wills” discussion is of concern, with more hand-offs and hand-ins. He’s also concerned about whether we will have enough people to deal with the extra process.
The rate of change is also a concern. Whittaker notes that organizations will have to be able to quickly modify processes to keep up with regulatory change. The question is whether organizations can keep up.
New product introduction: participants described a process whereby support functions have the ability to veto a new product that can’t easily be supported.
People issues: How do we deal with risk culture? For instance, should we have stage gates for bonus hold back. Tone at the top is really important. People’s bonuses should be calculated based on a risk adjusted P&L. Companies are also beginning to think about earnings quality. Regulators are focusing on a use test: how are companies using risk information.
Operational risk capital number: How does this influence the function? The operational risk component of economic capital is fairly small so it’s difficult to influence policy through the capital calculation discussion. Operational risk could be a must larger component of regulatory capital, however, and operational risk could have more to say in that discussion. Scenarios drive a discussion with the business, potentially enhancing the profitability of the line.
Benefits of AMA: Capital allocation–12.7 for a TSA bank/10.8 for an AMA bank. Huge benefit to putting that capital to work.
BIS range of practices in scenarios: US regulators are not accepting scenarios. Is this a trend? Whittaker thinks no and that US regulators over time will gravitate towards the European point of view. Pfeiffer points out that scenarios are hard to include in a capital model when you need 99.9 confidence.
Loss-based models: Reduce loss without reducing risk . Do we need to move to a causal model? Whittaker believes that everyone will have to have a hybrid-model, including both loss data and scenarios.
Internal control systems: Operational risk can make sure that there’s a single control process throughout the organization so that the person operating the control can spend more time on the control and less time answering questions about whether the control is effective. Also, there should be some consideration given to the way different controls are tested.
The Senate today voted 60-38 to end debate of the Financial Regulation Reform Bill and move to final passage later today before heading to President Obama’s desk. In addition to increased power to monitor systemic risk in banks, the Bill gives regulators the ability to step in and breakup or seize the assets of financial institutions deemed to be at risk of failing and posing a threat to the financial system. It also promises to create a new federal agency called the Consumer Financial Protection Bureau (CFPB) which will police loans and financial services products that banks and others sell to consumers. This morning’s vote was primarily democratic (all but one democrat supported the bill), and Republicans for the most part are claiming that it overextends the power of the government which, they argue in the long run will cost banks a significant amount of money in meeting the new regulations and reporting requirements.
The Huffington Post is reporting that, “a team of Goldman Sachs analysts predicted in a Tuesday research note that the legislation will annually cost Bank of America about $4.4 billion, Citi about $3.7 billion, JPMorgan about $5.3 billion, Morgan Stanley about $900 million, and Wells Fargo about $2.2 billion.”
The bill seems certain to pass the final Senate vote later today and Obama is ready to sign it. The ultimate impact on the risk and compliance management market is yet to be determined, but one thing is for certain, the era of deregulation is officially over.
There’s been a good deal of discussion recently about organizational location and reporting lines for a company’s compliance function. Some are stand alone, though many are embedded within the legal department, with concern of legal privilege among the considerations. Some report to the CEO, though for many others the reporting line is to another senior executive. And to further complicate matters, some compliance functions also have responsibility for ethics, with some being asked to take on even greater responsibility.
Certainly there are pros and cons to each organizational structure. What I’d like to focus on here is the critical relevance of a few key factors. One is to be sure a chief compliance officer, wherever he or she appears on the company organization chart, has the ability to bring relevant information directly to the chief executive and where necessary the board of directors. Depending on the nature of identified non-compliance events or associated risks, such access is essential. Also relevant are the recent amendments to the U.S. Sentencing Guidelines, which call for the compliance officer to report regularly to upper management and the board of directors or audit committee.
Another key factor is clarity around the compliance office’s scope of responsibility. Is It responsible for establishing a process for effecting compliance with all relevant laws and regulations to which the company is subject? That’s a good start. Does the scope include compliance with internal polices? That’s typically the case as well, and makes sense. But does the CEO and board think the compliance office can possibly ensure compliance? You and I know it can’t – the compliance function needs to focus on process and protocols, with direct responsibility for effecting compliance resting with line and staff unit leadership. Clarity around responsibility is essential. Amazingly, some company boards are looking to the compliance function to also take on responsibility for enterprise risk management! Fortunately chief compliance officers have fought the attempt, for good reason.
And another factor is the compliance function’s relationships with the legal and ethics functions, if separate. Certainly compliance processes must adequately reflect the legal and regulatory realities, and we know there often is a fine line between – and sometimes a forerunner or impetus for – unethical behavior crossing over to illegality. So clearly there must be close coordination to ensure information flows, policies, procedures and reporting mechanisms are in sync.
Of course each company needs to determine organization, reporting and responsibility for compliance to fit its own culture, management style and personnel. Getting this right will serve your organization well.
PwC surveyed the chief audit executives (CAEs) of Fortune 250 companies about trends likely to affect internal auditors over the next five years and what they expect internal audit to look like in 2012. Titled “Internal Audit 2012”, the study lists “ten imperatives” that provide the foundation for a high performance internal audit function in the years ahead including:
“Take an integrated approach to IT audit, one designed to strengthen IT capabilities. IT audit strategies need to lay the groundwork for integrating IT audit expertise within audit teams. An IT audit plan should center on an annual IT risk assessment, reflecting a clear linkage between IT risk assessments and IT audit planning. In addition, it should address risks within individual business processes and provide for continuous enhancement of IT audit capabilities. It’s also important for the plan to be clearly articulated, formally documented, and well aligned with organizational IT strategies and objectives.”
One of the key roadblocks to an integrated approach to IT audit is the sheer complexity of data gathering and management. In the past, it represented a tremendous amount of effort for internal audit to collect relevant information and to govern access to that information securely. A centralized technology platform for identifying, assessing and monitoring risk and controls presents a unique and unprecedented opportunity to help the business focus on making risk decisions based on management’s risk appetite and tolerances.
This common framework and process can make the business more predictable in meeting IT, financial and management objectives and can help managers anticipate major risk and control problems of the future. As a partner with IT and the business in managing risk, internal audit should be a driving factor in evaluating technological and process-based changes and evolving the organization’s risk management practices.
Washington DC played host to the 2010 Gartner Security and Risk Management Summit this week. At the event, Gartner Research Vice President French Caldwell provided a new twist on audience interaction with live polling via cell phone texting. In his session titled “Selecting and Applying GRC Frameworks and Standards,’ French polled the audience on “which areas are you most likely to apply standards?” Not surprisingly, IT risk and IT security ranked highest followed by regulatory compliance and enterprise risk. With respect to ERM, French then asked, “which ERM standard is most commonly used in your company?” The largest response was “none!” Fortunately, this was closely followed by COSO ERM, custom or self-defined frameworks and ISO 31000.
In a separate, lively and entertaining session titled “Research Factory,” French moderated a panel of Gartner analysts in a close-up look at how Gartner analysts propose and debate the merits of a new research topic. French again polled the audience on which proposed research topic was most relevant and had the best chance/probability of being fulfilled. Each analyst had four minutes to propose their topic and defend the debunkers on the panel. When all topics were complete, the audience voted on who presented and defended their topic the best. The winner was Research VP, Jay Heiser who in his proposal contended that there is a strong likelihood of a failure/data loss from a SaaS product or Cloud Service in the next few years having a major business impact on its subscribers.
Regardless of whether Jay’s prediction comes to fruition, clearly a strong case can be made for a detailed risk assessment of your SaaS and Cloud Services data protection processes.
Is risk management a strategic differentiator? When Toyota shifted the culture to one that valued and rewarded volume production, did it lose sight of quality as a strategic differentiator? Is Kermit the Frog a risk manager?
In the first installment of a multi-part Risk Chat with Eric Krell of the Big Fat Finance Blog, we touched on several such pressing topics. Check out Part One.
Over the last few weeks, several studies have emerged that indicate a growing demand for risk management. In one of the better ones, in late July, Accenture published their study on Risk Management. Entitled Managing Risk for High Performance in Extraordinary Times, Accenture surfaces several major findings, a few summarized here:
1. Risk management capabilities are not up to today’s challenges.
2. Risk management is too separate from the business and not integrated with day to day operations.
3. Companies are expecting to invest more in their risk management capabilities, despite the fact that many budgets are decreasing.
The study’s participants included 250 of the world’s largest companies; Accenture appears to have spoken with a representative sample of global businesses. Many of the study’s participants were focused on trying to align risk management with strategic objectives. Over 90% of respondents said that one of risk management’s primary challenges in the next two years is being aligned with overall business strategy. In other words, how can risk management help the business take on more risk?
This makes sense. As companies look to exploit competitors’ weaknesses during this downturn, they’re likely to want to move into new markets, offer new products, develop new channels. But these kinds of strategic initiatives will be coupled with additional risks that must be managed. We have always talked about the importance of integrating risk management into day to day operations and have developed software to enable business users to incorporate risk management in to their daily activities. As companies take on more risk to capitalize on opportunities afforded by the current downturn, they will be looking for programmatic ways to have business managers identify and manage risks associated with these new opportunities. From the current financial crisis, we now have many good examples of what happens when companies discount the risk management process, which is one reason risk management, both in terms of personnel and technology, is in greater demand today than a year ago.
No doubt you know that the Dodd-Frank Wall Street Reform and Consumer Protection Act has been signed into law, with at least some ramifications for every public company. Space here doesn’t permit an overview, and in any event you’ve probably already received highlights of the new law from one or more advisory firms. Among the more interesting aspects of new requirements is how the authority of corporate shareholders has risen, in a number of significant ways:
Say on pay: Shareholders now will get to vote on whether they’re satisfied with executive compensation. And the same holds for so called “golden parachutes” related to such transactions as sales or mergers of the company. While these are only non-binding advisory votes, compensation committees and full boards will certainly think twice before continuing with compensation voted down by the company’s owners – which parties also vote on whether sitting directors should be re-elected going forward. As such, we can expect to see boards more receptive to views of shareholders, especially major ones, on executive compensation programs.
Additional executive compensation disclosures: Public companies also will need to provide more detail about how executives pay relates to the company’s financial performance. Additionally, disclosure will be required of the ratio of the CEO’s total compensation to the average of all other workers’ median total pay. There’s little doubt that shareholders will be focusing closely on this information and reacting to it in the voting process.
Elimination of broker discretionary voting: Now stock exchanges will extend beyond the current NYSE rules, to now prohibit discretionary broker voting in board elections as well as executive compensation and other significant matters. Because brokers typically voted in favor of company initiatives, shareholders will have more say in what transpires.
Proxy access: Perhaps most significant, the SEC is authorized to allow shareholders to use proxy materials to nominate their own directors. While we don’t know exactly what the SEC will do in this regard, we can expect that shareholders will have a greater say in who sits in the boardroom.
These of course are just some of the elements of the new law, which impact ultimately will be determined by numerous studies to be undertaken and regulations to be issued. One thing, however, is clear. Shareholder authority continues to grow, and companies and their boards will continue the trend of opening channels of communication with shareholders.
Managing risk and compliance in silos is both cumbersome and costly. Implementing a new technology point-solution for each new regulation or risk discipline, limits an organization’s ability to streamline risk and compliance processes and reduce costs. It also obscures the opportunity to integrate risk and compliance to gain a holistic view of the firm’s risk landscape.
At OPUS 2010, Chris Haines, Vice President, Operational Risk Management Group at American Express discussed how American Express effectively leveraged the OpenPages technology in their efforts to converge risk management disciplines and best practices across the enterprise. American Express utilizes the OpenPages technology to create an integrated and converged risk and compliance management program that can streamline and improve its risk management processes. The Operational Risk Model employed by American Express provides management greater visibility into risk and empowers management to make strategic business decisions based on a broader understanding of its risk profile.
Whatever risk disciplines are significant within your firm, the goal is to integrate them within a single framework that produces a holistic view of your risk landscape. While most leading companies have tailored their risk methodologies to match their business operations, it is imperative to select a technology solution that can easily adapt to your firm’s unique risk and compliance methodology and evolve gracefully over time.
The ability to adapt the technology solution to your company’s specific risk management methodology and framework, without having to write custom code, is critical. The key business benefits of flexible configuration include:
Lower costs: Custom code is more expensive to develop for initial implementation and much more expensive to maintain and extend over time.
Time to deployment: Configuration can support rapid implementation at a fraction of the time compared with writing custom code.
Future proofing: Configuration will allow you to quickly adapt your risk framework to meet changing requirements while minimizing the impact on your business operations.
The extent to which your technology platform is configurable is arguably the most important decision criterion for selecting a solution.
Recently, much has been written about the fate of financial services technology spending given the recent financial crisis. The Wall Street Journal’s Business Technology blog, for instance, points out here that Lehman spent $309 million on technology and communications in the quarter ending August 31. It’s hard to know exactly how much of that spending would be cut under a dramatically reduced operation under Barclays, but clearly, at Lehman and elsewhere tech spending’s going to take a hit in the financial services sector.
However, there is one technology area that will certainly get increased attention and that is in risk management. It’s very likely that 2009 regulation will include greater checks on leverage and an expansion of banking-like regulation to other businesses with banking-like activities. And regulators are already focused on improving the risk management functions of financial services institutions. For instance, WaMu announced on Sept 8th that they had signed an MOU with the Office of Thrift Supervision concerning different areas of the business, including the risk and compliance functions.
Risk management technology, the systems that provide visibility into the state of risk in the business, is a critical component or early warning system for risk managers trying to run the business. Of course, knowing about the risks is not always sufficient. Just ask David Andrukonis of Freddie Mac who’s CEO apparently ignored the early warning signs of excess risk exposure, according to the New York Times. Nevertheless, having the risk managment infrastructure in place at least allows management to make informed decisions about what risks to take or not.
And there’s another driver here for risk management technology. Over time, shareholders, not just regulators, will want to have better visiblity into the risk exposures in a company. The Fed demonstrated that they are willing to let large entities fail (well, sort of), and as such it will be up to the market to assess risk in the business. Management will be encouraged to provide transparency as to the state of risk in the business through a lower cost of capital, the benefit for which would dwarf the cost of any risk management technology. Which is why I think spending on risk management technology will not drop as much as the overall market for financial services IT spending.
We’ve discussed in this blog the role of IT in GRC, mostly in terms of how IT manages risk inherent in delivering IT services. But there’s another risk that IT should be addressing, and that is the risk of disparate risk data marts scattered across the enterprise. I’ve written about it here.
A tag is a keyword you assign to make a blog or blog content easier to find. Click a tag to find content that has been assigned that keyword. Click another tag to refine the search further. Click Find a tag to search for a tag that is not displayed in the collection.