Today we announced the availability of OpenPages 6.0. This release represents a significant new phase in the evolution of GRC and provides organizations with the insight needed to drive business outcomes as well as the ability to manage effectively through the changing regulatory environment. We’re also excited to have completed the first phase of technical integration with IBM with the release of AIX support.
The GRC market developed out of the tactical, departmental deployment of SOX and other compliance and risk management solutions. Companies realized that they could leverage their control testing and risk assessment activities across multiple different oversight functions by consolidating their risk and compliance efforts on a common technology platform. Indeed, we’ve seen very strong ROIs for Enterprise GRC platforms, ROIs driven by this efficiency. The next phase in the evolution of GRC is about insight, using the GRC data to help drive business outcomes.
Here’s an example of how GRC data can be used to drive business outcomes. Imagine a multinational bank that has a subsidiary in France. The compliance team has identified some procedure violations with regard to the handling of customer account data. The audit team has found some major control weaknesses surrounding customer account data, and the operational risk team has observed some KRIs above threshold. Any one of those functions may not escalate their particular findings, but, taken as a whole, the GM in France would be able to see that the business is at great risk of a significant loss. This is the kind of insight that can help drive business performance, in this case avoiding a fine and loss of brand stature.
OpenPages 6.0 will provide better insight through enhanced business intelligence. The power user will benefit from easier report building and in context data presentation through Cognos mash-up services. The business user will benefit from interactive dashboards, and the executive from data syndication through Office and mobile devices. We’ll discuss some of the other new capabilities in 6.0 in subsequent blogs.
Yesterday, we announced a joint business relationship with PwC. This is the result of our closer alignment in the market for GRC solutions. We’re proud to be associated with such a great firm: with over $26 billion in revenue and 163,000 people in 151 countries, PwC has a strong global presence. We’ve found that PwC also has a strong presence at our financial services customers, and, given the challenges facing that industry, we think there’s a great opportunity to deliver joint solutions to our common customers.
OpenPages’ solutions inherently deliver a risk-based approach to GRC. This approach aligns perfectly with PwC’s top-down approach to GRC. They’re always asking the question, “What are your business objectives and what are you doing to achieve them?” We find that many service providers in the GRC business tend to take a bottoms up approach, implementing a comprehensive controls infrastucture, for instance, without making sure that the right controls are being implemented or that the right business objectives are being met. Given the financial constraints facing many customers, allocating resources effectively is a critical success factor for GRC programs, and we look forward to working with PwC to help our joint customers operationalize those programs for better business outcomes.
Now that healthcare reform has passed, the Obama administration has turned its focus on financial services regulatory reform. Today, Obama gave a speech on the administration’s position and priorities. The House has already passed a bill, and the senate may take up one this week, largely authored by Senator Dodd. A major sticking point has been the fund to facilitate an orderly liquidation (labeled a “bailout” fund by some critics) and the way to handle derivatives, but Senator Grassley’s vote yesterday to approve a senate committee’s plan for derivatives trading gave new momentum to a bipartisan effort on regulatory reform, and it looks increasingly likely that in the coming months (if not weeks) we’ll see a major overhaul of the regulations that govern Wall Street.
Further, the SEC demonstrated late last week that they are one government agency that is going to take their oversight responsibilities seriously. Their civil suit against Wall Street giant Goldman Sachs sent shock waves through the financial services sector. It’s clear that there’s a major shift on in the way regulators are regulating. Whether or not you agree with the merits of the suit, SEC Chair Shapiro is sending a message to the industry that they are going to be watching closely.
A common theme here is transparency: the SEC argues that Goldman didn’t provide adequate disclosure about the nature of the Abacus investment opportunity; Obama argues today that “reform would bring new transparency to many financial markets.” We also see this as a common theme with our customers–they are looking for greater transparency into the risks in their business. We see this push for regulatory reform and increased oversight as driving the demand for a new information architecture that provides this transparency to managers, executives, board members and regulators. Of course, many companies are finding that it can help you run your business better, too.
One of the key themes that developed during 2009 was that risk management is more crucial than ever to organizations, and failing to deal with it is not an option. Companies are seeking ways to gain a more complete picture of risk, assess exposures across business lines and aggregate these into a firm-wide view. Collaboration with and support from the business lines is critical to achieve these goals as we discussed in #2 on our list: “Better Collaboration with the Business.” But if you are looking for better collaboration and you’re investing in risk management systems (#6), you probably can also relate to #9 in the 2010 GRC Wish List: “Risk Applications that are Easily Adopted by the Business.”
How do you support adoption of your risk management application by the business? Here are a couple of things you might want to consider:
Involve the business in the application selection and implementation process. Participation by the business is a great way to build commitment and you will usually find that they have some great ideas too.
Select a solution that can easily adapt to your methodology. GRC solutions should be enablers that support your risk management practices. Technology should not force your users to change the way they do business.
Deploy a solution that is intuitive and easy to use. Most business professionals are technically competent but they are not “power users.” Make sure that your risk management solution is easy to learn and use. In addition, most business people will be infrequent users, so pay particular attention to how quickly and easily users can accomplish their specific tasks.
Focus on Usability first, User Experience second. Usability focuses on the factors that affect the user’s ability to understand and do things in the application. User experience focuses on providing an engaging, fun, pleasant, empowering and inspired experience. Usability is critically important for your business users and will greatly determine the extent to which they adopt your risk management solution. User Experience is nice, but save it for your company’s web site.
Providing a risk management solution that is easily adopted by your business users will be a key enabler for achieving actionable risk management: where risk and compliance activities are an integral part of everyday business operations.
Is your current risk application enhancing your risk management practices or getting in the way? Let us know about your experiences with deploying risk management applications and what has helped or hindered their adoption.
Fed Chairman Ben Bernanke went on the offensive yesterday at the annual meeting of the American Economic Association, arguing that lax regulatory oversight, not loose monetary policy, led to the housing bubble and subsequent financial crisis. You can read his remarks here.
After working behind the scenes for most of the fall, lobbying legislators one-on-one, Bernanke took a very public position yesterday, blaming the rise in housing prices on the alternative types of variable rate mortgages which priced in more demand than that which could be expected from prevailing interest rates.
Bernanke argued that “stronger regulation and supervision aimed at problems with underwriting practices and lenders’ risk management would have been a more effective and surgical approach to constraining the housing bubble than a general increase in interest rates.”
Further, he said that “the lesson I take from this experience is not that financial regulation and supervision are ineffective for controlling emerging risks, but that their execution must be better and smarter.”
To some extent he’s trying to deflect the spotlight onto other regulatory agencies chartered with overseeing the factory for different kinds of mortgages. But Bernanke can’t have it both ways. He’s argued in the past that the Fed has a role in consumer financial protection and has lobbied against the CFPA, so, if it is the case that the Fed’s mandate extends to the financial consumer, why did he let these mortgages with low monthly payments proliferate? While he was convincing that there were other factors beyond monetary policy that led to the housing bubble, he was less clear on what kind of regulatory structure would have prevented the bubble and how we should move forward on consumer financial protection. At this point, my bet is that the CFPA has enough momentum to pass with financial reg reform.
In 1958, IBM researcher Hans Peter Luhn first introduced the term business intelligence (BI) in an article he contributed to the IBM Journal. He described business intelligence as "the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal." Cleary, business intelligence plays a key role in risk management providing executive level decision-makers the ability to look across all categories of risk (in different business units, categories, geographies etc.) and providing a global view into business performance and risk exposure.
Business Intelligence and risk management are linked on two levels. First, when used in conjunction they provide executive level transparency into risks within the organization, and secondly, they provide product planners and corporate strategists a risk-adjusted performance view.
If you’re considering a risk management solution, you might want to listen to a recent IT-Finance Connectionpodcast on the role of business intelligence in risk management. Additionally, here are some tips on leveraging risk management practices to provide stronger and more introspective BI analysis:
Identify and eliminate risk factors and exposure points within the organization to create a strong foundation/base.
Examine opportunities related to taking strategic risks within the business (new products, launches into new geographies/industries, M&A, etc.).
Asses the potential risk exposures tied to moving forward with strategic company direction and initiatives.
Apply this risk management analysis to your overall business intelligence framework to provide executive management/management board with a clear view of not just the company’s risk exposure (and where risks have been eliminated altogether) but where there is an opportunity to take strategic risks with the added layer of business intelligence needed to make smarter business decisions.
The Institute of Internal Auditors 2009 General Audit Management Conference is coming up and should be quite timely given the evolving role that Audit is playing in providing an independent assessment of enterprise risk and governance. The conference has some intriguiging sessions including:
As you can see, internal audit has evolved from its traditional role of record examination and identification of policy violations to a more modern, consultative approach aimed at risk mitigation. As part of this evolutionary process, internal auditors have also focused more of their efforts on the risk assessment process and a top-down approach to audit scoping.
One of the key roadblocks to an integrated approach was the sheer complexity of data gathering and management. In the past, it represented a tremendous amount of effort for internal audit to collect relevant information and to govern access to that information securely. A centralized technology platform for identifying, assessing and monitoring risk and controls presents a unique and unprecedented opportunity to help the business focus on making risk decisions based on management’s risk appetite and tolerances. This common framework and process can make the business more predictable in meeting financial and management objectives and can help managers anticipate major risk and control problems of the future.
As a partner with the business in managing risk, internal audit should be a driving factor in evaluating technological and process-based changes and evolving the organization’s risk management practices.
If you’re planning on attending IIA GAM March 16-18 in Washington, DC please visit the OpenPages booth. And don’t forget to enter the raffle for a Flip handheld video recorder. Or, to learn more download our informative white paper, Internal Audit and its Evolving Role in ERM.
Or more to the point, was he thinking at all? We’re talking about Rajat Gupta, operating at the highest echelons of multinational business, who finds himself charged by the Securities and Exchange Commission with illegally passing inside information to Raj Rajaratnam, the Galleon Group founder about to go on trial on charges of insider trading. Mr. Gupta, a Harvard Business School graduate and former head of McKinsey & Co., has been a board member of the likes of Goldman Sachs, Proctor & Gamble, and American Airlines.
What did he do? Well, he of course is innocent until proven guilty, and according to media reports, his lawyer says he has done nothing wrong. But the SEC says otherwise. It alleges Gupta gave the Rajaratnam advance information about earnings at both Goldman and P&G. On top of that, the SEC maintains that Gupta called the Galleon head with the inside scoop of the Goldman Board’s approval of Warren Buffett’s $5 billion investment in the firm. The allegations speak to multiple phone calls between the two men, enabling Galleon to reap millions in profits. What must be particularly troubling for both is that the SEC says it has recordings of numerous telephone conversations.
Let’s presume for a moment that the allegations are factual. A relevant question is, is this a black eye on the companies on whose boards Gupta sat (by the way, the reports say he resigned months ago from the Goldman board, and recently from P&G). My answer, based on the information available, is “no.” Certainly, if the allegations are true, a statement by SEC Director of Enforcement is on point: “Mr. Gupta was honored with the highest trust of leading public companies, and he betrayed that trust by disclosing their most sensitive and valuable secrets.” But what could or should have been done to prevent wrong doing at the board level?
We know well the importance of a company’s board of directors in keeping a close eye on what the CEO and senior management team do, and on the company’s system of internal control. We recognize the importance of compliance officers, risk officers and internal audit functions. But who keeps an eye on the board, especially when their actions are outside the inner workings of the company itself? We can look to what happened years ago at HP, when a board member leaked information to the media, which resulted in the pretexting fiasco.
There are no immediate answers, other than to continue to ensure full vetting of director candidates, and maintaining effective board and internal audit processes to best identify and manage potential misbehavior. With the thousands of directors of major companies acting with extraordinary integrity and ethics and in the best interests of their companies and shareholders, I believe we don’t have much to worry about. But it is worth more thought going forward.
You’ve surely heard about Goldman Sachs’ settlement with the SEC on fraud charges related to the firm’s disclosure, or lack thereof, of a collateralized debt obligation that purportedly was designed to fail. The $550 million to be paid may seem like a lot, and indeed is said to be the largest SEC fine against a Wall Street bank, but many observers maintain that the firm got off easy, especially when the amount is viewed in light of Goldman’s revenue and profits.
But there’s another way in which Goldman seems to have dodged a bullet. While other companies have had to accept a government appointed monitor working inside the organization, Goldman won’t be subject to such meddling. In my mind, avoiding this kind of intrusive interloping is just as big, if not more so, than the manageable size of the fine – especially for a firm as sophisticated as Goldman Sachs.
There is, however, an annual requirement for filing a certificate, for three years, that Goldman is in compliance with the terms of the settlement. Of considerable interest is that the certificate is to be signed by the firm’s general counsel or global head of compliance. Some pundits are saying this makes eminent sense, while others take the position that it should be the CEO or board, who are ultimately responsible for ensuring compliance, to be putting their signature on the dotted line. In any event, all this puts more of a spotlight on chief compliance officers and compliance programs. One former chief compliance officer reportedly said the SEC “seems to be attempting to elevate importance of the chief compliance officer role,” while an active compliance chief says the settlement shows that compliance officers “are becoming true C-suite level executives.”
There’s a lot going on here, and we can expect to see the focus on compliance officers ratcheting up further going forward.
We’re nearing the second anniversary of SAP’s purchase of Virsa and their entry in a serious way to the GRC space. Last week, they made a series of announcements about their GRC products, which now extend beyond industry apps and the SOD/access control arena to other areas of GRC. Business Finance has a new GRC blog and covered SAP’s announcements. John Cummings notes that "the sheer scope of GRC offerings from SAP and other enterprise software providers is impressive, and point-solution vendors will need all of their agility to respond."
Certainly, we wouldn’t argue with that statement, but we would say that one of the most important parts of a GRC solution is how it fits into the rest of the system. While SAP (and maybe Oracle) might be able to make the argument that you should be single threaded on SAP, the rest of us cannot make that argument, so we have to play nice in the sandbox and 1) fit into the existing (heterogeneous) environment and 2) work across silos. This latter point is critical because what the enterprise GRC platform vendors are delivering is a way to see risk across the organization. When SAP demonstrates their risk management application, they focus on controls associated with a sales process; that’s a very different solution, a tightly integrated top-to-bottom solution, but not very good at crossing silos. And, as I blogged earlier in the week, the real value in risk management comes from relating risk together at the top of the business. Of course, we’re not an ERP vendor, but you have to wonder if you want the fox guarding the hen house.
The noon panel at GARP discussed risk and performance management, with a diverse set of participants, including representation from Hess, Swiss Re, and Vanguard.
Kanwardeep Ahluwalia from Swiss Re noted that many companies are going through a derisking process right now. However, Ahluwalia cautioned that companies need to be cognizant of how much they are paying to reduce their risk. In many cases, especially now, it may make more sense to manage the risk internally to maximize performance.
What is the role of risk management in the budget process? Panelists suggested that during the budgetary process risk management should step up and call out inconsistencies between risk and performance goals. The moderator, Kevin Buehler from McKinsey, noted that many times he has found that companies in trouble have misaligned expectations between risk and reward. For instance, a company may have aggressive revenue goals to take share in a particular (emerging) market, but those goals may in conflict with a risk adjusted return on capital. However, he said that typically risk management does not normally win out in a conflict in which the CEO is on the other side, but you have to force the dialog.
Jonathan Stein from Hess argued that risk management needs to move beyond the Be Careful mantra and move into recommendations for risk mitigation. He talked about the importance of developing scenarios that help define triggers risk mitigation actions.
In general, the message from the panelists was that deeper interaction with the business allows risk managers to be more effective. This includes everything from designing risk management processes around the way the business makes money to prompting a dialog at the executive level when risk and performance expectations are not aligned.
Risk management is a hot topic at Davos this year. Over on the Forbes blog, Paul Maidment notes that companies are thinking about how to improve their risk management approach, prompted in part by the new SEC proxy disclosure rules, though many are opting not to have a so-called risk committee. Maidment notes that management is responsible for educating board as to the state of risk exposure in the company. We would argue that there’s a step that has to happen first: companies have to put in place an information architecture that can provide transparency to that exposure in the first place. A rat’s nest of Excel spreadsheets won’t do the job.
Coincident with Davos, PwC released their 13th annual global CEO survey which found an up uptick in CEO sentiment worldwide. The survey also found that over 83% of companies are planning ‘a major change’ to their risk management approach. This is higher that for any other aspect of their strategy, organization or operating model. Clearly, we’ve reached the tipping point on risk management. Companies that don’t address this critical area of their business risk being left behind.
On a daily basis, we’re out speaking with prospects, customers, analysts, press, and thought leaders in the GRC/ERM space. Over the course of the last year, we’ve heard many myths about risk management, and, over the course of the next couple weeks, we’ll address these myths. But we thought that we would give you a taste of what’s to come, so here is a list of the top 10 myths in risk management. Please feel free to add your own in the comments section. This list is certainly not exhaustive!
1. IT Risk Management = Information Security
2. CIOs Have Embraced GRC
3. A Rigid, Standardized Approach Is Best
4. You Can Only Manage Risk from the Center
5. You Can Manage Risk and Compliance in Spreadsheets
Managing compliance obligations within a large organization often involves the coordination of many compliance professionals from the CCO to Chief Ethics Officer, CIO/CTO, Chief Privacy Officer, Director of Corporate Compliance, IT Director, Legal counsel, HR Director and Supply Chain Manager. There are also many challenges in managing fragmented and disparate compliance obligations from financial controls to IT compliance to environmental health and safety.
Consequently, many of these initiatives are being managed in silos which limit a compliance officer’s ability to effectively train, communicate, monitor and ultimately measure the status of compliance obligations across the enterprise with any level of confidence. Effective compliance programs leverage a more practical, cross-regulatory approach to managing compliance that can alleviate increasing costs and complexity. This more strategic approach to compliance reduces overall costs as you leverage common assessment processes and technology infrastructure.
To learn how you can break down the compliance silos and implement a cost-effective approach to managing compliance, check out a recent Compliance WeekWebinar presented by Michael Rasmussen, President of Corporate Integrity, and Julian Parkin, Group Privacy Programme Director at Barclays.
When I took my first class on financial engineering as a naïve applied mathematics undergrad, we started with portfolio selection and the capital asset pricing model. In my typically confident (some might say arrogant) fashion, I decided I knew more than the professors, and that we should be focused on maximizing returns, rather than with the almost religious deference we were giving the notion of risk. A few case studies on LTCM (and modern hedge funds) brings into sharp relief the importance of risk. And yet, years later, I did it again. A few years ago, I claimed to be an expert on risk. In actuality, I was an expert on security, who knew very little about risk. In fact, I knew so little about risk, I had no idea how little I knew about it.
I come from the information security space. I spent a number of years there, and throughout my tenure, I continually abused the word “risk.” Oh, I had no idea I was doing it. In fact, 99% of my colleagues in security were doing the same thing. The fact of the matter is, the cloak and dagger security types, self-professed “security experts,” continue to misuse the word. It wasn’t until I really tried to peel back the onion and build a product that managed risk for the security space that I realized that what often passes for risk management in IT is actually control management and compliance. True risk management deals with uncertainty around unexpected losses – looking at consequences in business terms and weighing those against potential reward. Information security management, as currently practiced, is in most regards a necessary, but not sufficient, component of information risk management.
A little experience in different disciplines and verticals can make all the difference in the world. Financial Services is arguably the most sophisticated industry when it comes to managing risk. From a credit and market risk perspective, the average investment bank or hedge fund has teams of Ivy League PHDs running thousands of financial models 24×7 with a virtually unlimited budget on server farms with more firepower than NASA. From an operational risk perspective (much more analogous to information security), these same banks have taken the lessons they’ve learned in years of managing credit and market risk and have applied them to the more esoteric. Where they lack the hard, quantitative data of their peers, they’ve adapted clever ways of working around it.
Information security practitioners, on the other hand, are great at managing compliance by checklist. We have impressive standards, frameworks and regulations like ISO 17799, PCI, BITS, CobiT and a whole slew of others that are pretty good at spelling out a series of “thou shalt have’s.” NIST 800-30 even gives a set of guidelines for doing risk management for IT systems. So what’s missing?
Information Security standards and guidelines are a good thing, but they can be very easily misused and abused. They encourage cookie cutter thinking, and miss the bigger point – no two industries are the same. No two companies within an industry are the same. No two geographies within a company are the same. No two data centers within a company geography are the same. No two services run on hardware in the same data center are the same. No two business processes serviced by the same service are the same. And guess what? Depending on the time of the year, the needs of your customers and other factors, the same business process may have different needs on different days!
OK, clearly mapping all of those dependencies is hard. So, most organizations give a data sensitivity rating to their information assets. Maybe they get cute, and provide a “platinum, gold, silver, bronze” type scheme. Maybe they even set some arbitrary control thresholds based on this classification. So why do we have multiple large company executives going on record claiming that PCI compliance is too hard? Two things here – first, PCI is an ISO 17799 derivative. Second, with sensitive customer data sitting on these information assets, shouldn’t they have already warranted a platinum rating? Logically, it should follow that in any 17799 shop (many), information assets should be close to PCI compliant.
In reality, however, we all know that InfoSec groups are asked to do way too much with increasingly smaller budgets. It’s difficult to get management to buy into the need for information security, which exacerbates the problem. As such, it’s critical that we work smarter, not harder. If only there was a tool that let us do that…
Enter risk management. Throwing a set of checklist controls at our enterprise architecture is not risk management. Theoretically, it should result in some risk reduction, granted, but that’s not an optimal return on investment. Imagine running a hedge fund without a complex risk model for every conceivable position – running countless layers deep. You’d be insolvent within a month.
So what are the roadblocks to risk management in information security? The biggest is a lack of business context. For years, IT has talked about aligning to the needs of the business. It’s still a challenge. The fact of the matter is, it’s tough getting C-level executives to prioritize business objectives and processes amongst themselves (think politics, agendas, silos), much less as a deliverable for IT (who they see as less and less of a strategic asset). And even if they could agree on a real priority for those corporate objectives, navigating the rat’s nest down of dependency from the objective to the asset level would prove difficult for most organizations. As a result, it’s impossible to prioritize the consequence of an attack on a specific tangible thing.
That starts to cover the consequence side of things. How about impact? Actuaries have tables for flood rates, financial engineers have volatility metrics for options calculations. Unfortunately, it’s very difficult to compile reliable loss data on the IT side of the house. Difficult, but not impossible. We safeguard that information like it is customer data. But, if you look at our peers managing operational risk, there several initiatives around sharing anonymous loss data. Banks collaborate on internal loss metrics to quantify the costs and probability of fraud, malfeasance, etc. Back to security, TJX set aside a penny a share to cover their data breach, and current press estimates range from $12 – $25 million. (Many experts think these estimates are overwhelmingly optimistic, by the way). Are the metrics we have available perfect? Not even close. But qualitative factors are a good stopgap to supplement the limited quant data we have.
Don’t get me wrong – we have some brilliant people working information security. Brilliant people doing amazing things with limited budgets in a game with stakes that would make a high roller at the Bellagio head for the nickel slots. What we need is to buy them some leverage. Risk Management help information security professionals make better decisions faster, helping practitioners do more with less. Risk Management is a great tool to help information security practitioners work more efficiently – just don’t confuse the two.
This year’s OPUS is shaping up to be the best yet! In addition to leading GRC executives presenting case studies and lessons learned, OPUS 2008 includes the who’s who of GRC thought leaders:
French Caldwell, Research VP at Gartner, Inc. will discuss the latest GRC Technology Trends.
Chris McClean, Analyst at Forrester Research, Inc. will provide a perspective on Corporate Social Responsibility and the growing influence of GRC.
John Haggerty, Vice President & Research Fellow at AMR Research will discuss his view on the future of GRC.
David Holcombe, Director of Risk Management for International Speedway Corporation and NASCAR, Inc will provide a history and evolution of safety in motor sports from the NASCAR and motor sport facility perspective.
Mark Beasley, Director, ERM Initiative and Deloitte Professor of ERM at NC State will discuss how many organizations are responding to external pressure by leveraging traditional risk management processes into an enterprise risk management (ERM) view.
Richard Steinberg, Founder and Principal, Steinberg Governance Advisors, will provide his insight on risk convergence.
You can get a preview of Richard Steinberg’s perspective on enterprise-wide risk management by checking out our recent blog entry. In this video, Richard is interviewed by Gordon Burnes at the recent Executive ERM Forum.
Check back soon for more detail on our extensive line-up of customer case studies being presented at OPUS and our extended Hands-On Workshops.
If you’re attending OpRisk USA in New York City March 24-26, don’t miss Scott Green’s discussion on reinventing risk processes. A frequent speaker and well versed GRC industry expert, Scott is the managing vice president of operational risk management at Capital One and also serves as vice president of the OpenForum User Group. At OpRisk USA, Scott will make a case for risk process integration and success factors in driving change to create a more effective and efficient ORM.
Scott also recently joined Compliance Week for a web cast conversation on “Risk Management Strategies at Capital One” where he discussed how Capital One is reinventing risk management processes to improve efficiency and effectiveness in managing risk requirements and their associated controls. The archived web cast is available on-demand.
Recently purchased by The Bank of Tokyo Mitsubishi (the 2nd largest banking group in the world), Union Bank, N.A. out of San Francisco has been asked to lead the way for the entire organization with respect to adopting Basel II and the advanced measurement approach for operational risk measurement.
Marty Blaauw, Senior Vice President of Operational Risk at Union Bank stated, “At Union Bank, we are striving to use the advanced measurement approach for operational risk measurement and OpenPages provides an integrated operational risk management framework that will assist us in this goal. We are confident that OpenPages’ solution will allow us to streamline our operational risk management and measurement process and provide the integrated risk reporting and dashboards being requested at the executive level.”
With $86 billion in assets under management and 340 banking offices in California, Oregon, Washington and Texas as well as two international offices, this is a strategic initiative with enterprise-wide implications. Union Bank purchased licenses for the entire OpenPages Platform and selected OpenPages ORM as the operational risk system of record for managing risk assessments, key risk indicators (KRIs), issue management and scenario analysis, as well as integrated risk reporting.
One of the most commonly requested agenda items from past OPUS attendees has been for more Hands-on Workshop sessions.
We’re happy to announce for OPUS 2008, we’ve added an entire track dedicated to these valuable workshops. Each 2-hour session will provide you with a discussion on best practices, a technical demonstration, and a chance to work within a sample OpenPages 5.5 environment to implement the techniques that you have learned.
These sessions will provide a unique training opportunity that you won’t want to miss!
OpenPages 5.5 Hands-On Workshop: Workflow - Review workflow concepts and terminology, then create and test your own workflow containing two tasks.
OpenPages 5.5 Hands-On Workshop: Creating Profiles and Home Page Configuration – Discuss best practices methodology to meet business requirements, see a technical demonstration on configuring an Object Profile, then create your own profile, associate users to the profile, configure the home page and restrict user access to fields.
OpenPages 5.5 Hands-On Workshop: OpenPages CommandCenter Reporting – Review a business use case and discuss best practices methodology, then create, troubleshoot, and run a report on your own sample OpenPages 5.5 environment.
To learn more about the OPUS 2008 agenda, which includes the industry’s best line-up of GRC thought leaders and customer case studies, download the detailed agenda and register on-line.
Keep in mind that the exclusive OPUS room rate will expire on September 28, 2008, so please book your room today. To make your hotel reservations call the hotel directly at 1-800-HOTELS-1 (1-800-468-3571) and ask for the OPUS room rate, or visit the Renaissance Boston Waterfront Hotel website.
Our friends at OpRisk & Compliance magazine recently kicked-off the third annual operational risk software ranking survey. The survey consists of 5 categories, within which you are asked to rank the top five firms (OpenPages!), or to enter a firm of your own choosing. The categories are:
Scenario analysis functionality
OpRisk loss data collection
Key risk indicators
Risk control and self assessments (RCSAs)
Regulatory and economic capital modeling
This is a great way to show support for your favorite operational risk software vendor (OpenPages) while also being entered into a prize draw to win a copy of "The Basel Handbook", second edition. All responses remain anonymous and will not be attributed to individuals. OpRisk & Compliance will publish the results from this survey in the May edition of OpRisk & Compliance magazine.
So if you have a moment (1-2 minutes), please take part in this fun and informative survey.
We did an interesting survey at OPUS a couple weeks ago. We’ll be publishing the results here next week, but one of the GRC topics that people have been talking about is whether GRC spending will decrease like most of the rest of the tech sector, or increase based on the very obvious need for better risk management in corporate America. Whether or not GRC spending increases next year will depend, of course, on the state of the economy, and a host of other issue that Brian Sommer discusses in a blog post this week at ZDNet.
Brian and I discussed a variety of topics on the value of GRC deployments and in particular on the importance of risk management. While technology alone would not have prevented the current crisis, it can be an enabler for change, and many firms at OPUS indicated that using a GRC management system can enforce policy and help catalyze behavioral change around risk management. The beauty of such a system is that you can very quickly find out who’s following the rules and who’s not. That might have been helpful for some of the financial services institutions trying to deal with risk exposure they never knew they had.
Compliance expert Eric Krell from DRS Technologies speaks to Business Finance editor in chief Jack Sweeney about how the tactical precision with which key risk and compliance decisions were made allowed internal audit to blossom. DRS Technologies currently utilizes OpenPages to manage their SOX compliance requirements and takes advantage of the technology’s workflow automation capability to supplement the 302 certification process.
We’ve blogged frequently on the topic of IT risk management, most recently here. With recent events highlighting the need for better risk management, now, more than ever, people are thinking about how to improve their processes and technology for supporting their risk management programs. Ben Worthen over at the WSJ BizTech blog has written recently that tech departments shortchange risk management. We couldn’t agree more.
The basic problem, as Symantec’s Samir Kapuria notes via Ben’s post, is that IT tends to think of risk management as a project vs. a continuous process. This may be the result of the fact that most IT infrastructure vendors sell risk management for project delivery but don’t really have solutions that allow IT to take a risk-based approach to all their activities. It may also be the result of IT having to keep everything running, all the time. Regardless, unless you start with a top-down approach using a risk assessment process, identifying which vulnerabilities match to the most significant potential business impacts, you will never be able to allocate IT resources appropriately. Once you understand how realized risks will impact the business, you can take a truly risk-based approach to IT management. Obviously, we have a horse in this race, but any effort to tackle the IT risk management challenge must involve the business and identify the key risks therein.
With increased scrutiny being given to risk management accelerated in the aftermath of the near financial system meltdown, COSO has released a new thought paper to support companies’ efforts to enhance their risk management processes. Titled Strengthening Enterprise Risk Management for Strategic Advantage, the paper is geared to senior executives and boards of directors, highlighting key elements of ERM. This paper is a follow up to COSO’s Effective Enterprise Risk Oversight: The Role of the Board of Directors, reported by John Kelly in his September 1 blog entry.
This newest paper is intended to provide a “basis for introspection about current approaches to risk management and be a catalyst for management to strengthen risk management for the purpose of enhancing the board’s risk oversight capabilities and the organization’s strategic value.” As such, COSO encourages boards and management to turn to COSO’s Enterprise Risk Management— Integrated Framework for in-depth discussion of core components of enterprise risk management.
The paper sets the stage by focusing on how the financial crisis and business complexity, advances in technology, globalization, speed of product cycles, and the overall increased pace of change increases risks facing organizations. It points to a perception that senior executives and boards “could be more aware of the risks they are taking” and do more to prepare for their downside. It also points to legislative and regulatory initiatives providing further impetus for focusing on risk management.
The paper centers on four areas where senior management can work with its board to enhance the board’s risk oversight capabilities:
Discuss risk management philosophy and risk appetite
Understand risk management practices
Review portfolio risks in relation to risk appetite
Be apprised of the most significant risks and related responses
The paper does a good job highlighting how these activities can be effectively operationalized, and contains points of focus particularly for directors. It’s especially useful for senior executives and board members struggling to cope with their management and oversight responsibilities. It may even be worth the read for professionals with some knowledge of the COSO ERM Framework, to refresh memories and sharpen a focus on what ERM is all about. Probably most of all, the paper should provide useful support to those who are working to make the case for ERM in their organizations.
Strengthening Enterprise Risk Management for Strategic Advantage is available at www.coso.org.
The announcement of IBM’s intention to acquire OpenPages generated volumes of editorial response and news coverage in today’s world of instant publishing. The news which provoked a very positive response across the board from OpenPages customers, prospects, media and analysts, has generated over 1,400 ‘tweets’, numerous news stories and some thought-provoking analysis from industry analysts.
In particular, Chris McClean of Forrester raised an interesting point in his blog coverage noting that acquisitions in the GRC market over the past two years have resulted in not only vendor consolidation, but also market fragmentation. He points out that the Thomson Reuters acquisition of Paisley was meant to ‘strengthen its tax and accounting business’, while EMC acquired Archer ‘as a dashboard (at least initially) to pull together IT risk data and processes,’ whereas the IBM acquisition of OpenPages ‘will likely turn the company more toward higher-level corporate performance and enterprise risk management.’ I think Chris is as usual on target, yet would respectfully add that integration with the control infrastructure allows OpenPages to instrument the risk assessment and control testing process, thereby delivering the only comprehensive solution on the market.
Also published recently is Gartner’s ‘First-Take’ on the acquisition in which analysts French Caldwell and John Hagerty report that they are expecting a ‘Market Split’ whereby the vendor landscape will be divided between those that have coupled qualitative risk assessments with quantitative risk analytics, and those that provide just qualitative risk assessments: ‘Vendors that have a risk intelligence strategy would compete for large accounts with combined risk analytics and traditional governance, risk management and compliance (GRC) management functionality, while those without risk analytics capabilities would address less-quantitative risk assessments, compliance and audit management.’
If you’re a risk manager or a business manager, the integration of risk analytics with GRC management will provide your business with more timely and more accurate information to understand the risk exposure to the business and help you make better decisions.
Attrition.org maintains a list of public, high profile data breaches. The list is staggeringly long, and goes back to the year 2000. TJX, while a high profile data breach and perhaps one of the biggest stories of 2007, is only one of the many that were publicly reported. And, companies have a vested interest in not making these events public. Add to that the breaches that happen every day that go undiscovered and it becomes clear that this staggeringly long list is just the tip of the iceberg.
But why is this list growing? Preventative technology and knowledge gets better and better every day. Shouldn’t we be getting safer? Information risk management is sometimes a thankless job. As an old mentor of mine used to say, a good day is a day where nothing happens. The villains get better and better every day, however, and the gap remains. Your organization is susceptible, and it’s critical you do everything you can to keep the gap as narrow as possible.
I’ll be the last one to tell you that a strong central risk management function is a bad thing. Unfortunately, many organizations make the mistake of investing only in a centralized function because it’s too difficult to federate, and push risk management to lower levels of responsibility in the organization. It’s a classic consistency vs. quality of information problem.
Accurate information lies at the business line level – a manufacturing company’s CRO may not know that you’re throwing away millions of dollars a year due to a lack of quality suppliers, but the supplier quality manager certainly does. The challenge is that it’s traditionally very expensive to consolidate this local lower level information. Organizations attempt to survey and assess process owners, but the information comes back in various formats, of various levels of quality, and it leads to information silos – it’s impossible to get an apples to apples comparison. Out of frustration, many of these efforts fail, leading to a strong centralized risk function.
Organizations must augment their centralized risk management efforts with localized, distributed data, and the only to reliably do that is to invest in automated technology solutions.
As we mentioned last week, during the heyday of buying for Sarbanes-Oxley (SOX) compliance solutions, many companies put in place technology platforms that now support a variety of risk and compliance initiatives. SOX solutions were generally purchased with the tacit approval of IT, but, given the range of solutions currently in deployment (spreadsheets, custom applications using Microsoft Access as a platform, and COTS SOX solutions), it is clear that IT never standardized on a strategy for managing risk and compliance data. The result is that today CIOs have an opportunity to either leverage their existing technology or put in place a standard platform to support risk and compliance data and processes.
The reality is that many CIOs continue to allow the business to buy disparate platforms for different GRC solutions. In numerous buying decisions, IT is at the table to support solution implementation rather than thinking about the long term strategic benefits of a common GRC platform. Just as disparate customer data marts drove down customer satisfaction rates and hampered sales efforts, leading to the rise of the CRM market, so too will scattered risk and compliance data marts cause an immense amount of pain for risk managers trying to get a clear picture of risk throughout the business.
As companies prepare for Solvency II, many are struggling with the right approach to address key aspects of the directive. Due to come into effect in 2012, Solvency II promises a more sophisticated ‘risk-based’ form of supervision that will require many insurers to augment their risk management framework – particularly with regard to operational risk.
In a Webinar hosted by OpRisk & Compliance Magazine, Stuart Robinson, senior vice president of group risk at Germany’s largest insurer, Allianz discusses how Solvency II operational risk requirements are well aligned with Basel II requirements and other regulatory and industry standards. In particular, he points to how insurers need to:
Identify operational losses and capture data on them
Understand the key risks and key controls in business processes and review control effectiveness on a regular basis
Use scenario analysis to assess the impact of potential operational risks
Quantify operational risk capital requirements
Demonstrate that operational risks are managed through reporting, KRIs and action plans
To understand the operational risk requirements and challenges facing insurance companies and how Solvency II impacts operational risk management, check out this informative Webinar.
The PCAOB’s Auditing Standard 5 (AS5) is structured around a top-down approach to identify the most important controls to test during your Sarbanes Oxley (SOX) effort that address the assessed risk of misstatement for each relevant financial assertion.
At OPUS 2010, Jo Morton, Business Analyst, Internal Audit at Williams Companies, Inc. and Lawrence Joiner, Manager of Internal Audit Operations at Williams presented an informative session titled, “An OpenPages Approach to Auditing Standard 5 Compliance.” In their session, Jo and Lawrence outlined how Williams has been able to move beyond a “process by process” review and up to an Account Level review that truly is an AS5 “Top-down Approach” In the following conversation, Jo Morton describes her session and her overall OPUS 2010 experience.
This week OpenPages is sponsoring the RMA Operational Risk Management Discussion Group being held at The Federal Reserve Bank of Philadelphia. The two-day forum was kicked-off by Victoria Garrity, Senior Quantitative Analyst from the Boston Federal Reserve. Victoria’s session titled “Regulatory Perspective on Scenarios: Challenges and Issues”, was well attended and sparked a number of conversations on potential forthcoming regulations. Other interesting sessions included a discussion moderated by Michael Fenn of DTCC and Patrick McDermott of Freddie Mac on the evolution of ORM assessments, and a roundtable facilitated by Kathy Miller of KeyCorp on “Recent Experiences with Regulators” in which the discussion focused on operational risk examinations and emerging guidance from the regulatory environment.
Overall a very timely and thought-provoking forum attended by some of the leading operational risk practitioners.
In November, I blogged about the difference between IT Risk Management and Information Security. For the full post, read here.
There’s a big different between tactical execution and strategic oversight. Therein comes the challenge with most information security programs; they place far too much emphasis on the how and what, and far too little on the why. Information risk management, on the other hand, is necessary to prioritize efforts, and concerns itself with the why.
The problem (and it’s a good problem to have) is that we’ve got a lot of great information available to us regarding how and what. There are libraries of control checklists from numerous standards organizations that provide great common practice guidance around how to secure information assets. As new vulnerabilities are discovered, new patches and workarounds are circulated and proactively communicated through a huge number of alerting services. Modern Information Security practices are mostly controls based — ie they focus on the what. They largely ignore the why — the element of business risk because it’s too difficult to understand.
Where this approach falls down is that there will always be far too much to do. There are too many vulnerabilities to remediate and too many controls to implement across the typical enterprise. As a result, critical deficiencies will go unmanaged. True risk management requires a business perspective on these deficiencies. Only with that business risk perspective is it possible to focus on doing the right things first. That’s lacking in the vast majority of modern businesses, and as a result, time is wasted and risk posture suffers.
In November, OpenPages and Compliance Week hosted a roundtable with 17 senior executives where the discussion covered the kinds of metrics to use when discussing and reporting on risk and compliance initiatives. Companies ranged from a variety of industries and covered the risk and compliance functions. Matt Kelly, editor in chief at Compliance Week, moderated the session and blogged about it. To review the article written about the roundtable, “Shop Talk: Metrics for Risk, Compliance,” visit the Compliance Week website.
One of the interesting topics that surfaced during the discussion was how do we, as risk and compliance professionals, get closer to the business to influence the decisions that are being made in the context of an operational process. Everyone agreed that it’s relatively easy to track external metrics–calls to the whistleblower hotline, for instance, and it’s also straightforward to put in training procedures. However, the challenge is getting the “in-process” metrics to influence decision-making.
This question of how to get closer to the business was also a theme at RiskMinds in Geneva last week. Many operational risk executives said that the next frontier is more effectively influencing the “in-process” decisions to reduce operational risk.
For us, the takeaway is that reporting and KRIs have to get more granular and more timely for both the business and the group risk and compliance functions.
A group of investors issued a report yesterday that criticized the Obama administration’s plan to have the Federal Reserve operate as the systemic risk regulator. The Investor’s Working Group consists of a high-octane set of investors, including CalPERS, Capital Group and GMO, as well as academics and journalists, and it is chaired by former SEC chairs William Donaldson and Arthur Levitt.
There are lots of interesting ideas in the report, but, most significantly, the report argues that the Federal Reserve as systemic risk regulator has “serious drawbacks”. It cites the “potentially competing responsibilities" from monetary policy, to managing the payments system. More importantly, the report cites the recent regulatory failures of failing to police mortgage underwriting and to impose suitability standards on mortgage lenders.
Whether it’s the Fed or some other entity that ends up being responsible for systemic risk regulation, a new information architecture will be required to surface up the right information to the systemic risk regulator. The Investor’s Working Group has suggested that the regulator “should have the authority to gather all information it deem relevant to systemic risk”. Such an information gathering exercise will not be a trivial effort. The ORX has some experience with that, and the systemic risk regulator would presumably be looking for timely information about positions, counterparties and activity. The more the information requests align with current operational reporting at banks and other financial services institutions, the easier it will be to implement this new reporting requirement. This alignment is something ORX has learned is important. The reporting requirement alone is an argument for an omnibus approach to financial services regulatory reform, because surely there will be other reporting requirements coming out of the regulatory reform process. Let’s hope there’s a coordinated approach here; otherwise, potential reforms could be very expensive to implement and fail to deliver as promised.
The Shareholder Bill of Rights Act of 2009 submitted by Senators Schumer and Cantwell addresses one of the key issues in the current financial crisis, that of corporate governance. While the NYSE has a rule that the board must articulate its enterprise risk management strategy, such a proscription has yet to be enshrined in law. The Schumer Bill address that:
16 (A) IN GENERAL.—Each issuer shall…establish a risk committee, comprised entirely of independent directors, which shall be responsible for the establishment and evaluation of the risk management practices of the issuer.”
It’s unlikely that this particular bill passes as written, but the notion that companies will have to formally name a risk committee will certainly shine the light on how companies identify and evaluate risk in their business.
Interestingly, in the UK, the Financial Reporting Council just finished their review of the corporate governance code. There’s an interesting article in Management Today here:
I disagree with the conclusion, however. The ‘comply or explain’ approach will never work. We just learned that lesson from the former investment banks that were supposed to self-regulate in the US. My view is that you can fashion regulation that’s not “over-reaching” (some would say Sarbanes-Oxley falls into this category) yet provides sufficient guidance on operating requirements to actually mitigate real risks.
ERM, similar to most business processes, is not a “one-size-fits-all” solution. It has to be customized and tailored for each firm. As Mark Olson of the Federal Reserve notes, “An effective enterprise-wide compliance-risk management program is flexible to respond to change and it is tailored to an organization’s corporate strategies, business activities and external environment.” (April 10, 2006)1
Companies that try to implement an out of the box methodology will likely fail. ERM methodologies and taxonomies must be adapted to a company’s legal, regulatory, economic and competitive environment, all of which can vary dramatically by industry and must, of course, be tailored to the company’s internal processes and culture. Further, the risk framework must be able to adapt to change over time to avoid losing competitive advantage.
President Obama announced his long-awaited proposals for the overhaul of the regulatory system for the financial services sector. There were no real surprises as the administration has been seeking input from many different parties for months and surrogates have be floating specific proposals over the last couple weeks. The so-called White Paper (www.financialstability.org) that outlines the administration’s thinking is pretty stark in places: “it is clear now that the government could have done more to prevent many of these problems from growing out of control and threatening the stability of our financial system.” (Page 2).
It’s clear that the main thrust of the regulatory reform is around systemic risk and consumer protection. Addressing the systemic risk issue, the Administration is proposing a Financial Services Oversight Council. What’s interesting here is that they scrapped the alternative of consolidating such power in a single agency, e.g. the Federal Reserve. Other proposals to address systemic risk include the creation of a category called Tier 1 FHC which the Fed will oversee, a revision of capital requirements for Tier 1 FHCs, a new Federal Bank Supervisor to conduct prudential supervision of other banks, the regulation of investment banks by the Fed, and other proposals. On consumer protection, the Administration is proposing a new Consumer Financial Protection Agency.
What is utterly lacking in these proposals is any sort of programmatic approach to intra-company risk management. There’s a brief mention of this issue:
“Prudential standards for Tier 1 FHCs—including capital, liquidity, and risk management standards—should be stricter and more conservative than those applicable to other financial firms to account for the greater risks that their potential failure would impose on the financial systems.”
How companies will react to these “stricter standards” remains to be seen. But as taxpayers holding a large stake in several of these Tier 1 FHCs, we should all want greater transparency into the control environment, decision-making process around risk management issues, and the organizational structure around risk management. These proposals do not address those issues.
Although there were differing opinions about the main causes of the current financial crisis, most speakers at RiskMinds in Geneva were unanimous in their belief that the worst is still to come in what many were referring to as the “Great Recession.”
Robert Shiller of Yale University drew many parallels between the Great Depression and today’s crisis. For example, we have lost 60% of the stock market value since the 2000 high, while during the great depression there was an 80% drop. But Shiller refuted many of the commonly believed causes of the current crisis such as weak underwriting standards, unsound risk management practices, increasingly opaque financial products, and aggressive leverage. He maintains that the speculative bubble in both the real estate and stock markets were largely to blame for the worst financial crisis since 1929.
Maureen Miskovic, CRO at State Street, opened her presentation with a quote from Dickens’ Tale of Two Cities: “It was the best of times, it was the worst of times, …” and went on to claim that we are in the midst of a financial revolution. Miskovic predicts that we will see unemployment levels of close to 10% in the U.S. next year which will in turn cause problems in the prime mortgage market. She also predicted that the current political climate will result in punitive regulation which will transform the large U.S. banks into institutions that are very similar to public utilities (increased disclosure, more transparency, and intrusive examination).
Zannie Beddoes, Global Economics Editor at the ECONOMIST, predicts that shrinking personal wealth will greatly effect demand and eventually push the world into depression era economics. She stated that the current situation is unlike other post war recessions due to the asset bubble burst and so we are in for a deep, long recession. She also fears an anti-market backlash which could result in subsidy wars and protectionism policies.
While the speakers painted a picture of doom and gloom, they were clear about the increasing role that risk managers need to play in helping financial institutions restore confidence and trust, as well as create a sense of opportunity in the financial markets.
I’ll summarize some of their recommendations in tomorrow’s blog.
A traditional model to planning the audit process typically examines 10-20 risk factors for each element of the audit universe, and buckets each auditable entity into a risk categorization which will drive the frequency with which it is audited. While this approach may have worked well in the past, modern audit departments are being asked to do more with less. The known risk universe gets bigger by the day, and investing in a massive risk evaluation for each entity may not be the best use of resources. Is it worth tying up valuable stakeholders in management and on the audit committee to assess the risk inherent in the coffee procurement process for a remote sales office?
Progressive organizations are turning towards a more agile, top down approach to risk assessment to drive audit scheduling. This will lead to more efficient resource allocations, ensuring auditors are focused on the truly risk areas.
The New York Times is reporting a story today about banks slowing their foreclosure process as a result of cutting corners to speed their way through the legal process. It turns out that the foreclosure process requires lots of signatures by the foreclosing entity and some banks deployed robo-signers–people who signed up to 10,000 documents a month. The problem is that part of what they sign says that they personally had reviewed all the documentation, which, of course, is not possible.
The result of this realized operational risk is that foreclosures and subsequent sales have slowed, tying up banks’ capital for longer. We have long argued that operational risk is the linch pin in risk management. A linch pins keeps the wheel from sliding off the axle it rides on. The analytic approach used to value credit portfolios is a critical component of the overall risk management process, but the operating risks are no less important, as we read today.
Canada’s oldest and fourth largest bank has proven a successful risk management framework can reduce risk exposure – even in the midst of a global economic meltdown. Hamish Lock, head of operational risk at Bank of Montreal and valued OpenPages customer is featured on the cover of the November, 2009 OpRisk & Compliance Magazine. In a very candid interview, Hamish describes how he took on his new role as head of operational risk just over a year ago in the midst of the financial crisis and has been able to steer his bank through the crisis relatively unscathed.
“The key goal I’ve been trying to achieve”, said Hamish in the interview, “is to increase the transparency and awareness of where operational risk lies within the bank; ensure ownership is clear and to identify it; and to talk about it in a specific and informed fashion.”
He attributes his success to the value the bank places on effective operational risk management. “It is about maintaining and enhancing the overall risk management capability at the firm. We would be doing a lot of these things even if they weren’t requirements. We believe there is value in trying to continue to evolve the way we manage operational risk as better tools are developed and the discipline matures. I’ve been involved directly in operational risk for almost eight years and there has been a huge amount of maturity in that time, in terms of the way we identify, and particularly the ways we assess and measure, the risks. We would want to continue to do that regardless.”
To view the interview in its entirety, click here.
In Observations on Risk Management Practices during the Recent Market Turbulence, the Senior Supervisors Group, which consists of US, UK, Swiss, French and German regulators, took a look at a number of global financial services institutions during the period of recent market turmoil. These institutions included the largest financial services firms in the world. The regulators zeroed in on exposure to the securitization of US subprime mortgage-related credit.
According to the report introduction penned by William Rutledge, Chairman of the NY Fed, " firms that avoided such problems [losses associated with such exposure] demonstrated a comprehensive approach to viewing firm-wide exposures and risk, sharing quantitative and qualitative information more effectively across the firm and engaging in more effective dialog across the management team."
What’s interesting here is that the regulators called out the ability of senior management to share risk information across silos, to discuss how exposures and risks all came together at the top of the business. This is certainly about risk culture, but it’s also about having access to that information so that it can be shared in the first place, which is really a systems problem. Regardless, it’s pretty clear that the days of siloed risk management are going to come to an end. Senior management must look at risk across the business in a more holistic way. It would be overly simplistic to say that Bear Stearns collapsed because of siloed risk management, but for anyone who’s ever read Memos From the Chairman, it’s hard to imagine this happening to a firm once run by Ace Greenberg, who championed a culture that had little tolerance for festering problems.
Over the years, there have been many studies about CEO compensation and risk taking with the data on outcomes derived from data available from public companies. The latest salvo comes from two professors, one from BYU and the other Penn State, who have published a new paper in the current issue of The Academy of Management Journal. They studied 950 companies from 1994 through 2000 and found that CEOs who recieved more than half their compensation from stock options were more likely to undertake risky investments to deliver extreme company performance. The problem is that these investments were more likely to end up in big losses than big wins.
Floyd Norris in the New York Times describes the conclusion that the study’s authors come to. Namely, CEO pay should include deeply in the money options and longer holding periods so that the CEO acts more like shareholders.
This may or may not be a good idea. What’s clear is that in the studied companies risk taking was at the discretion of the CEO. Companies apparently could not distinguish between good and bad risks to take, and the decision about whether to take them rested on the shoulders of the CEO. But this not need be the case. Better transparency into the state of risk into the business would have provided more sunlight on these so-called risky decisions. In the current climate of risk management focus, boards complain that they don’t have good visibility into the state of risk in the enterprise, and judging from this study, this lack of visibility is causing poor performance, with companies investing in areas with sub par returns for the chance of a big win. This bodes well for the risk management business.
While many companies have basic elements of a compliance program in place such as code of conduct and whistleblower programs, simply having these elements is no substitute for a comprehensive program. In reality, many companies have implemented a “one-off” approach in which procedures often become fragmented, duplicative and outdated over time. For these organizations, the cost of non-compliance can be extraordinarily high, whereas a well-designed, comprehensive compliance program provides numerous efficiencies and can serve as a solid foundation for effective Enterprise Risk Management.
Don’t miss Rick Steinberg, founder and CEO of Steinberg Governance Advisors and Compliance Week columnist, as he outlines steps that companies can take toward achieving a well-designed, comprehensive compliance program. In this informative Webinar, Rick describes a strategic, risk-based approach that supports business objectives and provides an enterprise view of compliance.
The court did take issue with the way PCAOB members could be removed, and ruled that board members could be removed “at will” by the commissioners of the Security and Exchange Commission. In the majority opinion, Chief Justice Roberts wrote that, despite the unconstitutional tenure provisions, the Act remains “fully operative as a law.”
So what does this mean? Congress clearly tried to insulate the PCAOB from the political whims of the executive office, passing the Act, as it did, during an administration skeptical of regulation. Roberts’ court handed advocates of executive power a victory by ruling that dual for-cause limitation on the removal of officers is not constitutional and that the president must have a direct line to remove officers of the government, which the Board members were determined to be.
However, given the current administration’s concern about corporate accountability and the integrity of financial risk reporting in general, it would be very surprising if SEC Chair Mary Shapiro were to exercise her new found power and replace Board members with someone more lenient on the accounting firms. And, AS5 really took the heat off of corporate America vis-a-vis their auditors, anyway; the SEC’s the one that carries the big stick with regard to the integrity of financial controls.
Further, more and more SOX efforts are being rolled into a comprehensive program of managing risks enterprise-wide. Companies are more interested in broadening the application of the approaches, tools and techniques for testing financial controls to their broader control environment. The net here is that the Supreme Court’s ruling will probably have little to no effect on how companies actually manage their risk with respect to financial reporting.
RIMS 2010 kicked off in Boston this week with no signs of an economic slowdown. The Risk & Insurance Management Society Inc. (RIMS) is celebrating its 60th anniversary on the historic Boston waterfront with its annual conference being held at the Boston Convention and Exhibition Center. RIMS includes greater than 10,000 risk managers from over 3,500 organizations ranging from Fortune 500 enterprises to government, nonprofit and service organizations.
The conference, now in its 19th year includes sessions on Enterprise Risk Management, Loss Control, Finance, Risk Management and Insurance among others. RIMS president Terry Fleming described the past 10 years as the most important in RIMS’ development, “risk management as a discipline has been thrust into center stage in the wake of catastrophe after catastrophe, including the global financial meltdown. The need for risk management has been highlighted more than ever before and RIMS has stepped up to the proverbial plate by testifying before congress, identifying new areas of interest in the discipline, creating inroads abroad and crafting the very definition of enterprise risk management.”
Accelerated filers of course have long been subject to SOX 404 (a), requiring management reporting on the effectiveness of internal control over financial reporting, as well as section (b), where auditor attestation is required. While having to incur tremendous costs, with some companies seeing little commensurate benefit, others have seen improvement in business process effectiveness, internal control beyond financial reporting, and improved compliance more broadly. Non-accelerated filers, already subject to management reporting, have gained another reprieve from the auditor attestation requirements of section (b). Great news, many are saying. They hail the opportunity to avoid incurring additional costs and taking focus away from running and growing their businesses.
Recently I came across an article in Directors & Boards by a former colleague of mine that offers a different perspective, which in my view is worth considering. His view is, in addition to the SEC losing credibility – agreeing to another deferral after making clear and definitive statements that no more would be forthcoming – that requiring and adhering to section (b) offers benefits beyond the costs, for a number of reasons. These include (1) Smaller companies traditionally have less sophisticated systems and less experienced individuals in management positions, with statistics showing greater incidences of fraud and restatement of financial results (2) The 404(b) compliance costs have come down with the advent of AS 5 and COSO’s guidance for smaller businesses (3) Studies indicate that companies that are not SOX compliant or have material weaknesses in their internal controls receive a lower valuation, whereas those that are compliant receive higher multiples when sold (4) These companies are less likely to take advantage of IT solutions that provide enhanced efficiently and management capabilities well beyond better controlled financial reporting, and (5) CEOs and CFOs who already must certify to the effectiveness of financial reporting controls are on the hook by themselves, failing to receive the comfort provided by auditor attestation.
Certainly, these arguments are worth considering by senior managements and boards of companies still waiting to see whether and when the 404 (b) requirement ultimately will become effective.
A recent industry survey by PTC shows that the highest cost of product compliance failures is not always fines and legal fees, but delayed time to market and product shipments. This is particularly true in manufacturing where restricted substance-based product recalls have cost manufacturers and consumer product companies millions in lost revenue due to compliance failures or supply chain disruptions.
Of course implementing a compliance program has its costs as well. As our recent white paper “The High Cost of Non-Compliance” authored by Rick Steinberg points out, an OCEG Benchmarking Study shows the cost of Sarbanes-Oxley compliance alone averaging:
$4 million for companies with $5 billion revenue
$10 million for companies with $10 billion and more in revenue, and;
for companies with more than $1 billion revenue, compliance costs equaled 190 full time equivalent employees.
So, while implementing a compliance program may seem high, it’s clear that not putting an effective compliance program in place can be significantly more expensive.
The white paper points out several key ways companies have succeeded not only in reducing compliance costs, but also enhancing efficiency and gaining real business benefits:
Built into Business Processes
A Program Founded on Ethics and Integrity
A Risk-Based Approach and Clarity Around Responsibilities
The Treasury is expected to announce this afternoon their long-awaited results to the so-called banking stress tests. They’ve done a good job leaking the key bits of information this week so the market has had time to adjust. Actually, they’ve done a good job attenuating the whole banking system assessment process, which has allowed time for investor sentiment to improve with the recent glimmers of hope for the economy that some prognosticators are seeing. What will be interesting is to see the parameters of the tests. I’ve read that the tests assume a worst case scenario of job loss that we’re very likely to exceed, but I’ll wait to see the fine print.
I was very interested in Warren Buffett’s comment over the weekend. Widely reported and noted in the Boston Globe here, Buffett said that the stress tests largely ignored the strength of the bank’s business models. This question of the viability of the business model going forward is an interesting one, and an important aspect of any ongoing business model would be how the risk management procedures will change to avoid similar problems in the future. Raising capital or creating a bad bank or any of the other strategies to deal with toxic assets don’t address the fundamental risk management weakness that got us into this mess in the first place.
I would be interested in seeing an assessment of the banks’ ability to identify and manage risk moving forward. And a key dimension of that capability is their stance on operational risk. Marc Leipoldt in the recent issue of OpRisk and Compliance (requires login) argues that now is the time for operational risk to assert itself across the business and become "central to the bank’s risk management". Clearly, many of the issues facing banks today were the result of realized operational risks that may have slipped through the cracks in the market or credit risk function. For instance, where would we be if the instance of mortgage fraud were dramatically lower? According to some banking insiders, at certain players this accounts for a good portion of the bad debt they’re having to account for.
The stress tests will certainly be a good snapshot of where we are, but what about our ability to manage risk going forward?
A tag is a keyword you assign to make a blog or blog content easier to find. Click a tag to find content that has been assigned that keyword. Click another tag to refine the search further. Click Find a tag to search for a tag that is not displayed in the collection.