No doubt you know that the Dodd-Frank Wall Street Reform and Consumer Protection Act has been signed into law, with at least some ramifications for every public company. Space here doesn’t permit an overview, and in any event you’ve probably already received highlights of the new law from one or more advisory firms. Among the more interesting aspects of new requirements is how the authority of corporate shareholders has risen, in a number of significant ways:
Say on pay: Shareholders now will get to vote on whether they’re satisfied with executive compensation. And the same holds for so called “golden parachutes” related to such transactions as sales or mergers of the company. While these are only non-binding advisory votes, compensation committees and full boards will certainly think twice before continuing with compensation voted down by the company’s owners – which parties also vote on whether sitting directors should be re-elected going forward. As such, we can expect to see boards more receptive to views of shareholders, especially major ones, on executive compensation programs.
Additional executive compensation disclosures: Public companies also will need to provide more detail about how executives pay relates to the company’s financial performance. Additionally, disclosure will be required of the ratio of the CEO’s total compensation to the average of all other workers’ median total pay. There’s little doubt that shareholders will be focusing closely on this information and reacting to it in the voting process.
Elimination of broker discretionary voting: Now stock exchanges will extend beyond the current NYSE rules, to now prohibit discretionary broker voting in board elections as well as executive compensation and other significant matters. Because brokers typically voted in favor of company initiatives, shareholders will have more say in what transpires.
Proxy access: Perhaps most significant, the SEC is authorized to allow shareholders to use proxy materials to nominate their own directors. While we don’t know exactly what the SEC will do in this regard, we can expect that shareholders will have a greater say in who sits in the boardroom.
These of course are just some of the elements of the new law, which impact ultimately will be determined by numerous studies to be undertaken and regulations to be issued. One thing, however, is clear. Shareholder authority continues to grow, and companies and their boards will continue the trend of opening channels of communication with shareholders.
Last week, the head of the New York Fed William Dudley, echoed the chorus that is getting louder and louder. Namely, that our regulatory system needs a complete overhaul to deal with the systemic risk posed by our largest financial services institutions. According to Dudley,
We need a more effective regulatory system. We need a systemic risk authority that has both the responsibility and the powers to look across the entire financial system—both depository institutions and the capital markets.
One issue with the current regulatory system is that it was not designed to support the sharing of information across institutions. It was designed to regulate individual institutions, and as the Fed, and other regulatory agencies, start to think about how to compare information across institutions, we can expect to see them demanding a great degree of standardization in term of how data about risk in the business is collected, managed and shared.
Recently, much has been written about the fate of financial services technology spending given the recent financial crisis. The Wall Street Journal’s Business Technology blog, for instance, points out here that Lehman spent $309 million on technology and communications in the quarter ending August 31. It’s hard to know exactly how much of that spending would be cut under a dramatically reduced operation under Barclays, but clearly, at Lehman and elsewhere tech spending’s going to take a hit in the financial services sector.
However, there is one technology area that will certainly get increased attention and that is in risk management. It’s very likely that 2009 regulation will include greater checks on leverage and an expansion of banking-like regulation to other businesses with banking-like activities. And regulators are already focused on improving the risk management functions of financial services institutions. For instance, WaMu announced on Sept 8th that they had signed an MOU with the Office of Thrift Supervision concerning different areas of the business, including the risk and compliance functions.
Risk management technology, the systems that provide visibility into the state of risk in the business, is a critical component or early warning system for risk managers trying to run the business. Of course, knowing about the risks is not always sufficient. Just ask David Andrukonis of Freddie Mac who’s CEO apparently ignored the early warning signs of excess risk exposure, according to the New York Times. Nevertheless, having the risk managment infrastructure in place at least allows management to make informed decisions about what risks to take or not.
And there’s another driver here for risk management technology. Over time, shareholders, not just regulators, will want to have better visiblity into the risk exposures in a company. The Fed demonstrated that they are willing to let large entities fail (well, sort of), and as such it will be up to the market to assess risk in the business. Management will be encouraged to provide transparency as to the state of risk in the business through a lower cost of capital, the benefit for which would dwarf the cost of any risk management technology. Which is why I think spending on risk management technology will not drop as much as the overall market for financial services IT spending.
Is risk management a strategic differentiator? When Toyota shifted the culture to one that valued and rewarded volume production, did it lose sight of quality as a strategic differentiator? Is Kermit the Frog a risk manager?
In the first installment of a multi-part Risk Chat with Eric Krell of the Big Fat Finance Blog, we touched on several such pressing topics. Check out Part One.
Our recent survey on IT risk management published some interesting findings on risk management in the IT function. One of the surprising findings was how many different titles can be responsible for IT risk management. Sovency II and several pieces of draft legislation in the US require that the CRO be responsible for overall risk management. So, there’s clearly a regulatory trend towards consolidating the responsibilities of risk management in the hands of fewer people — the “one throat to choke.” While there may be one throat to choke in the IT organization, finding that throat may be difficult.
Only 40% of organizations said that the CIO was responsible for IT risk management. Others responsible for risk management included the CISO, CRO, CFO, and Head of Enterprise Risk. The surprising finding was that over 25% said “other.” See results here. You can infer that the other category consists of people at the manager/director level, which would mean that 25% of organizations haven’t elevated IT risk management to an executive function.
One of the key issues that IT risk manager face is making sure they are addressing the key risks in the business. We’ve had many conversations with IT risk managers that have a control-oriented view of the world — they see their job as making sure all the controls they manage are effective. The question is whether you have the right controls in place, and you can only answer that question with an understanding of what are the business objectives. Building this bridge between business objectives and controls infrastructure is difficult, and frequently the two are totally disconnected. Given the survey findings on who is responsible for IT risk management, there’s no clear organizational model for how to make this connection.
We’ve discussed in this blog the role of IT in GRC, mostly in terms of how IT manages risk inherent in delivering IT services. But there’s another risk that IT should be addressing, and that is the risk of disparate risk data marts scattered across the enterprise. I’ve written about it here.
An interesting dynamic has emerged around financial services reg reform. Senator Dodd’s proposal includes creating a separate agency for bank oversight, stripping the Fed of that aspect of its current responsibilities. The Fed is attempting to defend its turf, pointing out that it’s very hard to execute well on a monetary policy mandate without the kind of data that bank regulation gives them (see crisis management, Bank of England). Fed Chair Bernanke has been lobbying his case behind the scenes, speaking directly with members of Congress in one-on-one conversations.
The banks, for their part, are apparently lobbying for the status quo, in essence supporting the Fed’s position as they do not want to have to support dealing with additional regulatory oversight from the new agencies that the Dodd plan calls for.
The Obama administration has been moving fast on a lot of fronts, but certainly one of the longstanding legacies will be regulatory reform in the financial services sector. We’ve blogged here before about Hal Scott and the Committee on Capital Markets Regulation (CCMR), and some of those ideas have made their way into the current proposals currently being floated by the Obama administration.The CCMR borrowed from the FSA’s model, and while current proposals don’t go that far (and Barney Frank won’t let them), there will likely be some regulatory consolidation in future legislation.
At Compliance Week this morning, SEC Commissioner Luis Aguilar spoke about several different proposals being floated by the Obama Administration. In his keynote, he commented on the systemic risk regulator, the financial product consumer protection agency and the single, consolidated financial services instustry regulator.
On the systemic risk regulator, Aguilar pointed out that the system needs to be protected from a failure in function, not (necessarily) from a failure of institutions. Currently, there are two models being discussed, a monolithic regulator (the Fed) and a more diverse council of regulators. Aguilar argues for the latter as better able to identify and mitigate risk through the variety of perspectives from a council-type format. Further, he argued that there could be inherent conflicts between the systemic risk regulator charter and the stewardship of monetary policy.
Elizabeth Warren’s argued that you can’t buy a toaster that has a 1 in 5 chance of bursting into flames and burning down your house. Aguilar supported Warren ’s notion that consumers of financial product must be protected but believes that there should not be a single regulatory authority over, say, credit card and mutual funds. The purposes, pricing, and value chains for the products are totally different. He further noted that there’s already a viable entity for investor protection, although in the Q&A he pointed out that the SEC is not self-funded and regulates 35,000 entities with a staff of 3,600 (vs. the FDIC, for instance, that regulates 5,100 banks with a staff of 5,000).
Finally, regarding the proposal for a single, consolidated financial services industry regulatory, Aguilar said that he “would be very concerned that a single regulator could result in a loss of investor protection.” For instance, he saw a conflict in full-disclosure if the same entity were charged with regulating institutions and financial products for investment purposes. He was supportive of an integrated capital markets regulator that might be a combination of the SEC, CFTC and parts of the Department of Labor (Employee Benefits Security Administration).
What was most remarkable about Aguilar’s comments was not the proposals themselves as much as the lack of clarity around how this will play out. Stay tuned.
The ERM Initiative at North Carolina State University was commissioned separately by the American Institute of CPAs (AICPA) and the Chartered Institute of Management Accountants (CIMA) to conduct surveys of their respective members on the state of enterprise risk oversight. While the AICPA survey was focused on US companies and the CIMA survey on global companies, not surprisingly respondents in all regions agreed in a new study titled ‘Enterprise Risk Oversight, A Global Analysis,’ that the volume and complexity of risks are increasing and that the need for increased risk oversight is being driven by senior executives and board members. Of greater concern, however is the number of respondents who feel that their risk oversight processes are immature. In the US, 84% of respondents rated their risk oversight processes as either ‘very immature’ or ‘only moderately mature.’ The study found that ‘46% of global respondents describe their risk oversight process as systematic, robust, and repeatable in contrast to 11% of U.S. respondents who believe they have a complete enterprise-wide risk management process in place.’
With recent disclosure rulings from the SEC including the board’s role in risk oversight and Dodd-Frank rulemaking on its way in which ‘risk committees’ will be required, companies rating their risk oversight processes as immature should begin preparations now. If you’re considering where to start, begin with the design goal of delivering an integrated and automated risk and compliance framework. A siloed approach limits an organization’s ability to streamline risk and compliance processes and reduce costs. It also limits your ability to gain a comprehensive view of the firm’s risk exposure.
You may be as amazed as I in continuing to encounter intelligent, accomplished business people who still don’t understand what Sarbanes-Oxley’s internal control requirements are about. Let me share a recent experience.
I’ve been working with a large multi-national company’s board of directors to identify shortcomings in corporate governance and enhance practices and performance. This has involved spending some time with each of the directors individually to get to know how they approach their board roles and are carrying out their responsibilities. Of particular interest is a highly educated, nationally known and well-respected business advisor, with whom I got into a discussion involving the boards’ role in overseeing the company’s risk management.
His message was that since the company already complies with SOX 404, including auditor attestation, risk management is well addressed in the organization. There’s no need, he said, for the board to do much more in that area. Working hard to contain my disbelief, I asked whether he had considered that the SOX 404 rule focuses only on internal control over financial reporting, and while there is a risk identification/analysis element therein, it does not expand beyond financial reporting. After he reiterated his position, I explained, as tactfully as possible, that the company’s and auditor’s compliance with 404 provides little if any comfort regarding strategic, operational, or other business objectives and their related risks.
Interestingly, we’ve also seen numerous instances where CEOs truly believe their companies already have enterprise risk management processes in place when reality is that they have elements of risk assessment performed ad hoc in pockets within their organizations.
For anyone looking to encourage their company’s boards or senior managements to consider establishing a disciplined and effective risk management process, it’s important to be sure there is no misconception about what is – or is not – already in place. Too often misconceptions exist, and they must be dealt with in order to move forward with a constructive development plan.
There’s been a good deal of discussion recently about organizational location and reporting lines for a company’s compliance function. Some are stand alone, though many are embedded within the legal department, with concern of legal privilege among the considerations. Some report to the CEO, though for many others the reporting line is to another senior executive. And to further complicate matters, some compliance functions also have responsibility for ethics, with some being asked to take on even greater responsibility.
Certainly there are pros and cons to each organizational structure. What I’d like to focus on here is the critical relevance of a few key factors. One is to be sure a chief compliance officer, wherever he or she appears on the company organization chart, has the ability to bring relevant information directly to the chief executive and where necessary the board of directors. Depending on the nature of identified non-compliance events or associated risks, such access is essential. Also relevant are the recent amendments to the U.S. Sentencing Guidelines, which call for the compliance officer to report regularly to upper management and the board of directors or audit committee.
Another key factor is clarity around the compliance office’s scope of responsibility. Is It responsible for establishing a process for effecting compliance with all relevant laws and regulations to which the company is subject? That’s a good start. Does the scope include compliance with internal polices? That’s typically the case as well, and makes sense. But does the CEO and board think the compliance office can possibly ensure compliance? You and I know it can’t – the compliance function needs to focus on process and protocols, with direct responsibility for effecting compliance resting with line and staff unit leadership. Clarity around responsibility is essential. Amazingly, some company boards are looking to the compliance function to also take on responsibility for enterprise risk management! Fortunately chief compliance officers have fought the attempt, for good reason.
And another factor is the compliance function’s relationships with the legal and ethics functions, if separate. Certainly compliance processes must adequately reflect the legal and regulatory realities, and we know there often is a fine line between – and sometimes a forerunner or impetus for – unethical behavior crossing over to illegality. So clearly there must be close coordination to ensure information flows, policies, procedures and reporting mechanisms are in sync.
Of course each company needs to determine organization, reporting and responsibility for compliance to fit its own culture, management style and personnel. Getting this right will serve your organization well.
OPUS 2010 wraps up today. We closed the conference with lively roundtable discussions that covered risk management, risk assessment, risk tolerance and more. Great knowledge sharing going on!
“Ask the engineers” discussion, which was a chance for our customers to talk to our product visionaries and get some additional tips, as well as special insights into our product vision for the next year.
We thank all our customers and partners for participating this year. You helped make it great!
Revised reporting of stock and option awards to company executives and directors in the Summary Compensation Table
Potential conflicts of interests of compensation consultants
What might not be entirely self-evident is when they take effect. Help is provided by PricewaterhouseCoopers, which issued an advisory highlighting the timing for these new disclosure requirements, as follows:
The effective date of the new rules was February 28, 2010. Accordingly, the Form 10-K and proxy statement of a calendar year company must be in compliance with the new disclosure requirements if filed on or after February 28, 2010. If a calendar year-end company files its proxy statement on or after February 28, 2010, the proxy statement must comply with the new disclosure requirements. This is true even if the 2009 Form 10-K was filed before February 28, 2010.
An existing SEC registrant with a 2009 fiscal year that ended before December 20, 2009 is not required to comply with the Regulation S-K amendments until it files its Form 10-K for fiscal year 2010. As a result, any registration statements filed before its 2010 Form 10-K is required to be filed would not be subject to the new Regulation S-K amendments. A company may early adopt the new disclosure provisions; however, if the company elects to voluntarily comply with the disclosure changes regarding stock and option awards, it must also comply with all the other applicable Regulation S-K amendments.
If a new registrant (e.g., a company completing an IPO or a registration statement on Form 10) first files its registration statement on or after December 20, 2009, compliance with the Regulation S-K amendments would be required for such registration statement to be declared effective on or after February 28, 2010.
How effective is your organization at identifying and managing IT risks? Does your organization think of IT risk only in terms of avoidance or compliance, or does it use risk management to improve the effectiveness and value of IT?
If you’ll complete this short, 5 minute survey on IT risk management, we’ll send you a complimentary copy of the final report so you can compare your organization’s IT risk maturity to your peers.
We’ve discussed in this blog the role of IT in GRC, mostly in terms of how IT manages risk inherent in delivering IT services. But there’s another risk that IT should be addressing, and that is the risk of disparate risk data marts scattered across the enterprise. I’ve written about it here.
In the Compliance Week 2010 panel Honest Experience with GRC Tools, Joann Sochor, VP Corporate Compliance at the Bank of Montreal Financial Group, spoke about their experience with OpenPages. See http://bit.ly/bMjFbl for slides that describe scope of implementation–40 different data marts, over 5K controls consolidated onto a single technology platform.
Mary Tuuk, EVP and CRO of Fifth Third Bank, spoke at Friday morning’s general session. US Banker Magazine has named her as one of the top 25 women to watch in banking, and she gave an interesting talk on the CRO perspective, “Leveraging Risk Management for Strategic Advantage.”
After discussing an historical perspective on the recent financial crisis, she discussed some of the lessons learned. First, risk management was too siloed from the rest of the institution. In many cases, she said, risk management runs stress tests, etc, but tended to be isolated from business decisions. Also, they failed to recognize the correlation of risks across domains, e.g. credit risk, turned to liquidity risk, turned to reputational risk, which exacerbated the liquidity risk, and led to operational risk as the bank had to make new kinds of decisions. She also commented that this issue of risk culture is important: how are decisions really being made?
She showed an interesting graphic of how risk management can be a strategic advantage. She defines Economic Net Income as accounting net income – expected loss – cost of capital. In this way, risk management can help risk-adjust earnings for the expected loss of, say, a particular client relationship. Her graphic showed how those companies that don’t do this kind of evaluation will be stuck working with customers that those who do have turned down.
She ended her discussion on how convergence drives advantage. She talked about five areas of convergence: integrated governance (transparent decision-making); risk identification, aggregated measurement and monitoring; defined appetites; stress testing; and, risk culture. She gave some examples of how to assess risk culture:
How is risk management viewed? As the police or business partners?
How is bad news treated? Are people willing to share bad news?
Do back office personnel and others feel empowered to raise issues?
ISACA announced the release of Risk IT today – an IT-related risk framework based on COBIT that provides a comprehensive view of the business risks associated with IT initiatives. In development for some time, it is a combined effort of a five-country task force comprising thousands of hours of work from a team of IT and business experts and 60 expert reviewers as well as over 1600 comments from around the world and across industries. Based on a set of guiding principles for effective management of IT risk, Risk IT provides a framework for enterprises to identify, govern and manage IT risk. Whereas COBIT provides the set of controls to mitigate risk, Risk IT enables an enterprise to understand and manage all significant IT risk types and allows the enterprise to make appropriate risk-aware decisions.
To learn more or to download the framework, click here.
As a follow on to my previous post about the survey conducted at OPEN, we also learned something about companies’ GRC efforts.
Almost 90% said that their GRC spending would either increase or stay the same over the next year. During a time when IT spending overall is dropping, it’s important to note that spending in the risk management sector is holding up. We’ve blogged about this before, but we keep getting additional data that all point to the same conclusion: companies are not cutting back on risk management spending.
The answer to the next question may provide some insight as to why. We asked how companies would characterize the current state of their GRC management efforts: siloed, converged or coordinated. 73% said siloed, 27% coordinated. This mirrors almost exactly the responses from October 2008, which suggests that the road to convergence is not a short one.
If you’re in Hong Kong this week for the 36th annual Sibos, don’t miss OpenPages – a featured business partner in the IBM booth.
Facilitated and organized by OpenPages customer SWIFT, for the SWIFT community, Sibos brings together the financial industry in a unique forum to meet, discuss, learn and keep in touch with what is going on in the industry.
Recently, there have been a few interesting articles about how other countries approach risk management. Last week, Dow Jones reported that Brazilian banks will be required to establish and maintain internal risk management structures, assuring regulators that there is some standardization of the risk management function across financial services institutions. Earlier this week, the Financial Times reported on the risk management governance practices in Spain. In Spanish banks, there’s a practice of non-executive directors sitting on a risk committee that advises management on their risk management practices. These are no paper tigers: apparently, the committee at BBVA met 102 times last year.
Regardless of whether or not these particular ideas make sense for other countries, they definitely provide some concrete examples of steps other countries could take. And they illustrate a real focus on the problem of risk management. That might be a good idea for the US. The long-awaited stress test results came out this afternoon in the US. A quick search of the document produced exactly zero hits on the term "risk management".
SearchSecurity has coverage from RSA about a new version of the PCI Data Security Standard, due out sometime in Q3 of this year. It appears they’re taking a pragmatic approach, and indications are that it will be an evolution based on user feedback rather than a drastic, revolutionary change. PCI has been a sensitive topic, and the general consensus from practitioners is that it doesn’t really help prevent data breaches in and of itself. What it DOES do, however, is provide a stick to use to get your organization to fund information security and IT risk management gaps.
If we learned anything from SOX, it’s that managing any non-trivial set of risks and controls in spreadsheets, word documents, word of mouth and prayer is a recipe for failure. PCI, in any incarnation, is no different.
US Rep and House Financial Services Committee Chair Barney Frank gave the opening keynote at Compliance Week 2010, day 2. As usual, he was witty and insightful. His remarks covered the conceptual underpinnings of financial services regulatory reform. He then took questions from the group.
He started out by saying that we needed to move quickly to provide stability to the financial system. Healthcare created a delay, but they are now on track.
To those who are cynical about government and think that “big money” runs politics, he said that the bills are the “defining counter example” of a bill that passed despite big money lobbying.
He noted that once the House passed their bill there was an assumption that the Senate would pass a watered down version, but the opposite happened–because the public was paying attention, it forced the Senate to pass a strong bill, the implication being that we should all be more vigilant about the process on Capitol Hill.
Bill should be passed before July 4, which is important for stability.
The outlines of the bill was described by Paulson in March of 2008 when he described the need for a way to dissolve non-bank financial institutions. As Frank put it, Palin’s “Death Panels” were discussed in the context of the wrong bill!
This bill will require that all financial services institutions will have to report their financial transactions to some regulator. If an entity becomes problematic, then the regulator can take action. The regulators will also have a mechanism to require enough capital for these entities to stay solvent. Although, as has been commented on the Baseline Scenario at http://bit.ly/bDddch, the amount of capital that would be required has not been defined, potentially to allow for alignment with rules in other countries.
Frank said that the real “problem was non-regulation”, pointing out that we did not have rules for credit default swaps, for instance. During the Q&A period, he used derivatives as another example of non-regulation. He said that under the bills, derivative transactions will have to be reported.
Matt Kelly, Compliance Week Editor, asked a question about international coordination. Frank pointed out that “nothing in the world is more mobile than capital” and that we should not legislate unilaterally without coordination with other countries.
Companies with market caps less than $70 million will likely be excepted from 404.
Addressing the concern of “unintended consequences,” Frank said that it was not an unintended consequence that companies may not be able to make as much money trading derivatives, as his vision for the financial services sector is that it exists to enable investment activity to grow the economy.
When asked by the regulatory reform bill is so broad, he pointed out that many of these issues are interrelated, concluding that “the ankle bone is [ultimately] connected to the shoulder bone.”
If you’re involved in developing, enhancing or monitoring your company’s risk management activities, you probably know that “risk” and associated terms are used very differently by different people. This too often is the case throughout an organization, right up to the board level. Indeed, experience shows that senior managements and boards think they’re talking the same language, when they are not.
How often have you heard the terms “risk assessment,” “risk management,” and “enterprise risk management” used almost interchangeably? If your experience is anything like mine, it happens all the time. My sense is that busy executives and directors understand the basic concept of risk and don’t take the time to get into what are perceived to be details in terminology. The resulting problem, however, is that we talk at cross purposes and misunderstandings abound. Risk related professionals know well that a risk assessment is a point-in-time snapshot of risks in an organization, risk management includes a number of activities in identifying, analyzing and managing risk, and enterprise risk management raises the bar to a still higher level.
A fundamental issue is that too often top managements and boards believe their organizations have in place effective enterprise risk management processes when in fact they don’t. They know the words, and truly believe they deal with risk as well as any organization. They believe their senior management team focuses on risk and drives risk management throughout the organization. And what we’ve often found is that they are wrong.
It is not a simple task to change the minds of high powered CEOs and directors. And one wonders whether it’s worth one’s political capital to push this issue. But this is so important a matter that to know there’s misunderstanding and allow it to continue is dangerous – for top management, the board, the company, and all of its people.
OPUS 2010 keynote speaker, independent financial fraud investigator and Madoff whistleblower Harry Markopolos will release his exclusive story “No One Would Listen: A True Financial Thriller” on March 2.
The book, which will be made available to all OPUS 2010 attendees, describes how he and his team “The Fox Hounds” investigated Madoff and presented their case to the SEC on numerous occasions’ years before Madoff turned himself in on December 11, 2008 (approximately $65 billion later).
From May 2000 to December 2008, Markopolos and his team submitted five separate and detailed warnings to the Securities and Exchange Commission (SEC) about Madoff’s operations in an effort to launch an investigation on the validity of his practices.
During the OPUS keynote address, Markopolos will detail how his four person investigative team tracked Madoff and the Madoff Feeder Funds throughout Europe and North America and repeatedly submitted detailed reports to the SEC.
If you’re an OpenPages customer and would like to hear Mr. Markopolos discuss the red flags, warning signs and the critical audit steps that companies need to be aware of to prevent similar events from occurring in the future, register for OPUS 2010 and receive a complimentary copy of his new book. It’s promising to be a “Thriller”!
Have you been involved in identifying “best practices” in a particular industry, sector, process or function, to shape and enhance your company’s governance, risk management, compliance or other activities? I know I have, as have many of my clients. It’s tough to argue with what are deemed best practices, since by definition “best” must be the ultimate. And the same goes for such terms as “leading” or “leading edge” practices. I wrote about this recently in a compliance journal, and would like to share some of my observations here.
As you may have already learned, experience shows that so called best practices often are no more than common practices, developed from surveys or information about peers, which might or might not be truly effective in the companies using them or elsewhere. And blindly following what others claim to be successful can lead to trouble.
Consider CEO compensation for example. Many board compensation committees have followed what was called a “best” practice but in reality was merely a common way of using comparative peer data in determining CEO compensation. Among the results was the Lake Woebegone effect, where every CEO had to be above average. The thought was if a board was doing its job in selecting and retaining the Company’s CEO, then the CEO must by definition be “above average.” Among the unintended consequences in financial institutions and some other organizations was extreme risk taking to drive short-term reported performance which in turn drove CEO comp.
Well, while peer comparison done with the right peer group can be a useful tool, truly effective boards directly link compensation to the company’s strategic plan. Relevant performance metrics motivate not short-term revenue but long-term return and shareholder value. While reflecting marketplace realities, compensation is geared to achievement of specified performance measures, aligned with board-approved risk appetites. And change-of-control and other severance arrangements are well thought out and tested in advance to avoid the kinds of outlandish payments we’ve seen all too often.
Indiscriminately following the herd in designing business processes and risk management and compliance systems can result in similarly unfortunate outcomes. It’s important to keep in mind that carefully and thoughtfully structuring processes in the context of your company’s objectives, culture, strategy, and risks will pay long-term dividends. Yes, it takes more effort, but nothing less will suffice, especially in this economic and competitive environment.
In February, British Banker and former chairman of Morgan Stanley International, Sir David Walker was appointed to lead a government inquiry into corporate governance in the banking sector. This week, he published the Walker Review which recommends overhauling the boards of banks and other big financial institutions by strengthening the role of non-executives and giving them new responsibilities to monitor risk and remuneration.
“We need to get governance back to centre stage,” said Walker in a statement regarding the report. “The fundamental change needed is to make the boardroom a more challenging environment than it has often been in the past. This requires non-executives able to devote sufficient time to the role in order to assess risk and ask tough questions about strategy.”
Some of the specific recommendations in the Walker Review include:
Banks should have board level risk committees chaired by non-executive
Risk committees to scrutinise and if necessary block big transactions
Chief Risk Officer to have reporting line to risk committee
Chief Risk Officer can only be sacked with agreement of board
The Walker Review proposes that most of the recommendations are enforced through inclusion in the Combined Code on Corporate Governance or a separate Stewardship Code for institutional investors, both operating on a ‘comply or explain’ basis.
It is clear that risk management will be under increasing scrutiny in the UK (and across the globe), and that the risk function will be increasingly important. To keep up with new regulation, companies will have to invest in systems to support the risk information sharing that such changes imply.
A recent article by Steven Minsky of ebiz highlights a very relevant point that many GRC industry participants struggle with: “What are the differences between ERM and GRC systems"? I strongly agree with Steven’s assertions that true Enterprise Risk Management is needed to “reach front line management and monitor risk management effectiveness”, yet at OpenPages we view ERM as a solution that facilitates effective governance, risk and compliance management. Most companies are looking to evolve and establish a governance, risk and compliance strategy and they are using ERM solutions to deliver an enterprise wide view of risk exposure, controls and performance. Therefore, we view ERM as a facilitator for your GRC initiative, not as a separate solution.
OpenPages has received a bronze award in the online video category as part of the 30th annual Telly Awards, a program that honors local, regional, and cable TV commercials and programs, video and film productions, and online film and video. There were 13,000 entries, so it was quite an honor to be recognized. Many thanks go to JCSI and SmartMarket Media for their creativity! To learn more about the award, take a closer look at our press release.
Take a look at our award-winning video. And if you’re smart, check out our careers page too!
Some of you will remember the keynote by Professor Hal Scott from last year’s OPUS. Professor Scott is a professor at the Harvard Law School and is also the Director of the Committee on Capital Markets Regulation, an entity supported by Treasury Secretary Paulson. In 2006, the Committee issued an interim report on the competitiveness of the US equity markets. One of the interesting recommendations, which Professor Scott discussed during his OPUS keynote, is a consolidation of the US financial regulatory structure. In general, their point is that the regulatory structure is too cumbersome and burdensome to be effective. For instance, it might make sense to combine the SEC and the CFTC into a single entity as they’re both market regulators. You can read the report here.
Earlier this year, Secretary Paulson proposed a blueprint that would modernize our financial regulations. For example, the Federal Reserve would be authorized to take a closer look at the operations of companies across the financial spectrum and ensure that their practices do not threaten overall financial stability. There are other good ideas, and members of Congress should consider them. As they do, they must ensure that efforts to regulate Wall Street do not end up hampering our economy’s ability to grow.
Clearly, this is an effort to shape the upcoming debate around how to make sure we don’t repeat the same mistakes we’ve made to get us in the current financial crisis. Hopefully, the debate will last longer than a week.
OPUS 2010 is approaching fast and we’ve got a great line-up on tap beginning with keynote speaker Harry Markopolos – the lead investigator that helped uncover the infamous $65 billion Bernie Madoff Ponzi scheme.
Profiled on 60 Minutes and in The Wall Street Journal, for over ten years Markopolos, a Chartered Financial Analyst (CFA) and Certified Fraud Examiner, diligently pursued the truth in the numbers of Bernie Madoff and his unbelievably huge profits. Figuring out the Madoff fraud before anyone else, Markopolos waved red flags and delivered detailed documentation to the Securities and Exchange Commission (SEC) in 2000, 2001, 2005, 2007 and again in 2008.
“It took me five minutes to know that it was a fraud. It took me another almost four hours of mathematical modeling to prove that it was a fraud.” He was repeatedly ignored by the SEC and relying on his own dogged determination and a small, tightly knit, team of trusted allies, he finally overcame the indifference of governmental agency and broke the scandal to the public.
During the OPUS keynote address, Markopolos will detail how his four person investigative team tracked Madoff and the Madoff Feeder Funds throughout Europe and North America and repeatedly submitted detailed reports to the SEC. He will discuss the red flags, warning signs and the critical audit steps that companies need to be aware of to prevent similar events from occurring in the future.
Operational risk is a critical component of an enterprise risk management strategy. For European Insurers, Solvency II will require an more holistic approach to managing and reporting on risk within the business, as noted here on the OpRisk and Compliance blog.
We’re hosting the Executive ERM Forum at PwC’s NYC office today. Twenty enterprise risk executives are gathered to discuss current topics in enterprise risk management such as regulatory reform, emerging risks, risk reporting and quantification, convergence, and GRC implementation.
Executives gathered represent a variety of industries, including banking, insurance, consumer products, travel and entertainment, and telecommunications.
At the outset, executives discussed the key issues around risk management in their business today:
Companies are very good at operational reporting, but not as good as risk reporting.
How do we improve the quality and consistency of risk reporting?
What is the right amount of information to provide the board?
How do you quantify your risk reporting?
What are the organizational, cultural and process issues associated with implementing a GRC solution?
What role should IT play in defining, architecting and managing the risk management function?
How do we as risk managers enable our business managers to make better risk decisions?
How should the risk management and audit functions collaborate?
The discussion started off on risk identification. The moderator articulated the problem as board members saying that they’re not seeing the right risks while management’s struggling to present succinctly a huge amount of information related to risk. One participant pointed out that it takes a long time to say something short.
Here are several of the key takeaways so far:
Developing the Initial Set of Risks
A couple companies talked about building up a set of risks accretively over the years, adding and deleting risks from the prior year based on the current year’s risk environment. Initially, the risks can be identified through brainstorming and/or process owners and the risks they manage.
Express Risks in Terms the Board can UnderstandOne insurance executive brought up the point that when reporting to the board, identified risks have to be expressed in terms that the board can understand relative to their notions of risk tolerance, e.g. impact on earnings per share. The board owns risk, but risk managers have to help board members understand the risk in the business. So what information does the board need? Managers need to report within the context of tolerance, something as simple as red, yellow, green. Companies need to be careful that reported risks don’t get “greener” as they move up the reporting chain to the board.
Pictures and Problems
One financial services executive discussed his company’s risk reporting as “Pictures and Problems”: What is the picture of the overall risk profile, and where are the problems (expressed in terms of risk tolerance). This gives the board both qualitative and quantitative ways to think about risk exposure.
What Does the Board Actually Believe?
The discussion turned to what does the board actually believe? A couple executives noted that board members are very skeptical of the traditional bottom up roll-up of risk. Participants agreed that this process results in a high degree of inaccuracy. One executive described their process of assessing risk at a mid-level. Then, risk is only quantified around those areas of concern. So, quantification is only at the risk level, not any aggregate. An insurance executive pointed out that companies are beginning to think about “notional exposures” – absolute value of the worst thing happening.
Over the last couple weeks, OpenPages has participated in three different conferences on risk and compliance. We sponsored the Global Conference on Operational Risk put on by the RMA and ORX, Gartner’s Risk Management and Compliance event in Chicago, and the RIMS show in Orlando. At each event, many people were asking about the role of risk management moving forward. In particular, as companies adjust to a new reality of risk management oversight, particularly in the financial services arena, many are rethinking how the different risk disciplines relate to one another.
Rene Stulz in the Six Ways Companies Mismanage Risk, published in the March issue of the Harvard Business Review, notes that risk managers often distinguish between market, credit and operational risks, which they measure separately and in isolation. However, Stulz points out that "when you put risks into a box, you’ve ignored the fact that business units strongly identified with a particular class of risk may be exposed to risks of other types that are associated with other units." Stulz goes on to point out that the collapse of the securitized mortgage market led to not just realized market risk for banks but also a very serious business risk associated with the drop in revenue associated with the lost fee income.
As more and more examples emerge about how risks cut across organizational silos, companies will more seriously consider a holistic approach to risk management. Part of the inertia has historical and cultural underpinnings, but to get out of the current crisis with a revamped risk management program that would help avoid another financial crisis of this type companies will need to get the different risk management silos to work more closely together. More on that later.
Reported everywhere but summarized on the White House blog, Obama laid out seven core principles for the new financial services regulation he and Congress will be pushing over the next months. Of particular interest to risk managers are his principles on openness and transparency as well as well as his call for the new regulatory system to be comprehensive and free of gaps.
We’ve been advocating for (and provide software for!) greater transparency for risk in the business. What’s important to note here is that to do so in any realistic, pragmatic way will require industry standards for risk reporting. Interestingly, that is the topic of today’s IBM Data Governance Council meeting on risk profile reporting. Industry experts on XBRL and risk reporting will be gathering in New York to discuss how industry can leverage existing models for risk information sharing (e.g. ORX) and the rising XBRL standard. Clearly, the government is going to provide a regulatory incentive for better risk reporting; it will be up to us to shape how that works in the real world.
Also, with regard to the regulatory framework principle, we are starting to see that regulatory agency consolidation may be an option after all. With the SEC weakened and derivative oversight totally lacking, it’s not impossible to consider the CFTC and their skills in derivative oversight as being a much larger solution to the regulatory problem. There’s also the interesting point that Obama’s from Chicago, where CFTC’s located and has great sway over the business and regulatory thinking. Having lived in Chicago for four years working for a CBOT-registered firm, I would not underestimate the influence of this perspective. Especially given the recent failures for the SEC.
Of further note, we can imagine that optional federal charter for insurance companies is dead on arrival. Clearly, Obama was saying no to regulatory arbitrage, something insurance commissioners throughout the US are concerned about.
A recent report from Forrester Research found that 46% of GRC inquiries were aimed at “understanding how to improve their compliance program.” Many asked if compliance should be part of a strategic plan or just a tactical component. If you tuned into the recent OpenPages Webinar with Aviva/Norwich Union’s David Fisher you heard how risk management practices at Aviva are being applied to financial controls to accomplish compliance goals while also improving operational control environments.
If you’d like to learn more about how to manage requirements of specific regulations or standards while understanding broader, strategic compliance programs that are responsible for all regulatory areas, be sure to check out the OpenPages Webinar with Chris McClean, Forrester Analyst, and Julian Parkin, Group Privacy Programme Director at Barclays. Chris and Julian provide a unique perspective on how companies can leverage common assessment processes and technology infrastructure to lower overall costs and take a strategic approach to managing compliance. To register for this Webinar, please visit Webinar Registration.
Also stay tuned for additinal upcoming OpenPages Webinars including:
Carnival Corporation Case Study Webinar: Leveraging the Power of Integrated Risk Management
Presented by Richard Brilliant – Vice President and Chief Audit Executive of Audit Services, Carnival Corporation & plc
How can risk management help restore confidence and trust in financial institutions and the stock market in particular?
Robert Shiller of Yale University believes that we need a new information infrastructure that provides comprehensive financial advice for everyone. He compared receiving professional financial advice to how most people have access to professional medical advice today. Imagine, for example, that if you got sick you had to go to a major drug company and ask them what to do and their advice would always be centered around their products, even if they knew a competitor had a drug that would be just right for you. For the majority of people, this is the situation we are in today with respect to financial advice and Shiller believes this needs to change even if it requires subsidizing financial professionals. Shiller also discussed ways to help improve the housing crisis where more than 12 million homes are now under water (mortgage-wise). He suggested that we need improved retail products such as home equity insurance and continuous workout mortgages that would adjust mortgage balances as housing prices decline.
Zannie Beddoes, Global Economics Editor at the ECONOMIST, gave her opinion on how we get rid of the inevitable headaches we are experiencing after moving from bubble to hangover, where assets went bust, greed changed to fear, and where thrift is foremost in everyone’s mind. Annie believes that letting Lehman fail was a major blunder and instead of an orderly wind down we were thrown into a major financial crisis. Her global to-do list for a recovery includes strengthening banks, lowering interest rates, and injecting money to provide credit liquidity.
A prominent theme from most speakers was the need to bring fairness to the restitution process. Shiller cited the example of how Germany was treated after World War I as the wrong approach. But public sentiment is definitely against the privatization of profits and the socialization of losses that seems to be happening within the financial services industry. And there is no question that providing NINJNA loans (no income, no job, no assets) was a colossal mistake but how should individual borrowers be treated in the aftermath? Should the general public be subsidizing borrowers who in many cases should not have purchased a home in the first place?
Nick Mongue, from the MACQUARIE GROUP, said that the good news is that very few banks have lost more than their capital models suggested. But, the bad news is that they lost it all in one year and that most of the losses have come from the good assets where there was hardly any risk allocated. He suggests that the current period will be rich in lessons to learn, but for risk professionals you want to learn from other banks as opposed to your own.
OPUS has been a tremendous success so far. We’ve heard from thought leaders and customers, had great dialog within sessions, and more great keynote presentations. Here’s a video with more highlights from Day Two of OPUS 2010.
OPUS is a unique gathering where OpenPages customers come to share experiences and learn. In its 6th year, OPUS is steeped in tradition and one of the foremost traditions is pairing local culture and flavor with social networking. OPUS 2010 continues that tradition with the announcement of the OPUS 2010 Gala to be held at the historic Boston Public Library, providing a true Boston experience with local culture, fare and history.
Home to over 1.2 million rare books and manuscripts including one of the rare Gutenberg Bibles and several first edition folios by William Shakespeare, the Boston Public Library is also known for its rich architectural history. Founded in 1848, by an act of the Great and General Court of Massachusetts, the Boston Public Library (BPL) was the first large free municipal library in the United States. The present location at Copley Square in Boston is across the street from OPUS 2010 and has been home to the Library since 1895, when architect Charles Follen McKim completed his “palace for the people.”
The OPUS 2010 Gala will begin with a cocktail reception overlooking the majestic outdoor Courtyard whose arcaded promenade is a replica of the Cancelleria Palace in Rome. Dinner will follow in the Popular Reading Room which looks out onto Copley Square and the Old South Church. The room features an ornate architectural vaulted ceiling with interlocking Guastavino terra cotta tiles and a distinct bookcase-lined mezzanine on two sides.
Desserts, music and fun will round out the evening in the beautiful Abbey Room where the famous “Quest of the Holy Grail,” murals by American artist, Edwin Austin Abbey have graced the walls since 1895. In true OPUS tradition, The Abbey Room will host the OPUS casino where you can try your hand at blackjack, roulette, and craps – not sure this is what Mr. Abbey had in mind! If you’re an OpenPages customer, we hope you will join us, it promises to be a fun evening with a little bit of culture on the side!
In today’s environment, an organization’s Board of Directors assumes a greater degree of accountability and understands the importance of instilling a risk aware culture to gain better visibility into corporate risk. With limited resources, it is more critical than ever that GRC managers focus on the key areas of risk in the business, whether in Compliance, Sox, IT or Audit.
To achieve these goals, organizations need to foster a risk-based approach to managing GRC initiatives where GRC managers focus and measure risk against the core aspects of their business. To be effective, a risk-based approach requires collaboration and coordination to create a common language for risk and synchronize the activities of the different functions. To learn how the right framework helps facilitate a risk-based approach and achieve the ten principles outlined by the Basel Committee, check out our latest white paper, "Sound Practices for the Management and Supervision of Operational Risk."
Tommy Thompson, IT Security and Compliance Coordinator at Williams Company recently presented at OPUS 2010 on reducing the complexity of IT risk and compliance and how Williams was able to significantly reduce costs while at the same time increase the effectiveness of their IT compliance programs. In the following video, I had the chance to speak with Tommy after his presentation.
Risk management best practices, strategic planning, networking and high energy were in abundance at OPUS 2010 – the sixth annual OpenPages User Symposium which witnessed continued growth in attendance. Featured topics at OPUS 2010 – where over 150 risk management professionals recently gathered from North America, Europe, South Africa and Asia, centered around evolving risk management strategies, risk convergence and implementing proactive compliance programs.
OpenPages President and CEO Michael Duffy kicked off Day One of the three-day user forum with the opening keynote address titled, ‘From Risk to Performance’ where he highlighted the evolution of risk management over the last decade and shared with attendees his vision for how risk management must adapt to the economic, regulatory and political pressures facing all companies today.
This was a common theme throughout OPUS 2010 as leading risk practitioners discussed the changes seen in the market over the past few years and how OpenPages customers are now in a unique position to provide valuable risk intelligence that will drive improved performance for their companies.
Following Michael Duffy’s opening keynote address, Madoff whistleblower Harry Markopolos outlined the red flags, warning signs and critical audit steps that companies need to be aware of to prevent similar events from occurring in the future. Following his keynote, Harry spent the day speaking to attendees, signing copies of his new book ‘No One Would Listen’ and sharing his thoughts on upcoming financial regulation (check out Pat O’Brien’s blog for more detail).
Julian Parkin, Group Privacy Programme Director at Barclays kicked-off Day Two with a fascinating case study on how Barclays has leveraged OpenPages for its risk management initiatives across the enterprise and across evolving risk types. Parkin described his target state as “a single view of risks, controls and governance across the organization.”
Throughout the three days, sessions were led by risk managers from a variety of customers and partners – American Express, Barclays, Carnival Corporation plc, Duke Energy, IBM, PwC and Williams Companies. Stay tuned for more details on these sessions in upcoming blog posts.
Thank you to all who attended, we look forward to seeing you at OPUS 2011!
The first keynote was delivered by Eric Rosengren, President and CEO of the Boston Fed. Rosengren opened by showing an interesting chart on the LIBOR to Overnight Swap spread, which jumped last summer and has been very volatile ever since, evidence of the reluctance of banks willingness to lend to each other.
Rosengren covered the role of liquidity in risk modeling, which he noted was largely underestimated in many models over the last year. He also noted that other fundamental assumptions were wrong, like the one that housing prices across the US are not correlated (he showed a chart of regional housing data over the last five years that looked highly correlated.)
Rosengren also spoke about the impact of rogue trading and legal settlements. Many institutions think these losses are 1 in a 1000 year events, but as we get more data, it’s emerging that these events are much more common than previously thought.
Regarding scenarios analysis and stress testing, Rosengren asked how much confidence should we put into this? In many cases, the stress tests did not accurately take into account the risks. He noted that the effect of falling housing pricing was not accurately assessed. He also noted that the impact of mortgage defaults on liquidty was universally missed.
In the Q&A period, he went on to say that we need to be more humble about the effect of some of these unexpected events and that we need to broaden our thinking about what could possibly happen.
A key theme of Rosengren’s talk is that organizations are too willing to ignore what they consider 1 in a 1000 year events, when in fact these events are turning out to be quite frequent. For instance, last year there were 14 losses over $1 billion reported. He reinforced this notion in the Q&A session that extreme losses have occurred much more frequently than we would have assumed a couple years ago.
Rosengren was followed by Randall Kroszner, Member of the Board of Governors, Federal Reserve. Kroszner took a broader perspective on Basel II, and the enhancements the framework committee is considering. He noted that banks pursuing AMA qualification need strong senior management and board oversight. He also noted that senior management can create an AMA that’s reflective of organizational realities.
Kroszner noted that Basel II has been the official regulation for just one month, but the implementation will take some time. Implementation must be taken “thoughtfully and deliberately” by individual banks which should first start with a sober and frank appraisal of their current state.
The core banks will have to plan in place for AMA qualification by Oct 1, and Kroszner noted that this will require buy-in and resource commitment from the top.
Kroszner also noted that their hope is to provide more information over the next couple months but provided some initial thoughts on what the plan will have to cover:
Gaps between existing practice and AMA
Objective and measurable milestones
Planning and governance process for meeting qualification requirements fully
He noted that the final rule allows 36 months before exiting the parallel run phase.
After some discussion of upcoming improvements to the Basel II framework, Kroszner addressed the standardized approach for non-core banks. He stated that the Fed expects that Basel II (referring to both the AMA and standardized approaches) will make the US banking system more resilient.
A key theme that emerged from Kroszner’s talk and the subsequent Q&A period was that a one size fits all approach is probably not best for the range of institutions we have in the US. Rosengren noted in the Q&A period that the final rule is more of a principles-based than a rules-based document and repeated that “it’s not clear that one size fits all.” He also noted that there’s already a wide range of practices in play right now.
Someone asked if Basel II make us more vulnerable to systemic risk because of model convergence? Kroszner responded that the flexibility of the final rule and the judgement afforded by the icap process should mitigate systemic risk. Rosengren said that oprisk has enough variety in the modeling, but that credit risk calculations over the last year may have been too reliant on the same historical data.
I had the privilege of first speaking and later serving on a panel at the Institute of Internal Auditors International Conference earlier this month, held this year right here in the U.S., in Atlanta. The panel moderator asked what I thought was a particularly interesting question – “GRC is an acronym used by many but with many different meanings; what does GRC mean to each of you?” I’d like to share my response, which went something like this.
Thinking back some years, it seems the term GRC, standing for governance, risk and compliance, came about from the management consulting world, with technology firms and others quickly picking it up. The term has served a purpose in communicating available services and software solutions. At the same time, there wasn’t anything called a “GRC” unit in businesses then, and still aren’t today. And while the term sometimes is used by compliance officers, risk officers or internal audit personnel, it’s seldom used or readily understood by line executives or board members.
As for what GRC means, to me it’s a combination of related though somewhat disparate concepts. The term “governance” traditionally has been used in context of a company’s board of directors. A definition I particularly like is “the allocation of power between the board, management and shareholders.” But of course the term now is used by many professionals to encompass what senior management does to run a company, and indeed even referring to activities downstream in the management ranks. The “R” is for “risk management,” and that term is used in many different ways, from a simple risk assessment to a full-blown enterprise risk management process. And “compliance” initially was applied to adherence to applicable laws and regulations, though many users now also include adherence to internal company policies as well.
I mentioned “disparate” because GRC isn’t really one end-to-end process that companies employ. And while the elements of GRC can be related to a company’s strategic and other business objectives, they in fact relate to activities and processes at different levels of an organization. Indeed, from a technical perspective we can say that there’s overlap, in that risk management can and should be designed to address compliance as well as other categories of objectives.
What’s important in my mind is not necessarily to try to put the genie back in the bottle by getting everyone to use these terms in the same way, because that’s just not going to happen. Rather, we need to be sure when we use the terms in our organizations that we’re very clear as to exactly what we mean.
Rick Steinberg provides a lucid review of the financial crisis and the role that financial regulators and overseers played in his recent webinar titled “The Great Financial System Meltdown”. If you were fortunate enough to attend, you heard Rick describe how we landed in this difficult financial crisis and what he expects in terms of regulations and outcomes for 2009 and beyond. One point that I found very interesting was that we need to recognize that the “100 year flood” happens every 20-30 years. He pointed to the S&L fiasco, junk bond debacle, dot-com bubble, today’s financial system meltdown and liquidity, credit markets seizure – all of which happened since Bill Buckner couldn’t field a ground ball in the ’86 World Series.
The need for better transparency at the board level and a top-down driven, risk aware culture has never been more apparent. Fortunately, as Gordon Burnes points out in a recent blog entry, the Obama administration is proposing financial services regulation which includes “principles on openness and transparency”. Chief risk officers are now more than ever getting a seat at the board table and executives are demanding visibility into risk exposure and its potential impact on operating performance.
Of course technology plays a critical role in an organization’s ability to implement an effective enterprise management framework that provides transparency and drives accountability. With enterprise risk management dashboards providing decision support at all levels within the organization, risk professionals and executives gain visibility into how their business is operating and a decision support system that can be used to improve operational performance and execution.
Technology can also drive culture. Too often in 2008 we heard of organizations that were made aware of risky portfolios and exposure, but did nothing to heed the warnings. It all begins with senior management, but technology can help promote a risk aware culture through integrated training and certifications that build awareness, creates accountability and pushes policies and processes into daily activities.
One can’t help but wonder what would have been the result had financial institutions involved in the sub-prime crisis been practicing strong risk management and fostering a risk aware corporate culture.
Shelley Parratt of the SEC’s Corporation Finance Division gave the afternoon keynote on Day 2 of Compliance Week 2010. She spoke about the Commission’s program of enhanced disclosure.
With 10K companies filing and SOX requiring the Commission to review every companies filing at least once in three years, she said that the SEC has to use their resources appropriately, and the filter that they use is how will the information be used by investors.
On executive compensation, she acknowledged that this is a very emotional topic. The SEC is trying to provide a clearer and more complete picture of what executives get paid. First, companies must provide a framework for how they make compensation decisions, but the SEC is interested in how the framework is used in real decisions. Also, the SEC is focusing on performance targets, how those targets change, and whether those targets are disclosed. “A company must engage in a thoughtful discussion about its disclosure decisions.” It is not sufficient, for instance, to just say that the target is “challenging” but should be put in context of historical performance.
On disclosure about the board and company leadership, Parratt was very clear that Chairman Shapiro is interested in increased disclosure on leadership choices and risk oversight. She said that there is no requirement for a risk committee. Different companies may choose different approaches to discharge their responsibility for risk oversight.
Regarding non-GAAP financial measures, Parratt said that disclosures should be consistent across filings and other communications. In other words, if a company uses non-GAAP financial measures in its earnings call, they should also use those measures in their filings. In no circumstances, however, should those measures be misleading, whether they are in a filing or not.
Regarding climate change, Parratt was careful to state that the Commission was not taking a position on the potential effects of climate change
During the Q&A session, Editor-in-Chief Matt Kelly asked about the current quality of the enhanced disclosure filings. Parratt acknowledged that “what we see in the first year of disclosure is often vastly different than what we will see in the second,” but noting that the first year’s disclosures aren’t necessarily out of compliance, inadequate, or poor, implying, of course, that this year’s proxy filings are all of the above!
The Stress Tests for the US Bank Holding Companies (BHC) have been released by the Fed. As had been leaked, the industry must raise $74.6 billion. The biggest number is for the Bank of America, which must raise $33.9 million, as they are unlikely to convert the preferred shares owned by the Treasury. The New York Times is reporting that the US Government will end up owning 36 pct of Citi after they convert their rescue funds into common stock. They will still have to raise $5.5 billion. Other interesting details:
Residential and consumer loans account for 70% of the losses projected under the adverse scenario, which would amount to $599.2 billion. The adverse scenario has unemployment at 8.9% in 2009 and topping out at 10.3% in 2010. Assuming that residential and consumer loans losses are a function of the unemployment rate, a lot is riding on what some economists think is an optimistic number. According to the Bureau of Labor Statistics, we’re already at 8.5% as of March (April’s numbers are being released tomorrow at 8:30 am). These results also suggest that commercial lending comprises a much smaller portion of the overall losses and won’t be the "next shoe to drop" for the economy as many people have suggested.
In the adverse scenario, each BHC was given a range of loss percent for the various categories. Each BHC could use firm-specific data to come up with their own assessment of the loss rate. Interestingly, for the First Lien Mortgages Bank of America came up with 6.8% while JP Morgan Chase 10.2%–a differential that seems quite high. Of course, JPMC bought WaMu, which had a large market share on the west coast. Another west coast bank, Wells, used 11.8% as their loss rate.
The Fed refers to the SCAP buffer–the capital needed to be raised under the Supervisory Capital Assessment Program, as a way for market participants, as well as the firms themselves, to have "confidence in the capacity of the major BHCs to perform their vital role in lending even if the economy proves weaker than expected." The press surrounding this announcement suggests that certainly the former will benefit from these results. What’s less clear is whether the banks themselves will magically start lending again. And, as discussed here, in this dynamic market, how will business models evolve to account for emerging opportunities and risks?
John Whittaker’s session on operational risk and aligning with the business covered some interesting approaches:
Barclays defines 13 principal risks that the business owns. The oprisk function can provide guidance on the control framework to mitigate each risk, but the oprisk function does not control the risk. The real process of operational risk does not sit in the corporate function.
Operational risk should be involved in discussions of strategy: it helps think through how the business can maintain their performance objectives during a 1 in 7 or 1 in 20 downturn; participates in new product approval; reviews the impact of large events. Whittaker also noted that oprisk should be involved in the stress testing process.
Operational risk managers need to understand the business intimately. This allows the function to influence decision-making effectively.
With regard to reporting, try taking away a report to see how much value it actually has. There’s some reporting that isn’t delivering the value that the reporters think. Also, trend analysis and comparison is important, not just absolute numbers. The main point is to create a discussion, which brings operational risk into the business.
What better place to hold an industry event on St. Patty’s Day than Boston? Writing today from IDC’s annual analyst event, there is plenty of green in IDC’s IT spending forecasts. IDC Directions 2009 includes over 40 sessions from IDC analysts, and the overall theme is: 2009 will be bad, but IT spending will rebound in late 2009 and 2010 should rebound to 2008 levels.
John Gantz, chief research officer and senior vice president began the day on a positive note discussing how by the end of the decade, nearly half the planet will be using mobile devices; more than 25% will have access to the Internet, most with broadband connections and more outside the developed world than in. 7.5 billion devices. The converged entity will be a $3 trillion market and encompass a kaleidoscope of computer, communications, content, and services vendors.
Mr. Gantz placed “Compliance” on his “Markets to Watch” list for potential for strong growth. In terms of Governance, Compliance and Sustainability, Kathleen Wilhide, Research Director, Compliance and Business Performance Management Solutions discussed how Fortune 500 companies are barraged with dozens of major standards of corporate accountability, responsibility, and sustainability. The rapid increase in the amount of communications focusing on sustainability topics such as carbon footprint and going "green" signal a shift occurring today: sustainability is now a critical part of many companies’ corporate strategies. While this session focused on sustainability, it became clear the overarching issue and business challenge of managing risk and compliance silos is evident and common across all governance, compliance and sustainability initiatives (OpenPages is hosting a Webinar on this topic on April 9, in conjunction with Compliance Week).
Ms. Wilhide described how software driven processes are emerging to support an organization’s sustainability strategy, calibrate progress, and quickly detect gaps in governance and sustainability practices. She also spoke to how software will play a role in managing sustainability efforts, and will provide a deeper dive into how energy companies approach one key area of sustainability – energy and the environment.
Patrick de Fontnouvelle of the Federal Reserve Bank of Boston presented a an interesting session at GCOR 2010 titled, “The Role of Operational Risk in the Recent Financial Crisis.” His basic premise was that the financial crisis of 2008 could have been avoided had financial institutions implemented and followed basic operational risk management best practices. And more importantly, that there is a history of operational risk management best practices being violated repeatedly throughout history with predictable consequences. He recommended three steps to moving forward and preventing similar crises in the future:
We must work to develop and normalize operational risk management and measurement
Outreach is critical: there is a lack of understanding or a misunderstanding regarding the nature and impact of operational risk
Governance: the risk function must have sufficient stature and authority to take action against questionable practices (in other words they must have a seat at the table)
OpenPages finished another strong quarter this week. Big wins in the US, UK and South Africa led to another profitable quarter, with both revenue and bookings up significantly in Q3 over Q2. Other highlights from the quarter included being named as a leader in both the Forrester EGRC Wave and the Gartner MQ as well as in reports by European analyst firms Chartis and Celent. If there were a Sprint Cup for risk management software, OpenPages would be way out in the lead! We’re seeing more and more evidence of what we surveyed at OPEN, namely, that risk management spending is trending up this year, and we’re also starting to see companies prepare for 2010. We’re involved in several opportunities that already have approved budgets for January.
Companies today are being forced to comply with an extensive set of regulations. One thing that you can count on in the fallout of the financial meltdown, is that regulatory pressures will continue to mount. And for large, multi-national organizations in heavily regulated verticals, the problem is further compounded. Businesses need to take a practical, cross-regulatory approach to managing compliance in order to alleviate the increasing burden while gaining insight into risks to key business processes that could affect overall corporate performance.
In a recent webinar, in which I had the privilege of co-presenting with Michael Rasmussen, president of Corporate Integrity and GRC advisor, Michael detailed several strategies that successful companies take to build an effective compliance program. Of particular note, he stated “A reactive and siloed approach to compliance is a recipe for disaster and leads to lack of visibility, wasted and/or inefficient use of resources, unnecessary complexity, lack of flexibility and vulnerability and exposure.
While compliance requires adherence to policies and a top-down driven culture, technology can play a critical role in effective compliance management through an integrated risk and compliance framework that enables business owners to document, assess, measure and test once; and then satisfy many stakeholders. This model leads to two main benefits:
1. Reduce cost and better efficiency
2. Improved effectiveness – in terms of better overall view of risk and compliance and the dependencies between them.
To find out how a Fortune 500 utility company leveraged technology to manage a massive compliance monitoring effort spanning multiple business units and areas of responsibility, check out the archived webinar or download the case study.
Former Federal Reserve Board Governor and PCAOB Chairman Mark Olson spoke during the general session this morning about the proposed legislation for financial services regulatory reform, the main point of which is to ensure systemic stability for the financial system. He made an interesting point, saying that in the US “we have a limited tolerance for financial volatility” and that regulatory reform aims to dampen that volatility.
Regarding “too big to fail,” Olson said that he agreed that we should focus legislation to manage this risk to taxpayers but that this “is a very complex task” that shouldn’t be understimated. He acknowledged that regulators and institutions agree that the soundness of the financial system requires better understanding the systemic risk posed by individual institutions, but the question is the best way to address this problem. He did note that the Dodd bill attempts to clarify the Fed’s role in “unusual and exigent circumstances” under section 13-3, which should provide more clarity as to what sort of consent is required for special action by the Fed, but, in the end, he said that the bill doesn’t address “too big to fail.”
He also said that the “tone and approach” of different regulatory agencies varies and that the bill will attempt to clarify responsibilities, although there are still certain areas of the bill which would lead to an overlap in responsibilities.
He noted that the Dodd bill will require risk committees that will require “timely and comprehensive information”, and he perceptively commented that the effectiveness of these committees will be dependent upon the quality of this information.
During the Q&A period, one member of the session asked about the so-called “shadow banking system” or financial services outside the regulatory scheme. Olson said that the consumer protection agency is trying to address this, and noted that the FTC had not been as aggressive as it should have been.
Overall, while Olson said that we would most likely get a bill passed this year, his comments did not make it clear that we would be getting the right one, or that it would truly address the complexities of managing risk in our financial system.
This weekend the president-elect Barak Obama was interviewed by Tom Brokow on Meet the Press. The interview covered a wide variety of topics, but one caught my eye as it impacts the risk management business moving forward.
On the subject of regulation in the financial services industry, Obama was very clear:
“And so, as part of our economic recovery package, what you will see coming out of my administration right at the center is a strong set of new financial regulations in which banks, ratings agencies, mortgage brokers, a whole bunch of folks start having to be much more accountable and behave much more responsibly because we can’t put ourselves–we, we can’t create the kind of systemic risks that we’re creating right now, particularly because everything is so interdependent. We’ve got to have transparency, openness, fair dealing in our financial markets. And that’s an area where I think, over the last eight years, we’ve fallen short.”
So, what does this mean for the risk management business? Well, there are two key points about what Obama said. First, he mentions accountability. The question is accountable for what. My guess is that the accountability he’s talking about is that, for instance, rating agencies have to be accountable for the ratings they issue, banks will have to be accountable for describing accurately, and completely, the securities they are selling, etc. Second, he mentions transparency and openness. Clearly, banks are going to have to provide more transparency around reporting on risk in their business. And with with more stringent reporting requirements will come greater emphasis on internal reporting on internal controls and risk exposure. Steve Adler of IBM blogged about this 10 months ago. It won’t be another 10 months for stricter regulation to materialize; the question is how will the industry respond?
Hosted by Barclays, this year’s OPEN (OpenPages European Network) Summit promises to be the best yet with a jammed-packed agenda including real-world case studies from OpenPages customer executives at Allianz, Barclays, Lloyds, ORX and Swiss Re. Joining them will be executives and product experts from OpenPages who will share the latest OpenPages product developments and review OpenPages investments and rapid customer adoption in EMEA.
If you’re unable to make it, check back for a recap of the event in the following week. Otherwise, we look forward to seeing you at Canary Wharf in London!
We’re now in “Moving Operational Risk Forward” or “Getting Value from ORX Data and Tying Operational Risk into Each Business Unit” with Joe Sabatini, JP Morgan, and Simon Wills, ORX. The introduction is being given by David Millar, PRMIA, who opened the session with a statement on the fire evacuation procedures. Some will remember that a fire alarm during an operational risk conference is not unheard of.
Sabatini started out by echoing comments from a previous session: namely, that the increased regulatory pressure will increase the challenges of managing operational risk at regulated entities.
Loss data, according to Sabatini, has been one of main drivers for change within the operational risk field. Before loss data was collected, no one really knew how much money was being lost on operational risk. With the collection of loss data, business lines understood how critical operational risk was.
With regard to capital calculation, the Enron/Worldcom data points included in the traditional LDA approach for capital would suggest for JP Morgan that they need $50 billion in capital driven somewhat by investment banking underwriting risk. Sabatini discussed an approach similar to that in the credit world where you calculate the probability of default, loss and investors winning a suit. This approach produces a more realistic capital number.
Sabatini also discussed some of the challenges and opportunities with regard to risk management, including business unit benchmarking, trend analysis, correlation with business metrics, and dynamic reporting. He also suggested that a significant advance would to have a real time dashboard that would allow what-if analysis discussion between market, credit and operational risk functions.
Simon Wills then gave an overview of ORX, our customer and partner. He said that they will be up to 54 member institutions when they announce their newest member tomorrow. Wills noted that ORX follows the Basel II categorization, with an additional category for corporate losses (ransom paid for a kidnapping of the chairman, for instance).
ORX also collects data on the product (e.g. equity derivative) and process (sales and marketing) associated with the losses, which provides a greater degree of granularity to the loss. ORX also collects additional information on large losses (over €10 million).
Wills shared some recent data on operational risk losses, and noted that sales and trading have been the driver of the large number of losses in 2008, whose aggregate severity rivals that of the Enron/Worldcom losses of 2002.
ORX is interested in a better visualization of the data to improve the communication and engagement of operational risk with the business. Corporate finance, for instance, tends to have low frequency and high severity losses, the opposite of losses in the retail business. Wills showed a 3D graph of the two different loss data sets, with dramatic spikes in the corporate finance business.
Wills talked about ORX sector services that will provide insight for different business units to benchmark against their peers, and, in this way, provide real business performance value to operational risk managers and their business line colleagues.
A tag is a keyword you assign to make a blog or blog content easier to find. Click a tag to find content that has been assigned that keyword. Click another tag to refine the search further. Click Find a tag to search for a tag that is not displayed in the collection.