Shelley Parratt of the SEC’s Corporation Finance Division gave the afternoon keynote on Day 2 of Compliance Week 2010. She spoke about the Commission’s program of enhanced disclosure.
With 10K companies filing and SOX requiring the Commission to review every companies filing at least once in three years, she said that the SEC has to use their resources appropriately, and the filter that they use is how will the information be used by investors.
On executive compensation, she acknowledged that this is a very emotional topic. The SEC is trying to provide a clearer and more complete picture of what executives get paid. First, companies must provide a framework for how they make compensation decisions, but the SEC is interested in how the framework is used in real decisions. Also, the SEC is focusing on performance targets, how those targets change, and whether those targets are disclosed. “A company must engage in a thoughtful discussion about its disclosure decisions.” It is not sufficient, for instance, to just say that the target is “challenging” but should be put in context of historical performance.
On disclosure about the board and company leadership, Parratt was very clear that Chairman Shapiro is interested in increased disclosure on leadership choices and risk oversight. She said that there is no requirement for a risk committee. Different companies may choose different approaches to discharge their responsibility for risk oversight.
Regarding non-GAAP financial measures, Parratt said that disclosures should be consistent across filings and other communications. In other words, if a company uses non-GAAP financial measures in its earnings call, they should also use those measures in their filings. In no circumstances, however, should those measures be misleading, whether they are in a filing or not.
Regarding climate change, Parratt was careful to state that the Commission was not taking a position on the potential effects of climate change
During the Q&A session, Editor-in-Chief Matt Kelly asked about the current quality of the enhanced disclosure filings. Parratt acknowledged that “what we see in the first year of disclosure is often vastly different than what we will see in the second,” but noting that the first year’s disclosures aren’t necessarily out of compliance, inadequate, or poor, implying, of course, that this year’s proxy filings are all of the above!
OPUS 2010 is approaching fast and we’ve got a great line-up on tap beginning with keynote speaker Harry Markopolos – the lead investigator that helped uncover the infamous $65 billion Bernie Madoff Ponzi scheme.
Profiled on 60 Minutes and in The Wall Street Journal, for over ten years Markopolos, a Chartered Financial Analyst (CFA) and Certified Fraud Examiner, diligently pursued the truth in the numbers of Bernie Madoff and his unbelievably huge profits. Figuring out the Madoff fraud before anyone else, Markopolos waved red flags and delivered detailed documentation to the Securities and Exchange Commission (SEC) in 2000, 2001, 2005, 2007 and again in 2008.
“It took me five minutes to know that it was a fraud. It took me another almost four hours of mathematical modeling to prove that it was a fraud.” He was repeatedly ignored by the SEC and relying on his own dogged determination and a small, tightly knit, team of trusted allies, he finally overcame the indifference of governmental agency and broke the scandal to the public.
During the OPUS keynote address, Markopolos will detail how his four person investigative team tracked Madoff and the Madoff Feeder Funds throughout Europe and North America and repeatedly submitted detailed reports to the SEC. He will discuss the red flags, warning signs and the critical audit steps that companies need to be aware of to prevent similar events from occurring in the future.
OpenPages finished another strong quarter this week. Big wins in the US, UK and South Africa led to another profitable quarter, with both revenue and bookings up significantly in Q3 over Q2. Other highlights from the quarter included being named as a leader in both the Forrester EGRC Wave and the Gartner MQ as well as in reports by European analyst firms Chartis and Celent. If there were a Sprint Cup for risk management software, OpenPages would be way out in the lead! We’re seeing more and more evidence of what we surveyed at OPEN, namely, that risk management spending is trending up this year, and we’re also starting to see companies prepare for 2010. We’re involved in several opportunities that already have approved budgets for January.
I had the privilege of first speaking and later serving on a panel at the Institute of Internal Auditors International Conference earlier this month, held this year right here in the U.S., in Atlanta. The panel moderator asked what I thought was a particularly interesting question – “GRC is an acronym used by many but with many different meanings; what does GRC mean to each of you?” I’d like to share my response, which went something like this.
Thinking back some years, it seems the term GRC, standing for governance, risk and compliance, came about from the management consulting world, with technology firms and others quickly picking it up. The term has served a purpose in communicating available services and software solutions. At the same time, there wasn’t anything called a “GRC” unit in businesses then, and still aren’t today. And while the term sometimes is used by compliance officers, risk officers or internal audit personnel, it’s seldom used or readily understood by line executives or board members.
As for what GRC means, to me it’s a combination of related though somewhat disparate concepts. The term “governance” traditionally has been used in context of a company’s board of directors. A definition I particularly like is “the allocation of power between the board, management and shareholders.” But of course the term now is used by many professionals to encompass what senior management does to run a company, and indeed even referring to activities downstream in the management ranks. The “R” is for “risk management,” and that term is used in many different ways, from a simple risk assessment to a full-blown enterprise risk management process. And “compliance” initially was applied to adherence to applicable laws and regulations, though many users now also include adherence to internal company policies as well.
I mentioned “disparate” because GRC isn’t really one end-to-end process that companies employ. And while the elements of GRC can be related to a company’s strategic and other business objectives, they in fact relate to activities and processes at different levels of an organization. Indeed, from a technical perspective we can say that there’s overlap, in that risk management can and should be designed to address compliance as well as other categories of objectives.
What’s important in my mind is not necessarily to try to put the genie back in the bottle by getting everyone to use these terms in the same way, because that’s just not going to happen. Rather, we need to be sure when we use the terms in our organizations that we’re very clear as to exactly what we mean.
Risk management best practices, strategic planning, networking and high energy were in abundance at OPUS 2010 – the sixth annual OpenPages User Symposium which witnessed continued growth in attendance. Featured topics at OPUS 2010 – where over 150 risk management professionals recently gathered from North America, Europe, South Africa and Asia, centered around evolving risk management strategies, risk convergence and implementing proactive compliance programs.
OpenPages President and CEO Michael Duffy kicked off Day One of the three-day user forum with the opening keynote address titled, ‘From Risk to Performance’ where he highlighted the evolution of risk management over the last decade and shared with attendees his vision for how risk management must adapt to the economic, regulatory and political pressures facing all companies today.
This was a common theme throughout OPUS 2010 as leading risk practitioners discussed the changes seen in the market over the past few years and how OpenPages customers are now in a unique position to provide valuable risk intelligence that will drive improved performance for their companies.
Following Michael Duffy’s opening keynote address, Madoff whistleblower Harry Markopolos outlined the red flags, warning signs and critical audit steps that companies need to be aware of to prevent similar events from occurring in the future. Following his keynote, Harry spent the day speaking to attendees, signing copies of his new book ‘No One Would Listen’ and sharing his thoughts on upcoming financial regulation (check out Pat O’Brien’s blog for more detail).
Julian Parkin, Group Privacy Programme Director at Barclays kicked-off Day Two with a fascinating case study on how Barclays has leveraged OpenPages for its risk management initiatives across the enterprise and across evolving risk types. Parkin described his target state as “a single view of risks, controls and governance across the organization.”
Throughout the three days, sessions were led by risk managers from a variety of customers and partners – American Express, Barclays, Carnival Corporation plc, Duke Energy, IBM, PwC and Williams Companies. Stay tuned for more details on these sessions in upcoming blog posts.
Thank you to all who attended, we look forward to seeing you at OPUS 2011!
Companies today are being forced to comply with an extensive set of regulations. One thing that you can count on in the fallout of the financial meltdown, is that regulatory pressures will continue to mount. And for large, multi-national organizations in heavily regulated verticals, the problem is further compounded. Businesses need to take a practical, cross-regulatory approach to managing compliance in order to alleviate the increasing burden while gaining insight into risks to key business processes that could affect overall corporate performance.
In a recent webinar, in which I had the privilege of co-presenting with Michael Rasmussen, president of Corporate Integrity and GRC advisor, Michael detailed several strategies that successful companies take to build an effective compliance program. Of particular note, he stated “A reactive and siloed approach to compliance is a recipe for disaster and leads to lack of visibility, wasted and/or inefficient use of resources, unnecessary complexity, lack of flexibility and vulnerability and exposure.
While compliance requires adherence to policies and a top-down driven culture, technology can play a critical role in effective compliance management through an integrated risk and compliance framework that enables business owners to document, assess, measure and test once; and then satisfy many stakeholders. This model leads to two main benefits:
1. Reduce cost and better efficiency
2. Improved effectiveness – in terms of better overall view of risk and compliance and the dependencies between them.
To find out how a Fortune 500 utility company leveraged technology to manage a massive compliance monitoring effort spanning multiple business units and areas of responsibility, check out the archived webinar or download the case study.
What better place to hold an industry event on St. Patty’s Day than Boston? Writing today from IDC’s annual analyst event, there is plenty of green in IDC’s IT spending forecasts. IDC Directions 2009 includes over 40 sessions from IDC analysts, and the overall theme is: 2009 will be bad, but IT spending will rebound in late 2009 and 2010 should rebound to 2008 levels.
John Gantz, chief research officer and senior vice president began the day on a positive note discussing how by the end of the decade, nearly half the planet will be using mobile devices; more than 25% will have access to the Internet, most with broadband connections and more outside the developed world than in. 7.5 billion devices. The converged entity will be a $3 trillion market and encompass a kaleidoscope of computer, communications, content, and services vendors.
Mr. Gantz placed “Compliance” on his “Markets to Watch” list for potential for strong growth. In terms of Governance, Compliance and Sustainability, Kathleen Wilhide, Research Director, Compliance and Business Performance Management Solutions discussed how Fortune 500 companies are barraged with dozens of major standards of corporate accountability, responsibility, and sustainability. The rapid increase in the amount of communications focusing on sustainability topics such as carbon footprint and going "green" signal a shift occurring today: sustainability is now a critical part of many companies’ corporate strategies. While this session focused on sustainability, it became clear the overarching issue and business challenge of managing risk and compliance silos is evident and common across all governance, compliance and sustainability initiatives (OpenPages is hosting a Webinar on this topic on April 9, in conjunction with Compliance Week).
Ms. Wilhide described how software driven processes are emerging to support an organization’s sustainability strategy, calibrate progress, and quickly detect gaps in governance and sustainability practices. She also spoke to how software will play a role in managing sustainability efforts, and will provide a deeper dive into how energy companies approach one key area of sustainability – energy and the environment.
Tommy Thompson, IT Security and Compliance Coordinator at Williams Company recently presented at OPUS 2010 on reducing the complexity of IT risk and compliance and how Williams was able to significantly reduce costs while at the same time increase the effectiveness of their IT compliance programs. In the following video, I had the chance to speak with Tommy after his presentation.
The Stress Tests for the US Bank Holding Companies (BHC) have been released by the Fed. As had been leaked, the industry must raise $74.6 billion. The biggest number is for the Bank of America, which must raise $33.9 million, as they are unlikely to convert the preferred shares owned by the Treasury. The New York Times is reporting that the US Government will end up owning 36 pct of Citi after they convert their rescue funds into common stock. They will still have to raise $5.5 billion. Other interesting details:
Residential and consumer loans account for 70% of the losses projected under the adverse scenario, which would amount to $599.2 billion. The adverse scenario has unemployment at 8.9% in 2009 and topping out at 10.3% in 2010. Assuming that residential and consumer loans losses are a function of the unemployment rate, a lot is riding on what some economists think is an optimistic number. According to the Bureau of Labor Statistics, we’re already at 8.5% as of March (April’s numbers are being released tomorrow at 8:30 am). These results also suggest that commercial lending comprises a much smaller portion of the overall losses and won’t be the "next shoe to drop" for the economy as many people have suggested.
In the adverse scenario, each BHC was given a range of loss percent for the various categories. Each BHC could use firm-specific data to come up with their own assessment of the loss rate. Interestingly, for the First Lien Mortgages Bank of America came up with 6.8% while JP Morgan Chase 10.2%–a differential that seems quite high. Of course, JPMC bought WaMu, which had a large market share on the west coast. Another west coast bank, Wells, used 11.8% as their loss rate.
The Fed refers to the SCAP buffer–the capital needed to be raised under the Supervisory Capital Assessment Program, as a way for market participants, as well as the firms themselves, to have "confidence in the capacity of the major BHCs to perform their vital role in lending even if the economy proves weaker than expected." The press surrounding this announcement suggests that certainly the former will benefit from these results. What’s less clear is whether the banks themselves will magically start lending again. And, as discussed here, in this dynamic market, how will business models evolve to account for emerging opportunities and risks?
Former Federal Reserve Board Governor and PCAOB Chairman Mark Olson spoke during the general session this morning about the proposed legislation for financial services regulatory reform, the main point of which is to ensure systemic stability for the financial system. He made an interesting point, saying that in the US “we have a limited tolerance for financial volatility” and that regulatory reform aims to dampen that volatility.
Regarding “too big to fail,” Olson said that he agreed that we should focus legislation to manage this risk to taxpayers but that this “is a very complex task” that shouldn’t be understimated. He acknowledged that regulators and institutions agree that the soundness of the financial system requires better understanding the systemic risk posed by individual institutions, but the question is the best way to address this problem. He did note that the Dodd bill attempts to clarify the Fed’s role in “unusual and exigent circumstances” under section 13-3, which should provide more clarity as to what sort of consent is required for special action by the Fed, but, in the end, he said that the bill doesn’t address “too big to fail.”
He also said that the “tone and approach” of different regulatory agencies varies and that the bill will attempt to clarify responsibilities, although there are still certain areas of the bill which would lead to an overlap in responsibilities.
He noted that the Dodd bill will require risk committees that will require “timely and comprehensive information”, and he perceptively commented that the effectiveness of these committees will be dependent upon the quality of this information.
During the Q&A period, one member of the session asked about the so-called “shadow banking system” or financial services outside the regulatory scheme. Olson said that the consumer protection agency is trying to address this, and noted that the FTC had not been as aggressive as it should have been.
Overall, while Olson said that we would most likely get a bill passed this year, his comments did not make it clear that we would be getting the right one, or that it would truly address the complexities of managing risk in our financial system.
The first keynote was delivered by Eric Rosengren, President and CEO of the Boston Fed. Rosengren opened by showing an interesting chart on the LIBOR to Overnight Swap spread, which jumped last summer and has been very volatile ever since, evidence of the reluctance of banks willingness to lend to each other.
Rosengren covered the role of liquidity in risk modeling, which he noted was largely underestimated in many models over the last year. He also noted that other fundamental assumptions were wrong, like the one that housing prices across the US are not correlated (he showed a chart of regional housing data over the last five years that looked highly correlated.)
Rosengren also spoke about the impact of rogue trading and legal settlements. Many institutions think these losses are 1 in a 1000 year events, but as we get more data, it’s emerging that these events are much more common than previously thought.
Regarding scenarios analysis and stress testing, Rosengren asked how much confidence should we put into this? In many cases, the stress tests did not accurately take into account the risks. He noted that the effect of falling housing pricing was not accurately assessed. He also noted that the impact of mortgage defaults on liquidty was universally missed.
In the Q&A period, he went on to say that we need to be more humble about the effect of some of these unexpected events and that we need to broaden our thinking about what could possibly happen.
A key theme of Rosengren’s talk is that organizations are too willing to ignore what they consider 1 in a 1000 year events, when in fact these events are turning out to be quite frequent. For instance, last year there were 14 losses over $1 billion reported. He reinforced this notion in the Q&A session that extreme losses have occurred much more frequently than we would have assumed a couple years ago.
Rosengren was followed by Randall Kroszner, Member of the Board of Governors, Federal Reserve. Kroszner took a broader perspective on Basel II, and the enhancements the framework committee is considering. He noted that banks pursuing AMA qualification need strong senior management and board oversight. He also noted that senior management can create an AMA that’s reflective of organizational realities.
Kroszner noted that Basel II has been the official regulation for just one month, but the implementation will take some time. Implementation must be taken “thoughtfully and deliberately” by individual banks which should first start with a sober and frank appraisal of their current state.
The core banks will have to plan in place for AMA qualification by Oct 1, and Kroszner noted that this will require buy-in and resource commitment from the top.
Kroszner also noted that their hope is to provide more information over the next couple months but provided some initial thoughts on what the plan will have to cover:
Gaps between existing practice and AMA
Objective and measurable milestones
Planning and governance process for meeting qualification requirements fully
He noted that the final rule allows 36 months before exiting the parallel run phase.
After some discussion of upcoming improvements to the Basel II framework, Kroszner addressed the standardized approach for non-core banks. He stated that the Fed expects that Basel II (referring to both the AMA and standardized approaches) will make the US banking system more resilient.
A key theme that emerged from Kroszner’s talk and the subsequent Q&A period was that a one size fits all approach is probably not best for the range of institutions we have in the US. Rosengren noted in the Q&A period that the final rule is more of a principles-based than a rules-based document and repeated that “it’s not clear that one size fits all.” He also noted that there’s already a wide range of practices in play right now.
Someone asked if Basel II make us more vulnerable to systemic risk because of model convergence? Kroszner responded that the flexibility of the final rule and the judgement afforded by the icap process should mitigate systemic risk. Rosengren said that oprisk has enough variety in the modeling, but that credit risk calculations over the last year may have been too reliant on the same historical data.
John Whittaker’s session on operational risk and aligning with the business covered some interesting approaches:
Barclays defines 13 principal risks that the business owns. The oprisk function can provide guidance on the control framework to mitigate each risk, but the oprisk function does not control the risk. The real process of operational risk does not sit in the corporate function.
Operational risk should be involved in discussions of strategy: it helps think through how the business can maintain their performance objectives during a 1 in 7 or 1 in 20 downturn; participates in new product approval; reviews the impact of large events. Whittaker also noted that oprisk should be involved in the stress testing process.
Operational risk managers need to understand the business intimately. This allows the function to influence decision-making effectively.
With regard to reporting, try taking away a report to see how much value it actually has. There’s some reporting that isn’t delivering the value that the reporters think. Also, trend analysis and comparison is important, not just absolute numbers. The main point is to create a discussion, which brings operational risk into the business.
This weekend the president-elect Barak Obama was interviewed by Tom Brokow on Meet the Press. The interview covered a wide variety of topics, but one caught my eye as it impacts the risk management business moving forward.
On the subject of regulation in the financial services industry, Obama was very clear:
“And so, as part of our economic recovery package, what you will see coming out of my administration right at the center is a strong set of new financial regulations in which banks, ratings agencies, mortgage brokers, a whole bunch of folks start having to be much more accountable and behave much more responsibly because we can’t put ourselves–we, we can’t create the kind of systemic risks that we’re creating right now, particularly because everything is so interdependent. We’ve got to have transparency, openness, fair dealing in our financial markets. And that’s an area where I think, over the last eight years, we’ve fallen short.”
So, what does this mean for the risk management business? Well, there are two key points about what Obama said. First, he mentions accountability. The question is accountable for what. My guess is that the accountability he’s talking about is that, for instance, rating agencies have to be accountable for the ratings they issue, banks will have to be accountable for describing accurately, and completely, the securities they are selling, etc. Second, he mentions transparency and openness. Clearly, banks are going to have to provide more transparency around reporting on risk in their business. And with with more stringent reporting requirements will come greater emphasis on internal reporting on internal controls and risk exposure. Steve Adler of IBM blogged about this 10 months ago. It won’t be another 10 months for stricter regulation to materialize; the question is how will the industry respond?
Hosted by Barclays, this year’s OPEN (OpenPages European Network) Summit promises to be the best yet with a jammed-packed agenda including real-world case studies from OpenPages customer executives at Allianz, Barclays, Lloyds, ORX and Swiss Re. Joining them will be executives and product experts from OpenPages who will share the latest OpenPages product developments and review OpenPages investments and rapid customer adoption in EMEA.
If you’re unable to make it, check back for a recap of the event in the following week. Otherwise, we look forward to seeing you at Canary Wharf in London!
We’re now in “Moving Operational Risk Forward” or “Getting Value from ORX Data and Tying Operational Risk into Each Business Unit” with Joe Sabatini, JP Morgan, and Simon Wills, ORX. The introduction is being given by David Millar, PRMIA, who opened the session with a statement on the fire evacuation procedures. Some will remember that a fire alarm during an operational risk conference is not unheard of.
Sabatini started out by echoing comments from a previous session: namely, that the increased regulatory pressure will increase the challenges of managing operational risk at regulated entities.
Loss data, according to Sabatini, has been one of main drivers for change within the operational risk field. Before loss data was collected, no one really knew how much money was being lost on operational risk. With the collection of loss data, business lines understood how critical operational risk was.
With regard to capital calculation, the Enron/Worldcom data points included in the traditional LDA approach for capital would suggest for JP Morgan that they need $50 billion in capital driven somewhat by investment banking underwriting risk. Sabatini discussed an approach similar to that in the credit world where you calculate the probability of default, loss and investors winning a suit. This approach produces a more realistic capital number.
Sabatini also discussed some of the challenges and opportunities with regard to risk management, including business unit benchmarking, trend analysis, correlation with business metrics, and dynamic reporting. He also suggested that a significant advance would to have a real time dashboard that would allow what-if analysis discussion between market, credit and operational risk functions.
Simon Wills then gave an overview of ORX, our customer and partner. He said that they will be up to 54 member institutions when they announce their newest member tomorrow. Wills noted that ORX follows the Basel II categorization, with an additional category for corporate losses (ransom paid for a kidnapping of the chairman, for instance).
ORX also collects data on the product (e.g. equity derivative) and process (sales and marketing) associated with the losses, which provides a greater degree of granularity to the loss. ORX also collects additional information on large losses (over €10 million).
Wills shared some recent data on operational risk losses, and noted that sales and trading have been the driver of the large number of losses in 2008, whose aggregate severity rivals that of the Enron/Worldcom losses of 2002.
ORX is interested in a better visualization of the data to improve the communication and engagement of operational risk with the business. Corporate finance, for instance, tends to have low frequency and high severity losses, the opposite of losses in the retail business. Wills showed a 3D graph of the two different loss data sets, with dramatic spikes in the corporate finance business.
Wills talked about ORX sector services that will provide insight for different business units to benchmark against their peers, and, in this way, provide real business performance value to operational risk managers and their business line colleagues.
Patrick de Fontnouvelle of the Federal Reserve Bank of Boston presented a an interesting session at GCOR 2010 titled, “The Role of Operational Risk in the Recent Financial Crisis.” His basic premise was that the financial crisis of 2008 could have been avoided had financial institutions implemented and followed basic operational risk management best practices. And more importantly, that there is a history of operational risk management best practices being violated repeatedly throughout history with predictable consequences. He recommended three steps to moving forward and preventing similar crises in the future:
We must work to develop and normalize operational risk management and measurement
Outreach is critical: there is a lack of understanding or a misunderstanding regarding the nature and impact of operational risk
Governance: the risk function must have sufficient stature and authority to take action against questionable practices (in other words they must have a seat at the table)
A tag is a keyword you assign to make a blog or blog content easier to find. Click a tag to find content that has been assigned that keyword. Click another tag to refine the search further. Click Find a tag to search for a tag that is not displayed in the collection.