It’s well known that a company’s tone at the top is critically important in determining its culture, including whether or not it will act with integrity and ethical values – fundamental elements of effective internal control and risk management. And we know it’s not only the words spoken at the top, but also the CEO’s actions that drive culture. What brings this to mind is the recent conviction of the CEO of fraud detection firm Fraud Discovery Institute. While a conviction of the head of this type of firm might appear unusual though not particularly noteworthy, what’s truly compelling about this news is that the CEO is none other than Barry Minkow.
If you were following internal control, risk management and fraud back in the late 1980’s, you’ll likely remember the well-publicized fraud carried out by Minkow when he led ZZZZ Best Co. Reportedly he started the business at age 16, and took it public with the value exceeding over $200 million. But it turns out he cooked the books and falsified documents to support the fraudulent financial statements. Having been found out, he was convicted and sentenced to a 25-year prison term, ultimately serving a bit more than seven. After leaving prison, he started Fraud Discovery Institute in San Diego to uncover corporate fraud for clients, and took on a role as pastor of a community church. Why would anyone hire his newly formed firm? Well, certainly Minkow could be termed an expert in how to commit fraud, and thus how to prevent it, and having paid his dues to society it’s understandable that he was given the benefit of the doubt in redemption and starting a new and productive life.
It would be nice if this story had a happy ending, but it turns out that in his new firm Minkow reverted to his old ways. Prosecutors claimed that Minkow made false and misleading statements about Miami homebuilder Lennar Corp.’s financial condition to drive down the company’s share price [and] abused his relationship with federal law enforcement agents to get non-public information about Lennar and traded on that information.” And the 45-year old Minkow was sentenced in federal court to a five year prison term.
One could say that “once a crook, always a crook,” but that would be unfair. People do bad things and then turn to the straight and narrow, and have done good deeds in their lives. Nonetheless, when it comes to leading a business, it’s not three strikes and you’re out, but two, or more likely one. The tone at the top and actions of a CEO are too important to trust to anyone with anything other than a background not only of skill and performance, but also acting with integrity and ethical values.
As you may know, the Dodd-Frank Act gave institutional investors and shareholder activists perhaps the item highest on their wish list – gaining ready access to the proxy statement with ability to name its own director nominees. And the SEC developed enabling rules to make it happen. Well, the U.S. Court of Appeals for the D.C. circuit just pulled the rule out from under shareholders. If you’re a shareholder activist, you’re probably outraged, but if you’re a board member or member of the senior management team, you’re likely breathing a sigh of relief!
The suit was brought by the Business Roundtable and U.S. Chamber of Commerce, and many thought it didn’t have much chance of succeeding. But succeed it did. The court ruled the S.E.C. “acted arbitrarily and capriciously” in failing to adequately consider the rule’s effect on “efficiency, competition and capital formation.” In its unanimous decision, the court added that the SEC “inconsistently and opportunistically framed the costs and benefits of the rule; failed adequately to quantify the certain costs or to explain why those costs could not be quantified; neglected to support its predictive judgments; contradicted itself; and failed to respond to substantial problems raised by commenters.”
And this isn’t the first time the Court shot down SEC rules – it’s happened several times in the last few years, also on the basis that the SEC didn’t properly assess the economic effects. So, where does the Commission go from here? Since this decision was issued by a panel of the Court, the SEC could ask the entire Court to review the case, or appeal to the U.S. Supreme Court. Or, it might want to conduct a more in-depth economic assessment of the rule to satisfy the Court, or come up with another rule. As the U.S. Chamber calls its victory “a big win for America’s job creators and investors,” the SEC is “reviewing the decision and considering our options.”
For what it’s worth, my view is that direct shareholder nominating of directors can be counterproductive. While seemingly supported by the concept of a democratic process, putting dissident or one-issue directors on the board, which might have occurred, would normally not serve a board, the company or its shareholders well. While the SEC’s rule seemed reasonable in terms of effecting the law’s mandate, perhaps the SEC can come up with something better.
You’re a CEO, senior manager, or board member watching your once-great company brought to its knees. You imagine yourself on the deck of the Titanic, your world coming to an end—your once confident self embarrassed in front of colleagues, competitors, friends, family, and the larger communities in which you once thrived and were held in such high esteem.
This is the first sentence a just-released book published by John Wiley & Sons. I got my hands on an advance copy, and it is compelling reading. It analyzes how – while facing different circumstances in different industries – common themes underlie why once-great companies have seen their fortunes sink, while others withstand economic turbulence and hazards to continue to grow and reap the rewards of success. But the book is not solely about how to avoid disaster. It highlights how having the right infrastructure enables an organization’s positive qualities to lead to success. This includes what’s needed to avoid the kinds of disasters that can befall any organization, but also essential to identifying opportunities and being positioned to seize them for competitive advantage.
I don’t often recommend books to others, but this one is exceptional. It has a long title: Governance, Risk Management and Compliance – It Can’t Happen to Us: Avoiding Corporate Disaster While Driving Success. I believe the substance stands up to its claim that “unlike other books, this one is not aimed solely at senior managers or solely at members of boards of directors. It’s directed to both, with an added objective of providing insight into the interface between the two.”
You might be asking why Steinberg is spending so much space here touting this book – it is because the book is really that valuable, or does he have some ulterior motive? Well, okay, I’ll fess up – the answer is “both.” Yes, as you may have guessed, I wrote the book. And I apologize for withholding that important fact until now! But I do believe virtually any reader of this blog will greatly benefit from reading the book. And I’m pleased that I’m not the only one who thinks so. Here’s what some others, whose names you might recognize, are saying:
- Rick Steinberg is a time-tested expert in this ever more essential field. His refreshing candor in assessing recent shortfalls makes this book a must-read for corporate leaders — Mark R. Fetting, Chairman and CEO, Legg Mason, Inc.
- This outstanding book provides a critically important perspective on how risk management can only be truly achieved by aligning culture, strategy, compliance programs, and compensation. It should be must reading for any board member concerned with improving the management of risk — Jay Lorsch, Louis E. Kirstein Professor of Human Relations, Harvard Business School
- A comprehensive and insightful examination of corporate governance. A must-read for those of us who are CEOs and serve on public boards — Randall L. Clark, Chairman and CEO, Dunn Tire LLC; former Chairman and CEO, Dunlop Tire North America
- Attention directors and officers: Ignore this book at your own peril. Richard Steinberg has crafted a careful, thoughtful approach to managing risks, and it should be required reading for Corporate America — Scott S. Cohen, founder and former Editor and Publisher, Compliance Week
- Richard Steinberg’s comprehensive and clearly written work will substantially benefit both new and experienced directors. It will help corporate boards recognize the challenging forces businesses face, as well as the techniques and standards available to intelligently monitor and supervise firms and their senior management. An easy and engaging read, this book should be on the bookshelf of every corporate director — William T. Allen, Director, NYU Pollack Center of Law & Business; former Chancellor, Court of Chancery of the State of Delaware
- Richard Steinberg, a respected and time-proven governance hand, has written a most enjoyable and thought-provoking work—an excellent addition to anyone’s governance shelf! — Charles Elson, Edgar S. Woolard, Jr., Chair in Corporate Governance and Director of the Weinberg Center for Corporate Governance, University of Delaware
By the way, the IBM Open Pages people were kind to allow me to use a paper I wrote for them as the basis of one of the chapters. I hope you will consider reading the book, and I trust you will not be disappointed!
If you’re involved with compliance, you must know that the SEC issued its final rules on whisleblowing. The original proposal was hugely contentious, with serious concern that employees will bypass companies’ internal reporting channels established as part of comprehensive compliance programs instituted and enhanced over recent years, and instead run directly to the SEC for a lottery-size payday.
The SEC’s director of enforcement initially had said that the agency will be “mindful of competing interests” as it shapes regulations around the new law.” Well, there are changes from the proposed rules to the final, but compliance officers and their companies are disappointed, understandably so. Unfortunately, as with the proposed rules, reporting first internally is not required. Among the changes are provision for employees to report internally and then within 120 days (rather than the proposed 90 days) go to the SEC and still maintain a “place in line” for a major payday by the regulator. Also, certain specified personnel are excluded from being paid by the government, generally including lawyers, auditors and compliance personnel, and those themselves involved in the misconduct – although there are exceptions to the exceptions. And interestingly, when a whistleblower reports to the SEC, related information subsequently provided by the company to the SEC is attributed to the whistleblower. Officials from the Association of Corporate Counsel have said the rule will result in “gutting” compliance systems, and the U.S. Chamber of Commerce continues to be up in arms. Two of the five commissioners voted against the final rule, which passed in a 3-2 vote. In a survey of directors, 67% said this is the most detrimental part of Dodd-Frank.
Suffice it to say here that the modifications from the proposed rules to the final are such that compliance and other corporate officers continue to believe their past efforts in establishing internal whistleblower protocols are being undermined, and they will need to work hard and be creative in encouraging employees to work within internal reporting systems. One law firm says the best line of defense is to have robust internal compliance and audit procedures designed to proactively uncover potential wrongdoing and, where misconduct is found, to promptly address and remediate it aggressively before a whistleblower surfaces. Easy to say, challenging to do. Clearly, there’s a lot of work ahead for compliance officers, general counsels and their colleagues.
OpRisk Europe 2011 – now in its 13th year, commenced today at the historic Waldorf Hotel in the West End of London. Somewhat ironic that the risk management conference is taking place in the stylish hotel whose interior is said to have inspired the designers of the “unsinkable” Titanic – a classic case study on risk management.
In one breakout session, Andrew Sheen of the FSA’s risk frameworks and governance team discussed recent developments from the BIS and their impact on operational risk. Citing updates to the “Sound Practices for the Management of Operational Risk” paper recently updated by the BIS Committee, Sheen emphasized several key considerations for the board of directors and senior management team. In particular, he emphasized the need for the board to set the tone at the top in order to promote a strong risk management culture and that banks should “develop, implement and maintain an operational risk management framework that is fully integrated into the banks overall risk management processes.” He also provided guidance for senior management. In particular, he noted that senior management should:
- “Develop for approval by the board a clear, effective and robust governance structure
- Be held responsible for implementing policies, processes and systems for managing operational risk and ones that are consistent with risk appetite and tolerance
- Implement an approval process for all new prods, activities, processes and systems that fully assesses operational risk, and;
- Regularly monitor operational risk profiles and material exposures to losses”
There’s been a lot of great content in Day One of OpRisk Europe, looking forward to tomorrow’s panel discussion on “The Impact of New Regulation on Operational Risk Management.”
The following excerpts are taken from “Compliance, complexity and
the need for XBRL: An interview with former SEC Chairman Christopher
What are the key drivers of regulatory reform? Will
Dodd-Frank really reduce systemic risk? Can better compliance processes
drive better financial results?
In the weeks running up to the Vision 2011 and OPUS 2011 conferences,
experts within IBM Business Analytics Financial Performance and
Strategy Management posed these and other questions to Christopher Cox, a
former SEC Chairman and keynote speaker at both events. Below is a
transcript of that interview.
Looking forward into the next three years,
what are some of the key drivers in the US that will be shaping
regulatory and compliance reform? How are those different from the past
The most significant characteristic of the time we are living in
right now is the remarkable pace of change, both in legislation and in
regulations governing corporate America, in particular the financial
Of course, the Dodd-Frank 2,300-page behemoth is well-known already
to senior finance executives. But what is unknowable are the hundreds of
rules that will be forthcoming under that legislation. The schedule
called for in the statute has the bulk of the final rule makings
scheduled for completion in the third quarter of 2011. It is very clear
across the regulatory agencies that these deadlines are going to be
As a result, not only will there be regulatory uncertainty on a
continuing basis this year, but also for several years into the future.
There are over 100 rule makings that have no statutory deadline at all. I
think a significant share of even those that were expected to be
completed earlier will also be rolled into the future. So during all of
this time, senior Finance executives are going to have to be reading the
tea leaves – not to mention the statute itself – to determine how to
comply. And it isn’t just Dodd-Frank, of course, where we have all this
legislative and regulatory ferment. The unprecedented rapid pace of
chance in law and regulation and the continued uncertainty about what
the government will do next pertains to the tax area as well. During the
last year alone, Congress enacted no fewer than six major pieces of tax
legislation – including the two “Obamacare” bills, the HIRE Act, the
Education Jobs Act, the Small Business Jobs Act and, of course the
year-end Tax Relief Act that temporarily extended the current tax rates.
That last piece of legislation bought us at least two years of tax
certainty, but when it comes to long-term capital gains or any of the
other rules governing the taxation of investment, two years are scarcely
enough to permit long-term planning, and so the uncertainty continues.
That uncertainty about where financial, tax and regulatory policy are
headed in turn creates a challenging environment within companies and
within firms when it comes to shaping their response to regulatory and
compliance changes. That’s the environment in which we find ourselves.
Given the extent of this change and the predictable uncertainty that
will continue for several years, it is very important that companies
respond to this in ways that are exceptionally flexible.
How should Finance organizations prepare for this future
regulatory environment in spite of uncertainties, particularly global
companies that do business in multiple jurisdictions? What sustainable
practices in their control and reporting processes and systems do they
need to invest in to prepare for the future?
Being globally active, of course, only ramps up the uncertainty
because the requirements from multiple jurisdictions are layered on the
responsibility of senior Finance executives for U.S. compliance. It is
nonetheless possible to synthesize thematically many of the global
requirements, because at least topically, they have very much in common.
What is most important is that the different parts of a global
organization can talk to one another and that the human beings who must
extract information from the IT systems that collect and disgorge that
information can rationalize it. In particular, companies that address
these changes in ways that are adaptable and flexible will have a clear
advantage. Companies that fail to manage the process in this way will
likely find their companies non-compliant and their risk management
practices called into question – not only by regulators, but also by
their shareholders and their customers.
Do you think that the passage of Dodd-Frank will reduce
systemic risk and improve stability in our financial services
Unfortunately, the Dodd-Frank Act failed to address several of the
most significant causes of instability in the financial system and
sources of systemic risk. The first is the status of the
government-sponsored enterprises (GSEs), Fannie Mae and Freddie Mac.
Their current status in federal conservatorship is unsustainable. The
government’s ongoing ownership and use of these GSEs as instruments of
policy to stimulate the housing market is inconsistent with the
ostensible aim of the legal conservatorships into which they’ve been
placed, which is to restore them to financial health.
This is particularly salient, as the conservatorships have required
the GSEs to engage in practices that support housing at the expense of
their financial well-being. Likewise, the government’s completely
unjustifiable practice of keeping these two GSEs off the federal balance
sheet, even as they are under government ownership, makes a mockery of
financial reporting norms and honest accounting. Addressing this glaring
omission in the Dodd-Frank Act remains a top priority of financial
Next in importance is the inadequacy of bank capital and liquidity
standards. Dodd-Frank did not adequately address the obvious failure of
the Basel standards in the financial crisis. Those standards continue to
create powerful incentives for asset concentration in mortgages and a
reliance on credit ratings, and of course both of those had a role in
generating the mortgage bubble that led to the financial crisis.
So the short answer to that question would be “No.”
Correct. I’d also say that Dodd-Frank has given the Financial
Stability Oversight Council a strong incentive to protect competitors
rather than to protect competition, which might take market share from
the dominant firms. The systemically important designation implies
government readiness to support those firms in a crisis, perversely
encouraging more risky behavior despite the more stringent capital and
other requirements and thus deepening moral hazard.
Can you discuss some of the best practices for boards of
directors with regard to risk oversight? Do you think that changes in
proxy disclosure with regard to risk governance has had an impact on
risk management practices?
Yes. In 2010, the SEC added requirements for proxy statement
discussion of a company’s board leadership structure and its role in
risk oversight. Now companies are required to disclose in their annual
reports the extent of the board’s role in risk oversight, and they’re
required to address such topics as how the board administers its
oversight function, the effect that risk oversight has on the board’s
processes, and whether and how the board or one of its committees
monitors risk. That increased focus on risk management has had
considerable and very earnest take-up across the corporate community.
There are several types of actions that companies and their
appropriate committees have been taking to step up their focus on risk
management. Without question, they are spending more time with
management, and isolating the categories of risk that the company faces –
focusing on risk concentrations and interrelationships, the likelihood
that these risks might materialize, and the effectiveness of the
company’s potential mitigating measures.
Many companies have created risk management committees. Financial
companies, of course, that are covered by Dodd-Frank must have
designated risk management committees, but boards of other companies
have carefully considered the appropriateness of a dedicated risk
committee, and many of them have found it prudent to create one. In
other cases, boards have delegated oversight of risk management to the
audit committee, which is consistent with the New York Stock Exchange
rule that requires the audit committee to discuss policies with respect
to risk assessment and risk management.
For large-cap companies that have a Big Board listing, that has
continued to be another way to address these heightened concerns. I
think boards are carefully bearing in mind that different kinds of risks
may be better-suited to the expertise of different kinds of committees,
so they may not always wish to stovepipe responsibility for risk in a
Above all, best practices today are focused on the fact that
regardless of how the board subdivides its responsibilities, the full
board has the responsibility to satisfy itself that the activities of
its various committees are co-ordinated and that the company has
adequate risk management processes in place.
It’s a fascinating world. I can see why if you’re a controller or CFO it’s an exciting but intense place to be.
I think that’s absolutely right. All of these changes we’ve discussed
– in particular in the US – mean that we are entering an era of
unprecedented demand on companies’ governance, risk, and compliance
processes and IT infrastructures. I think that companies have dealt with
regulatory changes over the past half-century largely incrementally.
They’ve made adjustments to their enterprise-wide systems as needed to
comply with what have been modest changes from year to year. But given
the enormous scope of changes in these forthcoming new regulations,
companies will find it necessary to find a comprehensive and holistic
approach to at least regulatory reporting – and, in my view, their
management control as well.
Companies have traditionally relied on different processes to gather
enterprise data to help management run the business on the one hand, and
to gather data in order to satisfy regulators, on the other. In part,
that was sustainable because the information that regulators were
requiring was historical and post-facto. But things are rapidly changing
under these new frameworks. Regulators including the SEC are now
requiring information that is risk-based and predictive. While that is a
big change, it’s also a significant silver lining in that this will
align the process of collecting and gathering information more closely
with what management needs. That means that CIOs should be looking for
ways to integrate their regulatory and their management reporting
processes. For that reason, regulatory reporting doesn’t have to be
viewed as sheer cost, or necessary evil. Instead, there can be
significant efficiencies and productivity gains for the enterprise by
merging the requirements of management and regulatory data gathering
This convergence will also allow companies to restructure their data
in a way that will feed predictive analytical systems. That, in turn,
can lead to an improvement in both risk management at the board level,
and risk-based decision-making processes at the management level.
About Christopher Cox, Former Chairman, United States Securities and Exchange Commission (SEC)
Beginning in 1988, when he was elected to the House of
Representatives, Christopher Cox established a record of legislative
accomplishments that elevated him to the top of the Congressional
leadership. His wide range of expertise in a variety of complex issues
gives him the ability to take the long view of the economic future,
predicting both the actions of Congress and the effects those actions
will have on the marketplace. The author of the Internet Tax Freedom
Act, which protects Internet users from multiple and discriminatory
taxation, Cox held leadership positions ranging from chairmanships on
committees and taskforces overseeing everything from budget process
reform and policy to homeland security and financial services. During
his tenure as chairman of the Securities and Exchange Commission, he
continued this fight for justice and transparency in the world of
An Accomplished Lawmaker and Reformer. During his
seventeen years in Congress, Cox served in the majority leadership of
the U.S. House of Representatives. He authored the Private Securities
Litigation Reform Act, which protects investors from fraudulent
lawsuits, and his legislative efforts to eliminate the double tax on
shareholder dividends led to legislation that cut the double tax by more
than half. In addition, he served in a leadership capacity as a senior
member of every committee with jurisdiction over investor protection and
U.S. capital markets, including the Energy and Commerce Committee, the
Financial Services Committee, the JointEconomic Committee, and the
An Advocate for Investors. At the SEC, Cox focused
on the enforcement of securities law enforcement, bringing a variety of
groundbreaking cases against market abuses such as hedge fund
insider-trading, stock options backdating, and municipal securities
fraud. He also helped turn the Internet into a secure environment, free
of securities scams, and he worked to halt fraud aimed at senior
citizens. As SEC chairman, he was one of the world’s leaders in the
effort to integrate U.S. and overseas regulatory policies in this era of
global capital markets, making international securities exchanges safe,
profitable, and transparent. As part of an overall focus on the needs
of individual investors, Cox reinvigorated the SEC’s initiative to
provide important investor information in plain English, championing the
investor’s right to a transparency. His reforms included transforming
the SEC’s system of mandated disclosure from a static, form-based
approach to one that taps the power of interactive data to give
investors qualitatively better information about companies, mutual
funds, and investments of all kinds.
In 1994 Cox was appointed by President Clinton to the bipartisan
commission on entitlement and tax reform, which published its unanimous
report in 1995. From 1986 until 1988, he served in as senior associate
counsel to President Reagan. From 1978-1986, he specialized in venture
capital and corporate finance with Latham & Watkins. Cox received an
M.B.A. from Harvard Business School and a J.D. from Harvard Law
School, where he was an Editor of the Harvard Law Review.
We know the banks and related mortgage service organizations have been under fire for their role in the financial system’s near meltdown and ensuing foreclosure fiasco. JPMorgan Chase’s CEO Jamie Dimon reportedly owned up to taking some responsibility, saying “Some of the mistakes were egregious, and they’re embarrassing . . . but we made a mistake, and we’re going to pay for that mistake.” The 50 state attorney generals and the SEC, among others, are pushing for changes in how the banks and services operate, and there’s little doubt changes are coming.
In the interim, a report emanating from investigations by the Office of Comptroller of the Currency, Federal Reserve Board, Office of Thrift Supervision, and Federal Deposit Insurance Corporation, is expected to form a basis for a settlement where the financial institutions would make fundamental changes in operations and controls. The banks and other servicers would, for instance, have to:
- Set up a single contact point within the organization, enabling homeowners to avoid what’s often a maze of different departments
- Take steps to ensure there will be no action to foreclose while borrowers are pursuing loan modifications
- Improve training of staff handling foreclosures
- Establish more layers of management oversight over the process
- Engage an independent consultant to review foreclosures over the past two years, and compensate homeowners who were treated improperly.
One wonders why adequate business process design and basics of internal control weren’t in place long ago, even though the volume of foreclosures wasn’t anticipated. The sloppiness has caused tremendous problems for both the banks and servicers on the one hand and their customers on the other – and executives should know by now that if a large swath of consumers is damaged, then laws and regulations will surely follow.
This of course is not the end for the banks and servicers – not by a long shot. They still need to deal with the state attorney generals and other regulators, and we can expect more required changes to be forthcoming, along with large financial payments for past misdeeds. Oh, if only the risks had been identified earlier and better managed, with appropriately designed business processes, and basic and supervisory controls and compliance in place.
The sea of blue suits at the OpRisk North America conference being held in New York City this week provides a stark contrast to the cold rain falling in Times Square. The conference kicked off with a keynote address from Mitsutoshi Adachi, director and deputy division chief at the Bank of Japan. Mr. Adachi, who also serves as chair of the SIG Operational Risk Subgroup for the Basel Committee, noted that his travel plans had to be moved up a few days in order to account for the continued travel delays out of Japan.
His keynote highlighted a recent report published by the Basel Committee on Banking Supervision titled “Operational Risk – Supervisory Guidelines for the Advanced Measurement Approaches” which found that “Operational risk capital for non-AMA banks is higher than for AMA banks, regardless of the exposure indicator used for scaling.” Mr. Adachi also noted in his address that AMA firms showed “only modest increases in losses during the financial crisis period.” Certainly not an unexpected result but what was telling was his finding that for the period of 2008 to 2009 (during the financial crisis), operational risk losses for all banks were “2 to 3 times fold” compared with the previous Basel Committee internal loss data collection period of 2005 – 2007. Mr. Adachi declined to field a question on which business lines contributed the most to the losses, per his obligation to keep such information confidential.
He concluded by saying that “the Basel Committee finds it even more important to engage with the industry” moving forward. Looking forward to the Plenary address “Reforming U.S. financial markets: reflections before and beyond Dodd-Frank.”
Unless you’ve escaped to a remote island with no communication capability, you know about the serious issues facing banks and mortgage generators and service companies surrounding the foreclosure fiasco. For background, you might want to refer back to my October 15 blog which outlines some of the problems stemming from shortcomings in risk management and related internal control.
Well, the lawsuits have begun, with tens of billions of dollars at stake. State courts already have issued rulings, with the Supreme Judicial Court of Massachusetts, the State’s highest court, deciding that two major banks didn’t have the appropriate documentation when they foreclosed, and returned the properties to the borrowers. New York State’s chief judge, noting “it’s such an uneven playing field [where] banks wind up with the property and the homeowner winds up over the cliff [not serving] anyone’s interest, including the banks,” set forth procedures to ensure all homeowners facing foreclosure have legal representation. The impact in human terms is illustrated by recent reports of how two large banks took action against active servicemen and overcharged 4000 service personnel, reportedly failing to follow the Servicemembers’ Civil Relief Act that allows mortgage rate reductions and outlaws foreclosures. More lawsuits are on the way, led by a former prosecutor driving a class action.
Not only might other states become more proactive, but no less than three federal government agencies have begun investigations – the Department of Justice’s Executive Office for U.S. Trustees, the Federal Housing Administration, and the Federal Reserve. And none of this has been lost on a coalition of all 50 state attorneys general, which recently presented the five largest banks with a set of game-changing demands. Reports say these include prohibition against beginning foreclosure proceedings while a borrower is actively seeking loan modification, a requirement that a borrower making three payments under a temporary loan modification agreement be granted a permanent modification, modification turn-down subject to automatic review by an ombudsman or independent review panel, compensation programs that reward employees for pursuing loan modification rather than foreclosure, curtailing of late fees, and where banks engage in misconduct borrowers would be compensated by a pre-established fund and mortgage balances would be subject to reduction. While some analysts say these changes would drag out the foreclosure process and delay stabilization of the housing market, this attorneys general plan is reportedly supported by the newly formed Consumer Financial Protection Bureau, along with the Departments of Treasury, Justice, and Housing and Urban Development, and the Federal Trade Commission.
We continue to wonder how major banks dealt with the basics of risk identification and analysis – the risk that reliable documents would be needed in the foreclosure process – and establishing control activities to ensure document processing was accurate and complete, with files intact and readily accessible when needed, and accountability in carrying out control procedures. And we can wonder about due diligence in selecting and using outsourcing firms.
Does risk management and related internal control matter? Unfortunately, learning too late may cost financial institutions billions of dollars.
Chief audit executives do a lot of things really well, adding value to the companies they serve. What is especially interesting is how well many, especially CAEs of larger companies, gain information and insight through networking. Many are involved with their peers in industry or geographically based discussion groups, sharing through blogs, conferences, and internet-based information exchanges. And of course there’s still the opportunity to communicate via email or text or pick up the phone to talk with a valued colleague.
I’m a member of one internet-based group – though I tend to read rather than write – and am struck by several themes that are the subject of intense discussion and debate. Among them is the extent to which internal audit can and should become more actively involved in their company’s “governance” activities, however the term is defined. There’s an emerging consensus that yes, they should, and with their insights and skill sets they can add significant value, with an eye toward moving up the organization scale from process to senior management’s and the board’s activities. Another topic is transition from providing risk and assurance to performing more consultative services. The debate is heated, recognizing that IIA Standards speak to and enable both, with strong views expressed regarding the opportunities to add value while keeping in mind the need to maintain independence and objectivity. A related subject under discussion involves opportunities for internal audit personnel to move within their companies to other staff or operating units, into any number of management positions. There’s recognition of the benefits to the internal audit function’s recruiting and development and ability to add value, though caveats are expressed and concerns exist regarding retaining objectivity.
Relevant is the IIA Research Foundation’s 2010 Common Body of Knowledge Global Internal Audit Survey, called the “most comprehensive global study conducted on the practice of internal auditing.” Of particular interest is where practitioners focus attention now versus where they see internal audit five years from now. The study shows that while current attention is centered on operation and compliance audits, auditing financial risks, fraud investigations and internal control evaluations, the focus will shift. Going forward internal audit is expected to be looking more closely at corporate governance, enterprise risk management, linkage of strategy and corporate performance, ethics, migration to IFRS, social and sustainability issues, and disaster recovery testing and support. Other topics are mentioned, so readers might want to take a look at the report.
I marvel at the internal auditor networks, where practitioners are benefiting from the exchange of information and thought. If you’re not already involved in one, you might consider looking into how you can do so.
We had the opportunity to host a panel on operational risk at GARP this week in New York. The panel, “Using Operational Risk Management to Gain Competitive Edge”, included moderator Christopher Donohue, Managing Director, Research and Educational Programs, (GARP), and panelists Marcelo Cruz, Global Head of Operational Risk Management and Metrics, Morgan Stanley, Patrick McDermott, Senior Director, Enterprise Operational Risk, Freddie Mac, and Mairtin Brady, Head of Operational Risk Management, TIAA-CREF, as well as me, Gordon Burnes.
At the beginning of the the panel, McDermott outlined the basic set of questions that operational risk managers have to answer:
- What can go wrong?
- How bad can it get?
- How likely is it to happen?
- What are we going to do about it?
This is a great way to frame the essence of an operational risk manager’s job, and those new to the discipline will do well to make sure that their program covers off on these fundamental questions.
This was an interesting panel in that each panelist represented a different perspective on managing operational risk programs. The starkest contrasts were between Cruz, representing the quants, and McDermott, representing the value and importance of qualitative information. Cruz took particular issue with scenario analysis but did acknowledge the limitations of models as expressed in confidence levels. It’s clear that there’s a wide range of practice in the industry on this topic, with some banks relying heavily on scenarios to model their capital, others relying more on internal data.
All panelist agreed that the operational risk function is on its ascendancy and is increasingly being brought to the table to weigh in on strategic matters, such as acquisitions or new product launches. One of the key takeaways was that operational risk information can help businesses better define their risk profile, allowing business managers to make better decisions about where to invest, and where to focus mitigation efforts.
When organizations choose to shift their corporate mission and redefine organizational goals, it is vital that they carefully evaluate the potential risks and fallout from redefined core value propositions and tactics. A case in point is Toyota—a company that has built its reputation on the quality of its product, but in recent years focused its sights on profits.
With the introduction of the Prius to the U.S. market in 2000, it appeared that a strategic risk had paid off, Toyota had created a hybrid engine for the mass market that was a clear success and was even marked in the press by a drove of Hollywood celebrity drivers including Leonardo DiCaprio, Cameron Diaz, Larry David, Billy Joel, David Duchovny, and more.
However, in recent years Toyota has been plagued by a series of escalated vehicle malfunctions. While the entire scope of the financial loss is currently unclear, since 2009 the company has initiated over 14 million recalls worldwide and more than $48.8 million in fines in the U.S. alone. The world’s number one automaker has also temporarily suspended U.S. sales of eight of its top models and halted production in five U.S. plants, an unprecedented step that clearly demonstrates the effort being made to maintain Toyota’s once solid reputation for customer satisfaction.
Overwhelming growth and the pressure to match increasing demand with production to has stifled Toyota’s promise of reliability. It is yet unclear what affect these recalls will have on Toyota’s global standing in years to come, but potential customers will certainly approach the automaker’s brand more tentatively than in decades past.
The lesson here is that all corporations must be prepared to mitigate risk, especially when taking such a precarious step as redefining their core vision and business strategy. Toyota now faces the huge challenge of recreating its customer brand loyalty while at the same time maintaining the momentum that their swollen infrastructure investments require.
Or more to the point, was he thinking at all? We’re talking about Rajat Gupta, operating at the highest echelons of multinational business, who finds himself charged by the Securities and Exchange Commission with illegally passing inside information to Raj Rajaratnam, the Galleon Group founder about to go on trial on charges of insider trading. Mr. Gupta, a Harvard Business School graduate and former head of McKinsey & Co., has been a board member of the likes of Goldman Sachs, Proctor & Gamble, and American Airlines.
What did he do? Well, he of course is innocent until proven guilty, and according to media reports, his lawyer says he has done nothing wrong. But the SEC says otherwise. It alleges Gupta gave the Rajaratnam advance information about earnings at both Goldman and P&G. On top of that, the SEC maintains that Gupta called the Galleon head with the inside scoop of the Goldman Board’s approval of Warren Buffett’s $5 billion investment in the firm. The allegations speak to multiple phone calls between the two men, enabling Galleon to reap millions in profits. What must be particularly troubling for both is that the SEC says it has recordings of numerous telephone conversations.
Let’s presume for a moment that the allegations are factual. A relevant question is, is this a black eye on the companies on whose boards Gupta sat (by the way, the reports say he resigned months ago from the Goldman board, and recently from P&G). My answer, based on the information available, is “no.” Certainly, if the allegations are true, a statement by SEC Director of Enforcement is on point: “Mr. Gupta was honored with the highest trust of leading public companies, and he betrayed that trust by disclosing their most sensitive and valuable secrets.” But what could or should have been done to prevent wrong doing at the board level?
We know well the importance of a company’s board of directors in keeping a close eye on what the CEO and senior management team do, and on the company’s system of internal control. We recognize the importance of compliance officers, risk officers and internal audit functions. But who keeps an eye on the board, especially when their actions are outside the inner workings of the company itself? We can look to what happened years ago at HP, when a board member leaked information to the media, which resulted in the pretexting fiasco.
There are no immediate answers, other than to continue to ensure full vetting of director candidates, and maintaining effective board and internal audit processes to best identify and manage potential misbehavior. With the thousands of directors of major companies acting with extraordinary integrity and ethics and in the best interests of their companies and shareholders, I believe we don’t have much to worry about. But it is worth more thought going forward.
Fueled by a global audience that is desperately looking for disclosure in the wake of the economic crisis and mature digital computing technologies that make it more and more difficult to contain sensitive information, WikiLeaks has emerged as a viable new threat to data security.
Until now the United States government has been the central target of WikiLeaks attacks, however, with WikiLeaks founder Julian Assange’s recent claim to be ready to release corporate secrets in early 2011, organizations everywhere are faced with a looming risk management challenge that is not likely to dissipate anytime soon.
Experts agree, and Assange himself has suggested, that the information that will be leaked is more likely to consist of internal communications between executives and other employees rather than the personal data protected by privacy compliance laws. However, the threat of any kind of exposure means that corporations need to tighten data security and evaluate areas of potential vulnerability.
Unfortunately, WikiLeaks has highlighted a liability that persists across all corporations and government agencies that technology and compliance measures alone simply cannot contain: the human factor. The increasing number of compliance and regulatory mandates that have been put in place in recent years have not proven enough to combat the risk posed by employees leaking sensitive information.
A recent poll by Harris Interactive reports that only 9% of companies have adequate crisis protocols in place to protect themselves from a potential onslaught. In this period of uncertainty, with virtually all large enterprises under the WikiLeaks radar, it is vital that organizations devise an adaptable enterprise risk management strategy to identify and manage areas of weakness without sacrificing business performance.
Just as a sharp increase in regulatory compliance mandates has created a necessary shift in industry risk management tactics, so has WikiLeaks spawned the recognition of new vulnerabilities that face companies in the modern digital age. The organizations that are well prepared to assess and mitigate against untested threats, like the one posed by WikiLeaks, are those that combine deep domain expertise with powerful and flexible tools to analyze and weigh the probability and cost associated with any given challenge.
Last week we announced the availability of OpenPages version 6.0, which marks a major milestone in the evolution of the GRC market-from convergence to insight. It also represents the completion of the first phase of our technical integration with IBM. And, the new release will help prepare our customers for managing through regulatory change in the post-Dodd-Frank environment.
Several industry experts have had positive things to say about the news:
- In IT Business Edge, Mike Vizard wrote:
“But there is a significant gap between collecting data and actually making it usable. The release of version 6.0 of the OpenPages GRC platform, which IBM acquired last year, is a significant step forward in terms of closing that gap by tightening the integration between OpenPages and the business intelligence (BI) software from Cognos that IBM also acquired back in 2007.”
- Compliance Week covered the news in a blog post
- Industry Analyst Guillermo Kopp wrote a report on 6.0, which details the key benefits and opportunities for the combined solutions of OpenPages and IBM. In regards to integrated risk management he says:
“A centralized governance, risk, and compliance (GRC) platform will help large companies manage various risks across client, location, product, and service domains. For financial firms, integrating financial risk dimensions (e.g., credit, market) will augment the challenge substantially.”
- 6.0 was also featured as the top story in CMS Wire’s GRC Roll-up
We’re happy to see the positive reaction to 6.0!
In the wake of Dodd-Frank passage, Chris McClean of Forrester Research commented that there are nearly 200 regulatory changes still on the U.S. federal agenda that span industry verticals such as finance, healthcare, and consumer protection.
As regulatory pressures continue to mount, organizations that adopt a more practical regulatory management approach across the enterprise will be able to react quicker to regulatory change and decrease costs and complexity while gaining valuable insight into the risks that could affect corporate performance in the form of legal action, fines and penalties, or a decline in company/brand loyalty.
The recently announced OpenPages 6
.0 includes significant enhancements to the Policy and Compliance Management (PCM) module that allows organizations to react quickly to changes in regulatory mandates and to manage regulator interactions effectively:
- Regulatory Change Management — lets users easily communicate, track, and manage regulatory change and enables quicker reactions.
- Regulator Interaction Management — provides workflow enablement to help users prepare for and manage complex regulator interactions.
- Policy Lifecycle Management — offers a new user-friendly view to consolidate policy details with configurable field/template definitions.
Policy and compliance management software is playing and increasingly important role in the business by allowing companies to easily communicate changes in laws and regulations and enable quicker reactions by the business.
My last posting spoke to one of COSO’s two recently issued guidance reports on enterprise risk management. The first provides approaches for getting started on an ERM initiative, and while it’s based on good intentions and provides useful information, especially to smaller companies, in Olympic games terms with only two entrants, that report gets the silver. The second report, Developing Key Risk Indicators to Strengthen Enterprise Risk Management – How Key Risk Indicators Can Sharpen Focus on Emerging Risk wins the gold – by a good margin.
COSO’s ERM report Application Techniques volume touches on the topic of key risk indicators, use of which was not commonplace at the time. Since then, along with key performance indicators, which focus primarily on past performance, more organizations have incorporated forward looking key risk indicators into their ERM processes, further enhancing risk management effectiveness. This new report does a good job of explaining KRIs and how they can be of benefit. A couple of simple examples include:
- For customer credit, where a common KPI includes data about customer delinquencies and write-offs, KRIs are developed to help anticipate future collection issues, focusing for example on analysis of reported financial results of a company’s 25 largest customers or general collection challenges throughout the industry to see what trends might be emerging among customers that could potentially signal challenges related to collection efforts going forward.
- Management of a chain of family-style restaurants sought to avoid a negative earnings event that could arise with unexpected market conditions. Recognizing that restaurant traffic is directly affected by customers’ discretionary income – where as discretionary income levels fall off, customers are less likely to dine out – management establishes as a KRI average gasoline prices people pay at the pump. This is based on the premise that when gasoline prices rise, discretionary income for individuals and families representing their core customer base decreases, and customer traffic begins to drop.
As such, KRIs enable management to take quicker action in dealing with the risks. In the later example, management is positioned to adjust marketing and promotion events to reduce the impact of the risk.
The report explains how KRIs are most effective when closest to the ultimate root cause of the risk event, providing more time for management to act proactively. And multiple KRIs can provide still more relevant information, keeping in mind that a close relationship between the KRI and the risk, and accuracy of information used, are both critical. Another benefit is the ability to readily track trend lines with dash boards or exception reports, quickly and easily communicating where action may be needed.
With KRIs continuing to gain recognition as important elements of enterprise risk management, this COSO report provides readily usable information and is definitely worth the read.
Today we announced the availability of OpenPages 6.0. This release represents a significant new phase in the evolution of GRC and provides organizations with the insight needed to drive business outcomes as well as the ability to manage effectively through the changing regulatory environment. We’re also excited to have completed the first phase of technical integration with IBM with the release of AIX support.
The GRC market developed out of the tactical, departmental deployment of SOX and other compliance and risk management solutions. Companies realized that they could leverage their control testing and risk assessment activities across multiple different oversight functions by consolidating their risk and compliance efforts on a common technology platform. Indeed, we’ve seen very strong ROIs for Enterprise GRC platforms, ROIs driven by this efficiency. The next phase in the evolution of GRC is about insight, using the GRC data to help drive business outcomes.
Here’s an example of how GRC data can be used to drive business outcomes. Imagine a multinational bank that has a subsidiary in France. The compliance team has identified some procedure violations with regard to the handling of customer account data. The audit team has found some major control weaknesses surrounding customer account data, and the operational risk team has observed some KRIs above threshold. Any one of those functions may not escalate their particular findings, but, taken as a whole, the GM in France would be able to see that the business is at great risk of a significant loss. This is the kind of insight that can help drive business performance, in this case avoiding a fine and loss of brand stature.
OpenPages 6.0 will provide better insight through enhanced business intelligence. The power user will benefit from easier report building and in context data presentation through Cognos mash-up services. The business user will benefit from interactive dashboards, and the executive from data syndication through Office and mobile devices. We’ll discuss some of the other new capabilities in 6.0 in subsequent blogs.
COSO recently released reports providing guidance in two areas related to risk management. One is Embracing Enterprise Risk Management – Practical Approaches for Getting Started, which suggests ways in which companies, especially smaller ones, can begin a risk management initiative with the objective of ultimately moving to an ERM process. It puts forth “keys to success” in terms of a number of “themes,” beginning with being sure to have support from the top. Theme 2 is building on incremental steps, which includes implementing key practices to gain immediate and tangible results. Theme 3 continues with focusing first on a small number of “top” risks, and theme 4 is leveraging existing resources by utilizing the capabilities of the chief audit executive, chief financial officer or other executive as a catalyst to begin the initiative.
The guidance continues with theme 5, building on existing risk management activities already being performed, for example, by internal audit, insurance or compliance functions, fraud protection/detection measures, or credit or treasury functions. Theme 6 involves embedding risk management into the fabric of the business, and concludes with theme 7’s continuing to update and educate senior management and the board on evolving ERM practices.
The guidance also provides seven “action steps” to support development of an ERM initiative: Seeking board and top management leadership, involvement and oversight; selecting a strong leader for the ERM initiative; establishing a risk committee or working group; conducting an enterprise wide risk assessment and developing a related action plan; inventorying existing risk management practices; developing a communication and reporting process; and developing the next phase of action plans and communication.
As stated in the report, the guidance says the suggested incremental step-by-step approach may be particularly useful to smaller companies, and importantly, the suggested approach is a only a starting point for moving to an enterprise risk management process. I believe the report is well meaning, looking to break down barriers and resistance to embarking on building an ERM process, and as such may be useful to companies considering taking a first step. But that’s all it is. It doesn’t provide guidance on how to design an ERM process, and how it can be effectively implemented throughout an organization. Yes, some of the “steps” are a start, but my concern is that, despite the warnings, companies going down this path will somehow believe they will have installed ERM in their organizations.
In Olympic games terms, with only two entrants, this report gets the silver. The second report on key risk indicators wins the gold – by a good margin. I’ll speak to that report in my next blog posting.
A recent client discussion reminds me of an article I came across a few years back with important implications for dealing with risk – or rather a risk that materializes into a major problem. The article, “What Organizations Don’t Want To Know Can Hurt,” focuses on events surrounding the College Board when it learned of extensive errors scoring its SAT tests, and provides a good example of not to do.
The company’s president reportedly said that finding the specific cause of the failure “did not really matter,” but rather what’s important is to ensure that improved controls catch future problems. His position was supported by the engagement leader of a consulting firm hired by the company, saying that dissecting past problems is not necessary either to ensure that the scoring system works better in the future or there is a good safety net to catch errors. He goes on, “You can do both without knowing whether it was rain that made the papers wet, or whether someone spilled a cup of coffee…[and] if we tried to brainstorm everything that could go wrong, we’d be here for years – for a lifetime. But if controls are in place to identify problems, and rescore tests that were misscored, that’s what you’re really looking for.”
These statements are fascinating – that there’s no need either to look back at why something went wrong because it’s unnecessary, or to dig deeply into what could go wrong because it would take too long. It suggests that problems in test scoring – which would certainly seem to be central to the company’s credibility and indeed its sustainability – are okay as long as they ultimately are found and test results rescored. Simply “catching future problems” by “rescoring tests” means that the company is satisfied detecting major problems with scoring after they occur, rather than taking steps to prevent such problems in the first place. I wonder what users of SAT scores think about that!
If you’re smiling at this you’ve got company. Cleary, looking neither backward nor forward is not a viable option. And, doing one or the other also is not the answer. Rather, it’s necessary to do both. Only by getting behind what went so wrong can management feel comfortable it understands what risks continue to exist, and only then is it positioned to look at what additional risks need to be the focus of its attention going forward.
It doesn’t take a genius to know that when a problem rears its ugly head it essential to find out why. The article talks about fields like aviation and medicine that conduct investigations to find out exactly what went wrong, to learn from often deadly mistakes and to improve processes and protocols. The National Transportation Safety Board does so focusing primarily not on casting blame but on making things better. Similarly, many hospitals hold mortality and morbidity conferences to analyze and learn from mistakes. Many businesses do that as well, learning from what went wrong. They don’t choose between learning from the past and working to make things better. They do both, with one supporting the other. And no, it doesn’t take “a lifetime” to find out what caused a major problem or to identify the source of the next potential disaster.