Tim O'Bryan 270001NMX7 email@example.com | | Tags:  timobryan financialperformancemanag... businessanalyticstoday businessanalytics | 0 Comments | 3,053 Visits
Finance teams are the performance management captain of the corporate ship empowered to provide the executive team, and business and support units with real insight and understanding of past, present, and future performance while guiding them on what the information means to each of their constituents and how it can be interpreted for decision making. That is their most strategic value to the business. However, because of the explosion of information, speed of business, growing systems through M&A and unique information capture needs of different areas of the business not to mention growing business practice demands, such as regulatory compliance to meet the regional, national, and global reporting requirements, the corporation’s most important analytical asset, finance departments, remains mired in just completing the basic week-to-week tasks without providing this strategic guidance. As a result, emerging trends, exploitable opportunities, efficiency gains, addi
In the 1960′s, IBM was the 800-pound gorilla in the mainframe business whose technology supremacy went unchallenged and superior performance went virtually unabated into the 1970′s. They were the blue suits bearing information-based mainframes to help companies use data to run their large – sometimes multinational – businesses with greater know-how about customers, products, operations, and financial performance far more adeptly than any other technology could. Yes, they were considered the masters of product innovation largely due to world-class business practices and industry expertise. However, Big Blue got complacent and far too comfortable in their long held pole position. Their inflated confidence and market share eventually disintegrated as they missed the advent of the new information-based technology, which, if they would have had the right analytic capabilities in place, they would have seen it coming; the emergence of the minicomputer. Minicomputers were technologically simpler than mainframes but with stronger computing power while requiring less resources to run them. To be fair, it wasn’t just IBM that missed the advent of minicomputers. It was virtually every mainframe company in existence at that time. This new technology virtually wiped out the entire mainframe business such that no mainframe business would be a major player in the minicomputer business at all.
What happened? What was missed? Who screwed up?
In my opinion, there were many failures that caused this emerging technology to go unaddressed by IBM and others, but the chief culprit who could and should have been prepared for it was finance. Yes, I think it’s up to finance as the owners of business performance (past, present, and future) to fundamentally understand the business climate – internally and externally – to then advise their corporate constituents on what the information they’ve analyzed means to them. For this to happen finance needs to get a handle on its core responsibilities before it can begin to really spot these performance-sapping icebergs that can possibly turn into business shuttering threats.
Let’s get back to the technology story for a minute if that’s okay. Then, I’ll finish my point.
Where were we? IBM’s out because the mainframe business has gone south – way south by way of the minicomputer. Exit IBM. Enter Digital Equipment Corporation. DEC virtually created the minicomputer business along with a few other aggressively managed companies like Data General, Prime, Wang Computer, Hewlett-Packard, and Nixdorf. Did DEC and others in the minicomputer business learn any lessons from IBM’s big miss on the minicomputer market so as to not repeat the same mistake? Of course not. The story of DEC’s demise rings almost too tellingly true to IBM’s mainframe debacle of the 1970′s. In fact, the management gurus and business journals missed it too. Digital Equipment Corporation was considered by all who had some insight into the company’s operations as being the ultimate technology company for decades to come. For certain it was a featured company in the McKinsey Study that became the stellar 1980′s management book, In Search of Excellence. DEC seemed destined for monster success. Still, despite all this fanfare, DEC missed the next wave in computing technology, the desktop computer market.
Again, where was finance watching past performance by measuring and monitoring it, to get analytical insight into what the future might look like to then advise their constituents across the business on what all of this means to each one of them? Answer: Heads down.
The desktop computing market was predictably seized not by DEC or one of its minicomputer compadres but by Apple Computer, Tandy, Commodore, and IBM’s PC-division. (Yes, IBM can’t be held down for long!)
What happened next? Like Rick Blaine says in the movie Casablanca, “Play It Again, Sam.” Apple, Tandy, IBM and the rest of the desktop computer gang focused on making the best desktop computers they could but ended up missing the next new, new thing. Apple Computer and IBM lagged 5 years behind bringing the latest-and-greatest technology rage to the market: portable computers. That market was owned by Silicon Graphics, Sun, and Apollo – all newcomers to this market.
In each case, the leading companies mentioned were regarded as the gold standard given their product excellence and operational execution only to be quickly pushed aside by an out-of-nowhere, technologically superior solution that reset the market’s expectations rendering the prior leader’s solution frumpy and stale. Missing emerging trends in the marketplace and not adapting to them quickly enough can ring a death knell for most companies. Think Wang, Silicon Graphics, Apollo. For others, this misstep can set them back 5 or even 10 years before they’re back on their feet again.
As a note, in the above example, I simply chose the technology sector but we could have easily used the retail merchandising sector (Think Sears vs. Nordstrom) or retail books (Think Barnes & Noble vs. Amazon), or Automotive (Think GM vs. Toyota). Each situation is an example of a failure to see the changing landscape which, I believe, finance is mostly at fault for squandering these opportunities.
How come finance? I think finance failed their companies in each instance because they weren’t effective enough in managing the day-to-day, low value tasks which, if they had them under control, they would have greater leverage to spend time on higher-value practices like forecasting and business analytics to uncover data points that can help the entire business spot emerging market forces before it’s too late to respond. This responsibility to identify these threats and opportunities lies squarely on finance. If not them, then who else? Be careful because whomever you’ll name will probably expect finance to provide them with the meaningful insight into performance results across the business as well as external information, which, again, means it’s incumbent on finance.
So, how does finance get to that point where it’s able to provide this kind of insight with the resources it has because Lord knows it’s not going to get additional headcount? Well, it all starts with finding a way to better leverage the resources it has. This requires finance teams to get the lower value tasks automated as much as they can so that they can off-load these process management steps to take on added capacity for these analytic practices.
What are the world class finance teams doing to be analytic leaders in their industry? World class analytic finance teams have these repeatable practices down to great consistency and repeatability from end-to-end:
These practices are the foundational elements required for finance to be the advisor in providing guidance to the business. Excel at the practices mentioned above and you’re soon positioning your analytic experts on your finance team to do the real analysis they’re supposed to be doing. It’s incumbent upon the CFO’s finance department to provide this guidance and leadership given finance’s role as the performance managers for the company. It is therefore finance’s job to provide insights into past, present, and future performance but also trends, anomalies and market opportunities that become visible only after thorough analysis of the information-based business results gleaned from systems, i.e. ERP, CRM, SCM, etc. and processes, i.e. forecasting, what-if scenario analytics, etc.
This finance role is looked to not only explain past business performance and its financial effects but also advise and guide the strategy in determining where to make investments with the resources at hand. The CFO’s analytics team – finance – needs to spend its time not on the everyday execution of basic, low value process steps, like compiling, validating, and reconciling data for various internal and external reporting needs but also analyzing past, present, and future to present guidance on what’s happened, what’s happening now, and what could happen. Only with an infrastructure in place to easily manage these basic elements of the finance team’s mandate can the real value-added analytic insights come to light. Otherwise, their companies will continue to drive through its business climate with a perpetual blind spot on what’s coming soon rendering them the next Tandy Computer, Silicon Graphics, or Apollo.
It’s up to you finance to not let this happen.
Check out more blogs by Tim O'Bryan by clicking here!
Tim O'Bryan 270001NMX7 firstname.lastname@example.org | | Tags:  ibmcognos timobryan businessanalytics financialconsolidation provenpractices businessanalyticstoday cognoscontroller | 0 Comments | 2,543 Visits
If asked to describe their financial consolidation process most corporate finance teams might spit out a few unprintable adjectives as they attempt to explain their effectiveness in harnessing all of the moving parts in this bear of a process. The primary issue they have is managing all of the inputs and one-offs throughout each step, making it difficult to track or audit it because of their lack of transparency, visibility and ultimate control over it. Riddled with disconnected systems that have major control risks requiring manual intervention and maintenance (Think spreadsheet-based systems) and other standalone technologies lacking any audit control make it an administrative nightmare. Financial consolidation isn’t a stationary target either given the ever-growing mountain of new regulations, report filings, and financial governance procedures required to which they need to adapt. (Think Dodd-Frank, IFRS, and XBRL to name a few.)As I mentioned in another blog post, “Close, Consolidate, Report & File: Automation & Embedded Controls Else It’s A House of Cards“, there’s so much to manage throughout the FInancial Consolidation process including manual inputs, offline adjustments, in-process reports, and stakeholders to keep a handle on it all. We’re not talking about nice-to-have reports here. We’re talking about reported Balance Sheets, Cash Flow Statements, 10Ks and 10Qs and other monthly, quarterly, and annual reports that go to shareholders, The Street — and, oh, by the way, these results are the primary drivers of business decisions being made across the organization to run the business. Hard to believe more organizations haven’t adopted an end-to-end solution to produce credible, timely, and reliable results while allowing finance teams to really focus on the important stuff like analyzing the results, not compiling them.
Drive Better Performance Thru Greater Finance Integration with the Sales & Operations Planning Process
Tim O'Bryan 270001NMX7 email@example.com | | Tags:  businessanalyticstoday businessanalytics salesandoperationsplannin... provenpractices | 0 Comments | 2,516 Visits
Tim O'Bryan 270001NMX7 firstname.lastname@example.org | | Tags:  provenpractices kpis timobryan businessanalytics strategymanagement businessanalyticstoday | 0 Comments | 2,159 Visits
Tim O'Bryan 270001NMX7 email@example.com | | Tags:  timobryan provenpractices businessanalytics businessanalyticstoday | 0 Comments | 2,085 Visits
Tim O'Bryan 270001NMX7 firstname.lastname@example.org | | Tags:  bichampions businessanalytics | 6 Comments | 2,014 Visits
Ever been the San Francisco? Great place, huh. Well, even if you haven't I'm sure you'd recognize an image of the Golden Gate Bridge without fail. Among many other unique characteristics it's got that unmistakeable burnt orange exterior and seems to perfectly blend in with the environment around which it stands. Stunning. Built AND assembled in America soup-to-nuts. A feat of engineering representing man's force of will in de facto equilibrium with nature. The same can be said about the Golden Gate Bridge's lesser known sister bridge, the Bay Bridge. But, the Made In America stamp on the Bay Bridge is soon going to change. As reported by David Barboza in yesterday's New York Times, the massive outsourcing project that's in place today to build the "eastern span" of the San Francisco Bay Bridge - Yuerba Buena Island to Oakland - is being done not in America or Canada or close by Mexico (Hello NAFTA!) but in...(wait for it)...China. Shocked? Doubtful.
This 21st century go-to-production model is certainly not a new wave in business strategy but it's certainly a trend emerging more and more as the rule more than the exception. Think Apple and its product development and manufacturing strategy for starters. Many others are right there too but this is the first multi-billion dollar municipal project farmed out to a foreign country thousands of miles away in Shanghai, China. As Bob Dylan said, "The Times They Are A Changin" ...and are gonna keep on...
I am not looking to discuss the virtues of outsourcing nor am I wanting to discuss the potential risks resulting from a heavily-weighted strategy leveraging a long line of supply chain partners in a corporate build strategy. What I'm more interested in is that there are changes in the way we run our businesses to adapt to this new outsourcing, global supply chain-dependent strategy that we need to consider. The changes are that the United States and many, many other countries around the globe are becoming more and more knowledge-based in their corporate philosophy and less and less manu
Given these shifting sands, what we can do is think about whether or not we now have the proper instruments in place to measure and monitor how these businesses are operating and performing. Currently, our external reporting requirements here in America are based upon early-twentieth century created Generally Accepted Accounting Principles, or GAAP, which are designed to reveal important information about companies (think railroads, automobile makers, and manufacturing enterprises of all kinds) that then dominated the U.S. economy. These companies were from the old school industrialized economy model where their value was inherently based upon the assets on their balance sheet. (Think purchasing plant, machinery, equipment, rolling stock, and other units of production from suppliers.) WYSIWYG. 'What you see is what you get' transparency where the costs of these assets were recorded on balance sheets and depreciated over time as they were used to generate revenues. Balance sheet values thus roughly reflected the cost of replacing the company as a whole, and the company’s earnings reflected the costs of producing the goods and services that were responsible for its revenues. Pretty simple, huh? Nice. Well, that reporting model works nicely when you're in a simple manufacturing-based economy but what about now when we've changed the game and added complexities like outsourcing, deeply entrenched supply chains, and global operations all doing mostly knowledge-based work where the real building is done in far flung locations by 3rd parties?
This is the Knowledge Economy. A change as profound as the Industrial Revolution in eighteenth-century Britain. We're at a new juncture and yet we don't have the right instruments in place to manage, report, and analyze our businesses differently given this monumental change in global business strategy? Do we really look at the business with an up to date view or are we just doing what we're told. Frankly, I think the entire finance department should rethink the reports they're generating in partnership with their IT counterparts and ask what the viability of these reports really is. Love this one, "We do things this way because it's what we've always done." Imagine if that's how the Beatles or Elvis or even the Rolling Stones thought. Heck, forget them. Imagine if that's how David Bowie thought. He's masterminded the art of adaptation over the years. Cha
As Winston Churchill said, "Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning." This is your strategy. This strategy is never unbending or unwavering. A good strategy is adaptable with risk assessments forall sorts of business impacting events. As a result, the workforce needs to be setup to adapt as the strategy changes. This Strategy Execution Framework can help get people doing the right things as quickly and purposefully.
For more information on strategy execution frameworks, please visit this URL with some superior resources for you to access.
Delaney Turner 270003RQ8K Delaney.Turner@ca.ibm.com | | Tags:  ibmsoftware business_analytics iod11 | 0 Comments | 1,765 Visits
We're heading into the home stretch before Information On Demand (you have registered, right?), so I'm sharing a few of my favorite blog posts to help frame the discussions at the premier conference for IT and business professionals. Feel free to bookmark, read and add to your own social media reading list. Also, feel free to comment on or disagree with these posts right here, as each is bound to raise a hackle or two.
1. GOOD Magazine: The Data Issue: GOOD calls itself an "integrated media platform for people who want to live well and do good" and "a company and community for the people, businesses, and NGOs moving the world forward." Its latest issue looks at areas of our lives that aren't typically associated with (or driven by) data and finds some surprising insights. Yes, data is everywhere and facts can be comforting, but when it comes to our own lives, it's the questions we ask ourselves that lead to true wisdom. As illustrator Andrew Kuo writes: When we search the numbers, we find reflections of ourselves, glimmers of the world we live in and the lives we lead. We may learn immense amounts from this data, but make no mistake: Our search is what gives it meaning. In The Information Arms Race, William Wheeler explores the increasingly effective use of microtargeting in political campaigns, as well as the repercussions for democratic debate. The issue is also chock full of of cheeky infographics and gets meta on data with a chart entitled "Which kinds of people like which charts?"
2. Numerati Baseball = Rope-a-dope, by Stephen Baker: Is winning boring? I suppose entertaining the fans is a secondary concern when you're buried beneath "fifty feet of crap," as Oakland As' manager Billy Beane (Brad Pitt) observes in the movie trailer below. Still, Baker (author of The Numerati, chronicler of the Watson story and a baseball fan himself) considers the implications of the analytical approach on the length of the average baseball game and its effect on the effect on viewer patience: I love baseball, and I defend it stoutly against all those who complain that it's boring. But anyone who can sit through a Yankees-Red Sox game without a fast-forward button deserves some kind of medal ... For someone who is not passionate about the Yankees or the Red Sox, it was torturous. The game dragged on for 4 hours and 21 minutes. What's your take on taking a lot more walks? Moneyball pioneer Billy Beane and Moneyball author Michael Lewis will share share their take on the analytical approach to wining an unfair game when they share the stage as our keynote speakers.
3. Desert Island Datasets: Over on The Guardian's Datablog, Charles Arthur plays with the "Desert Island Album" concept to datasets by asking, "Which set of open data would you like to get from the UK government so as to have the maximum impact on the open data movement?" Arthur's goal is twofold: first, to protect and advance the open data movement overall, and second, to focus on those datasets that can make the biggest improvement in public policy: I recently met some people inside government who are trying to push the open data idea, of getting anonymised, publicly-collected data out there for developers to be able to build applications which will have both financial and societal benefits. It is taken seriously at the top levels of government; they aren't just paying it lip service. The problem though is that there's only so much time available to anyone to push the agenda through.
Bonus feature: IOD Housekeeping Details
A few details to keep in mind as you prep your week and pack your bags:
Tracy Harris 2700026WJ9 email@example.com | | Tags:  cognos intelligence business iod11 ba-strategy baforum ibmaq analytics | 0 Comments | 1,649 Visits
Have you created your BI Strategy? If not, you can get started with lessons learned in a podcast from the team that brought you the book “BI Strategy: A Practical Guide to Achieving BI Excellence”. In a three-part series over the next few weeks, you can hear about the experiences of:
- John Boyer, Manager, BI Center of Excellence, at The Nielsen Company
- Bill Frank, Technology Manager, BI Practice, at Johnson & Johnson
- Brian Green, Manager of BI and Performance Management at Blue Cross Blue Shield of Tennessee
- Kay Van De Vanter, Enterprise BI Architect and BICC Lead at The Boeing Company
And myself (the fifth author) as we share practical advice on how to get more strategic in the use of analytics to help your organization outperform.
In this series, you’ll hear about how organizations can design a strategy that promotes business alignment, get practical advice on organizational design and culture as well as benefit from their pooled knowledge on technology strategy.
In this first episode, the team deep dives on the “Business Alignment Strategy” which sets the stage for how you define your stakeholders, map to the business needs of the organization and prioritize the many different projects that come to light when success is realized.
Delaney Turner 270003RQ8K Delaney.Turner@ca.ibm.com | | Tags:  baforum iod11 | 0 Comments | 1,532 Visits
How did you greet the new and improved Facebook? If you took to Facebook to complain about Facebook and demanded the old Facebook come back, you certainly weren't alone. Last Tuesday's rollout drew what could be a record amount of complaints from the site's 750 million users.
Don't blame Mark Zuckerberg, though. Blame your ancestors instead.
Our brain is a "prediction machine"
According to Huffington Post blogger Michael Taft, the reasons for this response date back millions of years, when our cave-dwelling ancestors faced a daily fight to survive:
Life evolved to gather energy resources, and the purpose of our advanced brains is to predict availability of resources (e.g., benefits) and possible loss of energy resources (e.g., threats). If we think of the brain as a prediction machine (a reductive but useful model), it follows that the brain likes to be correct about its predictions and dislikes being incorrect. [...] Failing at prediction is actually perceived as a threat to the organism (however slightly or subconsciously), and so any surprises or unanticipated changes seem menacing.
In short: we depend on predictions to survive. Being right makes us happy. And we get awfully cranky when our predictions turn out wrong. The fact that so many people (myself included) expressed so much frustration illustrates just how deeply embedded Facebook has become in our daily lives. Facebook's front page is a window on our world. Overnight, many felt that window had been replaced with a cruel hall of mirrors.
This dynamic doesn't just apply to Facebook. Taft sees the same phenomenon in at play in the fear that often greets new ideas. It also helps explain why we derive so much pleasure from watching movies we know by heart:
Our brains are highly optimized to anticipate outcomes and feel satisfaction and joy when we are proven right. This is why we like to re-encounter favorite movies and books again and again over the years and derive pleasure from them each time.
Finding our "sweet spot"
Taft isn't recommending we rely entirely on predictable events, which would leave us incapable of responding to change of any kind. Instead, he points to a "sweet spot" of challenge and ability where we can operate at our peak powers. Psychologist Mihaly Csikszentmihalyi refers to it as flow:
Csiksentmihalyi defines flow as “a state in which people are so involved in an activity that nothing else seems to matter; the experience is so enjoyable that people will continue to do it even at great cost, for the sheer sake of doing it.” He identifies a number of different elements involved in achieving flow:
The analytics-driven organization
Csikszentmihalyi is one of the pioneers of the scientific study of happiness. But to my ears this happy state of flow also sounds a lot like that of an analytics-driven organization - the kind you can build with the IBM solutions you're going to see down at Information On Demand next month in Las Vegas. In the hundreds upon hundreds of breakout sessions and EXPO demonstrations you'll see how you can add the capabilities to make your outcomes a little more predictable. You'll see how to enable your workforce to better manage the relentless pace of change. And you'll see how turning insight into action makes everyone that much happier.
Change may not be pleasant, but it's the only way we know of to move forward. On a smarter planet, smarter software can mitigate the pain and help you move forward as well. And besides, we've been through all of this before and come out the better for it. The question to you now is, are you ready to move forward as well?
Tim O'Bryan 270001NMX7 firstname.lastname@example.org | | Tags:  timobryan provenpractices businessanalyticstoday | 2 Comments | 1,517 Visits
We often hear Business Analytics being so many different things that we feel it’s near impossible to get a handle on what it really is. I’m sure you were just getting used to the idea of what Performance Management is and now we throw Business Analytics into the equation. To make matters worse, there’s a great deal of prognosticators, thought leaders, and industry analysts who still are married to the idea of calling the space Business Intelligence. I thought it might make sense to pass along a simple explanation of each without all of the Big 5 consulting speak that usually goes with it. So, here you are.
Business Intelligence (“The Historian”)
BI is where the historian in all of us comes out. This is where you’re doing rear view mirror analysis, querying, reporting, with enabled “alerts”, real-time monitoring, dashboards, scorecards, and visualization focused on past performance. This is your investigative practice area asking the questions ‘what happened?’ and ‘how are we doing?’ followed by thorough analysis of the detail behind the answers to these questions, i.e why are we on- or off-track?
Performance Management (“The Pragmatist”)
Performance Management builds off of “The Historian” to include the following: planning, budgeting, forecasting, and scenario modeling; customer and product profitability, i.e. profitability modeling and optimization; strategy management; governance, risk, and compliance; and, financial consolidation and external reporting. Performance Management is “the Pragmatist” who looks not only at monitoring and analyzing past performance (“The Historian”) but also wants to then use this past performance to help determine what the future outcomes are expected to be (Think budget/forecast), which is based off of these past results weighed against current conditions and intuitive insight, i.e. the “knowns” and “unknowns” about today and the foreseeable future.
Also included is the practice of governance, risk, and compliance. Of course, there needs to be rigor and accountability around these processes, including stringent compliance controls to meet all regulatory requirements. In addition, risk assessments are a necessary component of performance management should not only your performance assumptions (Think Risk-Adjusted Forecasting) be wrong not to mention other business risk elements of the business including strategic risk, market risk, credit risk, IT risk, operational risk, etc. The practice of governance, risk, and compliance enables customers to identify, manage, monitor and report on risk and compliance initiatives across the enterprise, helping businesses to reduce losses, improve decision-making capabilities about things like resource allocation, and, ultimately, optimize business performance.
Performance Management = [Business Intelligence] + [Planning, Budgeting & Forecasting, Profitability Modeling & Optimization, Governance, Risk, and Compliance, Strategy Management, and Financial Management & Control]
Business Analytics (“The Futurist”)
“The Futurist” looks at everything the “The Pragmatist” does but then runs what’s called Predictive Analytics against the Performance Management data that you already have to uncover unexpected patterns and associations and develop models to guide what should be done next. It turns the human element in planning, budgeting, and forecasting on its head by applying pure user-enabled algorithms and customizable statistical analysis providing you with the data driven answers. More simply, with predictive analytics companies are able to prevent high-value customers from leaving, sell additional services to current customers, develop successful products more efficiently, or identify and minimize fraud and risk. This is all being done by businesses all over the world today. Predictive analytics is just what its name suggests: It’s about giving you the knowledge to predict. [Business Analytics = [Performance Management] + [Predictive Analytics]
More blogs @ httpLin
YouTube Channel: Prov
Tim O'Bryan 270001NMX7 email@example.com | | Tags:  businessanalytics openpages clarity ibmcognos | 0 Comments | 1,359 Visits
Leveraging IBM OpenPages & Cognos Clarity for Risk Management, Disclosure Management and XBRL
Tim O'Bryan 270001NMX7 firstname.lastname@example.org | | Tags:  predictiveanalytics timobryan forecasting businessanalytics | 0 Comments | 1,317 Visits
Tim O'Bryan 270001NMX7 email@example.com | | Tags:  provenpractices timobryan businessanalyticstoday businessanalytics | 0 Comments | 1,279 Visits
Imagine entering the cockpit of a modern jet airplane and seeing only a single instrument there. How would you feel about boarding the plane after the following conversation with the pilot?
Q: I’m surprised to see you operating a plane with only a single instrument. What does it measure?
Q: That’s good. Airspeed certainly seems important. But what about altitude. Wouldn’t an altimeter be helpful?
We suspect you wouldn’t board the plane after this discussion. Even if the pilot did an exceptional job on air speed, you would be worried about colliding with tall mountains or running low on fuel. Clearly, such a conversation is a fantasy since no pilot would dream of guiding a complex vehicle like a jet airplane through crowded airspace.
This is an often cited story by many business strategists and other management prognosticators which I will attribute to Drs. David Norton and Robert Kaplan, pioneers of the Balanced Scorecard. It’s intended to reflect how critical the actual indicators are that we setup for not only pilots but also the indicators by which you establish for your entire workforce because these indicators will serve as the guiding force behind their decision-making.
Why is this so important? Well, many reasons starting with the business environment has substantially changed where no longer can a company operate rudderless without a core set of metrics to steer each of its employees individually and as a collective unit in the right direction. That right direction is the enterprise strategy. The speed at which these decisions are being made seem to have increased exponentially in just in the past 5 years. The days of top-down, command-and-control authority over decision-making are far from over in deference to a more nimble, decentralized execution hierarchy intended on keeping pace with the velocity of the related competition and customer expectations. The need for getting relevant and actionable information to the business users has never been more pronounced than we’re seeing today. If you can’t react fast enough to the market realities your customers will go elsewhere. We live in a world where product or brand loyalties are becoming more and more a thing of the past. It’s about execution. Good execution is about making smarter, more informed decisions that support the organization’s goals.
These decisions being made are happening across all levels, geographies, and functional areas of the business everyday. For this post I want to zero in on the first question asked which falls under measuring and monitoring the business. This question is, how are we doing?
Sure, the executive suite is constantly measuring and monitoring overall business performance to ensure the company is on track to meet its strategic targets. In addition, the function leads in marketing, sales, finance, HR, and development all the way down to the individual contributor levels of the organization are measuring and monitoring the performance of their area of the business too. But how does everyone know they’re doing the right things at all times? What are their real priorities helping the organization achieve its goals? Is it guesswork? Is it trust-based that the entire workforce is going to naturally make the right decisions supporting top-line goals? How can we be so sure?
This fictional story referenced at the beginning of this post is
really about measuring and monitoring – not an aircraft – but your
business thru a tool called a scorecard.
There are personal, departmental, and enterprise scorecards. A
scorecard includes the key performance indicators, or KPIs, for which,
in the case of a personal scorecard, an employee is responsible which,
if these KPIs are correctly defined, would include measurements that,
when looked at in aggregate, support the enterprise’s top-line strategic
goals and objectives. Inevitably, there will be shared targets for some
of the KPIs in a personal scorecard either within a specific functional
area of the business (Think Marketing Director/Marketing Associate
having similar campaign targets) or as shared KPIs across functional
groups like marketing, sales, procurement, and deve
The actual KPIs – typically there’s about 6-10 for each individual – are critical because they will define the actions taken by the individual for which they’re responsible. The ultimate alignment via scorecards composed of KPIs across these business groups, departments, divisions, business units, etc. is the embodiment of what we call a company’s strategy execution framework.
Harvard Business School having done a study on this framework found that, “a 35% increase in Strategy Execution leads to 30% gain in shareholder value”. That’s a pretty strong argument for at least taking a harder look at it.
How do you deploy such a framework, you ask? Well, in theory it’s very simple. You just translate the business strategy and its related goals into a set of performance indicators that outline the targets for which each department and employee within each department are responsible and away you go, right? Yes, I know. It sounds easy in theory. But, in practice it’s a little more work.
The key is working top-down with each business and support unit area to translate their contribution towards meeting these higher level targets so that these lower-level, cascaded measurements, or KPIs, will, when rolled up in total, directly tie to the top-level enterprise’s strategic goals. This ensures proper alignment of the organization while providing an ongoing set of metrics by which the workforce can measure themselves.
Even more important in defining the right KPIs is the understanding that whatever the indicators are, this will determine the individual’s behavior so take care as you define these. Something else that makes this framework so effective is that it makes it that much easier to reset the workforce when those top-level strategies change. the infrastructure is in place to restructure the scorecards. This allows the company to adapt more quickly.
Think about deploying such a framework for your organization. The best incentive I can give you for taking on this effort is that going through the KPI definition process for each set of scorecards it forces discussions across functions, within departments and at the executive level that will expose how achievable these targets really are with the current resources in place today and who is ultimately responsible for what. This is just about the most important exercise I think a company can go through to make sure it’s not setting itself up for failure because its strategy isn’t attainable given the resources currently in place. Once this KPI definition process is complete and everyone knows who’s doing what and where the synergies lye it’s all about execution. This framework sets companies up to execute well because they’ve already identified their needs and resources at their disposal and now it’s a matter of delivering. It’s go time.
If done right this will be the outcome for your organization:
More coming on this subject. Stay tuned. In my next post I’ll tell you some of the best practices in defining the right KPIs for personal scorecards.
Blog @ http
Follow Business Analytics Forum, our annual users conference in Las Vegas, NV, October 23-26th, on Twitter @ #baforum!
Delaney Turner 270003RQ8K Delaney.Turner@ca.ibm.com | | Tags:  deloitte ibmbao | 0 Comments | 1,253 Visits
If using analytics in the Office of Finance isn’t particularly new, the kinds of analytics now available to finance professionals most certainly are. Finance still builds budgets and closes the books, but now it’s in areas such as model-based forecasting, advanced fraud detection and portfolio optimization where Finance professionals are finding new sources of value and competitive advantage. Here, I speak to Miles Ewing and Scott Wallace of Deloitte. (Download the podcast version)
Miles is partner in Deloitte’s Finance practice and leads Deloitte’s Integrated performance management practice in the U.S. Scott is a Director in Deloitte’s Risk Information practice and leads the U.S.-Cognos Alliance Relationship.
Analytics can mean different things to different people because you can do so many things with them. Can you explain how Deloitte defines analytics for its clients?
Miles Ewing: Analytics is a very broad term, and from our perspective they’ve been going on since humanity created fire and decided it was warmer to stand next to it than further away from it. But when we think about what’s different today, there are three aspects. First is the fundamental volume of data that’s available today. There will be more information created this year than in the past 5,000 years. Next is the speed at which we can analyze this data. If it took us 10 years to code the genome a decade ago, we can do that it in a week today with our processing power. Third, there’s the reach and breadth of the data. From social networks to sensing technologies there’s a dramatically broader reach.
These combine to give us an enhanced capability to look at both patterns in data and advise on specific individual transaction-level data. Because of this we can make decisions either at a higher level or lower level that we weren’t able to do in the past. And it’s that combined capability and bringing those disciplines to business that is really where Deloitte defines analytics.
Scott Wallace: More tactically speaking, it's really bringing what used to be back-office functions – either with your statisticians and actuaries - into the front office, where Finance professionals can use capabilities to do this analysis on their own. There’s an ability to do more with analytics tactically than before that’s bringing it to life.
Deloitte has different analytical disciplines. Can you provide us with some examples?
Miles Ewing: We break analytics into three areas. The first is core analytics – from basic variance analysis in your budget to the analysis that goes into your external reporting. It’s not just in your traditional FP&A group, but the analytics in your tax department, treasury, investor relations and operations. Companies have been doing that for a long time will continue to do so.
There are two things that are new. The first is where Finance teams are taking advanced analytic methods such as model-based forecasting - algorithmic-based forecasting, advanced fraud detection or portfolio optimization - and bringing those capabilities to their core, either to improve the efficiency and accuracy of these functions, or to add a different way of looking at it and get more bang for their buck on the core analytic side.
The second area is what we would call Finance-supported analytics. And these are areas where Finance is bringing its cross-functional capabilities to the problems faced by other parts of the business, be they in supply chain, procurement, IT or sales and marketing. What we see here is Finance taking a cross-functional view of the situation and coming out to support things like pricing, or vendor spend analysis or technology investment prioritization. These are areas where because of the reach and speed of data, Finance can support decisions at the micro level and provide better, more effective decision-making in those functions in a way that they couldn’t in the past.
Scott Wallace: It’s been core to Finance for a long time to have access and visibility across the organization. The CFO and his or her team need to be aware of what’s happening in other parts of the organization. What you’re seeing with analytics is that coming together and making it more meaningful and more impactful to the organization. Lately we’ve have a lot of requests from our clients asking how to integrate their sales or operational planning with their financial planning. So not only has Finance typically taken a cross-functional view, now there’s a demand pull for that view across organizations because of the capabilities of the tools and the data availability.
What areas of Finance need the most help?
Scott Wallace: As you read the different literature around Finance and analytics from firms like ours and from the academics, they’re really pushing the envelope on how to become a more value-added function using analytics; yet many organizations are still fundamentally trying to fix core processes. I do see a continuing demand and convergence in the area of forecasting. That’s where you’re seeing this convergence of the analytic capabilities and when you think back to what Miles said about the different kinds of analytics, the ability to have insight into other functional information and data, and then how do I move that kind of information into predictive forecasting – identifying those real key drivers of the business across the functions that I can model based on historical data, based on external data, and start to have more confidence in my ability to predict the future financial performance of the company. That’s where we’re asked to provide help.
Miles Ewing: Companies are at very different places. Some are still trying to get the core right and they need to get that set first. Organizations that have been unable to get that core right over the past decade will find it difficult to really advance into that support. They may lack credibility as analytical leaders in their company. Focusing on that core becomes increasingly urgent for them.
Where does the demand for analytics come from? Is it from a CFO setting out a new vision, or does it come from the bottom up? What trends are you seeing?
Scott Wallace: Right now we’re experiencing lot of top-down demand from the CEO and CFO. A lot of it is borne of frustration – despite all the data they have in their ERP and their more advanced operational systems they still don’t feel they’re getting the right levels of transparency and insight. Also, because of the influx of information about analytics and tools and methodologies and success stories they’ve seen, CFOs are really asking themselves how they can continue to grow their relevance within their organizations. They’re really pushing on analytics.
Deloitte has six guiding principles for getting started with analytics. Can you outline them?
Scott Wallace: First off, link your goals and objectives with clear business drivers. If you’re going to use analytics, make sure they tie to your existing strategies or other initiatives you have inside and outside Finance. Ask yourself: What am I really trying to do? What are the competitive differentiators I’m trying to find in my data set?
The second is to know your data. Many of our clients have a good vision. They know what they want to do and how to tie their analytics together, but they run into data issues because the data isn’t in a single location or it’s not clean enough to provide the right insights.
The third is to start simple. Analytics needs to be something that can be accepted by your organization. Pick an area where there’s a need or pent-up demand. Stay focused on that area, get the numbers right and get them delivered properly. Build the confidence within your leadership team that the predictive capabilities and outcomes you’re providing make sense.
The fourth is to leverage existing insights. If you’ve got programs under way – customer analysis programs, working capital analysis programs, for example – look for ways to enhance them using insights you can get from analytics. How can you better project things that are already being looked at by the organization? You’re adding insight to a point of view that’s already being used in the organization.
Tim O'Bryan 270001NMX7 firstname.lastname@example.org | | Tags:  provenpractices businessanalyticstoday businessanalytics timobryan | 0 Comments | 1,251 Visits
Tim O'Bryan 270001NMX7 email@example.com | | Tags:  financialperformancemanag... charlottelocke timobryan fpm | 0 Comments | 1,250 Visits
As organizations seek the best ways to respond to a volatile marketplace that can change on a dime, the functions that were once the purview of finance organizations, such as enterprise planning, budgeting, forecasting and analysis, have spread to other parts of business, such as business units and organizations. This is because financial performance management – led by Finance -- has become increasingly strategic in organizations, regardless of their size or market sector.
While initial deployments might have once been focused on Finance, companies are tending to deploy performance management solutions more broadly in organizations. Therefore, performance management is rapidly migrating from finance to executives and everyday business users, who are taking on more and more responsibility for financials, analytics, planning/budgeting, risk analytics, and reporting on these processes, such as profitability analysis. Additionally, many companies that have successfully implemented these financial performance management (FPM) solutions, such as a planning solution or a financial controls, would now like to integrate these solutions with other FPM software and technology for a more complete solution.
To maximize the value obtained from either putting financial analytics in the hands of this new, wider audience with a common planning platform or from greater FPM solution integration, finance departments are challenged with managing and supporting these new tools and capabilities for numerous divisions, regions and functions and making sure that they work together. (See related article in this blog: “Financial Performance Management & The Agile Enterprise: Two Sides of the Same Coin,” by Tim O'Bryan) Processes that were already in place to manage spreadsheet sharing and review and manual processes are no longer sufficient. Developing an enterprise-wide initiative with standard technologies and processes that allows for extensions of current implementations is critical, and a Finance Center of Excellence (FCOE) can provide the reusable knowledge, disciplines and best practices to make these financial performance management initiatives possible.
Look for an upcoming posts on this topic, which will feature:
-- Charlotte Locke
Tim O'Bryan 270001NMX7 firstname.lastname@example.org | | Tags:  businessanalytics businessanalyticstoday timobryan leadership | 1 Comments | 1,229 Visits
“He who hesitates loses.”
The ugly truth of this phrase rings true in the case of battles that never occurred in the American Civil War. For those unfamiliar with this war the American Civil War occurred during the mid-1800′s from 1861-1865 on U.S. soil. It was fought between the North (Union States) and the South (Confederate States) and was primarily triggered by the election of Abraham Lincoln in 1860. Its definitive starting point occurred on April 12, 1861 at 4:30am when the first Confederate shot hurtled into Fort Sumter, a Union stronghold, sitting at the entrance to the harbor of Charleston, South Carolina.
The conflict continued until a final peace was made at Appomattox Court House on April 9, 1865 between Union General Ulysses S. Grant and Confederate General Robert E. Lee. At the outset of the war, the North was better organized, better equipped with a much larger conscription of troops not to mention having far more resources at its disposal than its southern counterpart. Despite the South’s opening victory at Fort Sumter, a largely symbolic victory for the South, many predicted the North would prevail swiftly and decisively. In 1861, President Lincoln was resolutely confident that a string of Union victories as they marched their way from Washington, D.C. to the seizure of the Confederate capital of Richmond, Virginia would be enough of a blow to the Rebels that the Confederates would be forced to give a total surrender. The key for this surrender to be given though was that the Northern Army needed to move quickly from their current position in Washington, D.C. down to Richmond to dismantle the South’s capital and central command post: without the shepherd (Richmond) the sheep (Southerners) would lack direction forcing an end to the war. All pointed to an assured victory for the North. Or so it seemed. “On to Richmond!” was the northern cry. Lincoln gave the job of leading the Union Army to General George B. McClellan. McClellan was revered for his many talents. Smart, pedigreed, and more than capable, McClellan had the trust of all Northerners. March to Richmond, plant the Union Flag and McClellan was assured to be a hero. As I mentioned earlier, no battle never fought was ever won. Despite overwhelming odds in his favor and indisputable evidence that his army far outnumbered the Southerners, General McClellan repeatedly hesitated to march his troops into battle against the Confederate army which stood between him and his ultimate objective, Richmond. The Confederate capital at the time of his hesitations was literally within McClellan’s sights but he never made it there because he failed to act. Instead, after a few defensive squirmishes with Southern forces and despite multiple requests from Lincoln to fight, McClellan all-too-eagerly retreated hat-in-hand back to Washington, D.C. giving up ground to the South’s General Robert E. Lee who made an aggressive march northward to Antietam into Northern territory. What a turnabout. I wonder if this is what prompted Lincoln to later say, “I can make more generals, but horses cost money.” Obviously, Abe had to mind his dollars and cents now given that this would be a long and protracted war.
My point isn’t to denigrate the character of General McClellan. Not at all. He did many exceptional things in his life of which he should be proud. What I mean to do is to illustrate an example where sitting idle, even when there’s overwhelming evidence to support taking that action, is a missed opportunity. We’ve all done this in one way or another. We wait to act. It happens to the best of us. A lot of times taking action on something means change from the norm and, regardless of the ultimate benefits, we can resist that change because, well, it’s change. These reasons alone are what cause a lot of people to hit their personal pause button and not do anything.
Let’s look at this from a different perspective. Have you ever had your master bathroom redone? If so, then you’ll know what I’m talking about. Tons of benefits here. Maybe you’d be getting a new jacuzzi tub, a nice steam shower and even more cabinet space not to mention your own sink this time around… Still, most resist a project like this because it’ll mean losing your bathroom for quite some time before it’s ready for prime time. Big benefits to upgrading your bathroom but there’s certainly going to be some inconveniences (read change) before it’s usable.
No matter how necessary the project is whether it’s redoing your master bathroom or marching regiments of 100,000 men into enemy territory don’t let the forces of hesitation get the best of you. Yes, there will be initial adjustment pains as you go through the process but keep your eye on the ultimate prize – and, when applicable, make sure you’re keeping everyone else’s eye on the prize too.
If you see measurable benefits justifying an investment in something, try to see the benefits beyond the initial ramp up time and just go for it.
(Shameless plug) If you’re thinking of deploying Business Analytics solutions to enable critical business processes at a minimum take stock of what the possibilities are. Look at your processes such as the following:
…take stock of what the best practices are in one or two of these areas and see how your company measures up. Perhaps this is an opportunity to drive a planning & analytics optimization initiative across finance and the rest of the organization (Check out IBM Cognos TM1). Or, maybe you want to look at automating your financial statement reporting practices (Check out IBM Cognos FSR), or even review your risk management practices (Check out IBM OpenPages). Look at the processes first. See what’s preventing these processes from being at a best practice level. There could be many reasons for this. After you’ve done this take a look at the enabling solutions that are out there that address these processes. Whatever it is don’t hesitate because there might be some additional work in the investigation phase or in the technology implementation phase because, once it’s up and running, you’ll be glad you did.
The measurable benefits in adopting these solutions, such as automation, embedded controls, workflow management, minimal administration, etc. which allows for higher frequency forecasting, stronger analytic capabilities, access to real-time reporting, effective scenario analytics, best-in-class predictive analytic capabilities, rigorous statutory reporting & risk management enablement, etc.,with you championing the project can be your path to success while the organization benefits from the bottom line ROI. Everybody wins. You can become the figurehead for it too because you acted on it.
“Fortune favors the bold”.
Tim O'Bryan 270001NMX7 email@example.com | | Tags:  businessforecasting businessanalyticstoday forecast planning timobryan provenpractices strategy businessanalytics | 0 Comments | 1,197 Visits
What if you knew tomorrow’s winning lottery ticket number? Imagine the possibilities. Quit your job? Travel the world? Buy that convertible Bentley you’ve always wanted? Addition to the house? Pay off those nagging debts? Think about the impact of knowing what a stock price will be next week, or knowing when your car is going to break down, or exactly when your roof is about to start leaking? Better, what about if you had early insight into your future health condition? Now, wait a minute! Something seems different here. With regard to the winning lottery ticket number that seems a lot more unpredictable than say, picking a stock or determining when your car is going to break down not to mention forecasting potential health concerns. It certainly is different. I’m sure you can guess the difference between predicting the winning lottery ticket number and the other examples. I’ll state it anyway. It’s because in these other examples we can draw from historical data, analytical research, individual’s input based on their experience, and a vast array of data to more accurately determine what is likely to happen. Once you know this information you can begin to do some planning for these possibilities or scenarios. Seems pretty logical, right? We know how much information is being captured today by companies about their customers, employees’s insights, internal operations and external market conditions that there’s obviously not a problem with lack of data to do this predictive analysis. Yet, in a lot of companies today this practice does not happen with regularity. Companies aren’t using their most valuable resources available for forecasting – their people and their data – to develop this in-house capability.
by Larry Bossidy & Ram Charan
by Bjarte Bogsnes
by Chip & Dan Heath
Financial Performance Management for the Empowered CFO (using IBM Cognos TM1, IBM Cognos Controller & IBM Cognos BI for Scorecarding)
Tim O'Bryan 270001NMX7 firstname.lastname@example.org | | Tags:  timobryan businessanalytics businessanalyticstoday provenpractices | 0 Comments | 1,148 Visits
To read the remainder of this blog please click here.
The Close, Consolidate, Report & File process: Without Automation & Embedded Controls It’s A House Of Cards
Tim O'Bryan 270001NMX7 email@example.com | | Tags:  businessanalytics financialperformancemanag... businessanalyticstoday ccrf close-consolidate-report-... timobryan | 0 Comments | 1,108 Visits