Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  timobryan financialperformancemanag... businessanalyticstoday businessanalytics | 0 Comments | 2,591 Visits
Finance teams are the performance management captain of the corporate ship empowered to provide the executive team, and business and support units with real insight and understanding of past, present, and future performance while guiding them on what the information means to each of their constituents and how it can be interpreted for decision making. That is their most strategic value to the business. However, because of the explosion of information, speed of business, growing systems through M&A and unique information capture needs of different areas of the business not to mention growing business practice demands, such as regulatory compliance to meet the regional, national, and global reporting requirements, the corporation’s most important analytical asset, finance departments, remains mired in just completing the basic week-to-week tasks without providing this strategic guidance. As a result, emerging trends, exploitable opportunities, efficiency gains, addi
In the 1960′s, IBM was the 800-pound gorilla in the mainframe business whose technology supremacy went unchallenged and superior performance went virtually unabated into the 1970′s. They were the blue suits bearing information-based mainframes to help companies use data to run their large – sometimes multinational – businesses with greater know-how about customers, products, operations, and financial performance far more adeptly than any other technology could. Yes, they were considered the masters of product innovation largely due to world-class business practices and industry expertise. However, Big Blue got complacent and far too comfortable in their long held pole position. Their inflated confidence and market share eventually disintegrated as they missed the advent of the new information-based technology, which, if they would have had the right analytic capabilities in place, they would have seen it coming; the emergence of the minicomputer. Minicomputers were technologically simpler than mainframes but with stronger computing power while requiring less resources to run them. To be fair, it wasn’t just IBM that missed the advent of minicomputers. It was virtually every mainframe company in existence at that time. This new technology virtually wiped out the entire mainframe business such that no mainframe business would be a major player in the minicomputer business at all.
What happened? What was missed? Who screwed up?
In my opinion, there were many failures that caused this emerging technology to go unaddressed by IBM and others, but the chief culprit who could and should have been prepared for it was finance. Yes, I think it’s up to finance as the owners of business performance (past, present, and future) to fundamentally understand the business climate – internally and externally – to then advise their corporate constituents on what the information they’ve analyzed means to them. For this to happen finance needs to get a handle on its core responsibilities before it can begin to really spot these performance-sapping icebergs that can possibly turn into business shuttering threats.
Let’s get back to the technology story for a minute if that’s okay. Then, I’ll finish my point.
Where were we? IBM’s out because the mainframe business has gone south – way south by way of the minicomputer. Exit IBM. Enter Digital Equipment Corporation. DEC virtually created the minicomputer business along with a few other aggressively managed companies like Data General, Prime, Wang Computer, Hewlett-Packard, and Nixdorf. Did DEC and others in the minicomputer business learn any lessons from IBM’s big miss on the minicomputer market so as to not repeat the same mistake? Of course not. The story of DEC’s demise rings almost too tellingly true to IBM’s mainframe debacle of the 1970′s. In fact, the management gurus and business journals missed it too. Digital Equipment Corporation was considered by all who had some insight into the company’s operations as being the ultimate technology company for decades to come. For certain it was a featured company in the McKinsey Study that became the stellar 1980′s management book, In Search of Excellence. DEC seemed destined for monster success. Still, despite all this fanfare, DEC missed the next wave in computing technology, the desktop computer market.
Again, where was finance watching past performance by measuring and monitoring it, to get analytical insight into what the future might look like to then advise their constituents across the business on what all of this means to each one of them? Answer: Heads down.
The desktop computing market was predictably seized not by DEC or one of its minicomputer compadres but by Apple Computer, Tandy, Commodore, and IBM’s PC-division. (Yes, IBM can’t be held down for long!)
What happened next? Like Rick Blaine says in the movie Casablanca, “Play It Again, Sam.” Apple, Tandy, IBM and the rest of the desktop computer gang focused on making the best desktop computers they could but ended up missing the next new, new thing. Apple Computer and IBM lagged 5 years behind bringing the latest-and-greatest technology rage to the market: portable computers. That market was owned by Silicon Graphics, Sun, and Apollo – all newcomers to this market.
In each case, the leading companies mentioned were regarded as the gold standard given their product excellence and operational execution only to be quickly pushed aside by an out-of-nowhere, technologically superior solution that reset the market’s expectations rendering the prior leader’s solution frumpy and stale. Missing emerging trends in the marketplace and not adapting to them quickly enough can ring a death knell for most companies. Think Wang, Silicon Graphics, Apollo. For others, this misstep can set them back 5 or even 10 years before they’re back on their feet again.
As a note, in the above example, I simply chose the technology sector but we could have easily used the retail merchandising sector (Think Sears vs. Nordstrom) or retail books (Think Barnes & Noble vs. Amazon), or Automotive (Think GM vs. Toyota). Each situation is an example of a failure to see the changing landscape which, I believe, finance is mostly at fault for squandering these opportunities.
How come finance? I think finance failed their companies in each instance because they weren’t effective enough in managing the day-to-day, low value tasks which, if they had them under control, they would have greater leverage to spend time on higher-value practices like forecasting and business analytics to uncover data points that can help the entire business spot emerging market forces before it’s too late to respond. This responsibility to identify these threats and opportunities lies squarely on finance. If not them, then who else? Be careful because whomever you’ll name will probably expect finance to provide them with the meaningful insight into performance results across the business as well as external information, which, again, means it’s incumbent on finance.
So, how does finance get to that point where it’s able to provide this kind of insight with the resources it has because Lord knows it’s not going to get additional headcount? Well, it all starts with finding a way to better leverage the resources it has. This requires finance teams to get the lower value tasks automated as much as they can so that they can off-load these process management steps to take on added capacity for these analytic practices.
What are the world class finance teams doing to be analytic leaders in their industry? World class analytic finance teams have these repeatable practices down to great consistency and repeatability from end-to-end:
These practices are the foundational elements required for finance to be the advisor in providing guidance to the business. Excel at the practices mentioned above and you’re soon positioning your analytic experts on your finance team to do the real analysis they’re supposed to be doing. It’s incumbent upon the CFO’s finance department to provide this guidance and leadership given finance’s role as the performance managers for the company. It is therefore finance’s job to provide insights into past, present, and future performance but also trends, anomalies and market opportunities that become visible only after thorough analysis of the information-based business results gleaned from systems, i.e. ERP, CRM, SCM, etc. and processes, i.e. forecasting, what-if scenario analytics, etc.
This finance role is looked to not only explain past business performance and its financial effects but also advise and guide the strategy in determining where to make investments with the resources at hand. The CFO’s analytics team – finance – needs to spend its time not on the everyday execution of basic, low value process steps, like compiling, validating, and reconciling data for various internal and external reporting needs but also analyzing past, present, and future to present guidance on what’s happened, what’s happening now, and what could happen. Only with an infrastructure in place to easily manage these basic elements of the finance team’s mandate can the real value-added analytic insights come to light. Otherwise, their companies will continue to drive through its business climate with a perpetual blind spot on what’s coming soon rendering them the next Tandy Computer, Silicon Graphics, or Apollo.
It’s up to you finance to not let this happen.
Check out more blogs by Tim O'Bryan by clicking here!
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  businessanalytics businessanalyticstoday provenpractices timobryan | 0 Comments | 809 Visits
Learn the value of the new IBM Cognos Planning 10.1.1 (GA November 22). You will be pleased to learn of this release as it affirms IBM’s continued commitment to ongoing support and value-added enhancements to the IBM Cognos Planning solution. The release fulfills our customers latest requests with:
- features for greater ease and speed;
IBM Cognos Planning v10.1.1 delivers additional functionality for contributors (end users), faster access to data for reporting, an improved installation features, and conformance with IBM Cognos BI version 10.1.1 and Microsoft Excel 2010, and other key solutions.
Click here to listen!
Financial Performance Management for the Empowered CFO (using IBM Cognos TM1, IBM Cognos Controller & IBM Cognos BI for Scorecarding)
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  timobryan businessanalytics businessanalyticstoday provenpractices | 0 Comments | 821 Visits
To read the remainder of this blog please click here.
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  provenpractices businessanalyticstoday businessanalytics timobryan | 0 Comments | 959 Visits
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  timobryan provenpractices businessanalytics businessanalyticstoday | 0 Comments | 1,670 Visits
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  provenpractices kpis timobryan businessanalytics strategymanagement businessanalyticstoday | 0 Comments | 1,779 Visits
Drive Better Performance Thru Greater Finance Integration with the Sales & Operations Planning Process
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  businessanalyticstoday businessanalytics salesandoperationsplannin... provenpractices | 0 Comments | 2,090 Visits
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  provenpractices timobryan businessanalyticstoday businessanalytics | 0 Comments | 927 Visits
Imagine entering the cockpit of a modern jet airplane and seeing only a single instrument there. How would you feel about boarding the plane after the following conversation with the pilot?
Q: I’m surprised to see you operating a plane with only a single instrument. What does it measure?
Q: That’s good. Airspeed certainly seems important. But what about altitude. Wouldn’t an altimeter be helpful?
We suspect you wouldn’t board the plane after this discussion. Even if the pilot did an exceptional job on air speed, you would be worried about colliding with tall mountains or running low on fuel. Clearly, such a conversation is a fantasy since no pilot would dream of guiding a complex vehicle like a jet airplane through crowded airspace.
This is an often cited story by many business strategists and other management prognosticators which I will attribute to Drs. David Norton and Robert Kaplan, pioneers of the Balanced Scorecard. It’s intended to reflect how critical the actual indicators are that we setup for not only pilots but also the indicators by which you establish for your entire workforce because these indicators will serve as the guiding force behind their decision-making.
Why is this so important? Well, many reasons starting with the business environment has substantially changed where no longer can a company operate rudderless without a core set of metrics to steer each of its employees individually and as a collective unit in the right direction. That right direction is the enterprise strategy. The speed at which these decisions are being made seem to have increased exponentially in just in the past 5 years. The days of top-down, command-and-control authority over decision-making are far from over in deference to a more nimble, decentralized execution hierarchy intended on keeping pace with the velocity of the related competition and customer expectations. The need for getting relevant and actionable information to the business users has never been more pronounced than we’re seeing today. If you can’t react fast enough to the market realities your customers will go elsewhere. We live in a world where product or brand loyalties are becoming more and more a thing of the past. It’s about execution. Good execution is about making smarter, more informed decisions that support the organization’s goals.
These decisions being made are happening across all levels, geographies, and functional areas of the business everyday. For this post I want to zero in on the first question asked which falls under measuring and monitoring the business. This question is, how are we doing?
Sure, the executive suite is constantly measuring and monitoring overall business performance to ensure the company is on track to meet its strategic targets. In addition, the function leads in marketing, sales, finance, HR, and development all the way down to the individual contributor levels of the organization are measuring and monitoring the performance of their area of the business too. But how does everyone know they’re doing the right things at all times? What are their real priorities helping the organization achieve its goals? Is it guesswork? Is it trust-based that the entire workforce is going to naturally make the right decisions supporting top-line goals? How can we be so sure?
This fictional story referenced at the beginning of this post is
really about measuring and monitoring – not an aircraft – but your
business thru a tool called a scorecard.
There are personal, departmental, and enterprise scorecards. A
scorecard includes the key performance indicators, or KPIs, for which,
in the case of a personal scorecard, an employee is responsible which,
if these KPIs are correctly defined, would include measurements that,
when looked at in aggregate, support the enterprise’s top-line strategic
goals and objectives. Inevitably, there will be shared targets for some
of the KPIs in a personal scorecard either within a specific functional
area of the business (Think Marketing Director/Marketing Associate
having similar campaign targets) or as shared KPIs across functional
groups like marketing, sales, procurement, and deve
The actual KPIs – typically there’s about 6-10 for each individual – are critical because they will define the actions taken by the individual for which they’re responsible. The ultimate alignment via scorecards composed of KPIs across these business groups, departments, divisions, business units, etc. is the embodiment of what we call a company’s strategy execution framework.
Harvard Business School having done a study on this framework found that, “a 35% increase in Strategy Execution leads to 30% gain in shareholder value”. That’s a pretty strong argument for at least taking a harder look at it.
How do you deploy such a framework, you ask? Well, in theory it’s very simple. You just translate the business strategy and its related goals into a set of performance indicators that outline the targets for which each department and employee within each department are responsible and away you go, right? Yes, I know. It sounds easy in theory. But, in practice it’s a little more work.
The key is working top-down with each business and support unit area to translate their contribution towards meeting these higher level targets so that these lower-level, cascaded measurements, or KPIs, will, when rolled up in total, directly tie to the top-level enterprise’s strategic goals. This ensures proper alignment of the organization while providing an ongoing set of metrics by which the workforce can measure themselves.
Even more important in defining the right KPIs is the understanding that whatever the indicators are, this will determine the individual’s behavior so take care as you define these. Something else that makes this framework so effective is that it makes it that much easier to reset the workforce when those top-level strategies change. the infrastructure is in place to restructure the scorecards. This allows the company to adapt more quickly.
Think about deploying such a framework for your organization. The best incentive I can give you for taking on this effort is that going through the KPI definition process for each set of scorecards it forces discussions across functions, within departments and at the executive level that will expose how achievable these targets really are with the current resources in place today and who is ultimately responsible for what. This is just about the most important exercise I think a company can go through to make sure it’s not setting itself up for failure because its strategy isn’t attainable given the resources currently in place. Once this KPI definition process is complete and everyone knows who’s doing what and where the synergies lye it’s all about execution. This framework sets companies up to execute well because they’ve already identified their needs and resources at their disposal and now it’s a matter of delivering. It’s go time.
If done right this will be the outcome for your organization:
More coming on this subject. Stay tuned. In my next post I’ll tell you some of the best practices in defining the right KPIs for personal scorecards.
Blog @ http
Follow Business Analytics Forum, our annual users conference in Las Vegas, NV, October 23-26th, on Twitter @ #baforum!
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  businessanalytics businessanalyticstoday timobryan leadership | 1 Comments | 855 Visits
“He who hesitates loses.”
The ugly truth of this phrase rings true in the case of battles that never occurred in the American Civil War. For those unfamiliar with this war the American Civil War occurred during the mid-1800′s from 1861-1865 on U.S. soil. It was fought between the North (Union States) and the South (Confederate States) and was primarily triggered by the election of Abraham Lincoln in 1860. Its definitive starting point occurred on April 12, 1861 at 4:30am when the first Confederate shot hurtled into Fort Sumter, a Union stronghold, sitting at the entrance to the harbor of Charleston, South Carolina.
The conflict continued until a final peace was made at Appomattox Court House on April 9, 1865 between Union General Ulysses S. Grant and Confederate General Robert E. Lee. At the outset of the war, the North was better organized, better equipped with a much larger conscription of troops not to mention having far more resources at its disposal than its southern counterpart. Despite the South’s opening victory at Fort Sumter, a largely symbolic victory for the South, many predicted the North would prevail swiftly and decisively. In 1861, President Lincoln was resolutely confident that a string of Union victories as they marched their way from Washington, D.C. to the seizure of the Confederate capital of Richmond, Virginia would be enough of a blow to the Rebels that the Confederates would be forced to give a total surrender. The key for this surrender to be given though was that the Northern Army needed to move quickly from their current position in Washington, D.C. down to Richmond to dismantle the South’s capital and central command post: without the shepherd (Richmond) the sheep (Southerners) would lack direction forcing an end to the war. All pointed to an assured victory for the North. Or so it seemed. “On to Richmond!” was the northern cry. Lincoln gave the job of leading the Union Army to General George B. McClellan. McClellan was revered for his many talents. Smart, pedigreed, and more than capable, McClellan had the trust of all Northerners. March to Richmond, plant the Union Flag and McClellan was assured to be a hero. As I mentioned earlier, no battle never fought was ever won. Despite overwhelming odds in his favor and indisputable evidence that his army far outnumbered the Southerners, General McClellan repeatedly hesitated to march his troops into battle against the Confederate army which stood between him and his ultimate objective, Richmond. The Confederate capital at the time of his hesitations was literally within McClellan’s sights but he never made it there because he failed to act. Instead, after a few defensive squirmishes with Southern forces and despite multiple requests from Lincoln to fight, McClellan all-too-eagerly retreated hat-in-hand back to Washington, D.C. giving up ground to the South’s General Robert E. Lee who made an aggressive march northward to Antietam into Northern territory. What a turnabout. I wonder if this is what prompted Lincoln to later say, “I can make more generals, but horses cost money.” Obviously, Abe had to mind his dollars and cents now given that this would be a long and protracted war.
My point isn’t to denigrate the character of General McClellan. Not at all. He did many exceptional things in his life of which he should be proud. What I mean to do is to illustrate an example where sitting idle, even when there’s overwhelming evidence to support taking that action, is a missed opportunity. We’ve all done this in one way or another. We wait to act. It happens to the best of us. A lot of times taking action on something means change from the norm and, regardless of the ultimate benefits, we can resist that change because, well, it’s change. These reasons alone are what cause a lot of people to hit their personal pause button and not do anything.
Let’s look at this from a different perspective. Have you ever had your master bathroom redone? If so, then you’ll know what I’m talking about. Tons of benefits here. Maybe you’d be getting a new jacuzzi tub, a nice steam shower and even more cabinet space not to mention your own sink this time around… Still, most resist a project like this because it’ll mean losing your bathroom for quite some time before it’s ready for prime time. Big benefits to upgrading your bathroom but there’s certainly going to be some inconveniences (read change) before it’s usable.
No matter how necessary the project is whether it’s redoing your master bathroom or marching regiments of 100,000 men into enemy territory don’t let the forces of hesitation get the best of you. Yes, there will be initial adjustment pains as you go through the process but keep your eye on the ultimate prize – and, when applicable, make sure you’re keeping everyone else’s eye on the prize too.
If you see measurable benefits justifying an investment in something, try to see the benefits beyond the initial ramp up time and just go for it.
(Shameless plug) If you’re thinking of deploying Business Analytics solutions to enable critical business processes at a minimum take stock of what the possibilities are. Look at your processes such as the following:
…take stock of what the best practices are in one or two of these areas and see how your company measures up. Perhaps this is an opportunity to drive a planning & analytics optimization initiative across finance and the rest of the organization (Check out IBM Cognos TM1). Or, maybe you want to look at automating your financial statement reporting practices (Check out IBM Cognos FSR), or even review your risk management practices (Check out IBM OpenPages). Look at the processes first. See what’s preventing these processes from being at a best practice level. There could be many reasons for this. After you’ve done this take a look at the enabling solutions that are out there that address these processes. Whatever it is don’t hesitate because there might be some additional work in the investigation phase or in the technology implementation phase because, once it’s up and running, you’ll be glad you did.
The measurable benefits in adopting these solutions, such as automation, embedded controls, workflow management, minimal administration, etc. which allows for higher frequency forecasting, stronger analytic capabilities, access to real-time reporting, effective scenario analytics, best-in-class predictive analytic capabilities, rigorous statutory reporting & risk management enablement, etc.,with you championing the project can be your path to success while the organization benefits from the bottom line ROI. Everybody wins. You can become the figurehead for it too because you acted on it.
“Fortune favors the bold”.
Tim O'Bryan 270001NMX7 TIM_OBRYAN@yahoo.com | | Tags:  profitability-optimizatio... businessanalyticstoday timobryan businessanalytics | 0 Comments | 799 Visits
IBM Cognos TM1 is like a Swiss Army knife – it has capabilities to do a
lot of different things extremely well. One of the practices it’s
being utilized for with incredible results is an emerging, but no so
well understood, practice entitled profitability modeling and
optimization. The power of IBM Cognos TM1 is the primary reason this
practice has emerged. Companies adopting IBM Cognos TM1 as the
technology of choice for this practice is the primary reason they’re
thriving moving from ‘also rans’ to industry leaders. Still, this post
is less about IBM Cognos TM1 and more about the process, Profitability
Modeling & Optimization.
When talking about profitability modeling & optimization it’s more illustrative to start with the ever-important strategic element of price. It’s that set amount that can make or break a product or service or even the company’s acceptance by the marketplace. Price, of course, is most often set by what the market is willing to pay for this product or service. A lot of factors can go into your pricing strategy including things like the brand equity and image of the product or service (Think Tiffany’s), expected sales volumes (Think economies of scale), competitive factors (Only game in town?), or if it’s a new category being created. Of course, we know that companies profit by selling this product or service at a margin greater than its cost. Then, a profit is turned.Now, what you do with those profits is then up to the company to decide. Do you reinvest those profits? Most do. If so, how do you allocate them across the business for the most profitable return? Maybe you even want to release some of them to shareholders via dividends? What about making acquisitions? These are important decisions to make based on what the future needs of the business are. This is a good problem to have.
Enter Profitability & Optimization.
To sustain competitiveness in the marketplace organizations must be able to model their profitability to maximize profits and optimize the components to profitability that incur costs. Most companies are performing some form of profitability and optimization activities everyday. The BIG difference is that some are much better at it than others. Why is this? What separates them from the others? It comes down to the best-in-class organizations investing in their people, the related processes, and a capable technology.
Ask yourself the following questions:
Now, how would your colleagues in Sales, HR, Marketing, IT, Operations, etc. answer these questions?
It’s critical to know that every area of the business is looking at some element of profitability. Running scenarios, what if, and profitability analysis with a 360-degree viewpoint of the data is the lifeblood of value-added analysis for making smarter decisions which, if done effectively, translates into better corporate performance. Without better insight and understanding of the information derived from profitability modeling and optimization, business users are forced to go on their gut and/or out of date information — or, worse, they’re waiting until they do get the information they need before they can act…tick, tick, tick. Time is money.
If you and the rest of the workforce don’t have this information at your fingertips don’t fret because you’re not alone. However, things are changing. Companies today are investing more and more into this practice area as technology has caught up in a way that’s allowing for massive data volumes to be sliced-and-diced in seconds for this very purpose.
Now is the time to act. The technology is there so I’d ask why isn’t your organization there too. Hustle up. I know a lot of companies that are doing this practice very well – and some that aren’t unfortunately. They’re the ones looking to leverage IBM Cognos TM1 to put them on the path to better PM&O.
If you want to know more about this subject feel free to email me to discuss further.
Click here to see more resources on IBM Cognos TM1.
More Blog entries @ http