You are currently browsing the monthly archive for December 2011.

Management decision-making typically involves a three-step process of inform, analyze and act. In the earliest days of what came to be known as business intelligence, developers created decision support systems that provided information and analytics to help executives and high-level managers choose the best course of action. Working with numbers rather than gut instinct still is viewed as a best practice. After all, a pilot who doesn’t trust his or her instruments is heading for an accident.

Over the past couple of decades, crude, expensive corporate information systems have become more comprehensive and affordable as organizations collect broader sets of data from a wider range of functional organizations and processes and apply more sophisticated analyses to these data sets. Today, dashboards and scorecards are pervasive even in midsize and smaller companies. Yet alongside the explosion of data available to executives and managers and the tools to make sense of it, there is a stark reality: It’s not enough to know; action is required. This isn’t news to people who have to make those decisions, but most of the software demonstrations I see stop with the “inform” and the most basic part of the “analyze” stages of the three-step process. These sorts of vendor presentations assume that this is all decision-makers need to make consistently good decisions, but from my perspective this level of capabilities falls under the heading of “necessary but insufficient.”

More is needed and, happily, more is possible. Most organizations use analytics to make sense of the past. While this is useful, it also is not enough to be actionable. Addressing shortfalls and capitalizing on successes are a good first step, but these are essentially reactive approaches. Having a better understanding what might happen in the future and weighing the implications of potential actions is more valuable to a line manager or an executive looking to steal a march on the competition. Today, to address these shortcomings, more powerful business analytics are becoming accessible to generalists. New computing architecture (notably in-memory processing) is making it easier for companies to do more interactive and collaborative contingency analysis and planning to explore the implications of future actions.

For several years Ventana Research has been providing research on and stressing the importance of using predictive analytics. Predictive analytics helps enhance the accuracy of forecasts, often by detecting unseen drivers of results, giving companies greater precision in projecting future sales, expenses and operating results. Predictive analytics also provides a baseline set of expectations that can be used for early detection of departures from expected results. At the start of a holiday selling season, for instance, such departures could signal the need for early discounting or (if still feasible) allocating a product in high demand to the best customers. Predictive analytics also can be used to manage supply chains and to project cash flows more accurately.

Leading indicators are another tool that companies use less well than they could. Especially in the areas of demand analysis, supply chain risk and cost projection, such indicators can enable companies to anticipate future changes in markets and gain additional time to develop contingency plans or alter strategy.

Contingency planning and what-if analysis give people the ability to make better decisions more consistently and with greater agility. Thinking about pilots again, they use simulator training to help make the right decisions faster in the event of an emergency. This sort of planning can enable executives and managers to react sooner and with greater confidence when conditions change and even help determine how best to modify strategy or tactics in a volatile business climate. However, most midsize or larger companies have found it challenging to do contingency planning because technology limitations have made it impractical to do except at a very high level. This sort of planning is best done interactively in a collaborative setting. Until in-memory computing systems were applied to this process, response times have typically been too slow for many enterprise systems. If desktop spreadsheets are employed, it can take hours or days to recreate a scenario with a couple of changes in major assumptions.

It would be great if action-oriented decision support systems also provided the framework and capabilities for people to work collaboratively to follow through on decisions once they are made and to track the necessary follow-ups to see the process through to completion. Only with these capabilities could a system rightly be characterized as “closed loop.”

I see performance management, business intelligence and analytic applications becoming increasingly action-oriented over the next three years. Information technology can be of greater value to executives and managers in supporting their assessments and decision-making, and ensuring follow-through once a decision is made. But I question how long it will take before using the technology to do this becomes a standard feature of business. Some technology-inspired changes in behavior have happened rapidly over the past decade, but these have mainly been on the consumer side. While business computing has spurred changes in how organizations work, most businesses have been focused on efficiency by automating processes or eliminating the need for middle managers to coordinate activities. The kinds of changes to, for example, corporate planning that are now feasible require companies to set new expectations for planning and to alter how they plan. Although I believe the need for change is compelling, I fear adoption will take longer than it should.

Regards,

Robert D. Kugel – SVP Research

At its annual Influencer’s Summit in Boston, SAP offered multiple perspectives on where the company’s strategy and products are heading. Overall, I was struck by the essential similarities to its message on its strategic direction a decade ago. The overarching objective in its roadmap now, as then, is to have information technology increasingly adapt to the needs of individual users and how they choose to execute established/repetitive or ad-hoc processes,  rather than forcing them to adapt to the limitations of the technologies they are using. Back then the idea was to create a comprehensive process framework – a closely coupled approach. Today, it’s essentially the opposite, as SAP products run on an architecture that enables flexibility – a loosely coupled approach – both in how the computing infrastructure is organized and how people execute their tasks. It seems to me that this reflects the impact of having choices between cloud-based software as a service (SaaS) and on-premises systems and the need to enable access through a variety of devices (from desktops to mobile handhelds and tablets). Mobility is important both for people whose roles take them beyond the firewall (in sales, service and logistics, for example) and executives and managers who often find themselves managing by walking around. Tablets, smartphones and similar devices are attractive also because people consider them personal items and associate them with fun, whereas desktops and notebooks are corporate and work-related.

Architecture drives product design, and SAP continues to stress HANA and the ability of its in-memory system to expand the scope and capabilities of applications that run on it. That makes sense since any in-memory computing platform can transform how software is used. The challenge then becomes transforming the habits of users. For example, I’ve noted the need for more contingency planning. One reason it’s not used more is that the latency between thought and answer in complex scenario analyses on disk-based systems is often too long to be useful in promoting a collaborative dialogue around possible situations and their potential outcomes. “Too long” is a relative thing, of course, but based on my experience, the outer edge in this case may be 10 seconds to 1 minute. Once the technology foundation is in place, the hard work begins. Companies have to understand what is technologically feasible and that they need to adopt better planning techniques, notably driver-based planning. From my perspective, few technology advances have immediately led to forehead-slapping, “aha!” moments. Thus the spread of adoption of in-memory technology into business processes is never automatic, so one of SAP’s challenges is to create demand for HANA by promoting improved management techniques that are supported by in-memory computing. So the technology is different, but the business issue is much the same.

Since it offers differentiation in an increasingly commodity-like business computing product market, advanced analytics was a key theme at the summit. Analytics are an increasingly important capability for organizations, enabling companies to manage more effectively, not just efficiently. Predictive analytics, for example, should play a role in more than the 13 percent of organizations that our research shows are using them. They have become much more accessible but aren’t well understood. Predictive analytics certainly help in forecasting, but they’re also handy for spotting exceptions from expected results, especially when companies have to work with large data sets. Departures from expected results can provide the basis for management alerts and notifications, as when an order from a regular customer or an invoice payment is not received within the normal period. Both situations can indicate customer issues. A follow-up to the former might uncover that a competitor is offering promotional deals to gain market share, and the company could counter the move sooner. The latter might be the result of some issue that occurred in fulfilling the order, in which case it would be better to have the first communication with the customer be an immediate note of concern rather than a dunning notice weeks later. Analytics is an important theme in business computing, and SAP will need to focus on it in its product efforts.

Risk management is yet another area where real-time data can be an important enabler of capabilities that are not practical without the ability to crunch substantial amounts of data rapidly. Outside of financial services, few industries manage risk comprehensively, and even financial services can put more of their operations into a risk management framework. In part this situation reflects the fact that few industries have developed a framework for measuring risk objectively. Part of this is historical: Financial services have always been about numbers, and it’s straightforward to use these numbers to measure risk. Financial ratio analysis therefore can be applied to assessing many operational risks in that industry. And since it was one of the first to utilize computers to handle operations, these corporations have long experience in using software to manage risk. By contrast, only within the past few decades have other types of companies begun using IT systems to manage operations. Using data to identify and track operational risks is still nascent, so any discussion of how far it has developed strikes me as premature. Yet I believe risk management in consumer, industrial and business services should be on the radar screens of executives, especially because it addresses the agency dilemma. SAP’s risk management software portfolio could expand substantially over the next several years to address the need. (I expect the same from Oracle and IBM, along with many smaller vendors.) But here again businesses must become aware of the need before a market will grow.

SAP Business ByDesign is a topic worthy of its own blog, so I’ll post one on this topic shortly.

It struck me that this Influencer Summit demonstrated SAP’s understanding of what it needs to accomplish over the next few years to be competitive with its substantially larger rivals – IBM, Oracle and, in applications for small and midsize businesses, Microsoft. Strategy is one thing but execution – as ever – is probably more important. Here, SAP must demonstrate that it can operate at faster clock speed than it has in the past to maintain its top-tier position in business computing.

Regards,

Robert D. Kugel – SVP Research

RSS Robert Kugel’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Stats

  • 127,631 hits