You are currently browsing the tag archive for the ‘TM1’ tag.

IBM hosted the Big Data and Analytics Analyst Insights conference in Toronto recently to emphasize the strategic importance of this topic to the company and to highlight recent and forthcoming advancements in its big data and analytics software. Our firm followed the presentations with interest. My colleagues Mark Smith and Tony Cosentino have commented on IBM’s execution of its big data strategy and its approach to analytics. As well, Ventana Research has conducted benchmark research on challenges in big data.

The perennial challenge for the IT industry observer is to be skeptical enough to avoid being taken in by overblown claims (often, the next big thing isn’t) without missing turning points in the evolution of technology. “Big data” has always been with us – it’s the amount that constitutes “big” that has changed. A megabyte was considered big data in the 1970s but is a trivial amount today. There’s no question that the hype around big data is excessive, but beneath the noise is considerable substance. Big data is of particular interest today because the scope and depth of the data that can be analyzed rapidly and turned into useful information are so large as to enable transformations in business management. The effect will be to raise computing – especially business computing – to a higher level.

The IBM event demonstrated that the technology for handlingvr_bigdata_obstacles_to_big_data_analytics big data analytics to support more effective business computing is falling into place. It’s not all there yet but should improve strongly over the next several years. Yet while technological barriers are falling, there are other issues organizations will need to resolve. For example, the conversation in the conference sessions frequently turned to the people element of successfully deploying big data and analytics. These discussions confirmed an important finding in our big data research, shown in the chart, that more than three-fourths of organizations find staffing and training people for big data roles to be a challenge. It’s useful to remind ourselves that there will be the usual lag between what technology makes possible and the diffusion of new information-driven management techniques.

The conference focused mostly on the technology supporting big data and analytics but included examples of conceivable use cases for that technology. For example, there was a session on better management of customer profitability in the banking industry. Attempts to use software to optimize customer profitability go back to the 1980s, but results have been mixed at best, and a great many users continue to perform analysis in spreadsheets using data stored in business silos. IBM speakers described a generic offering incorporating Cognos TM1 to automate the fusion of multiple bank data sources that are incorporated in a range of profitability-related analytic applications. The aim is to enable more precise pricing and price-related decisions related to rates and fees, among other factors. This application enables consistent costing methodologies, including activity-based ones for indirect and shared expenses, to promote a more accurate assessment of the economic profitability of offers to customers. A good deal of the value in this offering is that it puts the necessary data in one place, giving executives and managers a consistent and more complete data set than they typically have. As well, the product’s use of in-memory analytic processing enables much faster calculations. Faster processing of a more complete data set enables more iterative, what-if analyses that can be used to explore the impact of different strategies in setting objectives for a new product or service or examining alternatives to exploit market opportunities or address threats.

As the Internet did, big data will change business models and drive the creation of new products and services. The dramatic drop in the cost of instrumenting machinery of all types and connecting them to a data network (part of the concept of the Internet of Things) is already changing how companies manage their productive assets, for example, by optimizing maintenance using sensors and telematics to enhance uptime while minimizing repair expense. Decades ago, companies monitored production parameters to ensure quality. More recent technologies can extend the speed and scope of what’s monitored and provide greater visibility into production conditions and trends. Representatives from BMW were on hand at the conference to talk about their experience in improving operations with predictive maintenance. Centrally monitoring the in-service performance of equipment and capital assets is old hat for airlines and jet engine manufacturers. For them, the economic benefits of optimizing maintenance to maximize the availability and uptime were significant enough to warrant the large investment they started making decades ago. The same basic techniques can be used to for early detection of warranty issues, such as identifying specific vehicles subject to “lemon law” provisions. From IBM’s perspective, these new technologies will enhance the value of Maximo, its asset management software, by extending its functionality to incorporate monitoring and analytics that help users explore options and optimize responses to specific conditions.

IBM Watson is the company’s poster child for the transformative capabilitiesVR_2012_TechAward_Winner_Logo of big data and analytics on how organizations operate. The company’s objective is to enable customers to achieve better performance and outcomes by having systems that learn through interactions, providing evidence-based responses to queries. My colleagues Mark Smith and Richard Snow have written about Watson in the contexts of cognitive computing and its application to customer service. And we awarded IBM Watson Engagement Advisor our 2012 Technology Innovation Award. Conference presenters gave an extensive review of progress to date with Watson, featuring innovative ways to use it for diagnosis and treatment in medicine as well as to provide customer support.

Although this was not directly related to big data, IBM also used the conference to announce the availability of Cognos Disclosure Management (CDM) 10.2.1, which will be available both on-premises and in a cloud-based SaaS version. CDM facilitates the creation, editing and publishing of highly structured enterprise documents that combine text and numbers and are created repeatedly and collaboratively, including ones that incorporate eXtensible Business Reporting Language (XBRL) tags. The new version offers improvements in scalability and tagging over the earlier FSR offering. The SaaS version, available on a per-seat subscription basis, will make this technology feasible for midsize companies, enabling them to save the time of highly skilled individuals as well as enhance the accuracy and consistency of, for example, regulatory filings and board books. A SaaS option also will help IBM address the requirements of larger companies that prefer a cloud option.

Most of the use cases presented at the conference were extensions and enhancements of well-worn uses for information technology. However, when it comes to business, the bottom line is what matters, not novelty. Adoption of technology occurs fastest when “new” elements of its use are kept to a minimum. The rapid embrace of the Internet in North America and other developed regions of the world was a function of the substantial investment that had been made over the previous decades in personal computers and local- and wide-area communications networks as well as the training and familiarity of people with these technologies. Big data and the analytics that enable us to apply it have a similar base of preparation. Over the next five years we can expect advances in practical use that benefit businesses and their customers.

Regards,

Robert Kugel – SVP Research

I recently attended Vision 2013, IBM’s annual conference for users of its financial governance, risk management and sales performance management software. These three groups have little in common operationally, but they share software infrastructure needs and basic supporting software components such as reporting and analytics. Moreover, while some other major vendors’ user group meetings concentrate on IT departments, Vision focuses on business users and their needs, which is a welcome difference. For me, there were three noteworthy features related to the finance portion of the program. First, IBM continues to advance its financial performance management (FPM) suite and emphasizes its Cognos TM1 platform to support a range of finance department tasks. Second, the user-led sessions illustrated improvements that finance departments can make to their core processes today, ones that improve the quality of these processes and go a long way toward enabling Finance to play a more strategic role in the company it serves. Third, the Cognos Disclosure Management product has better performance and useful new features to support the management of a full range of internal and external disclosure documents, including the extended close, which I have discussed.

It’s customary for companies to produce a slew of press releases to coincide with big conferences or user group events. Thus it’s interesting that IBM made no such announcements in this case: Product releases either happened months ago or are scheduled for later this year. This was probably incidental, but the lack of hoopla also reflects a good read of the audience attending this event (which tends to be skeptical, especially of anything that smacks of sales and marketing bombast) as well as recognition that the market is still catching up with FPM suite capabilities that have been available for years. From a user or potential user’s perspective, what’s old is still new.

Our Financial Performance Management Value Index evaluates suites of financial performance software rather than individual components for which IBM was rated Hot in 2012. There is a long-running debate on whether companies should buy suites of software or individual components. I advise companies to take the suite approach unless components fall short of business requirements because a suite can be – this isn’t guaranteed – less expensive to buy and maintain. It also may facilitate training and operations if there is a common interface and a single sign-on capability. A core element of IBM’s FPM product strategy is to emphasize its unified architecture to support a range of core finance department activities. This point was rarely stated explicitly at the conference probably because people working in non-IT roles are more focused on the benefits that come with this approach. IBM’s architecture facilitates the integration of specific finance functions such as planning, budgeting, forecasting, statutory consolidation and creation of disclosure documents as well as providing complementary capabilities such as performance management (including scorecards and dashboards and reporting) and analytics.

In particular, Cognos TM1 is enterprise planning software that helps manage the full planning cycle: business modeling, strategic and long-range planning, target setting, operational planning, budgeting and reviewing, along with the reporting and analytic functionality needed at each step. TM1 serves both midsize and larger companies. For the former, Cognos Express offers an integrated platform with standardized reporting, ad-hoc analysis and planning with an in-memory analytic server that utilizes a Microsoft Excel interface. Express is designed for smaller organizations with very limited IT capabilities. The in-memory architecture facilitates all planning and forward-looking activities. It enables organizations to quickly run even complex detailed models against large data sets. Having the ability to rapidly iterate scenarios with specific assumptions (as opposed to simplistic base, upside and downside cases) enables senior executives as well as line managers to have more forward visibility and anticipate the consequences of specific business scenarios and the impact of potential responses to different scenarios.

Software that utilizes in-memory processing has the capability to change the design of planning to create models to work as easily with the things used in running a business (units of materials or parts, hours of labor or purchase orders processed, to name just three) as well as the financial and accounting aspects. In-memory systems could become the tipping point in how companies plan and budget in the future. When weekly or monthly operational reviews are able to focus more on assessing future operational alternatives and their financial consequences and less on historical accounting data, it will enable a fundamental shift in corporate management. The planning-budgeting-review-reforecast cycle will become more useful for those running a company in adapting to the changing currents of markets and economic conditions. The emphasis will shift to achieving business success from focusing on budget conformity.

Another potential advantage of a suite is that it can simplify data management. (But keep in mind here that other data management strategies can achieve the same aim, and in practice organizations can do a poor job of managing data even with the best system architecture.) In the future, it probably won’t matter where a finance vr_lrp12_data_for_planning_is_only_somewhat_adequatedepartment collects its data, keeps its applications and stores its reports. For the moment, though, there is value in having a single system dedicated to the needs of the department to ensure data accessibility, consistency, timeliness and accuracy. A unified architecture also facilitates the creation and maintenance of a unified data set to keep everyone working off the same numbers that are more accurate, consistent and contemporaneous. In addition, IBM’s recent acquisition of Star Analytics has made Essbase data sources, such as those used by Oracle Hyperion, readily available to FPM users. Our research finds that data issues are a common impediment to the execution of business functions. For example, as the chart illustrates, just one-third (31%) of participants in our long-range planning research said the data they work with is adequate.

User-led sessions at Vision 2013 focused on the nuts and bolts of achieving success in deploying FPM software. These demonstrations are a main reason why people attend user conferences. Here the sessions underscored the disparity in maturity we find in how companies approach financial performance management. They often pointed out the data and IT infrastructure challenges most face when attempting a transformative change in a finance organization. The takeaway of most of these success stories was the need to change management of some core process. For this reason, to summarize the lessons learned from the presentations, the first key to success typically is executive buy-in coupled with repeated communication of the objectives. Promoting accountability is often an important motivation for FPM initiatives, and that requires accurate and consistent data to ensure buy-in. Although plenty of companies are proving the value of software in managing more effectively, we had enough conversations with those in the trenches to confirm that in areas like planning and budgeting, analytics and scorecarding, maturity levels are still low.

Today’s FPM suites are designed to require as little IT involvement as possible, a feature that all vendors emphasize. One session at the conference, however, served as a reminder that for most larger companies these systems are never “hands free.” Even well-designed software can be configured improperly or require modification as use evolves. For example, unless TM1 is properly configured, senior executives reviewing corporate plans as a deadline approaches could experience frustration because the numerous last-minute adjustments to individual plans can bog down a system’s performance. The circumstances and fixes for these sorts of issues differ between software packages and companies. However, a universal best practice companies must follow is having an ongoing dialog between Finance and IT to address issues as they arise, as well as an emphasis in IT organizations on uncovering these sorts of problems and addressing them quickly.

Turning to a specific product, I see Cognos Disclosure Management (CDM) as a welcome upgrade to Cognos FSR. Both products are designed to automate and streamline the process of composing and editing disclosure documents such as the Form 10-K annual report filed with the U.S. Securities and Exchange Commission (SEC) as well as tagging the documents using eXtensible Business Reporting Language (XBRL). I have been enthusiastic about this product category from the start because it facilitates the production of external disclosures, eliminates the need for people to handle repetitive mechanical tasks and promotes accuracy. It allows organizations to focus more on what goes into the disclosure by cutting the effort required to assemble the multiple components. (In general, however, our anecdotal sampling indicates that XBRL tagging is universally viewed as a compliance requirement without benefit to the company.) CDM makes it easier for public companies to handle the tagging process internally rather than having a third party provide this service. My conversations with users confirm that this approach gives them more time to complete their disclosure documents, provides greater flexibility in managing the process (especially in incorporating last-minute changes) and gives the CFO much greater control over decisions about which XBRL tags to use. The revamped CDM is able to handle more users, which is increasingly important as companies use it for more extensive reporting and disclosure activities such as internal reports (for example, board books) and external compliance filings that require the integration of text and numbers.

Well-designed and smoothly run user group meetings are a useful and efficient way for people who have made considerable investments in software to see what others have accomplished and to network with their peers to understand how best to implement change. The software is a consistent topic, of course, but for attendees the people, process and project management elements are equally important. Our benchmark research shows that a majority of finance departments have scope (often considerable) in which to improve the quality of, and the efficiency with which they execute, core processes and support the strategic objectives of their company. Software by itself is only one element, but it can be either enabler or impediment in efforts to improve finance department performance.

Regards,

Robert Kugel – SVP Research

RSS Robert Kugel’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Twitter Updates

Stats

  • 127,122 hits
%d bloggers like this: