You are currently browsing the tag archive for the ‘Applications’ tag.

Business software is beginning to undergo a design revolution comparable to the seismic shift from the green screen to the graphical user interface (GUI) that began in the mid-1980s. Three forces are at work. One is the retirement of large numbers of members of the baby-boom generation and the rise of a generation that grew up with computers and computer games from a young age. Also, software and technology vendors have been recognizing the need to “consumerize” business applications as mobile device interactions, gestures and other newer user interface (UI) conventions, and are incorporating these innovations in their stodgy products. I commented on this in my assessment of Tidemark early this year. A third factor, “gamification” is all the rage in business consulting circles. The idea is to engage younger employees more completely by transforming dull, routine chores into more entertaining pursuits. I join with those skeptical of just how fun one can make clerical tasks. But software can – and should – be made less tedious (and therefore more productive), especially for a new generation of users.

I saw evidence that the generational shift is upon us when walking around this fall’s Dreamforce and Oracle OpenWorld conferences, which were held just weeks apart. It struck me that the crowd at the 2012 Dreamforce was younger and its energy level was higher. While the focus of Dreamforce has shifted more to those working in IT rather than line-of-business roles (and in this respect the event is increasingly like OpenWorld), I found there more of a sense of fundamental change in design and use of business computing even (and maybe especially) when you stripped away the cloud ballyhoo. The applications I saw demonstrated seemed more fluid, agile and easier to use. By contrast, many of the newer business applications on offer from Oracle have a stale look and feel to them.

Supporting the design revolution over the next several years will be evolving information technology. For example, HTML 5 enables a richer, more capable set of capabilities in web-based and mobile software. A growing number of vendors offer low-cost computing infrastructure in the form of platform-as-a-service, which significantly reduces barriers to entry for startup software companies because up-front costs are low and businesses enjoy fewer constraints to scaling up. Third, there are the ongoing gains in the price and performance of computer processing power and memory. It is – and will continue to be – difficult to say which of these is most important, since each will feed off of and drive the others.

Until the 1990s, the market for business applications was pretty small. The client-server revolution had a profound impact on the design of business software, making it easier to work with and more flexible compared to the mainframe applications that preceded them. Much of this change was driven by a switch to relational and multidimensional databases, easier-to-use development tools, and the adoption of graphical user interfaces, which permitted event-driven programming. These underlying technologies made it easier for people to do computer-related jobs. With that came a change in how users expected to work with business software and the information they expected to get from their systems. The new generation of technology was easier to use, more flexible and more powerful. This, and the increasing homogenization of infrastructure elements and buyers’ demand for open standards, also contributed to the decline in the cost of developing applications and drove demand for more off-the-shelf applications.

Today we’re on the cusp of a similar generational change as the evolution of underlying information technology drives a major redesign of business software. To oldsters, the coming shift in business computing may be welcome, disconcerting or both. If you’re a baby boomer, you probably can remember what things were like before the client-server era and maybe what life was like before personal computers. The major shifts that took place in the 1990s are about to be repeated in ways that are subtle individually but fundamental in aggregate. People entering the workforce today and people who are in the process of taking leadership positions in companies have a different set of experiences and expectations in dealing with computing devices and technology than the boomers. The set of CIOs that came of age in the 1990s and even some millennials will need to forget old certainties, or the organizations that employ them will be forced to lose their old CIOs.

I’m pretty sure that accounts receivable or order entry will never be fun for all but the chosen few. For the rest, I’m equally certain these processes can be far less painful to execute. More fluid interactions with the software, less burdensome training, easier collaboration and greater adaptability to personal preferences are all feasible and increasingly essential for the coming releases of business applications. I also foresee increased automation to improve efficiency and reduce processing errors in these sorts of purely mechanical tasks. The result will be that, more than ever, executives and managers will need to rethink how work is performed in their parts of the business. Corporations must automate as much of the purely mechanical aspects of work as they can. This is especially true in the finance function, which still grinds out work that can and should be handled hands-free by software. In the process, companies must shift the focus of what people do to tasks that require knowledge, insight, perspective and judgment – things that (for now at least) are not easily supplied by IT. That would make back office work “funner,” if not exactly fun.


Robert Kugel – SVP Research

For the past several years Ventana Research has focused more on analytics and their importance to improving business performance. We’ve done extensive benchmark research in business analytics, detailing how they are used generally in business and in major functional areas of companies as well as their application in specific industries. We adopted this focus  because technology advances are changing the landscape of analytics. Its use in business management, for example, is getting new scrutiny these days because of three important changes in information technology.

One change is the increasing wealth of data that companies can use. It’s not just the data now available in the cloud. Over the past decades, organizations have implemented a range of systems for managing core business processes and collecting the data that go with these processes. ERP and CRM systems were among the first, but especially in larger companies, just about every function and every department uses some system that collects data. Almost all of these systems store this information in ways that make it feasible to access it. Second, so-called big data is making it possible for organizations to process much larger data sets than ever before to gain intelligence and insight into business operations and markets. Third, in-memory data processing is enabling companies to get immediate answers to queries, even through complex analyses of very large data sets, rather than having to wait minutes, hours or days. This accessibility changes the dynamics of planning and review meetings, for one thing, because it enables a far more fluid and interactive dialog around the questions “Why did we get the results we got?” and “What should we do next?” than has been the case in the past.

Yet all of this progress shouldn’t obscure the enduring value of simple ratio analysis. This technique for understanding  business performance  predates even the adding machine, going back centuries. Although it is widely used in the finance function, I think most companies today underutilize ratio analysis. Our benchmark research in finance analytics shows that finance groups do a good job with basic, well-established metrics such as profit margins or days sales outstanding (DSO) as well as debt and liquidity measures. But they – and the rest of the organization – do less well in monitoring and reviewing ratios that combine financial and nonfinancial data, especially where these involve key performance measures. Ratio analysis can help here.

It is particularly useful for assessing the efficiency of processes and the effectiveness of results, and at its core, business is about transmuting inputs into outputs, such as pounds or kilos of steel or direct labor hour per completed product unit. Indirect cost efficiency also can be measured as a ratio, such as the number of full-time equivalents (FTEs) employed per 1,000 invoices processed. Effectiveness can be measured by, for instance, the percentage of repeat customers, manufacturing defects per 100 units or, in customer support, the percentage of first-call resolutions.

Finance departments tend to focus on financial ratios and overlook operational ones, which may be viewed only by that specific part of the business. Thus, a periodic assessment of the profitability of a particular retail store may only include revenue and costs. However, without looking at the gross profit per sales employee and/or the average revenue per sales employee, it’s difficult to distinguish between the direct and indirect factors that are determining branch profitability.

Because they measure the relationship between inputs and results, ratios are especially useful as quantitative performance metrics. Potentially, there are thousands of these ratios that a company can use for setting objectives, monitoring results and assessing performance. However, it’s important to focus on the “key” performance ratios – those that have the greatest impact on the results of individuals, business units and the company as a whole. Companies can have a difficult time identifying their key factors. This is where driver-based modeling and planning come in because the process of creating these models sorts out the important from the marginal measures.

The use of advanced analytics is growing in importance as technology provides companies another way to achieve an edge on their competitors. At the same time, it’s critical that executives and managers build on the basics. If an organization cannot formulate the most important ratios that define business performance, and if it cannot readily access the data needed to perform this simple division, it’s unlikely to be able to handle large sets of data effectively and benefit from more advanced analytic techniques. Instead it is likely to wind up experiencing the “big garbage in, big garbage out” syndrome.


Robert Kugel – SVP Research

RSS Robert Kugel’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Twitter Updates


  • 127,122 hits
%d bloggers like this: