You are currently browsing the monthly archive for November 2011.

Right after I posted my blog about the dearth of useful content for the line-of-business and finance audience at this year’s Oracle Open World, I attended a truly useful session. (Of course, it had been shunted to the next-to-last time slot on the final day of the event.) It was a case study presented by AT&T’s tax group, discussing its use of Oracle Hyperion Financial Management to manage the corporation’s tax data.

The session focused on two of what I see as necessary ingredients for effective tax management – the people and information dimensions. While technology and process are also important components to promoting more effective management of the tax function, I’ve found that in midsize and larger organizations, deeper functional integration of tax activities into finance processes and better tax data management are bigger obstacles.

When it comes to the tax function, in most companies the “people” dimension almost always needs addressing. In most large organizations, the tax department is not well integrated with the rest of the organization – an issue I’ve commented on before. In this instance, I think one of the reasons behind AT&T’s success in improving tax management was the fact that the CFO (who previously was the controller) had a tax background so it did not get ignored. From the start, the effort was driven by cross-functional group including Tax, Finance and IT. Moreover, the integration of Tax into finance processes is deep: To eliminate unnecessary data complexity, AT&T tightly manages its chart of accounts (COA), and Tax participates in decisions.

Data is another major barrier companies run up against when trying to improve tax processes and operations such as provisioning and planning. AT&T made data governance was a key part of the project to streamline data collection for the tax function. Rationalization and maintenance of the chart of accounts are keys to managing finance department data. AT&T took the important step of making this investment in its data infrastructure, which pays off in time and money saved in just about any finance department process, not just tax. AT&T was able to cut the number of accounts by about two-thirds by aggressively consolidating wherever possible. Moreover, there is a tight review process whenever a new account is proposed that Tax participates in.

Another data-related rationalization was at the legal-entity level where the corporation had unnecessary complexity. Many midsize companies and most larger ones have complex legal-entity structures. This means they have many different types of entities (such as corporations, limited liability companies and partnerships) with different ownership structures (full or partial ownership and potentially cross-shareholdings).

The combination of a complex legal structure and poor data governance puts a heavy burden on the tax department to execute filings and audit defense because of the manual data collection and multiple reconciliations required to accurately translate the data into a taxable entity structure. This applies to direct taxes (income), indirect (such as sales and use or value-added tax) or property.

In this case both the complexity of the legal structure and chart of accounts issues were partly the legacy of the corporate amalgamation of several of the regional Bell operating companies (RBOCs) and the “old” AT&T. Still, it’s a problem that many larger companies face. Like the chart of accounts, creating new legal entities requires a review, effectively balancing the needs of various parts of the business.

In addition to addressing structural issues, creating a tax data warehouse of record is another component that companies may find useful. (AT&T did not set one up specifically for its project.)

The AT&T presenters indicated that the project enabled the team to achieve several important benefits. By streamlining data collection and making appropriate use of their technology infrastructure, they were able to accelerate the closing process and cut the burden on the tax department. Reconciliations were minimized, and Tax gained greater visibility into operations. Even little things like naming conventions and adjustment techniques were standardized to ensure consistency, improve accuracy and facilitate later analysis and justification of tax positions. And now because executing the mechanics of tax functions is easier, the company has been able to redeploy tax resources to higher-value planning and compliance efforts.

I don’t want to take anything away from the value that a proper application of Oracle’s software can generate, but I think the same benefits that AT&T achieved could be realized using other vendors’ products to automate the financial consolidation, close and external reporting functions. What’s most important is understanding and dealing with the data challenges that tax departments routinely grapple with as well as integrating Tax with the overall finance function. I believe virtually any large corporation (those with 1,000 or more employees, by our definition) will benefit substantially from a project to rationalize its tax, chart of accounts and legal entity data.

Regards,

Robert Kugel – SVP Research

Risk has always been an integral part of business, but dealing effectively with risk is a progression. Indeed, history shows businesses adapting and coping better with risk through innovation. The importance of using information technology to manage risk is growing because today’s systems can automatically measure and analyze a much broader set of risk factors than individuals can, and do so more reliably. But a key challenge companies face in implementing enterprise risk management is developing a process for defining and measuring risk.

The objective of enterprise risk management is to optimize risk. By that I mean defining an organization’s risk tolerance and taking steps to minimize risk within the context of its tolerance. Ideally, optimization is accomplished through a formal process of seven steps:

  1. Identification lists the relevant risks and defines their precursors. It answers these questions: What usually goes wrong? What is the source of the risk? What usually happens before something goes wrong?
  2. Analysis and quantification define the consequences of the risk, where the impact falls, and who controls the impact under which circumstances and estimates the cost and probability of the risk.
  3. Risk integration, a step specific to enterprise risk management, lists risks that are correlated across business units, identifies portfolio effects (where risks in individual business units may cancel each other out) and aggregates the risks within business units and across the enterprise.
  4. Assessment initially arrays the risks at the business unit level based on their cost and probability, refines those priorities at the business unit level based on management objectives and then further refines priorities at the corporate level.
  5. Response requires a company to address each of the identified risks. Some they may take steps to eliminate entirely because both the probability of this risk occurring is high and the consequences if it does are steep. In other cases, it can take steps to reduce the impact of a risk by narrowing the probability that it will occur or having responses in place to mitigate its impact. It can insure the risk fully or in part with third parties or self-insure it because of a cost/benefit calculation.
  6. Monitoring involves implementing continuous and consistent methods of tracking risks, reporting and alerting when these risk events (or their precursors) occur and measuring and assessing responses to them.
  7. Review is a periodic, fact-based secondary assessment because risks themselves are not static and all organizations learn from their successes and failures in identifying and dealing with risks.

This is a comprehensive model, but, alas, few corporations undertake this sort of rigorous risk management effort. Most set their risk parameters through a potpourri of explicit policies or more often by less formal means. And even in those cases, most companies don’t establish the appropriate metrics for these risks and therefore have a difficult time monitoring them.

Short of the major effort of overhauling a corporation’s attitudes and practices, the next best way to improve enterprise risk management is to focus on establishing key risk indicators on a bottom-up basis (defining risks and their appropriate metrics) and incorporating risk explicitly in performance management processes. Even without a rigorous, company-wide effort, companies should create key risk metrics for individuals and business units. Using them, executives and managers can assess performance of individuals or business units in a way that takes these risk metrics into account in determining how well they have performed.

“Risk-adjusted performance” is a concept central to investment management. Portfolio managers are assessed on their risk-adjusted returns, not their absolute returns, because they can show superior results by taking above-average risks – but usually only for a while. Risk-adjusted returns is a way of handicapping their performance so that the returns of those taking on average or even less risky investments are measured on a common scale with those that are making chancier bets.

Similarly, focusing only on business objectives without explicitly considering risk can produce results that are not in the best interest of senior executives, the business owners or employees as a whole, as I pointed out in an earlier blog.

Another contributing factor to the neglect of enterprise risk management is the absence of this important factor from purveyors of balanced scorecards. This technique emerged as a way to address the unintended negative consequences of simplistic performance measurement systems that focus on one or a few metrics. The scorecards are “balanced” because they incorporate metrics that model the kinds of trade-offs that intelligent executives or managers would want their direct reports to make. If, for example, call centers only measure call times, customer satisfaction will suffer because agents will attempt to get them off the phone as soon as possible, regardless of whether their questions have been answered or their issues have been addressed. A balanced scorecard therefore would include first-call-resolution percentage as a compensating metric to call times. Similarly, risk should be considered in assessing how well an individual or business unit has done. It provides a more balanced evaluation of performance and focuses individuals on key risks and their relative importance.

Most companies don’t need new software to implement enterprise risk management. Whatever systems they use to collect and report data will do the job of collecting and disseminating risk data and risk metrics. If they have a scorecard application, they can incorporate key risks into it. Implementing risk management requires executives to participate so the appropriate attention is paid to defining key risks, determining how to measure and monitor them, and ensuring complete data is available for this purpose. In good times, disasters and scandals only briefly raise awareness of dangers to the business. Challenging economic environments, such as the one we’re in today, tend to focus executives’ attention on risk. There’s no better time to deal with its implications.

Best regards,

Robert Kugel – SVP Research

RSS Robert Kugel’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Twitter Updates

Stats

  • 127,122 hits
%d bloggers like this: