Leading the Acceleration of Digital Transformation – Part 4A: How Do We Get The Value For Our Investments?

This is the continuation of the fourth part of an ongoing series, where I’m using the framework of WHY, WHAT, HOW, WHO from my book Blind Spot: A Leader’s Guide To IT-Enabled Business Transformation – enhanced with stories and analogies – to help business leaders to understand the Digital and IT transformation journey in very simple business terms.

The crisis of the last few months should have clarified WHY we need to change, and that the time is NOW (Part 1). Only a few senior leaders are still wondering about or resisting that premise. One real challenge, for the entire executive team, is to think about WHAT will make us faster and more flexible, which is “Business Agility” (Part 2). Of equal importance this time, as I wrote in Part 3A and Part 3B of this series, HOW we architect and build modern systems matters – a lot – because we can’t have Business Agility if we keep adding to our Technical Rigidity. And, in order to build it right and see the Business Agility value come to fruition, we have to finance and manage our IT investments the right way.

Part 4A: How Do We Get The Value For Our Investments?

Most big, industry-leading companies have been around for many decades and have been embedding technology in their operation for 30-40 years. The way we historically planned, justified and architected our IT investments has played a huge role in WHAT we built, HOW we built it, and whether or not we realized the business value. Those financial practices have also contributed, in many cases, to our Business and Technical Rigidity, which is sometimes referred to as “Technical Debt”. Nobody set out to build this complexity. It just happened one era, one idea, one project, one downturn at a time.

A brief history of my “55 years of hard road” as an active, operating CIO and in working with many other CIOs and CFOs can illustrate the journey that got us here and show us what we learned that can be helpful going forward.

Generally Accepted Accounting Principles (GAAP) and Functional Excellence

Early in my career, I was very fortunate to work for two great companies, IBM and PepsiCo/Frito-Lay, that really invested in training and broadening the perspectives of their leaders and people in general. When I worked for IBM, they sent me to a Harvard Business School class to understand how CEOs and CFOs think about growing sales, reducing expenses, and creating shareholder value. Later, when I was CIO at Frito-Lay, I had to take a required (for all executives) class called “Finance for Non-Financial Leaders”. These two training events helped me understand, appreciate, and become very thoughtful about how to frame IT investments as part of a larger picture of how the business makes money.

The history of Finance and Accounting goes back thousands of years to ancient times. And, every few centuries, there were innovations in practices that, gradually and steadily, became more common. However, it wasn’t until the 1930s that Generally Accepted Accounting Principles (GAAP) was created. It was aimed at the wrongdoings and inconsistency of practices leading up to the market collapse of 1929. GAAP was meant to establish principles (consistency, prudence, good faith, materiality, etc.) and to require publicly traded companies to follow those principles.

For decades since then, GAAP has been consistently applied and measured. During the era of the great industrial expansion in the 20th century, among other benefits, these common practices and principles enabled companies to invest in building plants, distribution centers, fleets, and new product innovation in advance of actually seeing the return or value of those investments realized. This approach to investments was perfect for that era. But, to make it work, each investment also required “a deal” between the CEO, the CFO, the Board of Directors, and the functional/departmental leadership – “If I give you this capital investment now, how will you give me these financial and value returns in the future?”

That was a straight-forward question, easily answered by most of the “physical” functions – like Manufacturing, Distribution, Operations, and Stores – when they presented their business cases for new or modernized plants, distribution centers, locations, track, trucks, or planes. In addition to asking for investment in very obvious and tangible ASSETS, these functions had the advantages of being mature and having evolved over centuries. They were very familiar and comfortable concepts for senior executives, Boards of Directors, and shareholders to think about and bet on.

However, the same questions were very difficult to answer for Research & Development and Marketing investments. Although there were some facts and data involved, there was also a great deal of intuition and risk tolerance required in those areas. For these and other functions or professions that were new in the 20th century and still evolving, results – or returns on investments – were hard to predict and quantify. The same was true for the area of Information Technology as it emerged in the corporate world in the 1960s.

How Accounting Practices Shaped Our Legacy IT “Hairball”

When IT (or Data Processing as it was known at the time) was first introduced, it was treated differently than the investments in the durable, “physical” assets I just described. IT was declared an “overhead expense”, placed in the General & Administrative line of the P&L, and treated as an annually budgeted item. That made sense at the time because, in the 1960s and 1970s, the mainframe computer was basically the Accounting machine for most companies. For example, we were writing Payroll, Accounts Receivable, and Accounts Payable systems to replace, in many cases, entire floors, or even buildings full of clerical staff. The “deal” (“I’ll give you this now if you’ll give me that in the future”) was perfect and self-contained to G&A. The investments came from and the benefits accrued to the G&A part of the P&L and the CFO/Finance/Accounting part of the organization. Life was good and IT, in most companies, reported to the CFO. But this annually budgeted, G&A “overhead” expense treatment started to cause problems pretty soon thereafter.

The scope of IT quickly got wider and started to impact each of the multiple departments across the business. By the late 1970s and throughout the 1980s, the mid-tier computer was introduced to the corporate market and began to get applied in Manufacturing, Distribution, and Sales to enable productivity of Cost of Goods Sold and Selling Expense. The world got more complicated for IT investing because those P&L categories didn’t have an underlying expense line item for spending on technology. So, those department heads had to come to the IT department to request an investment in technology from the G&A “bucket” while expecting returns to show up in the COGS and/or Selling Expense “buckets”.

Many companies just got in the habit of saying “no”, during budget season, to more IT spend, because “overhead” (G&A) was generally perceived as “bad” by Wall Street – and therefore the Board of Directors. In many of those companies, the IT “overhead” group was excluded from the conversation about technology in, for example, Manufacturing because “we can’t let the overhead expenses grow”. This financial approach caused the proliferation of technology. Groups like Manufacturing (and others), were influenced and enabled by the financial system in their companies to make the investments themselves – one decision at a time for one plant at a time. And, in the short-term, most were successful at bending the labor and facility cost curve for each of their, largely stand-alone, plants.

But, with each department making their own technology decisions and purchases within their “budgets” or P&L line items, and IT getting left out of even the technical or vendor conversations, those early moves started to create some unintended consequences of the unmanaged proliferation of heterogeneous systems and data.

That led to real complexity by the mid-1980s and the 1990s when, for example at Frito-Lay, nationwide growth and increasing desire for operational and financial leverage required all plants to be connected to each other, to all distribution centers, and to all sales locations in a network. And, everything had to also be connected to the original accounting systems. This was just an example of a pattern that started to emerge in big corporations. Although still important and necessary, excellence within one function or department was no longer sufficient. Continuing to gain efficiency, expand margins, compete, and grow, was beginning to also require cross-functional and cross-business-unit integration. The challenge we had with that requirement was that there were no department heads that really owned and were held accountable across functions. The exceptions were only Finance and IT. The pressure to integrate systems across functional silos was mounting, but our approach to budgeting and accounting – departmental cost centers, IT as G&A “overhead”, and Annual Operating Plans (AOP) – made it difficult to architect and deliver those systems the right way.

In that same timeframe, the Personal Computer (PC) was introduced, and impatient business teams hired vendors and built “lightweight, little” PC-based systems. They could innovate quickly and inexpensively because, now, they could use their own budgets to buy their own computers for just a few thousand dollars. With the much lower price point of PCs and LANs (Local Area Networks), many more departments and even smaller teams got into the game with technology independence. Now, they didn’t even have to go to Finance for permission, and they certainly didn’t have to collaborate with IT on those urgent, smaller-scope technology solutions they wanted. There was a rapidly growing autonomy and expectation of speed.

But, again, the problem or collateral effect was caused by their eventual need to “plug into” (share data into and pull data from) the main IT systems – which were now becoming a big tangled “hairball” of one point-to-point connection at a time. The evolution, one decision at a time, in this era, created the first round of legacy complexity, cost, and rigidity.

New Technology + Same Accounting Practices = More Complexity, Rigidity, Risk

That legacy complexity continued to grow over time through subsequent eras as we experienced and tried to benefit from accelerating technology innovation with very little change in our financial approaches rooted in GAAP and short-term P&L and budget management.

By the mid-1990s, it became clear that most companies were leveraging technology in ways that went way beyond automating and streamlining other G&A functions. The big money in IT was now being spent on Operations, Supply Chain, and Distribution. We had systems to run those functions and make them each, independently, functionally excellent in their silos. But we had gotten there one good idea at a time, from one vendor’s product at a time, with one project at a time. Then, in order to really gain efficiency and scale from the cross-functional integration that had now become a standard business strategy for most companies, we had attempted to connect all of these independently created systems with a point-to-point (or “hairball”) architecture. The resulting rigidity, difficulty of change, fragility, and cost were mounting. Also, as we approached the late 1990s, Y2K fears began to really bring the risks we had inadvertently created to the forefront of everyone’s mind. This spawned the “let’s replace everything with a big ERP that promises the world” era. Those big systems vendors promised to provide a silver-bullet solution to the Y2K transition AND a seamlessly integrated cross-functional solution for the operations and financial parts of the business. But those big ERP system replacement investments didn’t work out so well for most companies. Those projects themselves were beasts – cost overruns, timeline delays, short-cuts to meet deadlines, too much customization to try to match rich IP embedded in old home-grown systems, and more “hotwiring” to connect the big new systems with the existing functional systems and point-to-point connections that didn’t go away.

As we moved into the 2000s, the focus of much of the new technology investment in most companies shifted externally to customers and suppliers. With the growth of the internet and (B2C and B2B) e-commerce, faster networks, and smaller devices, the application of technology needed to expand beyond the four walls of a corporation. This meant we had to now “plug in” or “hotwire” the diverse systems and data from other companies and even from consumers. Also, as the competitive battleground shifted more and more to customer experience and service, we had to continue to get better and better at cross-functional integration.

Technology had become pervasive within companies and across company lines into ecosystems. Yet, most CIOs and CFOs continued to deal with financial planning with the same short-term (annual) and bottom-up (project-by-project, department-by-department) approaches and with the same accounting classifications (IT = G&A “overhead”) that started and made sense in the 1960s. The short-cuts kept happening; the “hairball” kept growing; the speed-to-market got slower; and, the cost of IT kept rising.

To combat the rising costs of technology, they formed Shared Services organizations and moved to offshore and/or outsource (“your mess for less”) IT. They were looking for leverage and productivity, but they just got temporary reductions in cost that eventually grew back to their original levels and then even began to escalate again. This happened because those financial and organizational solutions – on their own – didn’t deal with the underlying systems complexity that had been caused by decades of “architecture by project and AOP budgeting” and continued to grow because the approach to IT investments had not fundamentally changed.

The real breakout for technology came with the introduction of the iPhone in 2007. This signaled the end of the Industrial Era and the beginning of the Digital Era. Now, “sense-and-respond” IT has become truly pervasive and is critical to thriving in the future. Most big strategies – Customer Experience, Omnichannel, New Product Innovation, M&A, Geographic Expansion, Bending the Cost Curve, etc. – are all about how a company works and creates value across functions. The global, consumer, and ecosystem driven economy is being built on sensors, robotics, big data, GPS, and smart devices. We are also in an era of continuous transformation, and we need to sustain investment in ongoing change over multiple years, indefinitely into the future.

Digital native companies – like Uber – didn’t have to build up complex, expensive-to-maintain, rigid, and risky legacy bases because they didn’t live and grow through the eras I’ve described here. But the larger, more mature, incumbent industry leaders have that history and that burden to deal with. So, we have to do things differently as we build this next era of modern, digital capabilities. We can’t integrate across functional silos with “hairball architecture” of the past, nor can we build those new things on top of our old legacy bases without modernizing what we have so that it all works together – at speed and scale. The simplification and modernization we must do will take some investment, but that investment should, like any other investment, show returns over time. If done the right way, dealing with our legacy systems should yield IT productivity (speed, cost efficiency) that will reduce Run or Baseline costs and free up more of our IT dollars to focus on truly net-new and strategic capabilities.

The big strategies are “cross-functional”, business and technical “agility”, and “continuous transformation”. Our systems need to reflect and enable those strategies. In order for that to happen, we have to continue to evolve our financial approach to HOW we think about investing in IT and realizing the expected value of those investments.

Technology Change Starts with New Approaches for HOW to Invest in IT

If we keep doing what we’re doing, we’re going to keep getting what we’re getting.

It’s past time to evolve our Industrial Era version of accounting for IT as “overhead” and of short-term, project-by-project, bottom-up IT budgeting on an annual basis. As IT has moved from a subset of G&A focused on automating Finance and Accounting systems, to an underlying “Digital Fabric” enabling modern, digital business, I see a lot fewer CFOs using the old G&A model of thought for IT spending. Progressive CFOs, CEOs, and Boards are open to thinking about IT investments in a new way.

However, most companies we meet today are still struggling with finding the right modern economic or financial model to use for IT or technology. Maybe the pendulum has even swung too far to the other side. In fact, I see CEOs and CFOs willing to make big IT investments, kick-off several major initiatives, ask and try to answer the tough questions about what “digital” means to their business, and build organizational capabilities. Many companies are adding Chiefs (Chief Digital Officer, Chief Product Officer, Chief Customer Officer) to try to find their way. As we work with clients, and meet business and IT leaders in our leadership development classes, we often hear stories of as many as 40-50 concurrent “transformations” underway – Omnichannel, Blockchain, Cloud, Scaled Agile, Artificial Intelligence, Offshoring, Outsourcing, and so on – but not planned or organized in any coordinated or properly architected or sequenced way. So, those executives who are willing to spend, still have concerns about whether they are spending the money the right or best way and whether or not they will see the value they really need.

This is where the true partnership of the CIO and the CFO is so critical. This is not about the CIO talking the CFO into spending more money on IT – most companies spend enough or even more than enough. We’re just not spending it in the right way.

I believe that we already have a model for HOW to invest in IT so that we can achieve the Business Outcomes, Architectural/Technical Outcomes, and Organizational/Productivity Outcomes that we need in the Modern Era. We just need to look at our tried and true approaches to investing and getting value from our traditional, durable, core assets like plants, trucks, stores, planes, and facilities. IT, now and going forward, is and should be planned and invested in as a durable, core, pervasive-fabric-of-the-business asset. In my next blog post in this series, I’ll share ideas, proven practices, stories, and analogies that should help CIOs, CFOs, CEOs, and Boards align on this better approach.

Author: Charlie Feld, Founder and CEO, The Feld Group Institute

 


 

Further Learning

If you are interested in learning more about this topic, additional resources can be found on our site, including Leading The Acceleration Of Digital Transformation – Part 1: The Time Is Now, Leading The Acceleration Of Digital Transformation: The Only Real Strategy Is Sense-And-Respond At Speed And Scale – Part 2: The WHAT Is Business Agility, Leading the Acceleration of Digital Transformation – Part 3A: The How – Built to Last Because It’s Designed for Change, and Leading the Acceleration of Digital Transformation – Part 3B: The How – Built to Last Because It’s Designed for Change.

 


 

SHARE THIS

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn

More to Explore

Express Interest

I'm interested in getting more information about the attending a workshop.
  • This field is for validation purposes and should be left unchanged.

Express Interest

I'm interested in learning more about The Feld Group Institute Consulting Services.
  • This field is for validation purposes and should be left unchanged.

Express Interest

| LT

I'm interested in getting more information about the LT course.
  • This field is for validation purposes and should be left unchanged.

LT2

TBD

Express Interest

| EA

I'm interested in getting more information about the EA course.
  • This field is for validation purposes and should be left unchanged.

EA9

Dates: TBD
Virtual Classroom

Express Interest

| TLD

I'm interested in getting more information about the TLD course.
  • This field is for validation purposes and should be left unchanged.

TLD13

Dates: TBD
Virtual Classroom

Russell Villemez

Affiliate, The Feld Group Institute

Head of Technology Strategy, Dialexa – a Feld Group Institute partner

Highly regarded CTO and change agent with IT strategy and enterprise architecture expertise.

Russell Villemez is an Affiliate with the Feld Group Institute and the head of Technology Strategy at Dialexa, a Technology Research, Design and Creation firm that works with organizations on initiatives such as Operational Transformation, Business Growth, and New Venture Creation.

During 17 years in operational roles and 15 years in consulting roles, Russell has worked across a variety of industries in both executive leadership positions and as a subject matter expert. Russell thrives on the scale and complexity of leading major change agendas in large corporate environments.

Recent consulting clients include AmerisourceBergen, the American Automobile Association, Brinker International, Cubic, Equifax, and Cox Automotive. A common thread is the client’s need for strong leadership during a period of change—whether motivated by acquisitions, spin-offs, competitive pressures, or other factors. Clients also benefit from Russell’s expertise in enterprise architecture, agile development, application portfolio rationalization, technology and architecture strategy, as well as business strategy and commercial software product development.

Recognized as a versatile IT executive, adept at solving complex problems with innovative solutions, Russell’s capabilities and achievements span a continuum from business-strategy formation to hands-on IT solution development. His extensive career achievements include pioneering the first use of relational databases in high-volume transaction systems in the ‘80s, applying voice recognition DSPs in public intelligent network services for consumer markets in the ‘90s, and leading large-scale adoptions of open systems, object technology, and middleware frameworks in complex business environments, often in advance of commercially available software products.

Prior to joining Dialexa, Russell served at HP as Enterprise Services Chief Technology Officer for the Americas, leading a global capability for embedded Account CTOs in large enterprises. Russell began his career at Accenture, where he first crafted his consultative problem-solving approach, later honed at A.T. Kearny and the Feld Group. Russell’s deep telecom experience is built upon numerous director and enterprise architect positions at AT&T, Bell Atlantic, Telstra, US West, Pacific Bell, and Sprint, and as V.P. and CIO for WebLink Wireless.

Russell has a BS in Business Administration from Louisiana State University and an MBA from Vanderbilt University. In his spare time, Russell participates in amateur auto racing, and is a driving instructor with the Porsche Club of America.