Home | About | Recent Issue | Archives | Events | Jobs | Subscribe | ContactBookmark The Sterling Report


   

Will the enterprise market spend significant IT budget on Windows Vista in 2007?

Yes

No


The Future of Software

By Michael Tanner, Managing Director, The Chasm Group

Congratulations! By adapting to another post-bubble year and thriving through a difficult economic period you have undoubtedly become stronger in many ways. By making thoughtful and serious changes to your business, you’ve set the stage for continued growth as the economy recovers. If you are like a lot of people I speak with daily, you are now optimistic, energized and hopeful about the year ahead! But you also have some trepidation about the short-term impact that comes from STINKO (Security, Terrorism, Iraq, North Korea and Oil), and wonder when IT spending and the outlook for the software industry overall will improve. When will things just start to get easier?

Putting aside economic uncertainty for a moment, there is a macro-level change occurring in the software industry that we all need to be thinking about. A transformation is happening before our eyes, but because it is occurring relatively slowly it is difficult to see. I believe that many executives in our industry have failed to yet reckon with this change. Understanding the implications of this transformation will start to answer the question about “when things will get easier.” Unlike STINKO, the likely results are much more predictable, and in-fact can already be seen to affect most of our businesses right now. The transformation I am referring to is the maturing of our industry from the “go-go” hyper-growth stage to one that is beginning to look increasingly like other mainstream businesses. The stage we are performing on during the next few years will be increasingly defined by this change.

In the past, each time we approached a semi-stable state, Moore’s Law saved us by thankfully converting the software industry back into total chaos. Faster processors, storage and memory all created opportunity for each new technology wave to re-invent the software business from the bottom of the IT stack upwards. Computing power was rarely enough to solve the business problems of the moment, so each new wave created an opportunity to expand into yet another new IT infrastructure using the latest technology. When customers implemented a new solution, they necessarily built upon some level of proprietary user interfaces, databases, middleware and applications. So the costs to switch vendors after wave of change became increasingly high, and the greater the number of users adopting, the higher the customer’s cost to switch became. More importantly, the control that each vendor maintained over the customer’s IT environment, and the security that a customer would not easily change vendors kept software margins and prices high - even though the actual deliverable to the customer was largely on a tape, disk or CD with an actual unit cost of just a few dollars. This in-turn fueled the software vendor community’s ability to spend on R&D to complete the cycle of growth.

Unfortunately, in the transformed world to come Moore’s Law may not save us. With rare exception, we’ve reached the point where faster processors, faster displays and more disk space are no longer an absolute urgent barrier to getting our jobs done. For most of us, what we have in the way of computer horsepower is increasingly just fine. Beyond this fact, as my colleague Geoffrey Moore recently pointed out in an article for “Global Agenda- The Magazine of the World Economic Forum,” the economics of chip/semiconductor design may be slowing the Moore’s law phenomenon down faster than the technology itself. According to Moore (Geoffrey), the economic cost of designing masks for newer, faster chips has gotten to the point where it has become a barrier to coming out with newer semiconductor designs. There just are not that many markets that are big enough to support investment into faster and faster processing when most of us have pretty darn good solutions already. So even if the technology related to Moore’s (Gordon) Law will go on for another 10 years, the economics are an increasing impediment.

Next, the level of abstraction in software architecture, reusability and standardization has reached the point where increasing commoditization of software is almost certainly predictable. It used to be that the user–interface itself was a big switching cost and inspired customer loyalty. Today in a web environment the user interface is pretty consistent from an end-user perspective. It used to be that developing an enterprise application required substantial investment into a scalable platform architecture for deployment. Today, the emerging oligarchy among just a few industry titans, coupled with standards for application development and data integration continue to simplify the number of platform choices dramatically. Moreover, the emergence of just a few market leading platform vendors who maintain a complete enabling-technology stack has lowered the cost of entry for new application developers relative to just a few years ago.

As vendors in the ruling oligarchy continue to combine integration, user interface, database, wireless, business process management, development tools, middleware, etc. into their core, the entry cost to application developers, as well as their relative differentiation from one another has dwindled. And as the application providers themselves march head-on towards a web-services world, the ability for end-users to switch in and out of the modules/services provided become increasingly easy, thus continuing to make technology differentiation difficult to see relative to the past.

Another major change is on the end-user side of the business. The very switching costs that protected the software industry in the past now tend to make IT professionals want to leverage the investments they’ve already made, rather than to replace them with the newest technology. What saved us in the recent past was a Y2K budget that allowed IT departments to bury scads of money that could be repurposed out in smaller chunks to new programs. Then, “the Internet” saved us as investor hype fueled corporate mandates to “get it,” which in-turn fueled increased IT budgets downward. But today (with the possible exception of a few categories like security and storage) there is no major external forcing function for such technology replacement other than good old fashioned return on investment. So entrenched suppliers are gaining a larger share of the budget while new vendors compete aggressively for what’s left over.

The software world has traditionally viewed “killer apps” as the way to deploy infrastructure. The bet was that the customer would undoubtedly standardize over time and that the killer app would be just the first of many applications to then be built on a single infrastructure going forward. This "stickiness" would ensure a relatively high margin business on an ongoing basis. But customers, unlike vendors, have always seen their environment as consisting of multiple vendors, each vying to become the “enterprise bus” that ties things together, and each trying to get their own versions of interfaces adopted so as to increase their own pervasiveness. As the web-services architecture and open standards become an increasing reality, the services/applications will become increasingly interchangeable. The ability to develop new web services based upon reusable technology will also increase. Basically, the ongoing march towards standardizing enterprise software IT architecture will have the effect of removing much of the stickiness that protected software vendors and kept their margins high.

While all this seems like great news for users/buyers, it should send shivers up the spines of many software company CEOs. I believe it already has for many in the investment community. Lower entry costs, lower switching costs, less willingness to replace existing IT investments, common and interchangeable user interfaces, increasing standardization of the enabling technology to a few oligarchs, open data standards, a common application integration ability all point to maturation and consolidation. For many new software businesses, this means the opportunities have become increasingly niche-market oriented and hence less attractive as venture investments.



...backmore...



  Home | About | Recent Issue | Archives | Events | Jobs | Subscribe | Contact | Terms of Agreement
© 2006 The Sterling Report. All rights reserved.