|Home - Industry Article - Feb 06 Issue
When Best Practices Aren't Good Enough
By Kay Hammer, Founder and Former President & CEO, Evolutionary Technologies International, Inc.
To the average person, it seems like gross negligence that FEMA can't figure out how to route ice to the site of a hurricane when the navigational system in a rental car can adapt in real-time when a driver takes a wrong turn. Yet it is important to notice that the most dramatic breakthroughs in software over the past decade -- e.g., Google or GPS technology -- have not taken place in traditional IT shops, despite the fact that there are pressing needs like Homeland Security and SOX compliance that would constitute a huge market for new solutions.
In fact, the need for innovative solutions in these more "traditional" IT shops is serious enough that integration is now seen as one of the hottest areas in enterprise software, with service-oriented architecture (SOA) constituting the most promising solution at hand. SOA is appealing because it can avail itself both of the speed and ubiquity of the Internet and the "publish and subscribe" model. With an SOA, all software applications are treated as different types of services that can be consumed by other applications.
SOA is an elegant architecture and if successful, should maximize a more conventional organization's ability to re-use its existing software and greatly improve the auditability of how its IT systems enforce business processes. However, for SOA to work, developers must understand an organization's legacy systems, and according to IDC, this lack of understanding has historically been one of the largest reasons for the failure of IT initiatives.
Since integration is required for every IT initiative -- conventional or innovative -- it is unfortunate that one or more of the large consulting or integrator firms hasn't developed a set of "best practices" for enterprise integration in the same way there are best practices for data warehousing or implementing packaged applications like SAP.
There are a number of reasons why integration has eluded the creation of a discipline, and only a few of them have to do with software; the rest derive from the fact that organizations consistently choose to make the same trade-offs that created the integration problem in the first place. To understand why these choices continue to be made, we need to consider a broad array of topics such as the nature of computing and how that relates to the way people think; the business drivers that force "penny-wise and pound-foolish decisions;" and how people are compensated for the work they do.
Finally, it is up to the organizations that suffer from these challenges to develop an internal discipline that constrains IT funding to solutions that ultimately will make integration as straightforward as other types of system administration.
The Nature of Computing
Computers represent symbolic information. In other words, the meaning of data stored on a disk or tape is not inherent, but depends on the meaning assigned to that data by the program or human using it. For example, if a database field is labeled SALARY and has a value of 120,000, what does this mean? Is it an annual salary? That might be the best guess for a US reader, but what if the currency isn't the US dollar or euro, but the Iraqi dinar?
In short, before developers can integrate an application, they must understand the meaning of the data in one or more source databases so that they can correlate it to the meaning of the data expected by some other application(s). Sometimes, this information is documented in a data dictionary or within comments embedded in the applications themselves but as often as not, it is tribal -- e.g., "Oh, we just put in a -1 if we don't know the person's job code."
Moreover, for an integration initiative to be successful, the developers" understanding of the source data must be precise. The reason that we use computers is for their speed and accuracy -- the fact they can handle levels of complexity (both in volume and algorithm) far beyond human capability. In fact, most IT applications in large organizations are extremely complex, sometimes involving thousands of data elements.
The Way People Think: Limitations to the Interaction of Man and Machine
The major difficulty with integration derives from the fact that humans and computers differ in their processing skills. The easiest way to characterize this difference is to say that humans are analog and computers are digital. As evidenced by findings in developmental psychology and psycholinguistics, humans are inherently good at pattern matching and limited in their short-term memory unless they can assign structure to information (called "chunking"). Computers, on the other hand, are better at computation (no surprise) than pattern recognition. Consequently, the bulk of the analysis required for integration falls on the human.
This analysis is complicated by the fact that most integration initiatives also entail a cultural divide on the part of the people who know and understand the (usually more traditional) data sources and those who are familiar with the data model and new technology for the target side.
This skills gap is significant in two ways.
Penny-wise and Pound-foolish
Organizations are driven by budget constraints. Consequently, though operational IT is a necessary overhead, strategic or innovative initiatives are usually undertaken only when a situation either has reached critical proportions (e.g., when eroding profits require reductions) or the utilization of some disruptive technology is required to stay competitive (e.g., maintaining a web presence). Then, the task of upgrading and consolidating information systems adds risk and cost to the IT initiative but brings no short-term benefits to the business.
- First, many of the individuals who understand legacy environments are now at the age of retirement, and many of the recent college graduates have little or no expertise in anything but Java and SQL.
- Second, because many senior technical consultants and/or system integrators are not well grounded in legacy environments, they fail to anticipate many of the difficulties in data integration. This often results in schedule and cost overruns in delivering strategic applications.
As a result, unless it's absolutely required, most organizations favor doing as little as possible to disturb what is currently working. Even in the midst of a major acquisition -- often undertaken to benefit from economy of scale through the elimination of redundancies -- companies tend to favor using a data warehouse for a consolidated view of the combined organization rather than consolidating applications and data centers to reduce operating expenses. While this kind of decision makes sense in the moment, over time it is analogous to the gradual weight gain that leads to a dangerous level of obesity.
For example, consider the pharmaceutical company that -- after a number of mergers and acquisitions -- had 15,000 UNIX servers in their combined R&D organizations where the average utilization of each server was 30%. In this case, they could eliminate the cost of maintaining one third of their servers as well as the redundant software on those servers (both in fees to the vendor and internal time spent on system administration) and still have 50% more processing capacity than they need.
Compensation and Conflicts of Interest
There's another reason that companies favor the status quo: conflicts of interest that grow out of the way employees are recognized and rewarded. Individual technical contributors, who these days are often either on-site or off-shore consultants, are rewarded for delivering a project on time with little or no concern for whether the application will scale effectively or be easy to maintain. Because of this reward system, they tend to stick to what they've done traditionally rather than advocate a different approach or innovative technology that would provide superior long-term benefits.