Software Obsolescence – Is it Inevitable?

Every item of commercial software produced is the result of a sustained and expensive development process and yet it is sold in a volatile market in which the sales window may be measured in months.  Customers demand upgrades and enhancements and competitors are ever-willing to encroach on market share.  In no time at all, software products can become obsolete.  But does it have to be that way?

All software designers are keens to future-proof their products as much as possible but they all recognize the difficulty of predicting the future.  There are very many possible futures and building in future-proofing can also mean backing the wrong one.  And the costs involved in making the product adaptable, capable of evolution in the market and responsive to changing customer demands, can mean that the product life has to be greatly extended to recoup the investment through sales.

software-obsolescence-7996141

The technical problem is one of abstraction.  How can we separate out those aspects of the product which are likely to change frequently, perhaps in substantial ways in response to customer and market needs, from those which can be kept stable and unchanging during the longer life of the product?

Building software using a framework in which components can be placed to meet changing customer needs is far from a new idea.  Component technology has been around for decades but it has always failed to live up to its promises.  Components inevitably need to interact and the coupling we end up with often compromises those beautifully modular designs.

So even building on a framework is no guarantee of a simple solution to the problem of obsolescence.  The framework itself suffers from obsolescence pressure.  Libraries and collections of components evolve as the problem domains are better understood and the technologies are fine-tuned to be better performing, and there is a significant cost even in maintaining a stable application framework.

And the frameworks themselves also have to undergo changes often in response to the capabilities of new hardware.  Sometimes it is the manufacturers of the frameworks themselves who drive the change. The various incarnations of the .NET platform and the attempts to preserve interoperability with previous generations of code, have led to substantially increased complexity.

By modifying the existing architecture and code, we will meet current customer needs but at the expense of gradually increased complexity.  Despite the good intentions, constant re-factoring of the underlying code is not a commercial reality and so there is an understandable tendency to add rather than substitute.  Interfaces proliferate, additional components appear, supplementary functionality is introduced.

Brian Foote and Josef Joder described (from a suggestion by Brian Marick) such a system as a “big ball of mud” showing how with the best will in the world, constant updates and fixes can obscure an elegant design under layers of coding which eventually make it impossible to repair and develop further.

But there are some strategies which can reduce the risk of a system becoming a big ball of mud.

Evolving Architecture

We might think that a complex piece of software has to have a clear, well-defined design on which the functionality can rest and to a point that is true.  But it is a mistake to think that the architecture is somehow immune from the process of change.

Often no-one knows the correct architecture for a large-scale application until a substantial amount of code is working.  Only when the problem domain is already very well-understood will there be that knowledge of architectures that work, so in many cases it is premature to settle on a fixed architecture in a novel environment.

So architectures will change and the design of the architecture of the initial development should permit that.  High cohesion and low coupling between modules will assist this process.  In the early stages, complexity is the enemy because although highly skilled engineers can create complexity rapidly, they generally do not have time to explain it.

Where the architecture is expected to evolve, it has to be simple enough for this process to be managed.

Layering to Represent Abstraction

Abstraction is used to remove the specific in order to focus more effectively on the general and in software that means hiding more and more specific aspects of the system.  The user interface should not care about the machine’s processor or even ideally, the operating system.

By abstracting the details away from the higher levels of the software, any machine or OS-specific details are insulated from the upper levels of functionality.  This has the significant benefit that changes in user requirements can often be implemented at the higher levels without changing the lower levels.

A layered approach to software development has been common for many years but frequently the demands of performance and usability have required compromises.   The promises of cross-platform development have often been tempered with the real-world needs to gain performance advantage by tapping into the native OS functionality.  True platform independence has still not been reached even with the much-vaunted .NET framework and the Java language.

Hardware-induced Obsolescence

Advances in hardware design such as touch-screen technology, drive changing user expectations.  Users expect to be able to use very different methods for controlling software, and expect it to run in on a variety of devices.  Their paradigms of computer control shift noticeably from year to year so frameworks have to be portable and so must the functionality.

By abstracting the framework on which applications rest from the applications themselves, portability is made easier but this imposes constraints as well on the underlying code.

A framework has been able to call on uniform facilities from the underlying operating system and these will change rapidly with developing technology.  We therefore see a repetition of the same problem: how do we prevent the underlying frameworks from experiencing obsolescence as well?

The layering and abstraction which temporarily solved application software’s problem, is now also essential for the underlying frameworks.

Break Away from the Hardware

With Web 2.0 and the increasing use of cloud computing, we see a separation of the services provided to applications from the platform on which they run.  A small device can run an application making use of the prodigious power of a large collection of other machines, all delivering their services on demand.

Does this mean that the localized, computer application is condemned to obsolescence?  Probably not, because there will always be a need for high performance local applications able to work independently of any connectivity.  But it does mean that the necessarily rapid response to user demands is more likely to come from distributed computing, and the provision of services from remote sources.

The abstraction which led to layering in application architecture has reached the hardware level as well, in what may seem almost like a full circle.  Fifty years ago it was common for a dumb terminal to connect to a mainframe computer which would provide all of the processing power.  Now a simple but reasonably powerful handheld device can connect to an abstract cloud of processing resources.

Obsolescence in software may be kept at bay by the adoption of new ways of delivering functionality, the abstractions not simply being the separation of layers of software, but of lifting them now onto a cloud of disparate platforms in different locations.  Platform independence might yet become a reality.

About The Writer

Henrik Larsson owner of a new application that enables users to install additional Facebook Emoticons writes. Emoinstaller enables users to use additional Facebook chat emoticons.

Tags: #Theory