<$BlogRSDUrl$>

Hope for the Future

Developer Market News - February 2003 reprint

Thursday, August 07, 2003

I authored the following article for CMP Media (c) 2002-2003 which appeared in the Developer Market News, February 2003 issue. Access to the newsletter is limited, but this appears with permission.

Software’s Quest for Quality

by Jim Sherburne


Hope for the Future


What’s Missing?

What we are talking about is a need for fundamental changes in how software is written that will allow organizations to survive as a business. The weak link in all the efforts to date has been the need for human intervention into an inherently complex and ever changing process.

So what’s the answer? At least two possibilities recommend themselves:

· Autonomic software – Software that corrects itself, again with no human intervention. This has received some attention fairly recently, but is it an answer?
· Intelligent, automated tools or processes that remove, or at least minimize human intervention.

Autonomic Software

Webster defines autonomic as “acting or occurring involuntarily…” This is a very new concept for software, but one that holds a lot of promise for the future. Late last year IBM formed something called the Autonomic Computing Organization. This concept introduces self-configuring, self-healing, self-optimizing and self-protecting capabilities to an entire IT system. Releases of their Tivoli division identity and storage management made at around the same time introduce some limited self-healing properties. This sounds great, but I think that it potentially has a lot of baggage attached.

Like every other new software concept since the dawn of the information age, it has a long way to go. Even if we assume that the concept is eventually perfected, it will never be a complete answer. I actually think it represents a very real area for concern. If a self-healing, - correcting, -optimizing, etc., etc. mechanism were functioning with a set of flawed assumptions, then I would submit you’re actually worse off than you were before. There is nothing autonomic about the software being used to create this software in the first place. If there were that would raise some intriguing new possibilities! That means you could end up with a piece of software that you assume is going to somehow take care of any “issues” that it encounters, but doesn’t. Not good.

IBM is certainly not alone in this area. There is little question that the competition will be joined with other major players like Sun and HP. If we’re not careful, we could end up with a whole new category of software to get our customers in trouble.

Intelligent Tools

I think the only real answer is a set of intelligent tools and processes. The good news is that there are actually a few out there today that can legitimately be categorized in this manner. I am not going to pass judgment or try and rate the validity of those claims here. There are a number of software vendors who play in this space with varying degrees of success. I’ve had personal experience with a number of them.

The real issue is not with the tools. The fundamental problem with any tool, whether it’s a hacksaw or software is that if you don’t use it, and use it properly, it’s not going to do you a lot of good. That has been reality in our industry since the beginning when it comes to systematic quality management. The problem is even more soluble today. The latest generation of some of these tools is actually beginning to deliver on this promise of automation and “intelligence”. The advances have been incremental, and have reached a level of some maturity.

Why Now?

So why don’t IT departments use these tools and processes if they’re out there? Historically the reason was fairly straightforward – customers didn’t demand it. It’s akin to someone buying an automobile and saying, “Oh well, so what if the transmission doesn’t work, at least it runs!” It sounds silly to describe this way, but I’ve seen this kind of thing happen over and over. The demand for technology solutions was so great that as long as you addressed even the smallest element of the customer’s issues, they were willing to forgive an enormous number of omissions or failures in other areas. Besides, the IT vendors were always promising a seemingly unending stream of new software upgrades, new tools and new processes. Some of them even worked!

The good news here is that we’re in a recession. I know that seems perverse, but in this case it’s true. When economic prospects are gloomy, business tends to re-trench. New spending on things like IT is scarce, and the concentration is on optimizing existing resources. If there is any new spending, it will need to be cost justified in the strongest terms. Faulty product is not going to be tolerated in this kind of an economic climate – and that’s exactly what’s happening. Customers are pushing back, in many cases for the first time ever, and the industry is paying attention.

Previous attempts to apply rigorous analysis and discipline to the task of creating software applications were greeted with limited success. As I discussed in previous installments to this series, efforts like CMM & ISO are themselves labor intensive and introduce additional levels of complexity into an already complex process.

Mature industries like manufacturing found their answers in the works of W. Edwards Deming, first published in 1931. Deming was responsible for a quality management revolution of production processes in Japan and the Western world.

I’m not aware of anyone what approaches the stature of a Deming in the software industry. What we do have today is a collection of tools and nascent processes that provide a solid foundation for bringing a reasonable level of quality management to our industry. Some will consider it kind of boring but - We’re actually growing up!




RETURN TO RESUME

RETURN TO PROFESSIONAL BIOGRAPHY
posted by Jim Sherburne  # 3:32 PM

Archives

08/01/2003 - 09/01/2003  

This page is powered by Blogger. Isn't yours?