"Web services is yet another example of potential complexity headed your way." ... "I have perused a number of books on the subject, but the one thing I never see discussed is how one makes a virtual application highly available. What are the implications for backup and recovery of data? How do we manage security permissions and authorizations? Clearly, with the added flexibility, we have also created a new set of obstacles to overcome."
He goes on to state:
I suspect we will continue to see a growing trend towards infrastructure rationalization (fewer kinds of things) and consolidation (fewer numbers of things) as a means to reduce complexity. This trend is both useful and necessary if we have any hope of utilizing some of these new computing paradigms.
Charles believes that Web services increases complexity. By no stretch of the imagination do I consider myself an expert in complexity, chaos, systemics or related concepts. However, I believe that the conclusion Charles reaches is incorrect.
His assumption is that we are adding new elements to a set - and as the set grows, so does the complexity of the set. His answer is to reduce the number of elements (rationalization and consolidation) and this is where we differ.
Enterprise software is going through a 'specialization' and 'externalization' movement. We are taking pieces of code that seemed unique and abstracting them into reusable concepts. We do this with design patterns, architectural patterns, domain specific languages, platforms, libraries and specialized engines. Each time we see these reusable items, we 'write them down' (or externalize them). The complexity in the system already existed; we merely found the pattern, factored out the common substrate and documented it. Externalization is a well-known mechanism for reducing complexity; you take that which was not understood and make it understood.
As we take our mountains of unintelligible 3GL code and move much of it into common infrastructure, we increase the amount of 'known' computing elements and decrease the amount of 'unknown'. This process has an odd effect. As the inventory of known elements grows so does the concern for 'having too much'. This leads to a human fear of 'large-set-complexity'. And the natural reaction is, yes - you guessed it, rationalization and consolidation. This is a human tendency whereby man attempts to diminish complexity by hiding it. We take those First Order computing elements that made our inventory list and reduce them back into 2nd order elements, thus removing them from our list. And somehow, we feel better that we defeated the enemy.
Specialization of vocabularies, models and elements is a natural progression of every science. However, specialization must be followed with abstractions, taxonomies and containment. In essence, we must not hide our new found computational elements, rather we must manage them.
Web services are an additional element to our computing infrastructure. From one vantage you can claim that they are just another element to the set increasing the complexity. However, Web services will help to reduce systems complexity by adding a new abstraction over our computing environment. Although the total number of elements increases, we find ourselves commoditizing those lower in the stack and working with those that are closer to the business value.
The enemy of SOA isn't complexity; it is a lack of understanding. We must not retreat from our sophisticated computing infrastructure. Instead we must use cost effective means to manage the piece-parts, identify the abstractions, create common groupings and externalize the system.
No comments:
Post a Comment