The Cloud Effect – Enterprises Adopting Internet Strategies

The Cloud Effect – Enterprises Adopting Internet Strategies

1 – The Changing character Of Enterprise IT

1.1 – Tear Down This Wall

The last time a major wall came down, it was Berlin in 1989; the wall was known simply as the ‘Berlin Wall’; and it had divided a country, its resident families, and friends for over 28 years. Just prior to the destruction of the wall there was euphoria all around, a whiff of freedom in the air, and a tear in just about everyone’s eye. The impact of the wall’s takedown was enormous and affected over 75 million people living all over the country. The consequence of this historical moment was unheard of as Germany rose from the ashes to become a major world strength.

It’s been nearly 22 years since that monumental event, and now it’s time for another. This time however, the wall is the ‘enterprise firewall’, and it has separated the enterprise computing systems from the outside world of the web for far too long. This time too, in anticipation of the wall’s teardown, the air is electric, the possibilities seem endless and the possible feels completely untapped. The impact of this wall’s takedown will be humongous and will affect billions of people living all over the world. The consequence of this upcoming occurrence – brace yourselves, cause we are about to find out!

1.2 – A Giant jump For Mankind

Lately, the blogosphere has been hijacked by an era of ‘Acronym Anarchy’, and with good reason. SOA/WOA/ROA… SAAS/PAAS/IAAS… all point to but one thing – the simple fact that two great worlds – the enterprise and internet are about to collide in what shall be the biggest bang however.

For years, the shared enterprise architect had no choice but to direct his troops to live within the confines of the enterprise firewall, to build secluded information silos, and then to somehow connect them all together; while at the same time keeping costs at a minimum, performance at a maximum, and achieving ROIs set by CIOs to scarcely believable figures. Then arrived the age of architecture nomenclature which introduced the enterprise systems to whole lot of design patterns, which led to a whole lot of expenditure, without a whole lot of return. Shockingly however, this virus of architecture overexposure did not attack the thousands of web based startups springing up every year, and they seemed to do just fine without it, as rags to riches became a shared Silicon Valley story. Clearly, the enterprises had a lot to learn, and now it seems like the class is finally in session.

With the arrival of web 2.0, a number of highly promising new concepts, ideologies and technologies have emerged which shall forever reshape the scenery of enterprise IT, while at the same time creating a seamless world of integration between 2 widely different computing platforms. For years it was believed that the dynamic, light-weight, volatile web patterns could not possibly be applied to the stiff, heavy, rule-based enterprise systems. It was believed that the flaky web constructs could never perform the workhorse role of enterprise applications. It was believed that the implementation, sustenance and dispensing mechanisms of the web had no place in an enterprise architecture governed by the iron fist of its ivory tower bound architect. However the recent success of web based SAAS/PAAS/IAAS in the world of the enterprise tells a very different story. The wise folks who overcame their fear, let down their barriers, and let in the web based sets by their enterprise firewall were immediately rewarded with cost savings, state of the art functionalities and frequently updated specialized software. This may have been a small step for a high rise CIO, but it was most definitely a giant jump for mankind.

1.3 – On The Origin Of Species

The invasion of the enterprise world by the ever conquering web warlords has however start a new chain of events, one whose history dates back to the origins of the world, and one whose reach far surpasses human imagination. This enigmatic natural occurrence was first discovered by a man named ‘Charles Darwin’, and he then proceeded to enlighten the rest of mankind about its existence in his publication called ‘On the Origin of Species’, which established the theory of ‘evolutionary adaptation by natural selection’. The 2 distinguished computing worlds fighting for existence in the same space shall see certain technologies taking a ‘rudimentary form’, certain ideologies deleted by the ‘survival of the fittest’ and certain concepts being reduced due to disuse by the ‘inheritance of acquired characteristics’. However the most exciting prospect of this invasion is the gradual change of various existing systems to adapt to their alternation environments, and ultimately, accumulating over time to form a new species. It is by this factor that we shall soon find ourselves talking about the Enternet, a new copy of IT spawned by the successful evolution of the enterprise and the web by natural selection, and its stride as one combined unit out to rule all of machine kind.

This progression however requires adaption. consequently, this junction is an important one as it splits into two roads, one leading to evolution, and the other leading to extinction. The enterprise systems would be forced to discarded weight, to become more flexible, and to inherit a view of unified open standards. This journey is being refuted as impossible by many, especially given the enormous investments made by CIOs in the existing enterprise architectures, chiefly SOA – which has gripped the attention of enterprise architects worldwide with its numerous long term promises. However, what shall make this job a lot easier is the fact that the SOA must not be viewed as a hindrance, but a meaningful enabler of migration to more progressive systems, as it is widely believed that the web itself is the world’s largest SOA in existence.

Keeping the upcoming challenge in mind, the propaganda of IT architects everywhere must manifest itself into a new form when constructing a modern application and their vision of the enterprise system must be updated with a measure of reality. The internet has grown over time into the IT world’s greatest ever success story, and has done so by accumulating a set of personality traits which have helped it develop into the beast it is today. Many of the noticeable internet traits can infact also be leveraged into the enterprise world to create a new copy of IT constructions which are capable of surviving in their new habitat. The enterprises have been learning from their past experiences and have been developing increasingly better systems, however, as the following comparisons of their various aspects with those of the web shall prove, the future lies in adopting many of the successful internet attributes to some degree within the enterprise systems, in order to help them keep up with the technological advancements in the ever growing IT sector.

2 – Software Development In The New Era – Comparisons And Conclusions

2.1 – Ready, Steady, Fight!

2.1.1 – Integration vs Consumption – Design Consideration:


The SOA philosophy highly contributes the concept of service reuse. However the scope of this concept is just limited to reuse by other systems within the enterprise IT ecosystem. Due to this limited scope, the architectures usually use certain highly complicate set of specifications, which in turn enforce numerous pre-conditions that must be met by all interfacing systems in order to use its sets. This over time leads to multiple layers of unnecessary abstraction which results in restricting the very integration that it was originally meant to offer.


The web overcomes this obstacle by favoring consumption over integration. By following highly open standards, using widely adopted protocols and providing easily easy to reach web APIs/sets, the netizens are able to exceeding their enterprise counterparts by following one of the computing world’s oldest rules – KISS. As a consequence, all web based sets are easily obtainable for consumption by a wide variety of audiences, all of which it can cater to by simply not forcing the clients to use a set of complicate specifications. additionally, a highly consumable service is easily absorbed by the masses, which may rule to further development by crowdsourcing. Crowdsourcing, which is now viewed as a viable business strategy by numerous web based companies, has the possible to considerably increase the worth of a service by method of community based value addition. – Technical Implementation:


Although SOA is a insignificant design philosophy which does not specify a set of technologies for its implementation, over the years enterprise communities have uniform its technical elements, and generated a list of unofficial best practices which are now considered to be the law of the land. dominant among these is the SOAP – WSDL – UDDI collaboration which is meant to serve as a global standard for exposing application sets. This standard however, enforces constraints on all interacting systems, by forcing them to adopt dozens of heavyweight WS-* specifications, which results in the SOAP sets being harder to consume, and consequently seem less attractive to prospective clients.


The web overcomes this obstacle by favoring REST over SOAP as a method of exposing application sets. REST offers various advantages over SOAP simply by being based on the most basic of internet protocols – HTTP, which leads to rapid adoption due to its ‘ease of use’ allurement, and ultimately results in extensive consumption. additionally, the contract for a restful service may be implicit, and consequently it is easily handled by thin clients (chiefly the browser – which has a harder time dealing with the clearly defined WSDLs mandated by a SOAP based service) which further contributes its reputation as a widely applicable service exposure technique. The easy integration of REST with popular client side scripting languages such as Javascript serves only to enhance its charm. The success of REST in the World Wide Web becomes obvious when you consider the fact that every single web page on the internet is infact also a read-only REST service.

2.1.2 – Discoverable vs Searchable – Design Consideration:


SOA’s service reuse rule highly depends upon service discovery as a way to provide visibility of an enterprise’s sets and of the elements that make up those sets. As enterprise systems continue to grow more complicate, with thousands of sets being offered by numerous applications internally, architects have been hard at work trying to find a way to organize these sets in a methodic manner in order to comply with the SOA norms of making these sets discoverable. The idea at play is to allow a service from an application to be easily located by other applications based on a certain criteria, which would permit loose-coupling by preventing these sets to be hard wired between applications, and in turn create a highly flexible system with maximum service reuse capabilities. However, the enterprise’s implementation of the service discovery rule has been extremely flawed, as it further complicates matters by introducing new registries of service offerings, which must be built from the ground up, and have its APIs integrated with all applications intending to use it.


The web overcomes this obstacle of providing service visibility by using one of its most successful features till date – the internet search. already with gazillions of web-pages obtainable over the internet, one can keep track of all of them by using one of the many high quality searches offered by numerous vendors, which enables its users to find already the most conceal web resources using a variety of search criteria. additionally, the scope of these searches may vary, with websites developing their own private search mechanisms and implementing site-maps to permit discovery of resources internal to the website. With every web-page on the internet having a URI, the web follows a resource centric approach to locating the various requested items, and easily returns back the search consequence in the form of the resource’s rare address. What further helps the web in its quest to organize its resources for discoverability is the fact that one web-page may refer another by a hyperlink, which helps in creating a firmly woven net, where everything over the web may be accessed simply by traversing by its endless itinerary. – Technical Implementation:


The dominant mechanism of enabling service discovery in SOA is by building a repository in the form of a UDDI. However, this imposes numerous challenges for the architects in the form of expenses, integration with applications and scope of use, which leads to it offering very little (if anything at all) in the form of ROI. The UDDI must be constructed from scratch which leads to development costs. additionally, it must have its search sets infused with all existing and future systems, which leads to compatibility issues. Finally, the fact that UDDI is a registry for only SOAP based sets leads to its applicability being highly limited, and its operations not extending to most legacy constructs.


The above issues seem prehistoric in the world of the web which has successfully implemented an amazing resource discovery mechanism. The same concepts may be applied to the world of exposed application sets using REST web-sets. This is due to the fact that each REST service can be published as a URI, which leads to it being indexable by the backend web crawlers of numerous internet search companies, and consequently leads to it being searchable in the same way as a web-page. Further, each REST service may be a referenced resource in another REST service, which leads to a deeply connected set of sets, much like the web itself, and consequently a REST based network of service offerings is capable of being easily traversed. additionally, the adoption of RDF, RDF Schema and OWL as W3C standards has once again fueled the movement towards creating a machine traversable semantic web. consequently, an architecture based on REST sets is capable of taking advantage of these technologies by associating machine readable meta-data with the URI of each of its published sets, thereby easing highly progressive searches based on extremely specific criteria.

2.1.3 – Heavy-Weight vs Feather-Weight – Design Consideration:


The enterprises have a long history of following a strict set of standards, basing applications on complicate specifications, and creating multiple layers of abstraction which rule to the creation of a giant heavy-weight system. This over time leads to charging a very heavy consumption tax on service composition and invocation by all IT constructions within the enterprise system. The use of these heavy technologies ends up adding weight to the messaging systems used within the enterprise, which leads to the cost of communication between applications being extremely high, and results in huge amounts of expenses being incurred by the business in terms of performance, hardware utilization and network load.


The web overcomes this obstacle by favoring feather-weight technologies in all its constructs. This is made possible by offerings which incorporate only the most used features of all related primetime technologies and chip away the unnecessary flab. The web developers have long since realized, that not all features offered by a highly packed specification are utilized by all functionalities, offered by all applications, all of the time; and consequently apply a ‘pay as you go’ technique to building sets where the choice of the implementing technology is based on only the required aspects of that particular service. This leads to more efficient messaging systems due to the reduced fat of the messages passing between them. additionally, these systems are easier to implement and integrate, which results in a higher ROI. The internet, being the world’s largest network, channeling the world’s highest amount of traffic, and creating the world’s most cost effective solutions, is clearly the frontrunner when it comes to implementing messaging systems. – Technical Implementation:


The dominant messaging form adopted by enterprises in their implementation of SOA is SOAP based web-service, which is associated with a complete fledged stack of WS-* specifications that are seldom used, and hence it does not justify being treated as a ‘one size fits all’ form of data transmission solution in every application. SOAP is defined to be transport independent; however this mostly results in performance degradation as it does not take advantage of certain HTTP aspects, such as restful usage of URLs and methods. additionally, it bypasses existing TCP/IP mechanisms such as ordern management, flow control and service discovery, which leads to a highly inefficient transport system. Furthermore, it mandates messages to be passed in the form of XMLs, which bloats up the message size and hence increases the cost of serializing/deseriaizing each message. Also, a SOAP header is attached to each transmitted XML, which results in any systems intending to produce/consume it, needing a SOAP library. It is due to these issues that an enterprise messaging system results in being extremely obese.


The web is biased towards restful technologies, which permit it to continue quick response times, chiefly due to a highly efficient inter-application communication system. The web developers keep up REST web-sets in high regard as it allows them to implement JSON as a data container. This format employs a ‘size zero’ approach to structuring data, which leads to minimal overhead in terms of message size, and consequently results in optimal serialization/deserialization times. additionally, JSON shines as a programming language-independent representation of typical programming language data structures, especially with a dynamic programming language where a reasonable in-memory representation of a JSON object can be obtained simply by calling a library function, which leads to swifter parsing due to reduced external data restructuring logic. Furthermore, REST allows its GET function calls to be cached, which greatly contributes its ability to provide rapid transmission of information. Also, the use of a cache with REST sets reduces the load on the backend hardware, thereby improving application performance.

2.1.4 – Scalability vs Agility – Design Consideration:


The functionalities offered by an enterprise system are delivered to the end user in the form of a standard GUI which in turn calls an application modular service. The sets presented by the GUI are enhanced in batches, and are dependent upon the corresponding backend system’s release cycles, which are usually spaced at around 3 month intervals. It is due to these restrictions that there is a important lag in translating business requirements to technical implementation, which may cause an organization to lose its competitive edge, and consequently is far from being an optimal solution. additionally, the enterprises have historically built applications in the form of giant grounded garrisons, which heavily trade off the ‘time to market’ aspect of these IT constructions in return for their long term capabilities. It’s the inner complexities of the technical elements used to build these applications which cause their development to proceed at a snail’s speed.


The web overcomes this obstacle by relying on the independent delivery of the various sets offered by its various applications. The web developers pay chief importance to the loose coupling of sets within the application which enables them to upgrade one operation without affecting another. additionally, the sets may be exposed to the end user in the form of various small portlets, where each portlet is a GUI for one or more functionalities provided by the backend systems. Following this approach, based loosely on grid computing, makes it possible for web establishments to enhance their service offerings without requiring a complete application release. This form of a service delivery form also enables the possibility of building mashups, where various different portlet functionalities offered by various different vendors may be aggregated into one combined portal, which may add value to the native sets in expected or unexpected ways. Furthermore, the web based applications are built using lightweight technologies, where aspects like ‘learning curve’, ‘development speed’ and ‘time to market’ take precedence over most other considerations. It is due to these factors that the web poises itself to be the more nimble of the two computing worlds. – Technical Implementation:


The enterprise has over the years developed a standard way of building frontend representations of their backend sets using HTML, CSS and Javascript. These frontend systems are extremely static, providing very little in the form of client interaction, and thereby limiting their functionality to simply the ‘characterize of data’. additionally, the use of such technologies causes the GUI having to load the complete page repeatedly each time a new operation is requested, which in turn leads to inefficient performance as the unaffected data must also be re-requested for from the inner database, resulting in higher network/database/processor loads. A new operation invoked on these frontend systems results in a call to the sets offered by their backend counterparts, which are chiefly built using workhorse languages such as C/C++/Java etc. Applications built using these languages, although highly scalable, take extremely long to design, develop and deploy. consequently most enterprise applications follow an iterative development form, with quarterly release cycles, which leads to slow business growth. additionally, these applications do not offer an different delivery form for each of their independent sets, thereby mandating a complete application release in order to provide the upgraded functionality to the end user.


The web has had extreme growth over the past few years in terms of ‘high internet applications’. These applications are built in the form of widgets, and provide a high customer experience in terms of client interaction, leading to higher productivity as the GUIs are capable of offering a great range of functionalities. This is achieved using modern web technologies, such as Ajax – which requires no third party libraries, or various custom frameworks such as Flash, Java FX and Silverlight – which require the installation of third party libraries prior to their use. additionally, various widgets may be encased together into a combined web page, leading to higher flexibility. Each of these encased GUIs communicate asynchronously with their corresponding backend systems, which results in the operation invoked on one widget not affecting another, thereby boosting application performance while at the same time reducing hardware costs. It is also this ability of widgets to act as independent applications by themselves which allows them to be upgraded one at a time without requiring a complete release of their corresponding backend applications. These widgets may also be distributed to the masses as they are designed to be highly portable, allowing themselves to be additional by users to any existing web page running HTML without the knowledge of their technical aspects, which permits an organization to extend its reach well beyond the formal boundaries. The dispensing of these widgets may also give rise to the creation of mashups, either by using a format such as EMML, or by the various online frameworks obtainable to serve this purpose. additionally, these widgets receive data from their backend systems in a variety of formats such as RSS/ATOM/JSON, which are easily consumable by various client side scripting languages. The backend applications providing these sets are in turn built using dynamic scripting languages, which offer rapid development times, and consequently help transform business requirements to technical implementation in short time periods.

2.2 – And The Winner Is

Had the above exhibition of enterprise and web philosophies been a boxing match, the former would have been knocked out in the first round, with nothing to show for its effort. Clearly, the web has learnt to “float like a butterfly, sting like a bee”, a technique which has allowed it to infiltrate the IT departments of most businesses. IT development has suddenly shifted gears and moved into a new era, a dramatical change which has been led by the web at its spotlight, at a speed which the enterprises are finding difficult to keep up with. Never before has the world witnessed innovation at a rate so expensive, fuelled by a community so expansive, resulting in the realization of solutions so elite. As large, mighty and powerful as the enterprise systems may be, the ‘K-T extinction event’ which rule to the decline of dinosaurs has proven that failure to adapt quickly to a rapidly evolving ecosystem can rule to the moment collapse of already the most bright of species, a theory that my well be applied to the enterprise systems in a few years time if they fail to closest step on the gas.

3 – The Road Ahead

3.1 – The Future Looks Cloudy

We have come a long way since the introduction of computers as a way to solve chief business problems, with the demands of companies moving well beyond using computers to simply automate repetitive business processes, and the expectations of its customers skyrocketing way higher than receiving efficient sets. In this new era of computing, organizations are expected to focus their IT resources on chief value addition activities; a task made much harder by the fact that the world is facing extremely hard economic times, with CIOs reducing capital budgets, and CFOs being forced to cut operational expenses. It is this time of need that has led to the rise of a new form IT service, namely ‘cloud computing’, where SAAS, PAAS, and IAAS attempt to satisfy the demands of the companies and meet the expectations of its customers.

Cloud computing is seen as a gateway for organizations to focus IT on driving the business and not on maintenance, creating new applications with minimal upfront provisioning costs, extending the capabilities of current applications without new infrastructure, increasing the system capacity dynamically, and providing a better disaster recovery plan. The sheer amount of transparent ROI visible by these factors is enabling companies to look past the risks and shift their existing applications to external IAAS offerings, stage the development of new applications on external PAAS offerings, and to abandon the development of certain in-house applications in favor of external SAAS offerings. This embracement of public cloud sets marks the beginning of the web infiltrating the before firewall guarded enterprise.

The cloud brings with it several promises, which shall usher in a new era of computing, the impact of which shall be far greater than the cloud itself. The web based companies offering these cloud sets bring with them their skill in various internet technologies, many of which shall be implemented in their SAAS/PAAS/IAAS offerings, and now be integrated with the enterprise systems opting to use these sets. This shall help the enterprises realize the true value of the various web technologies without facing the risk of having to experiment with implementing it themselves. additionally, the move towards the public cloud shall decline the closeness of the enterprise to the internet, consequently opening up new avenues for the businesses to analyze, and take advantage of the various opportunities rare to the web culture. It is consequently in the enterprise’s best interest to align their IT technologies with the grain of the web, in order to ease easy integration with the various internet sets in the near future, which shall permit them to expand their business at an unheard-of speed by exploiting new prospects.

3.2 – The Imminent Merger

Today, most enterprise architects don’t think of an application service’s direct consumption by outside world as a meaningful criteria while developing its structure; however the move to the public cloud shall mark a paradigm shift in the thought course of action which goes into designing already the most backend applications, due to the endless possibilities obtainable for organizations to grow the business by exposing their sets to the end user over the web. This, coupled with the countless pre-existing sets obtainable over the internet, which the enterprise systems may consume in order to provide a range of new age value additional sets to its customers, shall ensure that the enterprises pave the way for the web technologies to be a part of their systems. The coming years shall observe the enterprise applications becoming consumable, their sets – searchable, their architecture – feather-weight, their delivery – nimble and their culture – collaborative. The IT world has seen many great mergers over the years; however this unification of the enterprise and web worlds shall forever raise the bar, and open up the doors to unimaginable possibilities.

The story of the enterprise and the web is much like a movie script, with two long lost brothers reuniting near the climax to take on the bad guy, and the audience cheering their every move. The enterprise and the web were separated at childhood, raised by separate communities, and went on to develop greatly different personalities. However, after years of living in ignorance, they have now finally rediscovered each other. The end is in sight, the dream is alive, and the complete world is watching, hoping, praying for a happy ending.

leave your comment