Why standards in information technology are critical

By Mark Skilton, Capgemini

See the next article in Mark’s series on standards here.

Information technology as an industry is at the center of communications and exchange of information, and increasingly, fully digitized products and services. Its span of influence and control is enabled through the ability of protocols, syntax and nomenclatures to be defined and known between consumers and providers. The Internet is testament to HTTP, TCP-IP, HTML, URL, MAC and XML standards that have become universal languages to enable its very existence. These “universal common standards” are an example of a homogenous, all-pervasive standard that enables the construction and use of resources and connections that are built on these standards.

These “building blocks” are a necessary foundation to enable more advanced language and exchange interactions to become possible. It can be argued that with every new technology advance, a new language is needed to express and drive that new advance. Prior to the Internet, earlier standards of timeshare mainframes, virtual memory, ISA chip architecture and fiber optics established scale and increasing capacity to affect simple to more complex tasks. There simply was no universal protocol-based standards that could support the huge network of wired and wireless communications. Commercial-scale computing was locked and limited inside mainframe and PC computers.

With federated distributed computing standards, all that changed. The Client-Server era enabled cluster intranet and peer-to-peer networks. Email exchange, web access and data base access evolved to be across a number of computers and to connect groups of computers together for shared resource services. The web browser running as a client program at the user computer enables access to information at any web server in the world. So standards come and go, and evolve in cycles as existing technology matures and new technologies and capabilities evolve much like the cycles of innovation explained in the development of technology and innovation seen in the published works of “Machine that Changed the World” by James Womack 1990, “Clock Speed” by Charles Fine in 1999 and recently the “Innovators Dilemma” by Clayton Christensen in the mid 2000’s.

The challenge is to position standards and policies to use those standards in a way that establish and enable products, services and markets to be created or developed. The Open Group does just that.

Mark Skilton will be presenting on “Building A Cloud Computing Roadmap View To Your Enterprise Planning” at The Open Group Conference, Austin, July 18-22. Join us for best practices and case studies on Enterprise Architecture, Cloud, Security and more, presented by preeminent thought leaders in the industry.

Mark Skilton, Director, Capgemini, is the Co-Chair of The Open Group Cloud Computing Work Group. He has been involved in advising clients and developing of strategic portfolio services in Cloud Computing and business transformation. His recent contributions include the publication of Return on Investment models on Cloud Computing widely syndicated that achieved 50,000 hits on CIO.com and in the British Computer Society 2010 Annual Review. His current activities include development of a new Cloud Computing Model standards and best practices on the subject of Cloud Computing impact on Outsourcing and Off-shoring models and contributed to the second edition of the Handbook of Global Outsourcing and Off-shoring published through his involvement with Warwick Business School UK Specialist Masters Degree Program in Information Systems Management.

4 comments

  1. IT plays a huge role in any successful business, especially now that so many companies have taken their business online, having websites and social pages that represent their image.

    In all this, IT consulting companies play a big role of making sure everything goes as smooth as possible for their clients. If the clients are happy, they’re happy.

Comments are closed.