Tag Archives: Cloud Computing Work Group

Future Technologies

By Dave Lounsbury, The Open Group

The Open Group is looking toward the future – what will happen in the next five to ten years?

Those who know us think of The Open Group as being all about consensus, creating standards that are useful to the buy and supply side by creating a stable representation of industry experience – and they would be right. But in order to form this consensus, we must keep an eye on the horizon to see if there are areas that we should be talking about now. The Open Group needs to keep eyes on the future in order to keep pace with businesses looking to gain business advantage by incorporating emerging technologies. According to the McKinsey Global institute[1], “leaders need to plan for a range of scenarios, abandoning assumptions about where competition and risk could come from and not to be afraid to look beyond long-established models.”

To make sure we have this perspective, The Open Group has started a series of Future Technologies workshops. We initiated this at The Open Group Conference in Philadelphia with the goal of identifying emerging business and technical trends that change the shape of enterprise IT.  What are the potential disruptors? How should we be preparing?

As always at The Open Group, we look to our membership to guide us. We assembled a fantastic panel of experts on the topic who offered up insights into the future:

  • Dr. William Lafontaine, VP High Performance Computing, Analytics & Cognitive Markets at IBM Research: Global technology Outlook 2013.
  • Mike Walker, Strategy and Enterprise Architecture Advisor at HP: An Enterprise Architecture’s Journey to 2020.

If you were not able to join us in Philadelphia, you can view the Livestream session on-demand.

Dr. William Lafontaine shared aspects of the company’s Global Technology Outlook 2013, naming the top trends that the company is keeping top of mind, starting with a confluence of social, mobile analytics and cloud.

According to Lafontaine and his colleagues, businesses must prepare for not “mobile also” but “mobile first.” In fact, there will be companies that will exist in a mobile-only environment.

  • Growing scale/lower barrier of entry – More data created, but also more people able to create ways of taking advantage of this data, such as companies that excel at personal interface. Multimedia analytics will become a growing concern for businesses that will be receiving swells of information video and images.
  • Increasing complexity – the Confluence of Social, Mobile, Cloud and Big Data / Analytics will result in masses of data coming from newer, more “complex” places, such as scanners, mobile devices and other “Internet of Things”. Yet, these complex and varied streams of data are more consumable and will have an end-product which is more easily delivered to clients or user.  Smaller businesses are also moving closer toward enterprise complexity. For example, when you swipe your credit card, you will also be shown additional purchasing opportunities based on your past spending habits.  These can include alerts to nearby coffee shops that serve your favorite tea to local bookstores that sell mysteries or your favorite genre.
  •  Fast pace – According to Lafontaine, ideas will be coming to market faster than ever. He introduced the concept of the Minimum Buyable Product, which means take an idea (sometimes barely formed) to inventors to test its capabilities and to evaluate as quickly as possible. Processes that once took months or years can now take weeks. Lafontaine used the MOOC innovator Coursera as an example: Eighteen months ago, it had no clients and existed in zero countries. Now it’s serving over 4 million students around the world in over 29 countries. Deployment of open APIs will become a strategic tool for creation of value.
  • Contextual overload – Businesses have more data than they know what to do with: our likes and dislikes, how we like to engage with our mobile devices, our ages, our locations, along with traditional data of record. The next five years, businesses will be attempting to make sense of it.
  • Machine learning – Cognitive systems will form the “third era” of computing. We will see businesses using machines capable of complex reasoning and interaction to extend human cognition.  Examples are a “medical sieve” for medical imaging diagnosis, used by legal firms in suggesting defense / prosecution arguments and in next generation call centers.
  • IT shops need to be run as a business – Mike Walker spoke about how the business of IT is fundamentally changing and that end-consumers are driving corporate behaviors.  Expectations have changed and the bar has been raised.  The tolerance for failure is low and getting lower.  It is no longer acceptable to tell end-consumers that they will be receiving the latest product in a year.  Because customers want their products faster, EAs and businesses will have to react in creative ways.
  • Build a BRIC house: According to Forrester, $2.1 trillion will be spent on IT in 2013 with “apps and the US leading the charge.” Walker emphasized the importance of building information systems, products and services that support the BRIC areas of the world (Brazil, Russia, India and China) since they comprise nearly a third of the global GDP. Hewlett-Packard is banking big on “The New Style of IT”: Cloud, risk management and security and information management.  This is the future of business and IT, says Meg Whitman, CEO and president of HP. All of the company’s products and services presently pivot around these three concepts.
  • IT is the business: Gartner found that 67% of all EA organizations are either starting (39%), restarting (7%) or renewing (21%). There’s a shift from legacy EA, with 80% of organizations focused on how they can leverage EA to either align business and IT standards (25%), deliver strategic business and IT value (39%) or enable major business transformation (16%).

Good as these views are, they only represent two data points on a line that The Open Group wants to draw out toward the end of the decade. So we will be continuing these Future Technologies sessions to gather additional views, with the next session being held at The Open Group London Conference in October.  Please join us there! We’d also like to get your input on this blog.  Please post your thoughts on:

  • Perspectives on what business and technology trends will impact IT and EA in the next 5-10 years
  • Points of potential disruption – what will change the way we do business?
  • What actions should we be taking now to prepare for this future?

[1] McKinsey Global Institute, Disruptive technologies: Advances that will transform life, business, and the global economy. May 2013

Dave LounsburyDave Lounsbury is The Open Group‘s Chief Technology Officer, previously VP of Collaboration Services.  Dave holds three U.S. patents and is based in the U.S.

1 Comment

Filed under Cloud, Enterprise Architecture, Future Technologies, Open Platform 3.0

Thinking About Big Data

By Dave Lounsbury, The Open Group

“We can not solve our problems with the same level of thinking that created them.”

- Albert Einstein

The growing consumerization of technology and convergence of technologies such as the “Internet of Things”, social networks and mobile devices are causing big changes for enterprises and the marketplace. They are also generating massive amounts of data related to behavior, environment, location, buying patterns and more.

Having massive amounts of data readily available is invaluable. More data means greater insight, which leads to more informed decision-making. So far, we are keeping ahead of this data by smarter analytics and improving the way we handle this data. The question is, how long can we keep up? The rate of data production is increasing; as an example, an IDC report[1] predicts that the production of data will increase 50X in the coming decade. To magnify this problem, there’s an accompanying explosion of data about the data – cataloging information, metadata, and the results of analytics are all data in themselves. At the same time, data scientists and engineers who can deal with such data are already a scarce commodity, and the number of such people is expected to grow only by 1.5X in the same period.

It isn’t hard to draw the curve. Turning data into actionable insight is going to be a challenge – data flow is accelerating at a faster rate than the available humans can absorb, and our databases and data analytic systems can only help so much.

Markets never leave gaps like this unfilled, and because of this we should expect to see a fundamental shift in the IT tools we use to deal with the growing tide of data. In order to solve the challenges of managing data with the volume, variety and velocities we expect, we will need to teach machines to do more of the analysis for us and help to make the best use of scarce human talents.

The Study of Machine Learning

Machine Learning, sometimes called “cognitive computing”[2] or “intelligent computing”, looks at the study of building computers with the capability to learn and perform tasks based on experience. Experience in this context includes looking at vast data sets, using multiple “senses” or types of media, recognizing patterns from past history or precedent, and extrapolating this information to reason about the problem at hand. An example of machine learning that is currently underway in the healthcare sector is medical decision aids that learn to predict therapies or to help with patient management, based on correlating a vast body of medical and drug experience data with the information about the patients under treatment

A well-known example of this is Watson, a machine learning system IBM unveiled a few years ago. While Watson is best known for winning Jeopardy, that was just the beginning. IBM has since built six Watsons to assist with their primary objective: to help health care professionals find answers to complex medical questions and help with patient management[3]. The sophistication of Watson is the reaction of all this data action that is going on. Watson of course isn’t the only example in this field, with others ranging from Apple’s Siri intelligent voice-operated assistant to DARPA’s SyNAPSE program[4].

Evolution of the Technological Landscape

As the consumerization of technology continues to grow and converge, our way of constructing business models and systems need to evolve as well. We need to let data drive the business process, and incorporate intelligent machines like Watson into our infrastructure to help us turn data into actionable results.

There is an opportunity for information technology and companies to help drive this forward. However, in order for us to properly teach computers how to learn, we first need to understand the environments in which they will be asked to learn in – Cloud, Big Data, etc. Ultimately, though, any full consideration of these problems will require a look at how machine learning can help us make decisions – machine learning systems may be the real platform in these areas.

The Open Group is already laying the foundation to help organizations take advantage of these convergent technologies with its new forum, Platform 3.0. The forum brings together a community of industry thought leaders to analyze the use of Cloud, Social, Mobile computing and Big Data, and describe the business benefits that enterprises can gain from them. We’ll also be looking at trends like these at our Philadelphia conference this summer.  Please join us in the discussion.


2 Comments

Filed under Cloud, Cloud/SOA, Data management, Enterprise Architecture

Welcome to Platform 3.0

By Dave Lounsbury, The Open Group

The space around us is forever changing.

As I write now, the planet’s molten core is in motion far beneath my feet, and way above my head, our atmosphere and the universe are in constant flux too.

Man also makes his own changes as well. Innovation in technology and business constantly create new ways to work together and create economic value.

Over the past few years, we have witnessed the birth, evolution and use of a number of such changes, each of which has the potential to fundamentally change the way we engage with one another. These include: Mobile, Social (both Social Networks and Social Enterprise), Big Data, the Internet of Things, Cloud Computing as well as devices and application architectures.

Now however, these once disparate forces are converging – united by the growing Consumerization of Technology and the resulting evolution in user behavior – to create new business models and system designs.

You can see evidence of this convergence of trends in the following key architectural shifts:

  • Exponential growth of data inside and outside organizations converging with end point usage in mobile devices, analytics, embedded technology and Cloud hosted environments
  • Speed of technology and business innovation is rapidly changing the focus from asset ownership to the usage of services, and the predication of more agile architecture models to be able to adapt to new technology change and offerings
  • New value networks resulting from the interaction and growth of the Internet of Things and multi-devices and connectivity targeting specific vertical industry sector needs
  • Performance and security implications involving cross technology platforms , cache and bandwidth strategies, existing across federated environments
  • Social behavior and market channel changes resulting in multiple ways to search and select IT and business services
  • Cross device and user-centric driven service design and mainstream use of online marketplace platforms for a growing range of services

The analyst community was the first to recognize and define this evolution in the technological landscape which we are calling Platform 3.0.

At Gartner’s Symposium conference, the keynote touched on the emergence of what it called a ‘Nexus of Forces,’ and warning that it would soon render existing Business Architectures “obsolete.”

However, for those organizations who could get it right, Gartner called the Nexus a “key differentiator of business and technology management” and recommended that “strategizing on how to take advantage of the Nexus should be a top priority for companies around the world.”[i]

Similarly, according to IDC Chief Analyst, Frank Gens, “Vendors’ ability (or inability) to compete on the 3rd Platform [Platform 3.0] right now — even at the risk of cannibalizing their own 2nd Platform franchises — will reorder leadership ranks within the IT market and, ultimately, every industry that uses IT.”[ii]

Of course, while organizations will be looking to make use of Platform 3.0 to create innovative new products and services, this will not be an easy transition for many. Significantly, there will be architectural issues and structural considerations to consider when using and combining these convergent technologies which will need to be overcome. Accomplishing this will in turn require cooperation among suppliers and users of these products and services.

That is why we’re excited to announce the formation of a new – as yet unnamed – forum, specifically designed to advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to take advantage of these convergent technologies. This will be accomplished by identifying a set of new platform capabilities, and architecting and standardizing an IT platform by which enterprises can reap the business benefits of Platform 3.0. It is our intention that these capabilities will enable enterprises to:

  • Process data “in the Cloud”
  • Integrate mobile devices with enterprise computing
  • Incorporate new sources of data, including social media and sensors in the Internet of Things
  • Manage and share data that has high volume, velocity, variety and distribution
  • Turn the data into usable information through correlation, fusion, analysis and visualization

The forum will bring together a community of industry experts and thought leaders whose purpose it will be to meet these goals, initiate and manage programs to support them, and promote the results. Owing to the nature of the forum it is expected that this forum will also leverage work underway in this area by The Open Group’s existing Cloud Work Group, and would coordinate with other forums for specific overlapping or cross-cutting activities.

Looking ahead, the first deliverables will analyze the use of Cloud, Social, Mobile Computing and Big Data, and describe the business benefits that enterprises can gain from them. The forum will then proceed to describe the new IT platform in the light of this analysis.

If this area is as exciting and important to you and your organization as it is to us, please join us in the discussion. We will use this blog and other communication channels of The Open Group to let you know how you can participate, and we’d of course welcome your comments and thoughts on this idea.

21 Comments

Filed under Enterprise Architecture, Professional Development

How Should we use Cloud?

By Chris Harding, The Open Group

How should we use Cloud? This is the key question at the start of 2013.

The Open Group® conferences in recent years have thrown light on, “What is Cloud?” and, “Should we use Cloud?” It is time to move on.

Cloud as a Distributed Processing Platform

The question is an interesting one, because the answer is not necessarily, “Use Cloud resources just as you would use in-house resources.” Of course, you can use Cloud processing and storage to replace or supplement what you have in-house, and many companies are doing just that. You can also use the Cloud as a distributed computing platform, on which a single application instance can use multiple processing and storage resources, perhaps spread across many countries.

It’s a bit like contracting a company to do a job, rather than hiring a set of people. If you hire a set of people, you have to worry about who will do what when. Contract a company, and all that is taken care of. The company assembles the right people, schedules their work, finds replacements in case of sickness, and moves them on to other things when their contribution is complete.

This doesn’t only make things easier, it also enables you to tackle bigger jobs. Big Data is the latest technical phenomenon. Big Data can be processed effectively by parceling the work out to multiple computers. Cloud providers are beginning to make the tools to do this available, using distributed file systems and map-reduce. We do not yet have, “Distributed Processing as a Service” – but that will surely come.

Distributed Computing at the Conference

Big Data is the main theme of the Newport Beach conference. The plenary sessions have keynote presentations on Big Data, including the crucial aspect of security, and there is a Big Data track that explores in depth its use in Enterprise Architecture.

There are also Cloud tracks that explore the business aspects of using Cloud and the use of Cloud in Enterprise Architecture, including a session on its use for Big Data.

Service orientation is generally accepted as a sound underlying principle for systems using both Cloud and in-house resources. The Service Oriented Architecture (SOA) movement focused initially on its application within the enterprise. We are now looking to apply it to distributed systems of all kinds. This may require changes to specific technology and interfaces, but not to the fundamental SOA approach. The Distributed Services Architecture track contains presentations on the theory and practice of SOA.

Distributed Computing Work in The Open Group

Many of the conference presentations are based on work done by Open Group members in the Cloud Computing, SOA and Semantic Interoperability Work Groups, and in the Architecture, Security and Jericho Forums. The Open Group enables people to come together to develop standards and best practices for the benefit of the architecture community. We have active Work Groups and Forums working on artifacts such as a Cloud Computing Reference Architecture, a Cloud Portability and Interoperability Guide, and a Guide to the use of TOGAF® framework in Cloud Ecosystems.

The Open Group Conference in Newport Beach

Our conferences provide an opportunity for members and non-members to discuss ideas together. This happens not only in presentations and workshops, but also in informal discussions during breaks and after the conference sessions. These discussions benefit future work at The Open Group. They also benefit the participants directly, enabling them to bring to their enterprises ideas that they have sounded out with their peers. People from other companies can often bring new perspectives.

Most enterprises now know what Cloud is. Many have identified specific opportunities where they will use it. The challenge now for enterprise architects is determining how best to do this, either by replacing in-house systems, or by using the Cloud’s potential for distributed processing. This is the question for discussion at The Open Group Conference in Newport Beach. I’m looking forward to an interesting conference!

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

1 Comment

Filed under Cloud, Conference

2013 Open Group Predictions, Vol. 1

By The Open Group

A big thank you to all of our members and staff who have made 2012 another great year for The Open Group. There were many notable achievements this year, including the release of ArchiMate 2.0, the launch of the Future Airborne Capability Environment (FACE™) Technical Standard and the publication of the SOA Reference Architecture (SOA RA) and the Service-Oriented Cloud Computing Infrastructure Framework (SOCCI).

As we wrap up 2012, we couldn’t help but look towards what is to come in 2013 for The Open Group and the industries we‘re a part of. Without further ado, here they are:

Big Data
By Dave Lounsbury, Chief Technical Officer

Big Data is on top of everyone’s mind these days. Consumerization, mobile smart devices, and expanding retail and sensor networks are generating massive amounts of data on behavior, environment, location, buying patterns – etc. – producing what is being called “Big Data”. In addition, as the use of personal devices and social networks continue to gain popularity so does the expectation to have access to such data and the computational power to use it anytime, anywhere. Organizations will turn to IT to restructure its services so it meets the growing expectation of control and access to data.

Organizations must embrace Big Data to drive their decision-making and to provide the optimal service mix services to customers. Big Data is becoming so big that the big challenge is how to use it to make timely decisions. IT naturally focuses on collecting data so Big Data itself is not an issue.. To allow humans to keep on top of this flood of data, industry will need to move away from programming computers for storing and processing data to teaching computers how to assess large amounts of uncorrelated data and draw inferences from this data on their own. We also need to start thinking about the skills that people need in the IT world to not only handle Big Data, but to make it actionable. Do we need “Data Architects” and if so, what would their role be?

In 2013, we will see the beginning of the Intellectual Computing era. IT will play an essential role in this new era and will need to help enterprises look at uncorrelated data to find the answer.

Security

By Jim Hietala, Vice President of Security

As 2012 comes to a close, some of the big developments in security over the past year include:

  • Continuation of hacktivism attacks.
  • Increase of significant and persistent threats targeting government and large enterprises. The notable U.S. National Strategy for Trusted Identities in Cyberspace started to make progress in the second half of the year in terms of industry and government movement to address fundamental security issues.
  • Security breaches were discovered by third parties, where the organizations affected had no idea that they were breached. Data from the 2012 Verizon report suggests that 92 percent of companies breached were notified by a third party.
  • Acknowledgement from senior U.S. cybersecurity professionals that organizations fall into two groups: those that know they’ve been penetrated, and those that have been penetrated, but don’t yet know it.

In 2013, we’ll no doubt see more of the same on the attack front, plus increased focus on mobile attack vectors. We’ll also see more focus on detective security controls, reflecting greater awareness of the threat and on the reality that many large organizations have already been penetrated, and therefore responding appropriately requires far more attention on detection and incident response.

We’ll also likely see the U.S. move forward with cybersecurity guidance from the executive branch, in the form of a Presidential directive. New national cybersecurity legislation seemed to come close to happening in 2012, and when it failed to become a reality, there were many indications that the administration would make something happen by executive order.

Enterprise Architecture

By Leonard Fehskens, Vice President of Skills and Capabilities

Preparatory to my looking back at 2012 and forward to 2013, I reviewed what I wrote last year about 2011 and 2012.

Probably the most significant thing from my perspective is that so little has changed. In fact, I think in many respects the confusion about what Enterprise Architecture (EA) and Business Architecture are about has gotten worse.

The stress within the EA community as both the demands being placed on it and the diversity of opinion within it increase continues to grow.  This year, I saw a lot more concern about the value proposition for EA, but not a lot of (read “almost no”) convergence on what that value proposition is.

Last year I wrote “As I expected at this time last year, the conventional wisdom about Enterprise Architecture continues to spin its wheels.”  No need to change a word of that. What little progress at the leading edge was made in 2011 seems to have had no effect in 2012. I think this is largely a consequence of the dust thrown in the eyes of the community by the ascendance of the concept of “Business Architecture,” which is still struggling to define itself.  Business Architecture seems to me to have supplanted last year’s infatuation with “enterprise transformation” as the means of compensating for the EA community’s entrenched IT-centric perspective.

I think this trend and the quest for a value proposition are symptomatic of the same thing — the urgent need for Enterprise Architecture to make its case to its stakeholder community, especially to the people who are paying the bills. Something I saw in 2011 that became almost epidemic in 2012 is conflation — the inclusion under the Enterprise Architecture umbrella of nearly anything with the slightest taste of “business” to it. This has had the unfortunate effect of further obscuring the unique contribution of Enterprise Architecture, which is to bring architectural thinking to bear on the design of human enterprise.

So, while I’m not quite mired in the slough of despond, I am discouraged by the community’s inability to advance the state of the art. In a private communication to some colleagues I wrote, “the conventional wisdom on EA is at about the same state of maturity as 14th century cosmology. It is obvious to even the most casual observer that the earth is both flat and the center of the universe. We debate what happens when you fall off the edge of the Earth, and is the flat earth carried on the back of a turtle or an elephant?  Does the walking of the turtle or elephant rotate the crystalline sphere of the heavens, or does the rotation of the sphere require the turtlephant to walk to keep the earth level?  These are obviously the questions we need to answer.”

Cloud

By Chris Harding, Director of Interoperability

2012 has seen the establishment of Cloud Computing as a mainstream resource for enterprise architects and the emergence of Big Data as the latest hot topic, likely to be mainstream for the future. Meanwhile, Service-Oriented Architecture (SOA) has kept its position as an architectural style of choice for delivering distributed solutions, and the move to ever more powerful mobile devices continues. These trends have been reflected in the activities of our Cloud Computing Work Group and in the continuing support by members of our SOA work.

The use of Cloud, Mobile Computing, and Big Data to deliver on-line systems that are available anywhere at any time is setting a new norm for customer expectations. In 2013, we will see the development of Enterprise Architecture practice to ensure the consistent delivery of these systems by IT professionals, and to support the evolution of creative new computing solutions.

IT systems are there to enable the business to operate more effectively. Customers expect constant on-line access through mobile and other devices. Business organizations work better when they focus on their core capabilities, and let external service providers take care of the rest. On-line data is a huge resource, so far largely untapped. Distributed, Cloud-enabled systems, using Big Data, and architected on service-oriented principles, are the best enablers of effective business operations. There will be a convergence of SOA, Mobility, Cloud Computing, and Big Data as they are seen from the overall perspective of the enterprise architect.

Within The Open Group, the SOA and Cloud Work Groups will continue their individual work, and will collaborate with other forums and work groups, and with outside organizations, to foster the convergence of IT disciplines for distributed computing.

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Cybersecurity, Enterprise Architecture

San Francisco Conference Observations: Enterprise Transformation, Enterprise Architecture, SOA and a Splash of Cloud Computing

By Chris Harding, The Open Group 

This week I have been at The Open Group conference in San Francisco. The theme was Enterprise Transformation which, in simple terms means changing how your business works to take advantage of the latest developments in IT.

Evidence of these developments is all around. I took a break and went for coffee and a sandwich, to a little cafe down on Pine and Leavenworth that seemed to be run by and for the Millennium generation. True to type, my server pulled out a cellphone with a device attached through which I swiped my credit card; an app read my screen-scrawled signature and the transaction was complete.

Then dinner. We spoke to the hotel concierge, she tapped a few keys on her terminal and, hey presto, we had a window table at a restaurant on Fisherman’s Wharf. No lengthy phone negotiations with the Maitre d’. We were just connected with the resource that we needed, quickly and efficiently.

The power of ubiquitous technology to transform the enterprise was the theme of the inspirational plenary presentation given by Andy Mulholland, Global CTO at Capgemini. Mobility, the Cloud, and big data are the three powerful technical forces that must be harnessed by the architect to move the business to smarter operation and new markets.

Jeanne Ross of the MIT Sloan School of Management shared her recipe for architecting business success, with examples drawn from several major companies. Indomitable and inimitable, she always challenges her audience to think through the issues. This time we responded with, “Don’t small companies need architecture too?” Of course they do, was the answer, but the architecture of a big corporation is very different from that of a corner cafe.

Corporations don’t come much bigger than Nissan. Celso Guiotoko, Corporate VP and CIO at the Nissan Motor Company, told us how Nissan are using enterprise architecture for business transformation. Highlights included the concept of information capitalization, the rationalization of the application portfolio through SOA and reusable services, and the delivery of technology resource through a private cloud platform.

The set of stimulating plenary presentations on the first day of the conference was completed by Lauren States, VP and CTO Cloud Computing and Growth Initiatives at IBM. Everyone now expects business results from technical change, and there is huge pressure on the people involved to deliver results that meet these expectations. IT enablement is one part of the answer, but it must be matched by business process excellence and values-based culture for real productivity and growth.

My role in The Open Group is to support our work on Cloud Computing and SOA, and these activities took all my attention after the initial plenary. If you had, thought five years ago, that no technical trend could possibly generate more interest and excitement than SOA, Cloud Computing would now be proving you wrong.

But interest in SOA continues, and we had a SOA stream including presentations of forward thinking on how to use SOA to deliver agility, and on SOA governance, as well as presentations describing and explaining the use of key Open Group SOA standards and guides: the Service Integration Maturity Model (OSIMM), the SOA Reference Architecture, and the Guide to using TOGAF for SOA.

We then moved into the Cloud, with a presentation by Mike Walker of Microsoft on why Enterprise Architecture must lead Cloud strategy and planning. The “why” was followed by the “how”: Zapthink’s Jason Bloomberg described Representational State Transfer (REST), which many now see as a key foundational principle for Cloud architecture. But perhaps it is not the only principle; a later presentation suggested a three-tier approach with the client tier, including mobile devices, accessing RESTful information resources through a middle tier of agents that compose resources and carry out transactions (ACT).

In the evening we had a CloudCamp, hosted by The Open Group and conducted as a separate event by the CloudCamp organization. The original CloudCamp concept was of an “unconference” where early adopters of Cloud Computing technologies exchange ideas. Its founder, Dave Nielsen, is now planning to set up a demo center where those adopters can experiment with setting up private clouds. This transition from idea to experiment reflects the changing status of mainstream cloud adoption.

The public conference streams were followed by a meeting of the Open Group Cloud Computing Work Group. This is currently pursuing nine separate projects to develop standards and guidance for architects using cloud computing. The meeting in San Francisco focused on one of these – the Cloud Computing Reference Architecture. It compared submissions from five companies, also taking into account ongoing work at the U.S. National Institute of Standards and Technology (NIST), with the aim of creating a base from which to create an Open Group reference architecture for Cloud Computing. This gave a productive finish to a busy week of information gathering and discussion.

Ralph Hitz of Visana, a health insurance company based in Switzerland, made an interesting comment on our reference architecture discussion. He remarked that we were not seeking to change or evolve the NIST service and deployment models. This may seem boring, but it is true, and it is right. Cloud Computing is now where the automobile was in 1920. We are pretty much agreed that it will have four wheels and be powered by gasoline. The business and economic impact is yet to come.

So now I’m on my way to the airport for the flight home. I checked in online, and my boarding pass is on my cellphone. Big companies, as well as small ones, now routinely use mobile technology, and my airline has a frequent-flyer app. It’s just a shame that they can’t manage a decent cup of coffee.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. Before joining The Open Group, he was a consultant, and a designer and development manager of communications software. With a PhD in mathematical logic, he welcomes the current upsurge of interest in semantic technology, and the opportunity to apply logical theory to practical use. He has presented at Open Group and other conferences on a range of topics, and contributes articles to on-line journals. He is a member of the BCS, the IEEE, and the AOGEA, and is a certified TOGAF practitioner.

Comments Off

Filed under Cloud, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, Service Oriented Architecture, Standards

What does the Amazon EC2 downtime mean?

By Mark Skilton, Capgemini

The recent announcement of the Amazon EC2 outage in April this year triggers some thoughts about this very high-profile topic in Cloud Computing. How secure and available is your data in the Cloud?

While the outage was more to do with the service level availability (SLA) of data and services from your Cloud provider, the recent, potentially more concerning risk of Epsilon e-mail data stolen, and as I write this the Sony email theft is breaking news, further highlights this big topic in Cloud Computing.

My initial reaction on hearing the about the outage was that it was due to over-allocation due to high demand in the US EAST 2 region, which led to a cascade system failure. I subsequently read that Amazon said it was a network glitch, which triggered storage backups to automatically create more than needed, consuming the elastic block storage. This in turn, I theorized, seems to have created the supply unavailability problem.

From a business perspective, this focuses on the issues of using a primary Cloud provider. The businesses like Quora.com and foursquare.com that were affected “live in the Cloud,” yet backup and secondary Cloud support needs are clearly important.  Some of these are economic decisions, trade-offs between loss of business and business continuity. It highlights the vulnerability of these enterprises even though a highly successful organization like Amazon makes this a rare event. Consumers of Cloud services need to consider taking mitigating actions such as disruption insurance; having secondary backups; and the issues of assurances of SLAs, which are largely out of the hands of SMB Market users. A result of outages in Cloud providers has been the emergence of a new market called “Cloud Backup,” which is starting to gain favor with customers and providers in providing added levels of protection of service fail over.

While these are concerning issues, I believe most outage issues may be addressed by taking due diligence in the procurement and usage behavior of any service that involves a third party. I’ve expanding the definition of due diligence in Cloud Computing to include at least six key processes that any prospective Cloud buyer should be aware and make contingency for, as you would with any purchase of a business critical service:

  • Security management
  • Compliance management
  • Service Management (ITSM and License controls)
  • Performance management
  • Account management
  • Ecosystem standards management

I don’t think publishing a bill of rights for consumers is enough to insure against failure. One thing that Cloud Computing design has taught me is that part of the architectural shift brought about by Cloud is the emergence of automation as an implicit part of the operating model design to enable elasticity. This automation may have been a factor, ironically, in the Amazon situation, but overall the benefits of Cloud far outweigh the downsides, which can be re-engineered and resolved.

A useful guide to address some of the business impact can be found in a new book by The Open Group on Cloud Computing for Business that we plan to publish this quarter. The topics of the book address many of these challenges in understanding and driving the value of the Cloud Computing in the language of business. The book covers chapters relating to business use of cloud and includes topics of risk management of the Cloud. Check The Open Group website for more information on The Open Group Cloud Computing Work Group and the Cloud publications in the bookstore at http://www.opengroup.org.

Cloud Computing isa key topic of discussion at The Open Group Conference, London, May 9-13, which is currently underway. 

Mark Skilton, Director, Capgemini, is the Co-Chair of The Open Group Cloud Computing Work Group. He has been involved in advising clients and developing of strategic portfolio services in Cloud Computing and business transformation. His recent contributions include the publication of Return on Investment models on Cloud Computing widely syndicated that achieved 50,000 hits on CIO.com and in the British Computer Society 2010 Annual Review. His current activities include development of a new Cloud Computing Model standards and best practices on the subject of Cloud Computing impact on Outsourcing and Off-shoring models and contributed to the second edition of the Handbook of Global Outsourcing and Off-shoring published through his involvement with Warwick Business School UK Specialist Masters Degree Program in Information Systems Management.

1 Comment

Filed under Cloud/SOA

PODCAST: Examining the current state of Enterprise Architecture with The Open Group’s Steve Nunn

By Dana Gardner, Interabor Solutions

Listen to this recorded podcast here: BriefingsDirect-Open Group COO Steve Nunn on EA Professional Groups

The following is the transcript of a sponsored podcast panel discussion on the state of EA, from The Open Group Conference, San Diego 2011.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion in conjunction with The Open Group Conference held in San Diego, the week of February 7, 2011. We’re here with an executive from The Open Group to examine the current state of enterprise architecture (EA). We’ll hear about how EA is becoming more business-oriented and how organizing groups for the EA profession are consolidating and adjusting. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

We’ll get an update on The Association of Open Group Enterprise Architects (AOGEA) and learn more about its recent merger with the Association of Enterprise Architects. What’s more, we’ll get an assessment of the current maturity levels and overall professionalism drive of EA, and we’re going to learn more about what to expect from the EA field and these organizing groups over the next few years.

Here to help us delve into the current state of EA, please join me now in welcoming Steve Nunn, Chief Operating Officer of The Open Group and CEO of The Association of Open Group Enterprise Architects.

Welcome back, Steve.

Steve Nunn: Hi, Dana. Good to be back.

Gardner: We’re hearing an awful lot these days about EA being dead, outmoded, or somehow out of sync. I know there’s a lot more emphasis on the business issues, rather than just the technical or IT issues, but what’s going on with that? Are we at a point where this topic, this professional category, is in some danger?

Nunn: Absolutely not. EA is very much the thing of the moment, but it’s also something that’s going to be with us for the foreseeable future too. Both inside The Open Group and the AOGEA, we’re seeing significant growth and interest in the area of EA. In the association, it’s individuals becoming certified and wanting to join a professional body for their own purposes and to help the push to professionalize EA.

Within The Open Group, it’s entities and organizations. Whether they be commercial, governments, academic, they are regularly joining The Open Group Architecture Forum. So, it’s far from dead and in terms of the importance of business overall, EA being relevant to business.

Tomorrow’s plenary session here at the Conference is a good example. It’s about using EA for business transformation. It’s about using EA to tie IT into the business. There is no point in doing IT for IT’s sake. It’s there to support the business, and people are finding that one way of doing that is EA.

Gardner: I would think too, Steve, that some of the major trends around mobile, security, and cyber risk would augment the need for a more holistic governing role, and the architect seems to fit that bill quite nicely. So is there wind in your sails around some of these trends?

Central to the organization

Nunn: Absolutely. We’re seeing increasingly that you can’t just look at EA in some kind of silo. It’s more about how it fits. It’s so central to an organization and the way that organizations are built that it has all of the factors that you mentioned. Security is a good one, as well as cloud. They’re all impacted by EA. EA has a role to play in all of those.

Inside the Open Group, what’s happening is a lot of cross-functional working groups between the Architecture Forum, the Security Forum, and the Cloud Work Group, which is just recognition of that fact. But, the central tool of it is EA.

Gardner: In addition to recognizing that the function of the EA is important, you can’t just have people walking the door and say, well, I’m an enterprise architect. It’s hard to define the role, but it seems necessary. Tell me about the importance of certification, so that we really know what an enterprise architect is.

Nunn: That’s right. Everyone seems to want to be an enterprise architect or an IT architect right now. It’s that label to have on your business card. What we’re trying to do is separate the true architects from one of these, and certification is a key part of that.

If you’re an employer and you’re looking to take somebody on to help in the EA role, then it’s having some means to assess whether somebody really has any experience of EA, whether they know any frameworks, and what projects they’ve led that involve EA. All those things are obviously important to know.

There are various certification programs, particularly in The Open Group, that help with that. The TOGAF® Certification Program is focused on the TOGAF® framework. At the other end of the spectrum is the ITAC Program, which is a skills- and experience-based program that assesses by peer review an individual’s experience in EA.

There are those, there are others out there, and there are more coming. One of the great things we see is the general acceptance of certification as a means to telling the wood from the trees.

Gardner: So, we certainly have a need. We have some major trends that are requiring this role and we have the ability to begin certifying. Looking at this whole professionalism of EA, we also have these organizations. It was three years ago this very event that The AOGEA was officially launched. Maybe you could tell us what’s happened over the past three years and set the stage for what’s driving the momentum in the organization itself?

Nunn: Three years ago, we launched the association with 700 members. We were delighted to have that many at the start. As we sit here today, we have over 18,000 members. Over that period, we added members through more folks becoming certified through not only The Open Group programs, but with other programs. For example, we acknowledged the FIAC Certification Program as a valid path to full membership of the association.

We also embraced the Global Enterprise Architecture Organization (GEAO), and those folks, relevant to your earlier question, really have a particular business focus. We’ve also embraced the Microsoft Certified Architect individuals. Microsoft stopped its own program about a year ago now, and one of the things they encouraged their individuals who were certified to do was to join the association. In fact, Microsoft would help them pay to be members of the association, which was good.

So, it reflects the growth and membership reflects the interest in the area of EA and the interest in individuals’ wanting to advance their own careers through being part of a profession.

Valuable resource

Enterprise architects are a highly valuable resource inside an organization, and so we are both promoting that message to the outside world. For our members as individuals what we’re focusing on is delivering to them latest thinking in EA moving towards best practices, whitepapers, and trying to give them, at this stage, a largely virtual community in which to deal with each other.

Where we have turned it in to real community is through local chapters. We now have about 20 local chapters around the world. The members have formed those. They meet at varying intervals, but the idea is to get face time with each other and talk about issues that concern enterprise architects and the advancement of profession. It’s all good stuff. It’s growing by the week, by the month, in terms of the number of folks who want to do that. We’re very happy with what has gone in three years.

Gardner: We’ve got a little bit of alphabet soup out there. There are several organizations, several communities, that have evolved around them, but now you are working to bring that somewhat together.

As I alluded to earlier, the AOGEA has just announced its merger with the Association of Enterprise Architects (AEA). What’s the difference now? How does that shape up? Is this simply a melding of the two or is there something more to it?

Nunn: Well, it is certainly a melding of the two. The two organizations actually became one in late fall last year, and obviously we have the usual post merger integration things to take care of.

But, I think it’s not just a melding. The whole is greater than the sum of the parts. We have two different communities. We have the AOGEA folks who have come primarily through certification route, and we also have the AEA folks who haven’t been so, so focused on certification, but they bring to the table something very important. They have chapters in different areas than the AOGEA folks by and large.

Also, they have a very high respected quarterly publication called The Journal of Enterprise Architecture, along the lines of an academic journal, but with a leaning towards practitioners as well. That’s published on a quarterly basis. The great thing is that that’s now a membership benefit to the merged association membership of over 18,000, rather than the subscribed base before the merger.

As we develop, we’re getting closer to our goal of being able to really promote the profession of EA in a coherent way. There are other groups beyond that, and there are the early signs of co- operation and working together to try to achieve one voice for the profession going forward.

Gardner: And this also followed about a year ago, the GOAO merger with the AOGEA. So, it seems as if we’re getting the definitive global organization with variability in terms of how it can deal with communities, but also that common central organizing principle. Tell me about this new über organization, what are you going to call it and what is the reach? How big is it going to be?

Nunn: Well, the first part of that is the easy part. We have consulted the membership multiple times now actually, and we are going to name the merged organization, The Association of Enterprise Architects. So that will keep things nice and simple and that will be the name going forward. It does encompass so far GEAO, AOGEA and AEA. It’s fair to say that, as a membership organization, it is the leading organization for enterprise architects.

Role to play

There are other organizations in the ecosystem who are, for example, advocacy groups, training organizations, or certification groups, and they all have a role to play in the profession. But, where we’re going with AEA in the future is to make that the definitive professional association for enterprise architects. It’s a non-profit 501(c)(6) incorporated organization, which is there to act as the professional body for its members.

Gardner: You have been with The Open Group for well over 15 years now. You’ve seen a lot of the evolution and maturity. Let’s get back to the notion of the enterprise architect as an entity. As you said, we have now had a process where we recognize the need. We’ve got major trends and dynamics in the marketplace. We have organizations that are out there helping to corral people and manage the whole notion of EA better.

What is it about the maturity? Where are we in a spectrum, on a scale of 1 to 10? What does that mean for where there is left go? This isn’t cooked yet. You can’t take it out of the oven quite yet.

Nunn: No, absolutely no. There’s a long way to go, and I think to measure it on a scale of 1 to 10, I’d like to say higher, but it’s probably about 2 right now. Just because a lot of things that need to be done to create profession are partly done by one group or another, but not done in a unified way or with anything like one voice for the profession.

It’s interesting. We did some research on how long we might expect to take to achieve the status of a profession. Certainly, in the US at least, the shortest period of time taken so far was 26 years by librarians, but typically it was closer to 100 years and, in fact, the longest was 170-odd years. So, we’re doing pretty well. We’re going pretty quickly compared to those organizations.

We’re trying to do it on a global basis, which to my knowledge is the first time that’s been done for any profession. If anything, that will obviously make things a little more complicated, but I think there is a lot of will in the EA world to make this happen, a lot of support from all sorts of groups. Press and analysts are keen to see it happen from the talks that we’ve had and the articles we’ve read. So, where there is a will there is a way. There’s a long way to go, but we’ve made good progress in a short numbers of years, really.

Gardner: So, there’s a great deal of opportunity coming up. We’ve talked about how this is relevant to the individual. This is something good for their career. They recognize a path where they can be beneficial, appreciated, and valued. But, what’s in it for the enterprise, for the organizations that are trying to run their businesses dealing with a lot of change already? What does a group like the AEA do for them?

Nunn: It’s down to giving them the confidence that the folks that they are hiring or the folks that they are developing to do EA work within their enterprise are qualified to do that, knowledgeable to do that, or on a path to becoming true professionals in EA.

Certainly if you were hiring into your organization an accountant or a lawyer, you’d be looking to hire one that was a member of the relevant professional body with the appropriate certifications. That’s really what we’re promoting for EA. That’s the role that the association can play.

Confidence building

When we achieve success with the association is when folks are hiring enterprise architects, they will only look at folks who are members of the association, because to do anything else would be like hiring an unqualified lawyer or accountant. It’s about risk minimization and confidence building in your staff.

Gardner: Now, you wear two hats. You’re the Chief Operating Officer at The Open Group and you’re the CEO of the AEA. How do these two groups relate? You’re in the best position to tell us what’s the relationship or the context that the listeners should appreciate in terms of how these shakeouts?

Nunn: That’s a good point. It’s something that I do get asked periodically. The fact is that the association, whilst a separately incorporated body, was started by The Open Group. With these things, somebody has to start them and The Open Group’s Membership was all you needed for this to happen. So, very much the association has its roots in The Open Group and today still it works very closely with The Open Group in terms of how it operates and certain infrastructure things for the association are provided by The Open Group.

The support is still there, but increasingly the association is becoming a separate body. I mentioned the journal that’s published in the association’s name that has its own websites, its own membership.

So, little by little, there will be more separation between the two, but the aims of the two or the interests of the two are both served by EA becoming recognized as profession. It just couldn’t have happened without The Open Group, and we intend to pay a lot of attention to what goes on inside The Open Group in EA. It’s one of the leading organizations in the EA space and a group that the association would be foolish not to pay attention to, in terms of the direction of certifications and what the members, who are enterprise architects, are saying, experiencing, and what they’re needing for the future.

Gardner: So, I suppose we should expect an ongoing partnership between them for quite some time.

Nunn: Absolutely. A very close partnership and along with partnerships with other groups. The association is not looking to take anyone’s turf or tread on anyone’s toes, but to partner with the other groups that are in the ecosystem. Because if we work together, we’ll get to this profession status a lot quicker, but certainly a key partner will be The Open Group.

Gardner: Well, very good. We have been looking at the current state of EA as profession, learning about the organizing groups around that effort and the certification process that they support. We’ve been talking with Steve Nunn, the Chief Operating Officer at The Open Group and also the CEO of the newly named Association of Enterprise Architects. Thank you so much, Steve.

Nunn: Thank you, Dana.

Gardner: You’ve been listening to a sponsored BriefingsDirect podcast coming to you in conjunction with the Open Group Conference here in San Diego, the week of the February 7, 2011. This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks for joining, and come back next time.

Copyright The Open Group and Interarbor Solutions, LLC, 2005-2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirectblogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

Comments Off

Filed under Enterprise Architecture

Cloud Conference — and Unconference

By Dr. Chris Harding, The Open Group

The Wednesday of The Open Group Conference in San Diego included a formal Cloud Computing conference stream. This was followed in the evening by an unstructured CloudCamp, which made an interesting contrast.

The Cloud Conference Stream

The Cloud conference stream featured presentations on Architecting for Cloud and Cloud Security, and included a panel discussion on the considerations that must be made when choosing a Cloud solution.

In the first session of the morning, we had two presentations on Architecting for Cloud. Both considered TOGAF® as the architectural context. The first, from Stuart Boardman of Getronics, explored the conceptual difference that Cloud makes to enterprise architecture, and the challenge of communicating an architecture vision and discussing the issues with stakeholders in the subsequent TOGAF® phases. The second, from Serge Thorn of Architecting the Enterprise, looked at the considerations in each TOGAF® phase, but in a more specific way. The two presentations showed different approaches to similar subject matter, which proved a very stimulating combination.

This session was followed by a presentation from Steve Else of EA Principals in which he shared several use cases related to Cloud Computing. Using these, he discussed solution architecture considerations, and put forward the lessons learned and some recommendations for more successful planning, decision-making, and execution.

We then had the first of the day’s security-related presentations. It was given by Omkhar Arasaratnam of IBM and Stuart Boardman of Getronics. It summarized the purpose and scope of the Security for the Cloud and SOA project that is being conducted in The Open Group as a joint project of The Open Group’s Cloud Computing Work Group, the SOA Work Group, and Security Forum. Omkhar and Stuart described the usage scenarios that the project team is studying to guide its thinking, the concepts that it is developing, and the conclusions that it has reached so far.

The first session of the afternoon was started by Ed Harrington, of Architecting the Enterprise, who gave an interesting presentation on current U.S. Federal Government thinking on enterprise architecture, showing clearly the importance of Cloud Computing to U.S. Government plans. The U.S. is a leader in the use of IT for government and administration, so we can expect that its conclusions – that Cloud Computing is already making its way into the government computing fabric, and that enterprise architecture, instantiated as SOA and properly governed, will provide the greatest possibility of success in its implementation – will have a global impact.

We then had a panel session, moderated by Dana Gardner with his usual insight and aplomb, that explored the considerations that must be made when choosing a Cloud solution — custom or shrink-wrapped — and whether different forms of Cloud Computing are appropriate to different industry sectors. The panelists represented different players in the Cloud solutions market – customers, providers, and consultants – so that the topic was covered in depth and from a variety of viewpoints. They were Penelope Gordon of 1Plug Corporation, Mark Skilton of Capgemini, Ed Harrington of Architecting the Enterprise, Tom Plunkett of Oracle, and TJ Virdi of the Boeing Company.

In the final session of the conference stream, we returned to the topic of Cloud Security. Paul Simmonds, a member of the Board of the Jericho Forum®, gave an excellent presentation on de-risking the Cloud through effective risk management, in which he explained the approach that the Jericho Forum has developed. The session was then concluded by Andres Kohn of Proofpoint, who addressed the question of whether data can be more secure in the Cloud, considering public, private and hybrid Cloud environment.

CloudCamp

The CloudCamp was hosted by The Open Group but run as a separate event, facilitated by CloudCamp organizer Dave Nielsen. There were around 150-200 participants, including conference delegates and other people from the San Diego area who happened to be interested in the Cloud.

Dave started by going through his definition of Cloud Computing. Perhaps he should have known better – starting a discussion on terminology and definitions can be a dangerous thing to do with an Open Group audience. He quickly got into a good-natured argument from which he eventually emerged a little bloodied, metaphorically speaking, but unbowed.

We then had eight “lightning talks”. These were five-minute presentations covering a wide range of topics, including how to get started with Cloud (Margaret Dawson, Hubspan), supplier/consumer relationship (Brian Loesgen, Microsoft), Cloud-based geographical mapping (Ming-Hsiang Tsou, San Diego University), a patterns-based approach to Cloud (Ken Klingensmith, IBM), efficient large-scale data processing (AlexRasmussen, San Diego University), using desktop spare capacity as a Cloud resource (Michael Krumpe, Intelligent Technology Integration), cost-effective large-scale data processing in the Cloud (Patrick Salami, Temboo), and Cloud-based voice and data communication (Chris Matthieu, Tropo).

The participants then split into groups to discuss topics proposed by volunteers. There were eight topics altogether. Some of these were simply explanations of particular products or services offered by the volunteers’ companies. Others related to areas of general interest such as data security and access control, life-changing Cloud applications, and success stories relating to “big data”.

I joined the groups discussing Cloud software development on Amazon Web Services (AWS) and Microsoft Azure. These sessions had excellent information content which would be valuable to anyone wishing to get started in – or already engaged in – software development on these platforms. They also brought out two points of general interest. The first is that the dividing line between IaaS and PaaS can be very thin. AWS and Azure are in theory on opposite sides of this divide; in practice they provide the developer with broadly similar capabilities. The second point is that in practice your preferred programming language and software environment is likely to be the determining factor in your choice of Cloud development platform.

Overall, the CloudCamp was a great opportunity for people to absorb the language and attitudes of the Cloud community, to discuss ideas, and to pick up specific technical knowledge. It gave an extra dimension to the conference, and we hope that this can be repeated at future events by The Open Group.

Cloud and SOA are a topic of discussion at The Open Group Conference, San Diego, which is currently underway.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. Before joining The Open Group, he was a consultant, and a designer and development manager of communications software. With a PhD in mathematical logic, he welcomes the current upsurge of interest in semantic technology, and the opportunity to apply logical theory to practical use. He has presented at Open Group and other conferences on a range of topics, and contributes articles to on-line journals. He is a member of the BCS, the IEEE, and the AOGEA, and is a certified TOGAF® practitioner.

1 Comment

Filed under Cloud/SOA

The golden thread of interoperability

By Dr. Chris Harding, The Open Group

There are so many things going on at every Conference by The Open Group that it is impossible to keep track of all of them, and this week’s Conference in San Diego, California, is no exception. The main themes are Cybersecurity, Enterprise Architecture, SOA and Cloud Computing. Additional topics range from Real-Time and Embedded Systems to Quantum Lifecycle Management. But there are a number of common threads running through all of those themes, relating to value delivered to IT customers through open systems. One of those threads is Interoperability.

Interoperability Panel Session

The interoperability thread showed strongly in several sessions on the opening day of the conference, Monday Feb. 7, starting with a panel session on Interoperability Challenges for 2011 that I was fortunate to have been invited to moderate.

The panelists were Arnold van Overeem of Capgemini, chair of the Architecture Forum’s Interoperability project, Ron Schuldt, the founder of UDEF-IT and chair of the Semantic Interoperability Work Group’s UDEF project, TJ Virdi of Boeing, co-chair of The Open Group Cloud Computing Work Group, and Bob Weisman of Build-the-Vision, chair of The Open Group Architecture Forum’s Information Architecture project. The audience was drawn from many companies, both members and non-members of The Open Group, and made a strong contribution to the debate.

What is interoperability? The panel described several essential characteristics:

  • Systems with different owners and governance models work together;
  • They exchange and understand data automatically;
  • They form an information-sharing environment in which business information is available in the right context, to the right person, and at the right time; and
  • This environment enables processes, as well as information, to be shared.

Interoperability is not just about the IT systems. It is also about the ecosystem of user organizations, and their cultural and legislative context.

Semantics is an important component of interoperability. It is estimated that 65% of data warehouse projects fail because of their inability to cope with a huge number of data elements, differently defined.

There is a constant battle for interoperability. Systems that lock customers in by refusing to interoperate with those of other vendors can deliver strong commercial profit. This strategy is locally optimal but globally disastrous; it gives benefits to both vendors and customers in the short term, but leads in the longer term to small markets and siloed systems.  The front line is shifting constantly. There are occasional resounding victories – as with the introduction of the Internet – but the normal state is trench warfare with small and painful gains and losses.

Blame for lack of interoperability is often put on the vendors, but this is not really fair. Vendors must work within what is commercially possible. Customer organizations can help the growth of interoperability by applying pressure and insisting on support for standards. This is in their interests; integration required by lack of interoperability is currently estimated to account for over 25% of IT spend.

SOA has proved a positive force for interoperability. By embracing SOA, a customer organization can define its data model and service interfaces, and tender for competing solutions that conform to its interfaces and meet its requirements. Services can be shared processing units forming part of the ecosystem environment.

The latest IT phenomenon is Cloud Computing. This is in some ways reinforcing SOA as an interoperability enabler. Shared services can be available on the Cloud, and the ease of provisioning services in a Cloud environment speeds up the competitive tendering process.

But there is one significant area in which Cloud computing gives cause for concern: lack of interoperability between virtualization products. Virtualization is a core enabling technology for Cloud Computing, and virtualization products form the basis for most private Cloud solutions. These products are generally vendor-specific and without interoperable interfaces, so that it is difficult for a customer organization to combine different virtualization products in a private Cloud, and easy for it to become locked in to a single vendor.

There is a need for an overall interoperability framework within which standards can be positioned, to help customers express their interoperability requirements effectively. This framework should address cultural and legal aspects, and architectural maturity, as well as purely technical aspects. Semantics will be a crucial element.

Such a framework could assist the development of interoperable ecosystems, involving multiple organizations. But it will also help the development of architectures for interoperability within individual organizations – and this is perhaps of more immediate concern.

The Open Group can play an important role in the development of this framework, and in establishing it with customers and vendors.

SOA/TOGAF Practical Guide

SOA is an interoperability enabler, but establishing SOA within an enterprise is not easy to do. There are many stakeholders involved, with particular concerns to be addressed. This presents a significant task for enterprise architects.

TOGAF® has long been established as a pragmatic framework that helps enterprise architects deliver better solutions. The Open Group is developing a practical guide to using TOGAF® for SOA, as a joint project of its SOA Work Group and The Open Group Architecture Forum.

This work is now nearing completion. Ed Harrington of Architecting-the-Enterprise had overcome the considerable difficulty of assembling and adding to the material created by the project to form a solid draft. This was discussed in detail by a small group, with some participants joining by teleconference. As well as Ed, this group included Mats Gejnevall of Capgemini and Steve Bennett of Oracle, and it was led by project co-chairs Dave Hornford of Integritas and Awel Dico of the Bank of Montreal.

The discussion resolved all the issues, enabling the preparation of a draft for review by The Open Group, and we can expect to see this valuable guide published at the conclusion of the review process.

UDEF Deployment Workshop

The importance of semantics for interoperability was an important theme of the interoperability panel discussion. The Open Group is working on a specific standard that is potentially a key enabler for semantic interoperability: the Universal Data Element Framework (UDEF).

It had been decided at the previous conference, in Amsterdam, that the next stage of UDEF development should be a deployment workshop. This was discussed by a small group, under the leadership of UDEF project chair Ron Schuldt, again with some participation by teleconference.

The group included Arnold van Overeem of Capgemini, Jayson Durham of the US Navy, and Brand Niemann of the Semantic Community. Jayson is a key player in the Enterprise Lexicon Services (ELS) initiative, which aims to provide critical information interoperability capabilities through common lexicon and vocabulary services. Brand is a major enthusiast for semantic interoperability with connections to many US semantic initiatives, and currently to the Air Force OneSource project in particular, which is evolving a data analysis tool used internally by the USAF Global Cyberspace Integration Center (GCIC) Vocabulary Services Team, and made available to general data management community.  The participation of Jayson and Brand provided an important connection between the UDEF and other semantic projects.

As a result of the discussions, Ron will draft an interoperability scenario that can be the basis of a practical workshop session at the next conference, which is in London.

Complex Cloud Environments

Cloud Computing is the latest hot technology, and its adoption is having some interesting interoperability implications, as came out clearly in the Interoperability panel session. In many cases, an enterprise will use, not a single Cloud, but multiple services in multiple Clouds. These services must interoperate to deliver value to the enterprise. The Complex Cloud Environments conference stream included two very interesting presentations on this.

The first, by Mark Skilton and Vladimir Baranek of Capgemini, explained how new notations for Cloud can help explain and create better understanding and adoption of new Cloud-enabled services and the impact of social and business networks. As Cloud environments become increasingly complex, the need to explain them clearly grows. Consumers and vendors of Cloud services must be able to communicate. Stakeholders in consumer organizations must be able to discuss their concerns about the Cloud environment. The work presented by Mark and Vladimir grew from discussions in a CloudCamp that was held at a previous Conference by The Open Group. We hope that it can now be developed by The Open Group Cloud Computing Work Group to become a powerful and sophisticated language to address this communication need.

The second presentation, from Soobaek Jang of IBM, addressed the issue of managing and coordinating across a large number of instances in a Cloud Computing environment. He explained an architecture for “Multi-Node Management Services” that acts as a framework for auto-scaling in a SaaS lifecycle, putting structure around self-service activity, and providing a simple and powerful web service orientation that allows providers to manage and orchestrate deployments in logical groups.

SOA Conference Stream

The principal presentation in this stream picked up on one of the key points from the Interoperability panel session in a very interesting way. It showed how a formal ontology can be a practical basis for common operation of SOA repositories. Semantic interoperability is at the cutting edge of interoperability, and is more often the subject of talk than of action. The presentation included a demonstration, and it was great to see the ideas put to real use.

The presentation was given jointly by Heather Kreger, SOA Work Group Co-chair, and Vince Brunssen, Co-chair of SOA Repository Artifact Model and Protocol (S-RAMP) at OASIS. Both presenters are from IBM. S-Ramp is an emerging standard from OASIS that enables interoperability between tools and repositories for SOA. It uses the formal SOA Ontology that was developed by The Open Group, with extensions to enable a common service model as well as an interoperability protocol.

This presentation illustrated how S-RAMP and the SOA Ontology work in concert with The Open Group SOA Governance Framework to enable governance across vendors. It contained a demonstration that included defining new service models with the S-RAMP extensions in one SOA repository and communicating with another repository to augment its service model.

To conclude the session, I gave a brief presentation on SOA in the Cloud – the Next Challenge for Enterprise Architects. This discussed how the SOA architectural style is widely accepted as the style for enterprise architecture, and how Cloud Computing is a technical possibility that can be used in enterprise architecture. Architectures using Cloud computing should be service-oriented, but this poses some key questions for the architect. Architecture governance must change in the context of Cloud-based ecosystems. It may take some effort to keep to the principles of the SOA style – but it will be important to do this. And the organization of the infrastructure – which may migrate from the enterprise to the Cloud – will present an interesting challenge.

Enabling Semantic Interoperability Through Next Generation UDEF

The day was rounded off by an evening meeting, held jointly with the local chapter of the IEEE, on semantic interoperability. The meeting featured a presentation by Ron Schuldt, UDEF Project Chair, on the history, current state, and future goals of the UDEF.

The importance of semantics as a component of interoperability was clear in the morning’s panel discussion. In this evening session, Ron explained how the UDEF can enable semantic interoperability, and described the plans of the UDEF Project Team to expand the framework to meet the evolving needs of enterprises today and in the future.

This meeting was arranged through the good offices of Jayson Durham, and it was great that local IEEE members could join conference participants for an excellent session.

Cloud is a topic of discussion at The Open Group Conference, San Diego, which is currently underway.

Dr Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. Before joining The Open Group, he was a consultant, and a designer and development manager of communications software. With a PhD in mathematical logic, he welcomes the current upsurge of interest in semantic technology, and the opportunity to apply logical theory to practical use. He has presented at Open Group and other conferences on a range of topics, and contributes articles to on-line journals. He is a member of the BCS, the IEEE, and the AOGEA, and is a certified TOGAF practitioner.

3 Comments

Filed under Cloud/SOA, TOGAF®