Tag Archives: cloud computing

NASCIO Defines State of Enterprise Architecture at The Open Group Conference in Philadelphia

By E.G. Nadhan, HP

I have attended and blogged about many Open Group conferences. The keynotes at these conferences like other conferences provide valuable insight into the key messages and the underlying theme for the conference – which is Enterprise Architecture and Enterprise Transformation for The Open Group Conference in Philadelphia. Therefore, it is no surprise that Eric Sweden, Program Director, Enterprise Architecture & Governance, NASCIO will be delivering one of the keynotes on “State of the States: NASCIO on Enterprise Architecture”. Sweden asserts “Enterprise Architecture” provides an operating discipline for creating, operating, continual re-evaluation and transformation of an “Enterprise.” Not only do I agree with this assertion, but I would add that the proper creation, operation and continuous evaluation of the “Enterprise” systemically drives its transformation. Let’s see how.

Creation. This phase involves the definition of the Enterprise Architecture (EA) in the first place. Most often, this involves the definition of an architecture that factors in what is in place today while taking into account the future direction. TOGAF® (The Open Group Architecture Framework) provides a framework for developing this architecture from a business, application, data, infrastructure and technology standpoint; in alignment with the overall Architecture Vision with associated architectural governance.

Operation. EA is not a done deal once it has been defined. It is vital that the EA defined is sustained on a consistent basis with the advent of new projects, new initiatives, new technologies, and new paradigms. As the abstract states, EA is a comprehensive business discipline that drives business and IT investments. In addition to driving investments, the operation phase also includes making the requisite changes to the EA as a result of these investments.

Continuous Evaluation. We live in a landscape of continuous change with innovative solutions and technologies constantly emerging. Moreover, the business objectives of the enterprise are constantly impacted by market dynamics, mergers and acquisitions. Therefore, the EA defined and in operation must be continuously evaluated against the architectural principles, while exercising architectural governance across the enterprise.

Transformation. EA is an operating discipline for the transformation of an enterprise. Enterprise Transformation is not a destination — it is a journey that needs to be managed — as characterized by Twentieth Century Fox CIO, John Herbert. To Forrester Analyst Phil Murphy, Transformation is like the Little Engine That Could — focusing on the business functions that matter. (Big Data – highlighted in another keynote at this conference by Michael Cavaretta — is a paradigm gaining a lot of ground for enterprises to stay competitive in the future.)

Global organizations are enterprises of enterprises, undergoing transformation faced with the challenges of systemic architectural governance. NASCIO has valuable insight into the challenges faced by the 50 “enterprises” represented by each of the United States. Challenges that contrast the need for healthy co-existence of these states with the desire to retain a degree of autonomy. Therefore, I look forward to this keynote to see how EA done right can drive the transformation of the Enterprise.

By the way, remember when Enterprise Architecture was done wrong close to the venue of another Open Group conference?

How does Enterprise Architecture drive the transformation of your enterprise? Please let me know.

A version of this blog post originally appeared on the HP Journey through Enterprise IT Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. 

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, TOGAF®

As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

This latest BriefingsDirect discussion, leading into the The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore the business implications of the current shift to so-called Platform 3.0.

Known as the new model through which big data, cloud, and mobile and social — in combination — allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we’re here now to learn more how to leverage Platform 3.0 as more than a IT shift — and as a business game-changer. It will be a big topic at next week’s conference.

The panel: Dave Lounsbury, Chief Technical Officer at The Open Group; Chris Harding, Director of Interoperability at The Open Group, and Mark Skilton, Global Director in the Strategy Office at Capgemini. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They’re turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it’s a pretty fundamental change.

Lounsbury

If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we’re starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Harding: Enterprises have to keep up with the way that things are moving in order to keep their positions in their industries. Enterprises can’t afford to be working with yesterday’s technology. It’s a case of being able to understand the information that they’re presented, and make the best decisions.

Harding

We’ve always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we’re talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs — velocity, volume, and value — on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we’re now into is the multi-workload environment, where you have mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton

It has to do with not just one solution, not one subscription model — because we’re now into this subscription-model era … the subscription economy, as one group tends to describe it. Now, we’re looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we’re dealing with — 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Why should IT be thinking about this as a fundamental shift, rather than a modest change?

Lounsbury: A lot depends on how you define your IT organization. It’s useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it’s how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There’s no point giving someone data if it’s not been properly managed or if there’s incorrect information.

What’s going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we’ve seen in the open-source  role and things like that nature, but there’s the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they’re going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can’t do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally — how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I’d just like to add to Dave’s excellent points about, the shape of data has changed, but also about why should IT get involved. We’re seeing that there’s a shift in the constituency of who is using this data.

We have the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We’ve got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there’s also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: How do we prevent this from going off the rails?

Harding: This a very important point. And to add to the difficulties, it’s not only that a whole set of different people are getting involved with different kinds of information, but there’s also a step change in the speed with which all this is delivered. It’s no longer the case, that you can say, “Oh well, we need some kind of information system to manage this information. We’ll procure it and get a program written” that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It’s really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It’s a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Skilton: Capgemini has been doing work in this area. I break it down into four levels of scalability. It’s the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We’re very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it’s all about connectivity in the field. I meet a number of clients who are saying, “We’ve got this cloud service,” or “This service is in a certain area of my country. If I move to another parts of the country or I’m traveling, I can’t get connectivity.” That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the “long tail effect.” It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: We’re coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3.0, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We’re still in the formational stages of  “third platform” or Platform 3.0 for The Open Group as an industry. To some extent, we’re starting pretty much at the ground floor with that in the Platform 3.0 forum. We’re leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we’ve got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we’re really working toward in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Harding: We certainly also need to understand the business environment within which Platform 3.0 will be used. We’ve heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we’ve heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We’re also hearing about marketplaces for services, new ways in which services are being made available and combined.

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: Looking to the future, when we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we’ve begun to see that how a recommendation engine could be brought to bear. I’m thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thoughts?

Skilton: What we’re talking about is the next generation of the Internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the Internet.

I think that in the future, we’ll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We’ll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what’s happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: There’s this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, “So now this is where we could make a change to our business.” It’s the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It’s going to be different for every business, and I’m very happy to say this, it’s something that computers aren’t going to be able to do for a very long time yet. It’s going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it’s a very exciting time, and we’ll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we’ll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, “It’s going to be them” or “It’s going to be them.”

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I’d disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we’ll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Comments Off on As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

Filed under ArchiMate®, Business Architecture, Cloud, Cloud/SOA, Conference, Data management, Enterprise Architecture, Platform 3.0, Professional Development, TOGAF®

Why is Cloud Adoption Taking so Long?

By Chris Harding, The Open Group

At the end of last year, Gartner predicted that cloud computing would become an integral part of IT in 2013 (http://www.gartner.com/DisplayDocument?doc_cd=230929). This looks a pretty safe bet. The real question is, why is it taking so long?

Cloud Computing

Cloud computing is a simple concept. IT resources are made available, within an environment that enables them to be used, via a communications network, as a service. It is used within enterprises to enable IT departments to meet users’ needs more effectively, and by external providers to deliver better IT services to their enterprise customers.

There are established vendors of products to fit both of these scenarios. The potential business benefits are well documented. There are examples of real businesses gaining those benefits, such as Netflix as a public cloud user (see http://www.zdnet.com/the-biggest-cloud-app-of-all-netflix-7000014298/ ), and Unilever and Lufthansa as implementers of private cloud (see http://www.computerweekly.com/news/2240114043/Unilever-and-Lufthansa-Systems-deploy-Azure-Private-cloud ).

Slow Pace of Adoption

Yet we are still talking of cloud computing becoming an integral part of IT. In the 2012 Open Group Cloud ROI survey, less than half of the respondents’ organizations were using cloud computing, although most of the rest were investigating its use. (See http://www.opengroup.org/sites/default/files/contentimages/Documents/cloud_roi_formal_report_12_19_12-1.pdf ). Clearly, cloud computing is not being used for enterprise IT as a matter of routine.

Cloud computing is now at least seven years old. Amazon’s “Elastic Compute Cloud” was launched in August 2006, and there are services that we now regard as cloud computing, though they may not have been called that, dating from before then. Other IT revolutions – personal computers, for example – have reached the point of being an integral part of IT in half the time. Why has it taken Cloud so long?

The Reasons

One reason is that using Cloud requires a high level of trust. You can lock your PC in your office, but you cannot physically secure your cloud resources. You must trust the cloud service provider. Such trust takes time to earn.

Another reason is that, although it is a simple concept, cloud computing is described in a rather complex way. The widely-accepted NIST definition (see http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf ) has three service models and four deployment models, giving a total of twelve distinct delivery combinations. Each combination has different business drivers, and the three service models are based on very different technical capabilities. Real products, of course, often do not exactly correspond to the definition, and their vendors describe them in product-specific terms. This complexity often leads to misunderstanding and confusion.

A third reason is that you cannot “mix and match” cloud services from different providers. The market is consolidating, with a few key players emerging as dominant at the infrastructure and platform levels. Each of them has its own proprietary interfaces. There are no real vendor-neutral standards. A recent Information Week article on Netflix (http://www.informationweek.co.uk/cloud-computing/platform/how-netflix-is-ruining-cloud-computing/240151650 ) describes some of the consequences. Customers are beginning to talk of “vendor lock-in” in a way that we haven’t seen since the days of mainframes.

The Portability and Interoperability Guide

The Open Group Cloud Computing Portability and Interoperability Guide addresses this last problem, by providing recommendations to customers on how best to achieve portability and interoperability when working with current cloud products and services. It also makes recommendations to suppliers and standards bodies on how standards and best practice should evolve to enable greater portability and interoperability in the future.

The Guide tackles the complexity of its subject by defining a simple Distributed Computing Reference Model. This model shows how cloud services fit into the mix of products and services used by enterprises in distributed computing solutions today. It identifies the major components of cloud-enabled solutions, and describes their portability and interoperability interfaces.

Platform 3.0

Cloud is not the only new game in town. Enterprises are looking at mobile computing, social computing, big data, sensors, and controls as new technologies that can transform their businesses. Some of these – mobile and social computing, for example – have caught on faster than Cloud.

Portability and interoperability are major concerns for these technologies too. There is a need for a standard platform to enable enterprises to use all of the new technologies, individually and in combination, and “mix and match” different products. This is the vision of the Platform 3.0 Forum, recently formed by The Open Group. The distributed computing reference model is an important input to this work.

The State of the Cloud

It is now at least becoming routine to consider cloud computing when architecting a new IT solution. The chances of it being selected however appear to be less than fifty-fifty, in spite of its benefits. The reasons include those mentioned above: lack of trust, complexity, and potential lock-in.

The Guide removes some of the confusion caused by the complexity, and helps enterprises assess their exposure to lock-in, and take what measures they can to prevent it.

The growth of cloud computing is starting to be constrained by lack of standards to enable an open market with free competition. The Guide contains recommendations to help the industry and standards bodies produce the standards that are needed.

Let’s all hope that the standards do appear soon. Cloud is, quite simply, a good idea. It is an important technology paradigm that has the potential to transform businesses, to make commerce and industry more productive, and to benefit society as a whole, just as personal computing did. Its adoption really should not be taking this long.

The Open Group Cloud Computing Portability and Interoperability Guide is available from The Open Group bookstore at https://www2.opengroup.org/ogsys/catalog/G135

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

3 Comments

Filed under Platform 3.0

Driving Boundaryless Information Flow in Healthcare

By E.G. Nadhan, HP

I look forward with great interest to the upcoming Open Group conference on EA & Enterprise Transformation in Finance, Government & Healthcare in Philadelphia in July 2013. In particular, I am interested in the sessions planned on topics related to the Healthcare Industry. This industry is riddled with several challenges of uncontrolled medical costs, legislative pressures, increased plan participation, and improved longevity of individuals. Come to think of it, these challenges are not that different from those faced when defining a comprehensive enterprise architecture. Therefore, can the fundamental principles of Enterprise Architecture be applied towards the resolution of these challenges in the Healthcare industry? The Open Group certainly thinks so.

Enterprise Architecture is a discipline, methodology, and practice for translating business vision and strategy into the fundamental structures and dynamics of an enterprise at various levels of abstraction. As defined by TOGAF®, enterprise architecture needs to be developed through multiple phases. These include Business Architecture, Applications, Information, and Technology Architecture. All this must be in alignment with the overall vision. The TOGAF Architecture Development Method enables a systematic approach to addressing these challenges while simplifying the problem domain.

This approach to the development of Enterprise Architecture can be applied towards the complex problem domain that manifests itself in Healthcare. Thus, it is no surprise that The Open Group is sponsoring the Population Health Working Group, which has a vision to enable “boundary-less information flow” between the stakeholders that participate in healthcare delivery. Checkout the presentation delivered by Larry Schmidt, Chief Technologist, Health and Life Sciences Industries, HP, US at the Open Group conference in Philadelphia.

As a Platinum member of The Open Group, HP has co-chaired the release of multiple standards, including the first technical cloud standard. The Open Group is also leading the definition of the Cloud Governance Framework. Having co-chaired these projects, I look forward to the launch of the Population Health Working Group with great interest.

Given the role of information in today’s landscape, “boundary-less information flow” between the stakeholders that participate in healthcare delivery is vital. At the same time, how about injecting a healthy dose of innovation given that enterprise Architects are best positioned for innovation – a post triggered by Forrester Analyst Brian Hopkins’s thoughts on this topic. The Open Group — with its multifaceted representation from a wide array of enterprises — provides incredible opportunities for innovation in the context of the complex landscape of the healthcare industry. Take a look at the steps taken by HP Labs to innovate and improve patient care one day at a time.

I would strongly encourage you to attend Schmidt’s session, as well as the Healthcare Transformation Panel moderated by Open Group CEO, Allen Brown at this conference.

How about you? What are some of the challenges that you are facing within the Healthcare industry today? Have you applied Enterprise Architecture development methods to problem domains in other industries? Please let me know.

Connect with Nadhan on: Twitter, Facebook, Linkedin and Journey Blog.

A version of this blog post originally appeared on the HP Enterprise Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. 

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Conference, Enterprise Architecture, Healthcare, TOGAF®

The Interconnectedness of All Things

By Stuart Boardman, KPN

My admiration for Douglas Adams only seems to increase with the years.

Adams, in his quiet way, conveyed quite a few useful insights into both human behavior and how the world (and the universe) works – or seems to work – or seems at times not to work. One of his little masterpieces was “the interconnectedness of all things,” which was the insight that inspired the work of Dirk Gently, owner and sole operative of the Holistic Detective Agency. This wasn’t some piece of cosmic mysticism, but essentially a rather practical insistence on looking at the pieces of the puzzle as an interconnected whole, even when one doesn’t yet know what the completed puzzle will look like. Here’s how Dirk expressed it:

“I’m very glad you asked me that, Mrs. Rawlinson. The term `holistic’ refers to my conviction that what we are concerned with here is the fundamental interconnectedness of all things. I do not concern myself with such petty things as fingerprint powder, telltale pieces of pocket fluff and inane footprints. I see the solution to each problem as being detectable in the pattern and web of the whole. The connections between causes and effects are often much more subtle and complex than we with our rough and ready understanding of the physical world might naturally suppose, Mrs. Rawlinson.

Let me give you an example. If you go to an acupuncturist with toothache, he sticks a needle instead into your thigh. Do you know why he does that, Mrs. Rawlinson?

No, neither do I, Mrs. Rawlinson, but we intend to find out. A pleasure talking to you, Mrs. Rawlinson. Goodbye.”

Cloud, SOA, Enterprise Mobility, Social Media/Enterprise/Business, The Internet of Things, Big Data (you name it) – each in its own way is part of an overall tendency. The general trend is for enterprises to become increasingly involved in increasingly broad ecosystems. As a trend, it predates that list of Internet phenomena but it’s clear that they are dramatically accelerating the pace. Not only do they individually contribute to that trend but collectively they add another factor of both complexity and urgency to the picture. They are interconnected by cause and effect and by usage. Unfortunately that interconnectedness doesn’t (yet) involve very much interoperability.

Readers of this blog will know that The Open Group is starting a new initiative, Platform 3.0  which will be looking at these technologies as a whole and at how they might be considered to collectively represent some new kind of virtual computing platform. There’s an ongoing discussion of what the scope of such an initiative should be, to what extent it should concentrate on the technologies, to what extent on purely business aspects and to what extent we should concentrate on the whole, as opposed to the sum of the parts. One can also see this as one overarching phenomenon in which making a distinction between business and technology may not actually be meaningful.

Although no one (as far as I know) denies that each of these has its own specifics and deserves individual examination, people are starting to understand that we need to go with Dirk Gently and look at the “pattern and web of the whole”.

Open Group members and conference presenters have been pointing this out for a couple of years now but, like it or not, it often takes an analyst firm like Gartner to notice it for everyone else to start taking it seriously. What these organizations like to do is to pin labels on things. Give it a name, and you can kid yourself you know what it is. That fact in and of itself makes it easier for people – especially those who don’t like dealing with stuff you actually have to think about. It’s an example of the 42 problem I wrote about elsewhere.

Gartner frequently talks about the “Nexus of Forces.” Those of you who are not Trekkies may not understand why I fall over laughing at that one. For your benefit, the Nexus was this sort of cloud thing, which if you were able to jump into it, enabled you to live out your most treasured but unrealistic dreams. And in the Star Trek movie this was a big problem, because out there in the real world everything was going seriously pear shaped.

In my view, it’s crucial to tackle the general tendency. Organizations and in particular commercial organizations become part of what Jack Martin Leith calls a “Business Ecosystem”(jump to slide 11 in the link for the definition). If one goes back, say, ten years (maybe less), this tendency already manifested itself on the business side through the “outsourcing” of significant parts of the organization’s business processes to other organizations – partners. The result wasn’t simply a value chain but a value network, sometimes known as Extended Enterprise. Ten years later we see that Cloud can have the same effect on how even the processes retained within the organization are carried out. Social and mobile take this further and also take it out into the wider enterprise and out into that business ecosystem. Cloud, social and mobile involve technological interconnectedness. Social and mobile also involve business interconnectedness (one could argue that Cloud does too and I wouldn’t feel the need to disagree). The business of an enterprise becomes increasingly bound up with the business of other enterprises and as a result can be affected by changes and developments well outside its own range of control.

We know that the effects of these various technologies are interconnected at multiple levels, so it becomes increasingly important to understand how they will work together – or fail to work together. Or to put it more constructively, we need strategies and standards to ensure that they do work together to the extent that we can control them. We also need to understand what all the things are that we can’t control but might just jump out and bite us. There are already enough anti-patterns for the use of social media. Add to that the multi-channel implications of mobility, stir in a dose of Cloud and a bunch of machines exchanging messages without being able to ask each other, “excuse me, what did you mean by that?” It’s easy to see how things might go pear shaped while we’re having fun in the Nexus.

Does this lead to an unmanageable scope for Platform 3.0? I don’t think so. We’ll probably have to prioritize the work. Everyone has their own knowledge, experience and interests, so we may well do things of different granularity in parallel. That all needs to be discussed. But from my perspective, one of the first priorities will be to understand that interconnectedness, so we can work out where the needle needs to go to get rid of the pain.

Stuart Boardman is a Senior Business Consultant with KPN where he co-leads the Enterprise Architecture practice as well as the Cloud Computing solutions group. He is co-lead of The Open Group Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI. He is a frequent speaker at conferences on the topics of Cloud, SOA, and Identity. 

3 Comments

Filed under Enterprise Architecture, Platform 3.0

Thinking About Big Data

By Dave Lounsbury, The Open Group

“We can not solve our problems with the same level of thinking that created them.”

– Albert Einstein

The growing consumerization of technology and convergence of technologies such as the “Internet of Things”, social networks and mobile devices are causing big changes for enterprises and the marketplace. They are also generating massive amounts of data related to behavior, environment, location, buying patterns and more.

Having massive amounts of data readily available is invaluable. More data means greater insight, which leads to more informed decision-making. So far, we are keeping ahead of this data by smarter analytics and improving the way we handle this data. The question is, how long can we keep up? The rate of data production is increasing; as an example, an IDC report[1] predicts that the production of data will increase 50X in the coming decade. To magnify this problem, there’s an accompanying explosion of data about the data – cataloging information, metadata, and the results of analytics are all data in themselves. At the same time, data scientists and engineers who can deal with such data are already a scarce commodity, and the number of such people is expected to grow only by 1.5X in the same period.

It isn’t hard to draw the curve. Turning data into actionable insight is going to be a challenge – data flow is accelerating at a faster rate than the available humans can absorb, and our databases and data analytic systems can only help so much.

Markets never leave gaps like this unfilled, and because of this we should expect to see a fundamental shift in the IT tools we use to deal with the growing tide of data. In order to solve the challenges of managing data with the volume, variety and velocities we expect, we will need to teach machines to do more of the analysis for us and help to make the best use of scarce human talents.

The Study of Machine Learning

Machine Learning, sometimes called “cognitive computing”[2] or “intelligent computing”, looks at the study of building computers with the capability to learn and perform tasks based on experience. Experience in this context includes looking at vast data sets, using multiple “senses” or types of media, recognizing patterns from past history or precedent, and extrapolating this information to reason about the problem at hand. An example of machine learning that is currently underway in the healthcare sector is medical decision aids that learn to predict therapies or to help with patient management, based on correlating a vast body of medical and drug experience data with the information about the patients under treatment

A well-known example of this is Watson, a machine learning system IBM unveiled a few years ago. While Watson is best known for winning Jeopardy, that was just the beginning. IBM has since built six Watsons to assist with their primary objective: to help health care professionals find answers to complex medical questions and help with patient management[3]. The sophistication of Watson is the reaction of all this data action that is going on. Watson of course isn’t the only example in this field, with others ranging from Apple’s Siri intelligent voice-operated assistant to DARPA’s SyNAPSE program[4].

Evolution of the Technological Landscape

As the consumerization of technology continues to grow and converge, our way of constructing business models and systems need to evolve as well. We need to let data drive the business process, and incorporate intelligent machines like Watson into our infrastructure to help us turn data into actionable results.

There is an opportunity for information technology and companies to help drive this forward. However, in order for us to properly teach computers how to learn, we first need to understand the environments in which they will be asked to learn in – Cloud, Big Data, etc. Ultimately, though, any full consideration of these problems will require a look at how machine learning can help us make decisions – machine learning systems may be the real platform in these areas.

The Open Group is already laying the foundation to help organizations take advantage of these convergent technologies with its new forum, Platform 3.0. The forum brings together a community of industry thought leaders to analyze the use of Cloud, Social, Mobile computing and Big Data, and describe the business benefits that enterprises can gain from them. We’ll also be looking at trends like these at our Philadelphia conference this summer.  Please join us in the discussion.


2 Comments

Filed under Cloud, Cloud/SOA, Data management, Enterprise Architecture

Why Business Needs Platform 3.0

By Chris Harding, The Open Group

The Internet gives businesses access to ever-larger markets, but it also brings more competition. To prosper, they must deliver outstanding products and services. Often, this means processing the ever-greater, and increasingly complex, data that the Internet makes available. The question they now face is, how to do this without spending all their time and effort on information technology.

Web Business Success

The success stories of giants such as Amazon are well-publicized, but there are other, less well-known companies that have profited from the Web in all sorts of ways. Here’s an example. In 2000 an English illustrator called Jacquie Lawson tried creating greetings cards on the Internet. People liked what she did, and she started an e-business whose website is now ranked by Alexa as number 2712 in the world, and #1879 in the USA. This is based on website traffic and is comparable, to take a company that may be better known, with toyota.com, which ranks slightly higher in the USA (#1314) but somewhat lower globally (#4838).

A company with a good product can grow fast. This also means, though, that a company with a better product, or even just better marketing, can eclipse it just as quickly. Social networking site Myspace was once the most visited site in the US. Now it is ranked by Alexa as #196, way behind Facebook, which is #2.

So who ranks as #1? You guessed it – Google. Which brings us to the ability to process large amounts of data, where Google excels.

The Data Explosion

The World-Wide Web probably contains over 13 billion pages, yet you can often find the information that you want in seconds. This is made possible by technology that indexes this vast amount of data – measured in petabytes (millions of gigabytes) – and responds to users’ queries.

The data on the world-wide-web originally came mostly from people, typing it in by hand. In future, we will often use data that is generated by sensors in inanimate objects. Automobiles, for example, can generate data that can be used to optimize their performance or assess the need for maintenance or repair.

The world population is measured in billions. It is estimated that the Internet of Things, in which data is collected from objects, could enable us to track 100 trillion objects in real time – ten thousand times as many things as there are people, tirelessly pumping out information. The amount of available data of potential value to businesses is set to explode yet again.

A New Business Generation

It’s not just the amount of data to be processed that is changing. We are also seeing changes in the way data is used, the way it is processed, and the way it is accessed. Following The Open Group conference in January, I wrote about the convergence of social, Cloud, and mobile computing with Big Data. These are the new technical trends that are taking us into the next generation of business applications.

We don’t yet know what all those applications will be – who in the 1990’s would have predicted greetings cards as a Web application – but there are some exciting ideas. They range from using social media to produce market forecasts to alerting hospital doctors via tablets and cellphones when monitors detect patient emergencies. All this, and more, is possible with technology that we have now, if we can use it.

The Problem

But there is a problem. Although there is technology that enables businesses to use social, Cloud, and mobile computing, and to analyze and process massive amounts of data of different kinds, it is not necessarily easy to use. A plethora of products is emerging, with different interfaces, and with no ability to work with each other.  This is fine for geeks who love to play with new toys, but not so good for someone who wants to realize a new business idea and make money.

The new generation of business applications cannot be built on a mish-mash of unstable products, each requiring a different kind of specialist expertise. It needs a solid platform, generally understood by enterprise architects and software engineers, who can translate the business ideas into technical solutions.

The New Platform

Former VMware CEO and current Pivotal Initiative leader Paul Maritz describes the situation very well in his recent blog on GigaOM. He characterizes the new breed of enterprises, that give customers what they want, when they want it and where they want it, by exploiting the opportunities provided by new technologies, as consumer grade. Paul says that, “Addressing these opportunities will require new underpinnings; a new platform, if you like. At the core of this platform, which needs to be Cloud-independent to prevent lock-in, will be new approaches to handling big and fast (real-time) data.”

The Open Group has announced its new Platform 3.0 Forum to help the industry define a standard platform to meet this need. As The Open Group CTO Dave Lounsbury says in his blog, the new Forum will advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to take advantage of these convergent technologies. This will be accomplished by identifying a set of new platform capabilities, and architecting and standardizing an IT platform by which enterprises can reap the business benefits of Platform 3.0.

Business Focus

A business set up to design greetings cards should not spend its time designing communications networks and server farms. It cannot afford to spend time on such things. Someone else will focus on its core business and take its market.

The Web provided a platform that businesses of its generation could build on to do what they do best without being overly distracted by the technology. Platform 3.0 will do this for the new generation of businesses.

Help It Happen!

To find out more about the Platform 3.0 Forum, and take part in its formation, watch out for the Platform 3.0 web meetings that will be announced by e-mail and twitter, and on our home page.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Platform 3.0