Tag Archives: service-oriented architecture

Business Benefit from Public Data

By Dr. Chris Harding, Director for Interoperability, The Open Group

Public bodies worldwide are making a wealth of information available, and encouraging its commercial exploitation. This sounds like a bonanza for the private sector at the public expense, but entrepreneurs are holding back. A healthy market for products and services that use public-sector information would provide real benefits for everyone. What can we do to bring it about?

Why Governments Give Away Data

The EU directive of 2003 on the reuse of public sector information encourages the Member States to make as much information available for reuse as possible. This directive was revised and strengthened in 2013. The U.S. Open Government Directive of 2009 provides similar encouragement, requiring US government agencies to post at least three high-value data sets online and register them on its data.gov portal. Other countries have taken similar measures to make public data publicly available.

Why are governments doing this? There are two main reasons.

One is that it improves the societies that they serve and the governments themselves. Free availability of information about society and government makes people more effective citizens and makes government more efficient. It illuminates discussion of civic issues, and points a searchlight at corruption.

The second reason is that it has a positive effect on the wealth of nations and their citizens. The EU directive highlights the ability of European companies to exploit the potential of public-sector information, and contribute to economic growth and job creation. Information is not just the currency of democracy. It is also the lubricant of a successful economy.

Success Stories

There are some big success stories.

If you drive a car, you probably use satellite navigation to find your way about, and this may use public-sector information. In the UK, for example, map data that can be used by sat-nav systems is supplied for commercial use by a government agency, the Ordnance Survey.

When you order something over the web for delivery to your house, you often enter a postal code and see most of the address auto-completed by the website. Postcode databases are maintained by national postal authorities, which are generally either government departments or regulated private corporations, and made available by them for commercial use. Here, the information is not directly supporting a market, but is contributing to the sale of a range of unrelated products and services.

The data may not be free. There are commercial arrangements for supply of map and postcode data. But it is available, and is the basis for profitable products and for features that make products more competitive.

The Bonanza that Isn’t

These successes are, so far, few in number. The economic benefits of open government data could be huge. The McKinsey Global Institute estimates a potential of between 3 and 5 trillion dollars annually. Yet the direct impact of Open Data on the EU economy in 2010, seven years after the directive was issued, is estimated by Capgemini at only about 1% of that, although the EU accounts for nearly a quarter of world GDP.

The business benefits to be gained from using map and postcode data are obvious. There are other kinds of public sector data, where the business benefits may be substantial, but they are not easy to see. For example, data is or could be available about public transport schedules and availability, about population densities, characteristics and trends, and about real estate and land use. These are all areas that support substantial business activity, but businesses in these areas seldom make use of public sector information today.

Where are the Products?

Why are entrepreneurs not creating these potentially profitable products and services? There is one obvious reason. The data they are interested in is not always available and, where it is available, it is provided in different ways, and comes in different formats. Instead of a single large market, the entrepreneur sees a number of small markets, none of which is worth tackling. For example, the market for an application that plans public transport journeys across a single town is not big enough to justify substantial investment in product development. An application that could plan journeys across any town in Europe would certainly be worthwhile, but is not possible unless all the towns make this data available in a common format.

Public sector information providers often do not know what value their data has, or understand its applications. Working within tight budgets, they cannot afford to spend large amounts of effort on assembling and publishing data that will not be used. They follow the directives but, without common guidelines, they simply publish whatever is readily to hand, in whatever form it happens to be.

The data that could support viable products is not available everywhere and, where it is available, it comes in different formats. (One that is often used is PDF, which is particularly difficult to process as an information source.) The result is that the cost of product development is high, and the expected return is low.

Where is the Market?

There is a second reason why entrepreneurs hesitate. The shape of the market is unclear. In a mature market, everyone knows who the key players are, understands their motivations, and can predict to some extent how they will behave. The market for products and services based on public sector information is still taking shape. No one is even sure what kinds of organization will take part, or what they will do. How far, for example, will public-sector bodies go in providing free applications? Can large corporations buy future dominance with loss-leader products? Will some unknown company become an overnight success, like Facebook? With these unknowns, the risks are very high.

Finding the Answers

Public sector information providers and standards bodies are tackling these problems. The Open Group participates in SHARE-PSI, the European network for the exchange of experience and ideas around implementing open data policies in the public sector. The experience gained by SHARE-PSI will be used by the World-Wide Web Consortium as a basis for standards and guidelines for publication of public sector information. These standards and guidelines may be used, not just by the public sector, but by not-for-profit bodies and even commercial corporations, many of which have information that they want to make freely available.

The Open Group is making a key contribution by helping to map the shape of the market. It is using the Business Scenario technique from its well-known Enterprise Architecture methodology TOGAF® to identify the kinds of organization that will take part, and their objectives and concerns.

There will be a preview of this on October 22 at The Open Group event in London which will feature a workshop session on Open Public Sector Data. This workshop will look at how Open Data can help business, present a draft of the Business Scenario, and take input from participants to help develop its conclusions.

The developed Business Scenario will be presented at the SHARE-PSI workshop in Lisbon on December 3-4. The theme of this workshop is encouraging open data usage by commercial developers. It will bring a wide variety of stakeholders together to discuss and build the relationship between the public and private sectors. It will also address, through collaboration with the EU LAPSI project, the legal framework for use of open public sector data.

Benefit from Participation!

If you are thinking about publishing or using public-sector data, you can benefit from these workshops by gaining an insight into the way that the market is developing. In the long term, you can influence the common standards and guidelines that are being developed. In the short term, you can find out what is happening and network with others who are interested.

The social and commercial benefits of open public-sector data are not being realized today. They can be realized through a healthy market in products and services that process the data and make it useful to citizens. That market will emerge when public bodies and businesses clearly understand the roles that they can play. Now is the time to develop that understanding and begin to profit from it.

Register for The Open Group London 2014 event at http://www.opengroup.org/london2014/registration.

Find out how to participate in the Lisbon SHARE-PSI workshop at http://www.w3.org/2013/share-psi/workshop/lisbon/#Participation

 

Chris HardingDr. Chris Harding is Director for Interoperability at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0™ Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

Leave a comment

Filed under big data, Cloud, digital technologies, Enterprise Architecture, Open Platform 3.0, TOGAF®, Uncategorized

The Open Group Boston 2014 to Explore How New IT Trends are Empowering Improvements in Business

By The Open Group

The Open Group Boston 2014 will be held on July 21-22 and will cover the major issues and trends surrounding Boundaryless Information Flow™. Thought-leaders at the event will share their outlook on IT trends, capabilities, best practices and global interoperability, and how this will lead to improvements in responsiveness and efficiency. The event will feature presentations from representatives of prominent organizations on topics including Healthcare, Service-Oriented Architecture, Security, Risk Management and Enterprise Architecture. The Open Group Boston will also explore how cross-organizational collaboration and trends such as big data and cloud computing are helping to make enterprises more effective.

The event will consist of two days of plenaries and interactive sessions that will provide in-depth insight on how new IT trends are leading to improvements in business. Attendees will learn how industry organizations are seeking large-scale transformation and some of the paths they are taking to realize that.

The first day of the event will bring together subject matter experts in the Open Platform 3.0™, Boundaryless Information Flow™ and Enterprise Architecture spaces. The day will feature thought-leaders from organizations including Boston University, Oracle, IBM and Raytheon. One of the keynotes is from Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, which reveals the secret of internet-driven marketplaces. Other content:

• The Open Group Open Platform 3.0™ focuses on new and emerging technology trends converging with each other and leading to new business models and system designs. These trends include mobility, social media, big data analytics, cloud computing and the Internet of Things.
• Cloud security and the key differences in securing cloud computing environments vs. traditional ones as well as the methods for building secure cloud computing architectures
• Big Data as a service framework as well as preparing to deliver on Big Data promises through people, process and technology
• Integrated Data Analytics and using them to improve decision outcomes

The second day of the event will have an emphasis on Healthcare, with keynotes from Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health, and Connect for Health Colorado CTO, Proteus Duxbury. The day will also showcase speakers from Hewlett Packard and Blue Cross Blue Shield, multiple tracks on a wide variety of topics such as Risk and Professional Development, and Archimate® tutorials. Key learnings include:

• Improving healthcare’s information flow is a key enabler to improving healthcare outcomes and implementing efficiencies within today’s delivery models
• Identifying the current state of IT standards and future opportunities which cover the healthcare ecosystem
• How Archimate® can be used by Enterprise Architects for driving business innovation with tried and true techniques and best practices
• Security and Risk Management evolving as software applications become more accessible through APIs – which can lead to vulnerabilities and the potential need to increase security while still understanding the business value of APIs

Member meetings will also be held on Wednesday and Thursday, June 23-24.

Don’t wait, register now to participate in these conversations and networking opportunities during The Open Group Boston 2014: http://www.opengroup.org/boston2014/registration

Join us on Twitter – #ogchat #ogBOS

1 Comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open Platform 3.0, Professional Development, RISK Management, Service Oriented Architecture, Standards, Uncategorized

The Open Group Open Platform 3.0™ Starts to Take Shape

By Dr. Chris Harding, Director for Interoperability, The Open Group

The Open Group published a White Paper on Open Platform 3.0™ at the start of its conference in Amsterdam in May 2014. This article, based on a presentation given at the conference, explains how the definition of the platform is beginning to emerge.

Introduction

Amsterdam is a beautiful place. Walking along the canals is like moving through a set of picture postcards. But as you look up at the houses beside the canals, and you see the cargo hoists that many of them have, you are reminded that the purpose of the arrangement was not to give pleasure to tourists. Amsterdam is a great trading city, and the canals were built as a very efficient way of moving goods around.

This is also a reminder that the primary purpose of architecture is not to look beautiful, but to deliver business value, though surprisingly, the two often seem to go together quite well.

When those canals were first thought of, it might not have been obvious that this was the right thing to do for Amsterdam. Certainly the right layout for the canal network would not be obvious. The beginning of a project is always a little uncertain, and seeing the idea begin to take shape is exciting. That is where we are with Open Platform 3.0 right now.

We started with the intention to define a platform to enable enterprises to get value from new technologies including cloud computing, social computing, mobile computing, big data, the Internet of Things, and perhaps others. We developed an Open Group business scenario to capture the business requirements. We developed a set of business use-cases to show how people are using and wanting to use those technologies. And that leads to the next step, which is to define the platform. All these new technologies and their applications sound wonderful, but what actually is Open Platform 3.0?

The Third Platform

Looking historically, the first platform was the computer operating system. A vendor-independent operating system interface was defined by the UNIX® standard. The X/Open Company and the Open Software Foundation (OSF), which later combined to form The Open Group, were created because companies everywhere were complaining that they were locked into proprietary operating systems. They wanted applications portability. X/Open specified the UNIX® operating system as a common application environment, and the value that it delivered was to prevent vendor lock-in.

The second platform is the World Wide Web. It is a common services environment, for services used by people browsing web pages or for web services used by programs. The value delivered is universal deployment and access. Any person or company anywhere can create a services-based solution and deploy it on the web, and every person or company throughout the world can access that solution.

Open Platform 3.0 is developing as a common architecture environment. This does not mean it is a replacement for TOGAF®. TOGAF is about how you do architecture and will continue to be used with Open Platform 3.0. Open Platform 3.0 is about what kind of architecture you will create. It will be a common environment in which enterprises can do architecture. The big business benefit that it will deliver is integrated solutions.

ChrisBlog1

Figure 1: The Third Platform

With the second platform, you can develop solutions. Anyone can develop a solution based on services accessible over the World Wide Web. But independently-developed web service solutions will very rarely work together “out of the box”.

There is an increasing need for such solutions to work together. We see this need when looking at The Open Platform 3.0 technologies. People want to use these technologies together. There are solutions that use them, but they have been developed independently of each other and have to be integrated. That is why Open Platform 3.0 has to deliver a way of integrating solutions that have been developed independently.

Common Architecture Environment

The Open Group has recently published its first thoughts on Open Platform 3.0 in the Open Platform 3.0 White Paper. This lists a number of things that will eventually be in the Open Platform 3.0 standard. Many of these are common architecture artifacts that can be used in solution development. They will form a common architecture environment. They are:

  • Statement of need, objectives, and principles – this is not part of that environment of course; it says why we are creating it.
  • Definitions of key terms – clearly you must share an understanding of the key terms if you are going to develop common solutions or integrable solutions.
  • Stakeholders and their concerns – an understanding of these is an important aspect of an architecture development, and something that we need in the standard.
  • Capabilities map – this shows what the products and services that are in the platform do.
  • Basic models – these show how the platform components work with each other and with other products and services.
  • Explanation of how the models can be combined to realize solutions – this is an important point and one that the white paper does not yet start to address.
  • Standards and guidelines that govern how the products and services interoperate – these are not standards that The Open Group is likely to produce, they will almost certainly be produced by other bodies, but we need to identify the appropriate ones and probably in some cases coordinate with the appropriate bodies to see that they are developed.

The Open Platform 3.0 White Paper contains an initial statement of needs, objectives and principles, definitions of some key terms, a first-pass list of stakeholders and their concerns, and half a dozen basic models. The basic models are in an analysis of the business use-cases for Open Platform 3.0 that were developed earlier.

These are just starting points. The white paper is incomplete: each of the sections is incomplete in itself, and of course the white paper does not contain all the sections that will be in the standard. And it is all subject to change.

An Example Basic Model

The figure shows a basic model that could be part of the Open Platform 3.0 common architecture environment.

ChrisBlog 2

Figure 2: Mobile Connected Device Model

This is the Mobile Connected Device Model: one of the basic models that we identified in the snapshot. It comes up quite often in the use-cases.

The stack on the left is a mobile device. It has a user, it has apps, it has a platform which would probably be Android or iOS, it has infrastructure that supports the platform, and it is connected to the World Wide Web, because that’s part of the definition of mobile computing.

On the right you see, and this is a frequently encountered pattern, that you don’t just use your mobile device for running apps. Maybe you connect it to a printer, maybe you connect it to your headphones, maybe you connect it to somebody’s payment terminal, you can connect it to many things. You might do this through a Universal Serial Bus (USB). You might do it through Bluetooth. You might do it by Near Field Communications (NFC). You might use other kinds of local connection.

The device you connect to may be operated by yourself (e.g. if it is headphones), or by another organization (e.g. if it is a payment terminal). In the latter case you typically have a business relationship with the operator of the connected device.

That is an example of the basic models that came up in the analysis of the use-cases. It is captured in the White Paper. It is fundamental to mobile computing and is also relevant to the Internet of Things.

Access to Technologies

This figure captures our understanding of the need to obtain information from the new technologies, social media, mobile devices, sensors and so on, the need to process that information, maybe on the cloud, to manage it and, ultimately, to deliver it in a form where there is analysis and reasoning that enables enterprises to take business decisions.

ChrisBlog 3

Figure 3: Access to Technologies

The delivery of information to improve the quality of decisions is the source of real business value.

User-Driven IT

The next figure captures a requirement that we picked up in the development of the business scenario.

ChrisBlog 4

Figure 4: User-Driven IT

Traditionally, you would have had the business use in the business departments of an enterprise, and pretty much everything else in the IT department. But we are seeing two big changes. One is that the business users are getting smarter, more able to use technology. The other is they want to use technology themselves, or to have business technologists closely working with them, rather than accessing it indirectly through the IT department.

The systems provisioning and management is now often done by cloud service providers, and the programming and integration and helpdesk by cloud brokers, or by an IT department that plays a broker role, rather than working in the traditional way.

The business still needs to retain responsibility for the overall architecture and for compliance. If you do something against your company’s principles, your customers will hold you responsible. It is no defense to say, “Our broker did it that way.” Similarly, if you break the law, your broker does not go to jail, you do. So those things will continue to be more associated with the business departments, even as the rest is devolved.

In short, businesses have a new way of using IT that Open Platform 3.0 must and will accommodate.

Integration of Independently-Developed Solutions

The next figure illustrates how the integration of independently developed solutions can be achieved.

ChrisBlog 5

Figure 5: Architecture Integration

It shows two solutions, which come from the analysis of different business use-cases. They share a common model, which makes it much easier to integrate them. That is why the Open Platform 3.0 standard will define common models for access to the new technologies.

The Open Platform 3.0 standard will have other common artifacts: architectural principles, stakeholder definitions and descriptions, and so on. Independently-developed architectures that use them can be integrated more easily.

Enterprises develop their architectures independently, but engage with other enterprises in business ecosystems that require shared solutions. Increasingly, business relationships are dynamic, and there is no time to develop an agreed ecosystem architecture from scratch. Use of the same architecture platform, with a common architecture environment including elements such as principles, stakeholder concerns, and basic models, enables the enterprise architectures to be integrated, and shared solutions to be developed quickly.

Completing the Definition

How will we complete the definition of Open Platform 3.0?

The Open Platform 3.0 Forum recently published a set of 22 business use-cases – the Nexus of Forces in Action. These use-cases show the application of Social, Mobile and Cloud Computing, Big Data, and the Internet of Things in a wide variety of business areas.

ChrisBlog 6

Figure 6: Business Use-Cases

The figure comes from that White Paper and shows some of those areas: multimedia, social networks, building energy management, smart appliances, financial services, medical research, and so on.

Use-Case Analysis

We have started to analyze those use-cases. This is an ArchiMate model showing how our first business use-case, The Mobile Smart Store, could be realized.

ChrisBlog 7

Figure 7: Use-Case Analysis

As you look at it you see common models. Outlined on the left is a basic model that is pretty much the same as the original TOGAF Technical Reference Model. The main difference is the addition of a business layer (which shows how enterprise architecture has moved in the business direction since the TRM was defined).

But you also see that the same model appears in the use-case in a different place, as outlined on the right. It appears many times throughout the business use-cases.

Finally, you can see that the Mobile Connected Device Model has appeared in this use-case (outlined in the center). It appears in other use-cases too.

As we analyze the use-cases, we find common models, as well as common principles, common stakeholders, and other artifacts.

The Development Cycle

We have a development cycle: understanding the value of the platform by considering use-cases, analyzing those use-cases to derive common features, and documenting the common features in a specification.

ChrisBlog 8

Figure 8: The Development Cycle

The Open Platform 3.0 White Paper represents the very first pass through that cycle, further passes will result in further White Papers, a snapshot, and ultimately The Open Platform 3.0 standard, and no doubt more than one version of that standard.

Conclusions

Open Platform 3.0 provides a common architecture environment. This enables enterprises to derive business value from social computing, mobile computing, big data, the Internet-of-Things, and potentially other new technologies.

Cognitive computing, for example, has been suggested as another technology that Open Platform 3.0 might in due course accommodate. What would that lead to? There would be additional use-cases, which would lead to further analysis, which would no doubt identify some basic models for cognitive computing, which would be added to the platform.

Open Platform 3.0 enables enterprise IT to be user-driven. There is a revolution in the way that businesses use IT. Users are becoming smarter and more able to use technology, and want to do so directly, rather than through a separate IT department. Business departments are taking in business technologists who understand how to use technology for business purposes. Some companies are closing their IT departments and using cloud brokers instead. In other companies, the IT department is taking on a broker role, sourcing technology that business people use directly.Open Platform 3.0 will be part of that revolution.

Open Platform 3.0 will deliver the ability to integrate solutions that have been independently developed. Businesses typically exist within one or more business ecosystems. Those ecosystems are dynamic: partners join, partners leave, and businesses cannot standardize the whole architecture across the ecosystem; it would be nice to do so but, by the time it was done, the business opportunity would be gone. Integration of independently developed architectures is crucial to the world of business ecosystems and delivering value within them.

Call for Input

The platform will deliver a common architecture environment, user-driven enterprise IT, and the ability to integrate solutions that have been independently developed. The Open Platform 3.0 Forum is defining it through an iterative process of understanding the content, analyzing the use-cases, and documenting the common features. We welcome input and comments from other individuals within and outside The Open Group and from other industry bodies.

If you have comments on the way Open Platform 3.0 is developing or input on the way it should develop, please tell us! You can do so by sending mail to platform3-input@opengroup.org or share your comments on our blog.

References

The Open Platform 3.0 White Paper: https://www2.opengroup.org/ogsys/catalog/W147

The Nexus of Forces in Action: https://www2.opengroup.org/ogsys/catalog/W145

TOGAF®: http://www.opengroup.org/togaf/

harding

Dr. Chris Harding is Director for Interoperability at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0™ Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

 

 

 

 

 

2 Comments

Filed under architecture, Boundaryless Information Flow™, Cloud, Cloud/SOA, digital technologies, Open Platform 3.0, Service Oriented Architecture, Standards, TOGAF®, Uncategorized

The Onion & The Open Group Open Platform 3.0™

By Stuart Boardman, Senior Business Consultant, KPN Consulting, and Co-Chair of The Open Group Open Platform 3.0™

Onion1

The onion is widely used as an analogy for complex systems – from IT systems to mystical world views.Onion2

 

 

 

It’s a good analogy. From the outside it’s a solid whole but each layer you peel off reveals a new onion (new information) underneath.

And a slice through the onion looks quite different from the whole…Onion3

What (and how much) you see depends on where and how you slice it.Onion4

 

 

 

 

The Open Group Open Platform 3.0™ is like that. Use-cases for Open Platform 3.0 reveal multiple participants and technologies (Cloud Computing, Big Data Analytics, Social networks, Mobility and The Internet of Things) working together to achieve goals that vary by participant. Each participant’s goals represent a different slice through the onion.

The Ecosystem View
We commonly use the idea of peeling off layers to understand large ecosystems, which could be Open Platform 3.0 systems like the energy smart grid but could equally be the workings of a large cooperative or the transport infrastructure of a city. We want to know what is needed to keep the ecosystem healthy and what the effects could be of the actions of individuals on the whole and therefore on each other. So we start from the whole thing and work our way in.

Onion5

The Service at the Centre of the Onion

If you’re the provider or consumer (or both) of an Open Platform 3.0 service, you’re primarily concerned with your slice of the onion. You want to be able to obtain and/or deliver the expected value from your service(s). You need to know as much as possible about the things that can positively or negatively affect that. So your concern is not the onion (ecosystem) as a whole but your part of it.

Right in the middle is your part of the service. The first level out from that consists of other participants with whom you have a direct relationship (contractual or otherwise). These are the organizations that deliver the services you consume directly to enable your own service.

One level out from that (level 2) are participants with whom you have no direct relationship but on whose services you are still dependent. It’s common in Platform 3.0 that your partners too will consume other services in order to deliver their services (see the use cases we have documented). You need to know as much as possible about this level , because whatever happens here can have a positive or negative effect on you.

One level further from the centre we find indirect participants who don’t necessarily delivery any part of the service but whose actions may well affect the rest. They could just be indirect materials suppliers. They could also be part of a completely different value network in which your level 1 or 2 “partners” participate. You can’t expect to understand this level in detail but you know that how that value network performs can affect your partners’ strategy or even their very existence. The knock-on impact on your own strategy can be significant.

We can conceive of more levels but pretty soon a law of diminishing returns sets in. At each level further from your own organization you will see less detail and more variety. That in turn means that there will be fewer things you can actually know (with any certainty) and not much more that you can even guess at. That doesn’t mean that the ecosystem ends at this point. Ecosystems are potentially infinite. You just need to decide how deep you can usefully go.

Limits of the Onion
At a certain point one hits the limits of an analogy. If everybody sees their own organization as the centre of the onion, what we actually have is a bunch of different, overlapping onions.

Onion6

And you can’t actually make onions overlap, so let’s not take the analogy too literally. Just keep it in mind as we move on. Remember that our objective is to ensure the value of the service we’re delivering or consuming. What we need to know therefore is what can change that’s outside of our own control and what kind of change we might expect. At each visible level of the theoretical onion we will find these sources of variety. How certain of their behaviour we can be will vary – with a tendency to the less certain as we move further from the centre of the onion. We’ll need to decide how, if at all, we want to respond to each kind of variety.

But that will have to wait for my next blog. In the meantime, here are some ways people look at the onion.

Onion7   Onion8

 

 

 

 

SONY DSCStuart Boardman is a Senior Business Consultant with KPN Consulting where he leads the Enterprise Architecture practice and consults to clients on Cloud Computing, Enterprise Mobility and The Internet of Everything. He is Co-Chair of The Open Group Open Platform 3.0™ Forum and was Co-Chair of the Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by KPN, the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI as well as several Open Group white papers, guides and standards. He is a frequent speaker at conferences on the topics of Open Platform 3.0 and Identity.

2 Comments

Filed under Cloud, Cloud/SOA, Conference, Enterprise Architecture, Open Platform 3.0, Service Oriented Architecture, Standards, Uncategorized

Secure Integration of Convergent Technologies – a Challenge for Open Platform™

By Dr. Chris Harding, The Open Group

The results of The Open Group Convergent Technologies survey point to secure integration of the technologies as a major challenge for Open Platform 3.0.  This and other input is the basis for the definition of the platform, where the discussion took place at The Open Group conference in London.

Survey Highlights

Here are some of the highlights from The Open Group Convergent Technologies survey.

  • 95% of respondents felt that the convergence of technologies such as social media, mobility, cloud, big data, and the Internet of things represents an opportunity for business
  • Mobility currently has greatest take-up of these technologies, and the Internet of things has least.
  • 84% of those from companies creating solutions want to deal with two or more of the technologies in combination.
  • Developing the understanding of the technologies by potential customers is the first problem that solution creators must overcome. This is followed by integrating with products, services and solutions from other suppliers, and using more than one technology in combination.
  • Respondents saw security, vendor lock-in, integration and regulatory compliance as the main problems for users of software that enables use of these convergent technologies for business purposes.
  • When users are considered separately from other respondents, security and vendor lock-in show particularly strongly as issues.

The full survey report is available at: https://www2.opengroup.org/ogsys/catalog/R130

Open Platform 3.0

Analysts forecast that convergence of technical phenomena including mobility, cloud, social media, and big data will drive the growth in use of information technology through 2020. Open Platform 3.0 is an initiative that will advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to use them.

The survey confirms the value of an open platform to protect users of these technologies from vendor lock-in. It also shows that security is a key concern that must be addressed, that the platform must make the technologies easy to use, and that it must enable them to be used in combination.

Understanding the Requirements

The Open Group is conducting other work to develop an understanding of the requirements of Open Platform 3.0. This includes:

  • The Open Platform 3.0 Business Scenario, that was recently published, and is available from https://www2.opengroup.org/ogsys/catalog/R130
  • A set of business use cases, currently in development
  • A high-level round-table meeting to gain the perspective of CIOs, who will be key stakeholders.

The requirements input have been part of the discussion at The Open Group Conference, which took place in London this week. Monday’s keynote presentation by Andy Mulholland, Former Global CTO at Capgemini on “Just Exactly What Is Going on in Business and Technology?” included the conclusions from the round-table meeting. This week’s presentation and panel discussion on the requirements for Open Platform 3.0 covered all the inputs.

Delivering the Platform

Review of the inputs in the conference was followed by a members meeting of the Open Platform 3.0 Forum, to start developing the architecture of Open Platform 3.0, and to plan the delivery of the platform definition. The aim is to have a snapshot of the definition early in 2014, and to deliver the first version of the standard a year later.

Meeting the Challenge

Open Platform 3.0 will be crucial to establishing openness and interoperability in the new generation of information technologies. This is of first importance for everyone in the IT industry.

Following the conference, there will be an opportunity for everyone to input material and ideas for the definition of the platform. If you want to be part of the community that shapes the definition, to work on it with like-minded people in other companies, and to gain early insight of what it will be, then your company must join the Open Platform 3.0 Forum. (For more information on this, contact Chris Parnell – c.parnell@opengroup.org)

Providing for secure integration of the convergent technologies, and meeting the other requirements for Open Platform 3.0, will be a difficult but exciting challenge. I’m looking forward to continue to tackle the challenge with the Forum members.

Dr. Chris Harding

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

1 Comment

Filed under Cloud/SOA, Conference, Data management, Future Technologies, Open Platform 3.0, Semantic Interoperability, Service Oriented Architecture, Standards

As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

This latest BriefingsDirect discussion, leading into the The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore the business implications of the current shift to so-called Platform 3.0.

Known as the new model through which big data, cloud, and mobile and social — in combination — allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we’re here now to learn more how to leverage Platform 3.0 as more than a IT shift — and as a business game-changer. It will be a big topic at next week’s conference.

The panel: Dave Lounsbury, Chief Technical Officer at The Open Group; Chris Harding, Director of Interoperability at The Open Group, and Mark Skilton, Global Director in the Strategy Office at Capgemini. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They’re turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it’s a pretty fundamental change.

Lounsbury

If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we’re starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Harding: Enterprises have to keep up with the way that things are moving in order to keep their positions in their industries. Enterprises can’t afford to be working with yesterday’s technology. It’s a case of being able to understand the information that they’re presented, and make the best decisions.

Harding

We’ve always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we’re talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs — velocity, volume, and value — on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we’re now into is the multi-workload environment, where you have mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton

It has to do with not just one solution, not one subscription model — because we’re now into this subscription-model era … the subscription economy, as one group tends to describe it. Now, we’re looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we’re dealing with — 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Why should IT be thinking about this as a fundamental shift, rather than a modest change?

Lounsbury: A lot depends on how you define your IT organization. It’s useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it’s how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There’s no point giving someone data if it’s not been properly managed or if there’s incorrect information.

What’s going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we’ve seen in the open-source  role and things like that nature, but there’s the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they’re going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can’t do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally — how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I’d just like to add to Dave’s excellent points about, the shape of data has changed, but also about why should IT get involved. We’re seeing that there’s a shift in the constituency of who is using this data.

We have the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We’ve got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there’s also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: How do we prevent this from going off the rails?

Harding: This a very important point. And to add to the difficulties, it’s not only that a whole set of different people are getting involved with different kinds of information, but there’s also a step change in the speed with which all this is delivered. It’s no longer the case, that you can say, “Oh well, we need some kind of information system to manage this information. We’ll procure it and get a program written” that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It’s really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It’s a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Skilton: Capgemini has been doing work in this area. I break it down into four levels of scalability. It’s the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We’re very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it’s all about connectivity in the field. I meet a number of clients who are saying, “We’ve got this cloud service,” or “This service is in a certain area of my country. If I move to another parts of the country or I’m traveling, I can’t get connectivity.” That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the “long tail effect.” It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: We’re coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3.0, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We’re still in the formational stages of  “third platform” or Platform 3.0 for The Open Group as an industry. To some extent, we’re starting pretty much at the ground floor with that in the Platform 3.0 forum. We’re leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we’ve got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we’re really working toward in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Harding: We certainly also need to understand the business environment within which Platform 3.0 will be used. We’ve heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we’ve heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We’re also hearing about marketplaces for services, new ways in which services are being made available and combined.

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: Looking to the future, when we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we’ve begun to see that how a recommendation engine could be brought to bear. I’m thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thoughts?

Skilton: What we’re talking about is the next generation of the Internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the Internet.

I think that in the future, we’ll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We’ll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what’s happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: There’s this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, “So now this is where we could make a change to our business.” It’s the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It’s going to be different for every business, and I’m very happy to say this, it’s something that computers aren’t going to be able to do for a very long time yet. It’s going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it’s a very exciting time, and we’ll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we’ll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, “It’s going to be them” or “It’s going to be them.”

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I’d disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we’ll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Cloud/SOA, Conference, Data management, Enterprise Architecture, Platform 3.0, Professional Development, TOGAF®

Beyond Big Data

By Chris Harding, The Open Group

The big bang that started The Open Group Conference in Newport Beach was, appropriately, a presentation related to astronomy. Chris Gerty gave a keynote on Big Data at NASA, where he is Deputy Program Manager of the Open Innovation Program. He told us how visualizing deep space and its celestial bodies created understanding and enabled new discoveries. Everyone who attended felt inspired to explore the universe of Big Data during the rest of the conference. And that exploration – as is often the case with successful space missions – left us wondering what lies beyond.

The Big Data Conference Plenary

The second presentation on that Monday morning brought us down from the stars to the nuts and bolts of engineering. Mechanical devices require regular maintenance to keep functioning. Processing the mass of data generated during their operation can improve safety and cut costs. For example, airlines can overhaul aircraft engines when it needs doing, rather than on a fixed schedule that has to be frequent enough to prevent damage under most conditions, but might still fail to anticipate failure in unusual circumstances. David Potter and Ron Schuldt lead two of The Open Group initiatives, Quantum Lifecycle management (QLM) and the Universal Data Element Framework (UDEF). They explained how a semantic approach to product lifecycle management can facilitate the big-data processing needed to achieve this aim.

Chris Gerty was then joined by Andras Szakal, vice-president and chief technology officer at IBM US Federal IMT, Robert Weisman, chief executive officer of Build The Vision, and Jim Hietala, vice-president of Security at The Open Group, in a panel session on Big Data that was moderated by Dana Gardner of Interarbor Solutions. As always, Dana facilitated a fascinating discussion. Key points made by the panelists included: the trend to monetize data; the need to ensure veracity and usefulness; the need for security and privacy; the expectation that data warehouse technology will exist and evolve in parallel with map/reduce “on-the-fly” analysis; the importance of meaningful presentation of the data; integration with cloud and mobile technology; and the new ways in which Big Data can be used to deliver business value.

More on Big Data

In the afternoons of Monday and Tuesday, and on most of Wednesday, the conference split into streams. These have presentations that are more technical than the plenary, going deeper into their subjects. It’s a pity that you can’t be in all the streams at once. (At one point I couldn’t be in any of them, as there was an important side meeting to discuss the UDEF, which is in one of the areas that I support as forum director). Fortunately, there were a few great stream presentations that I did manage to get to.

On the Monday afternoon, Tom Plunkett and Janet Mostow of Oracle presented a reference architecture that combined Hadoop and NoSQL with traditional RDBMS, streaming, and complex event processing, to enable Big Data analysis. One application that they described was to trace the relations between particular genes and cancer. This could have big benefits in disease prediction and treatment. Another was to predict the movements of protesters at a demonstration through analysis of communications on social media. The police could then concentrate their forces in the right place at the right time.

Jason Bloomberg, president of Zapthink – now part of Dovel – is always thought-provoking. His presentation featured the need for governance vitality to cope with ever changing tools to handle Big Data of ever increasing size, “crowdsourcing” to channel the efforts of many people into solving a problem, and business transformation that is continuous rather than a one-time step from “as is” to “to be.”

Later in the week, I moderated a discussion on Architecting for Big Data in the Cloud. We had a well-balanced panel made up of TJ Virdi of Boeing, Mark Skilton of Capgemini and Tom Plunkett of Oracle. They made some excellent points. Big Data analysis provides business value by enabling better understanding, leading to better decisions. The analysis is often an iterative process, with new questions emerging as answers are found. There is no single application that does this analysis and provides the visualization needed for understanding, but there are a number of products that can be used to assist. The role of the data scientist in formulating the questions and configuring the visualization is critical. Reference models for the technology are emerging but there are as yet no commonly-accepted standards.

The New Enterprise Platform

Jogging is a great way of taking exercise at conferences, and I was able to go for a run most mornings before the meetings started at Newport Beach. Pacific Coast Highway isn’t the most interesting of tracks, but on Tuesday morning I was soon up in Castaways Park, pleasantly jogging through the carefully-nurtured natural coastal vegetation, with views over the ocean and its margin of high-priced homes, slipways, and yachts. I reflected as I ran that we had heard some interesting things about Big Data, but it is now an established topic. There must be something new coming over the horizon.

The answer to what this might be was suggested in the first presentation of that day’s plenary, Mary Ann Mezzapelle, security strategist for HP Enterprise Services, talked about the need to get security right for Big Data and the Cloud. But her scope was actually wider. She spoke of the need to secure the “third platform” – the term coined by IDC to describe the convergence of social, cloud and mobile computing with Big Data.

Securing Big Data

Mary Ann’s keynote was not about the third platform itself, but about what should be done to protect it. The new platform brings with it a new set of security threats, and the increasing scale of operation makes it increasingly important to get the security right. Mary Ann presented a thoughtful analysis founded on a risk-based approach.

She was followed by Adrian Lane, chief technology officer at Securosis, who pointed out that Big Data processing using NoSQL has a different architecture from traditional relational data processing, and requires different security solutions. This does not necessarily mean new techniques; existing techniques can be used in new ways. For example, Kerberos may be used to secure inter-node communications in map/reduce processing. Adrian’s presentation completed the Tuesday plenary sessions.

Service Oriented Architecture

The streams continued after the plenary. I went to the Distributed Services Architecture stream, which focused on SOA.

Bill Poole, enterprise architect at JourneyOne in Australia, described how to use the graphical architecture modeling language ArchiMate® to model service-oriented architectures. He illustrated this using a case study of a global mining organization that wanted to consolidate its two existing bespoke inventory management applications into a single commercial off-the-shelf application. It’s amazing how a real-world case study can make a topic come to life, and the audience certainly responded warmly to Bill’s excellent presentation.

Ali Arsanjani, chief technology officer for Business Performance and Service Optimization, and Heather Kreger, chief technology officer for International Standards, both at IBM, described the range of SOA standards published by The Open Group and available for use by enterprise architects. Ali was one of the brains that developed the SOA Reference Architecture, and Heather is a key player in international standards activities for SOA, where she has helped The Open Group’s Service Integration Maturity Model and SOA Governance Framework to become international standards, and is working on an international standard SOA reference architecture.

Cloud Computing

To start Wednesday’s Cloud Computing streams, TJ Virdi, senior enterprise architect at The Boeing Company, discussed use of TOGAF® to develop an Enterprise Architecture for a Cloud ecosystem. A large enterprise such as Boeing may use many Cloud service providers, enabling collaboration between corporate departments, partners, and regulators in a complex ecosystem. Architecting for this is a major challenge, and The Open Group’s TOGAF for Cloud Ecosystems project is working to provide guidance.

Stuart Boardman of KPN gave a different perspective on Cloud ecosystems, with a case study from the energy industry. An ecosystem may not necessarily be governed by a single entity, and the participants may not always be aware of each other. Energy generation and consumption in the Netherlands is part of a complex international ecosystem involving producers, consumers, transporters, and traders of many kinds. A participant may be involved in several ecosystems in several ways: a farmer for example, might consume energy, have wind turbines to produce it, and also participate in food production and transport ecosystems.

Penelope Gordon of 1-Plug Corporation explained how choice and use of business metrics can impact Cloud service providers. She worked through four examples: a start-up Software-as-a-Service provider requiring investment, an established company thinking of providing its products as cloud services, an IT department planning to offer an in-house private Cloud platform, and a government agency seeking budget for government Cloud.

Mark Skilton, director at Capgemini in the UK, gave a presentation titled “Digital Transformation and the Role of Cloud Computing.” He covered a very broad canvas of business transformation driven by technological change, and illustrated his theme with a case study from the pharmaceutical industry. New technology enables new business models, giving competitive advantage. Increasingly, the introduction of this technology is driven by the business, rather than the IT side of the enterprise, and it has major challenges for both sides. But what new technologies are in question? Mark’s presentation had Cloud in the title, but also featured social and mobile computing, and Big Data.

The New Trend

On Thursday morning I took a longer run, to and round Balboa Island. With only one road in or out, its main street of shops and restaurants is not a through route and the island has the feel of a real village. The SOA Work Group Steering Committee had found an excellent, and reasonably priced, Italian restaurant there the previous evening. There is a clear resurgence of interest in SOA, partly driven by the use of service orientation – the principle, rather than particular protocols – in Cloud Computing and other new technologies. That morning I took the track round the shoreline, and was reminded a little of Dylan Thomas’s “fishing boat bobbing sea.” Fishing here is for leisure rather than livelihood, but I suspected that the fishermen, like those of Thomas’s little Welsh village, spend more time in the bar than on the water.

I thought about how the conference sessions had indicated an emerging trend. This is not a new technology but the combination of four current technologies to create a new platform for enterprise IT: Social, Cloud, and Mobile computing, and Big Data. Mary Ann Mezzapelle’s presentation had referenced IDC’s “third platform.” Other discussions had mentioned Gartner’s “Nexus of forces,” the combination of Social, Cloud and Mobile computing with information that Gartner says is transforming the way people and businesses relate to technology, and will become a key differentiator of business and technology management. Mark Skilton had included these same four technologies in his presentation. Great minds, and analyst corporations, think alike!

I thought also about the examples and case studies in the stream presentations. Areas as diverse as healthcare, manufacturing, energy and policing are using the new technologies. Clearly, they can deliver major business benefits. The challenge for enterprise architects is to maximize those benefits through pragmatic architectures.

Emerging Standards

On the way back to the hotel, I remarked again on what I had noticed before, how beautifully neat and carefully maintained the front gardens bordering the sidewalk are. I almost felt that I was running through a public botanical garden. Is there some ordinance requiring people to keep their gardens tidy, with severe penalties for anyone who leaves a lawn or hedge unclipped? Is a miserable defaulter fitted with a ball and chain, not to be removed until the untidy vegetation has been properly trimmed, with nail clippers? Apparently not. People here keep their gardens tidy because they want to. The best standards are like that: universally followed, without use or threat of sanction.

Standards are an issue for the new enterprise platform. Apart from the underlying standards of the Internet, there really aren’t any. The area isn’t even mapped out. Vendors of Social, Cloud, Mobile, and Big Data products and services are trying to stake out as much valuable real estate as they can. They have no interest yet in boundaries with neatly-clipped hedges.

This is a stage that every new technology goes through. Then, as it matures, the vendors understand that their products and services have much more value when they conform to standards, just as properties have more value in an area where everything is neat and well-maintained.

It may be too soon to define those standards for the new enterprise platform, but it is certainly time to start mapping out the area, to understand its subdivisions and how they inter-relate, and to prepare the way for standards. Following the conference, The Open Group has announced a new Forum, provisionally titled Open Platform 3.0, to do just that.

The SOA and Cloud Work Groups

Thursday was my final day of meetings at the conference. The plenary and streams presentations were done. This day was for working meetings of the SOA and Cloud Work Groups. I also had an informal discussion with Ron Schuldt about a new approach for the UDEF, following up on the earlier UDEF side meeting. The conference hallways, as well as the meeting rooms, often see productive business done.

The SOA Work Group discussed a certification program for SOA professionals, and an update to the SOA Reference Architecture. The Open Group is working with ISO and the IEEE to define a standard SOA reference architecture that will have consensus across all three bodies.

The Cloud Work Group had met earlier to further the TOGAF for Cloud ecosystems project. Now it worked on its forthcoming white paper on business performance metrics. It also – though this was not on the original agenda – discussed Gartner’s Nexus of Forces, and the future role of the Work Group in mapping out the new enterprise platform.

Mapping the New Enterprise Platform

At the start of the conference we looked at how to map the stars. Big Data analytics enables people to visualize the universe in new ways, reach new understandings of what is in it and how it works, and point to new areas for future exploration.

As the conference progressed, we found that Big Data is part of a convergence of forces. Social, mobile, and Cloud Computing are being combined with Big Data to form a new enterprise platform. The development of this platform, and its roll-out to support innovative applications that deliver more business value, is what lies beyond Big Data.

At the end of the conference we were thinking about mapping the new enterprise platform. This will not require sophisticated data processing and analysis. It will take discussions to create a common understanding, and detailed committee work to draft the guidelines and standards. This work will be done by The Open Group’s new Open Platform 3.0 Forum.

The next Open Group conference is in the week of April 15, in Sydney, Australia. I’m told that there’s some great jogging there. More importantly, we’ll be reflecting on progress in mapping Open Platform 3.0, and thinking about what lies ahead. I’m looking forward to it already.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Conference