Tag Archives: information technology

Business Benefit from Public Data

By Dr. Chris Harding, Director for Interoperability, The Open Group

Public bodies worldwide are making a wealth of information available, and encouraging its commercial exploitation. This sounds like a bonanza for the private sector at the public expense, but entrepreneurs are holding back. A healthy market for products and services that use public-sector information would provide real benefits for everyone. What can we do to bring it about?

Why Governments Give Away Data

The EU directive of 2003 on the reuse of public sector information encourages the Member States to make as much information available for reuse as possible. This directive was revised and strengthened in 2013. The U.S. Open Government Directive of 2009 provides similar encouragement, requiring US government agencies to post at least three high-value data sets online and register them on its data.gov portal. Other countries have taken similar measures to make public data publicly available.

Why are governments doing this? There are two main reasons.

One is that it improves the societies that they serve and the governments themselves. Free availability of information about society and government makes people more effective citizens and makes government more efficient. It illuminates discussion of civic issues, and points a searchlight at corruption.

The second reason is that it has a positive effect on the wealth of nations and their citizens. The EU directive highlights the ability of European companies to exploit the potential of public-sector information, and contribute to economic growth and job creation. Information is not just the currency of democracy. It is also the lubricant of a successful economy.

Success Stories

There are some big success stories.

If you drive a car, you probably use satellite navigation to find your way about, and this may use public-sector information. In the UK, for example, map data that can be used by sat-nav systems is supplied for commercial use by a government agency, the Ordnance Survey.

When you order something over the web for delivery to your house, you often enter a postal code and see most of the address auto-completed by the website. Postcode databases are maintained by national postal authorities, which are generally either government departments or regulated private corporations, and made available by them for commercial use. Here, the information is not directly supporting a market, but is contributing to the sale of a range of unrelated products and services.

The data may not be free. There are commercial arrangements for supply of map and postcode data. But it is available, and is the basis for profitable products and for features that make products more competitive.

The Bonanza that Isn’t

These successes are, so far, few in number. The economic benefits of open government data could be huge. The McKinsey Global Institute estimates a potential of between 3 and 5 trillion dollars annually. Yet the direct impact of Open Data on the EU economy in 2010, seven years after the directive was issued, is estimated by Capgemini at only about 1% of that, although the EU accounts for nearly a quarter of world GDP.

The business benefits to be gained from using map and postcode data are obvious. There are other kinds of public sector data, where the business benefits may be substantial, but they are not easy to see. For example, data is or could be available about public transport schedules and availability, about population densities, characteristics and trends, and about real estate and land use. These are all areas that support substantial business activity, but businesses in these areas seldom make use of public sector information today.

Where are the Products?

Why are entrepreneurs not creating these potentially profitable products and services? There is one obvious reason. The data they are interested in is not always available and, where it is available, it is provided in different ways, and comes in different formats. Instead of a single large market, the entrepreneur sees a number of small markets, none of which is worth tackling. For example, the market for an application that plans public transport journeys across a single town is not big enough to justify substantial investment in product development. An application that could plan journeys across any town in Europe would certainly be worthwhile, but is not possible unless all the towns make this data available in a common format.

Public sector information providers often do not know what value their data has, or understand its applications. Working within tight budgets, they cannot afford to spend large amounts of effort on assembling and publishing data that will not be used. They follow the directives but, without common guidelines, they simply publish whatever is readily to hand, in whatever form it happens to be.

The data that could support viable products is not available everywhere and, where it is available, it comes in different formats. (One that is often used is PDF, which is particularly difficult to process as an information source.) The result is that the cost of product development is high, and the expected return is low.

Where is the Market?

There is a second reason why entrepreneurs hesitate. The shape of the market is unclear. In a mature market, everyone knows who the key players are, understands their motivations, and can predict to some extent how they will behave. The market for products and services based on public sector information is still taking shape. No one is even sure what kinds of organization will take part, or what they will do. How far, for example, will public-sector bodies go in providing free applications? Can large corporations buy future dominance with loss-leader products? Will some unknown company become an overnight success, like Facebook? With these unknowns, the risks are very high.

Finding the Answers

Public sector information providers and standards bodies are tackling these problems. The Open Group participates in SHARE-PSI, the European network for the exchange of experience and ideas around implementing open data policies in the public sector. The experience gained by SHARE-PSI will be used by the World-Wide Web Consortium as a basis for standards and guidelines for publication of public sector information. These standards and guidelines may be used, not just by the public sector, but by not-for-profit bodies and even commercial corporations, many of which have information that they want to make freely available.

The Open Group is making a key contribution by helping to map the shape of the market. It is using the Business Scenario technique from its well-known Enterprise Architecture methodology TOGAF® to identify the kinds of organization that will take part, and their objectives and concerns.

There will be a preview of this on October 22 at The Open Group event in London which will feature a workshop session on Open Public Sector Data. This workshop will look at how Open Data can help business, present a draft of the Business Scenario, and take input from participants to help develop its conclusions.

The developed Business Scenario will be presented at the SHARE-PSI workshop in Lisbon on December 3-4. The theme of this workshop is encouraging open data usage by commercial developers. It will bring a wide variety of stakeholders together to discuss and build the relationship between the public and private sectors. It will also address, through collaboration with the EU LAPSI project, the legal framework for use of open public sector data.

Benefit from Participation!

If you are thinking about publishing or using public-sector data, you can benefit from these workshops by gaining an insight into the way that the market is developing. In the long term, you can influence the common standards and guidelines that are being developed. In the short term, you can find out what is happening and network with others who are interested.

The social and commercial benefits of open public-sector data are not being realized today. They can be realized through a healthy market in products and services that process the data and make it useful to citizens. That market will emerge when public bodies and businesses clearly understand the roles that they can play. Now is the time to develop that understanding and begin to profit from it.

Register for The Open Group London 2014 event at http://www.opengroup.org/london2014/registration.

Find out how to participate in the Lisbon SHARE-PSI workshop at http://www.w3.org/2013/share-psi/workshop/lisbon/#Participation

 

Chris HardingDr. Chris Harding is Director for Interoperability at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0™ Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

Leave a comment

Filed under big data, Cloud, digital technologies, Enterprise Architecture, Open Platform 3.0, TOGAF®, Uncategorized

The Open Group Panel: Internet of Things – Opportunities and Obstacles

Below is the transcript of The Open Group podcast exploring the challenges and ramifications of the Internet of Things, as machines and sensors collect vast amounts of data.

Listen to the podcast.

Dana Gardner: Hello, and welcome to a special BriefingsDirect thought leadership interview series coming to you in conjunction with recent The Open Group Boston 2014 on July 21 in Boston.

Dana Gardner I’m Dana Gardner, principal analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these discussions on Open Platform 3.0 and Boundaryless Information Flow.

We’re going to now specifically delve into the Internet of Things with a panel of experts. The conference has examined how Open Platform 3.0™ leverages the combined impacts of cloud, big data, mobile, and social. But to each of these now we can add a new cresting wave of complexity and scale as we consider the rapid explosion of new devices, sensors, and myriad endpoints that will be connected using internet protocols, standards and architectural frameworks.

This means more data, more cloud connectivity and management, and an additional tier of “things” that are going to be part of the mobile edge — and extending that mobile edge ever deeper into even our own bodies.

When we think about inputs to these social networks — that’s going to increase as well. Not only will people be tweeting, your device could be very well tweet, too — using social networks to communicate. Perhaps your toaster will soon be sending you a tweet about your English muffins being ready each morning.

The Internet of Things is more than the “things” – it means a higher order of software platforms. For example, if we are going to operate data centers with new dexterity thanks to software-definited networking (SDN) and storage (SDS) — indeed the entire data center being software-defined (SDDC) — then why not a software-defined automobile, or factory floor, or hospital operating room — or even a software-defined city block or neighborhood?

And so how does this all actually work? Does it easily spin out of control? Or does it remain under proper management and governance? Do we have unknown unknowns about what to expect with this new level of complexity, scale, and volume of input devices?

Will architectures arise that support the numbers involved, interoperability, and provide governance for the Internet of Things — rather than just letting each type of device do its own thing?

To help answer some of these questions, The Open Group assembled a distinguished panel to explore the practical implications and limits of the Internet of Things. So please join me in welcoming Said Tabet, Chief Technology Officer for Governance, Risk and Compliance Strategy at EMC, and a primary representative to the Industrial Internet Consortium; Penelope Gordon, Emerging Technology Strategist at 1Plug Corporation; Jean-Francois Barsoum, Senior Managing Consultant for Smarter Cities, Water and Transportation at IBM, and Dave Lounsbury, Chief Technical Officer at The Open Group.

Jean-Francois, we have heard about this notion of “cities as platforms,” and I think the public sector might offer us some opportunity to look at what is going to happen with the Internet of Things, and then extrapolate from that to understand what might happen in the private sector.

Hypothetically, the public sector has a lot to gain. It doesn’t have to go through the same confines of a commercial market development, profit motive, and that sort of thing. Tell us a little bit about what the opportunity is in the public sector for smart cities.

Barsoum_Jean-FrancoisJean-Francois Barsoum: It’s immense. The first thing I want to do is link to something that Marshall Van Alstyne (Professor at Boston University and Researcher at MIT) had talked about, because I was thinking about his way of approaching platforms and thinking about how cities represent an example of that.

You don’t have customers; you have citizens. Cities are starting to see themselves as platforms, as ways to communicate with their customers, their citizens, to get information from them and to communicate back to them. But the complexity with cities is that as a good a platform as they could be, they’re relatively rigid. They’re legislated into existence and what they’re responsible for is written into law. It’s not really a market.

Chris Harding (Forum Director of The Open Group Open Platform 3.0) earlier mentioned, for example, water and traffic management. Cities could benefit greatly by managing traffic a lot better.

Part of the issue is that you might have a state or provincial government that looks after highways. You might have the central part of the city that looks after arterial networks. You might have a borough that would look after residential streets, and these different platforms end up not talking to each other.

They gather their own data. They put in their own widgets to collect information that concerns them, but do not necessarily share with their neighbor. One of the conditions that Marshall said would favor the emergence of a platform had to do with how much overlap there would be in your constituents and your customers. In this case, there’s perfect overlap. It’s the same citizen, but they have to carry an Android and an iPhone, despite the fact it is not the best way of dealing with the situation.

The complexities are proportional to the amount of benefit you could get if you could solve them.

Gardner: So more interoperability issues?

Barsoum: Yes.

More hurdles

Gardner: More hurdles, and when you say commensurate, you’re saying that the opportunity is huge, but the hurdles are huge and we’re not quite sure how this is going to unfold.

Barsoum: That’s right.

Gardner: Let’s go to an area where the opportunity outstrips the challenge, manufacturing. Said, what is the opportunity for the software-defined factory floor for recognizing huge efficiencies and applying algorithmic benefits to how management occurs across domains of supply-chain, distribution, and logistics. It seems to me that this is a no-brainer. It’s such an opportunity that the solution must be found.

Tabet_SaidSaid Tabet: When it comes to manufacturing, the opportunities are probably much bigger. It’s where we can see a lot of progress that has already been done and still work is going on. There are two ways to look at it.

One is the internal side of it, where you have improvements of business processes. For example, similar to what Jean-Francois said, in a lot of the larger companies that have factories all around the world, you’ll see such improvements on a factory base level. You still have those silos at that level.

Now with this new technology, with this connectedness, those improvements are going to be made across factories, and there’s a learning aspect to it in terms of trying to manage that data. In fact, they do a better job. We still have to deal with interoperability, of course, and additional issues that could be jurisdictional, etc.

However, there is that learning that allows them to improve their processes across factories. Maintenance is one of them, as well as creating new products, and connecting better with their customers. We can see a lot of examples in the marketplace. I won’t mention names, but there are lots of them out there with the large manufacturers.

Gardner: We’ve had just-in-time manufacturing and lean processes for quite some time, trying to compress the supply chain and distribution networks, but these haven’t necessarily been done through public networks, the internet, or standardized approaches.

But if we’re to benefit, we’re going to need to be able to be platform companies, not just product companies. How do you go from being a proprietary set of manufacturing protocols and approaches to this wider, standardized interoperability architecture?

Tabet: That’s a very good question, because now we’re talking about that connection to the customer. With the airline and the jet engine manufacturer, for example, when the plane lands and there has been some monitoring of the activity during the whole flight, at that moment, they’ll get that data made available. There could be improvements and maybe solutions available as soon as the plane lands.

Interoperability

That requires interoperability. It requires Platform 3.0 for example. If you don’t have open platforms, then you’ll deal with the same hurdles in terms of proprietary technologies and integration in a silo-based manner.

Gardner: Penelope, you’ve been writing about the obstacles to decision-making that might become apparent as big data becomes more prolific and people try to capture all the data about all the processes and analyze it. That’s a little bit of a departure from the way we’ve made decisions in organizations, public and private, in the past.

Of course, one of the bigger tenets of Internet of Things is all this great data that will be available to us from so many different points. Is there a conundrum of some sort? Is there an unknown obstacle for how we, as organizations and individuals, can deal with that data? Is this going to be chaos, or is this going to be all the promises many organizations have led us to believe around big data in the Internet of Things?

Gordon_PenelopePenelope Gordon: It’s something that has just been accelerated. This is not a new problem in terms of the decision-making styles not matching the inputs that are being provided into the decision-making process.

Former US President Bill Clinton was known for delaying making decisions. He’s a head-type decision-maker and so he would always want more data and more data. That just gets into a never-ending loop, because as people collect data for him, there is always more data that you can collect, particularly on the quantitative side. Whereas, if it is distilled down and presented very succinctly and then balanced with the qualitative, that allows intuition to come to fore, and you can make optimal decisions in that fashion.

Conversely, if you have someone who is a heart-type or gut-type decision-maker and you present them with a lot of data, their first response is to ignore the data. It’s just too much for them to take in. Then you end up completely going with whatever you feel is correct or whatever you have that instinct that it’s the correct decision. If you’re talking about strategic decisions, where you’re making a decision that’s going to influence your direction five years down the road, that could be a very wrong decision to make, a very expensive decision, and as you said, it could be chaos.

It just brings to mind to me Dr. Suess’s The Cat in the Hat with Thing One and Thing Two. So, as we talk about the Internet of Things, we need to keep in mind that we need to have some sort of structure that we are tying this back to and understanding what are we trying to do with these things.

Gardner: Openness is important, and governance is essential. Then, we can start moving toward higher-order business platform benefits. But, so far, our panel has been a little bit cynical. We’ve heard that the opportunity and the challenges are commensurate in the public sector and that in manufacturing we’re moving into a whole new area of interoperability, when we think about reaching out to customers and having a boundary that is managed between internal processes and external communications.

And we’ve heard that an overload of data could become a very serious problem and that we might not get benefits from big data through the Internet of Things, but perhaps even stumble and have less quality of decisions.

So Dave Lounsbury of The Open Group, will the same level of standardization work? Do we need a new type of standards approach, a different type of framework, or is this a natural path and course what we have done in the past?

Different level

Lounsbury_DaveDave Lounsbury: We need to look at the problem at a different level than we institutionally think about an interoperability problem. Internet of Things is riding two very powerful waves, one of which is Moore’s Law, that these sensors, actuators, and network get smaller and smaller. Now we can put Ethernet in a light switch right, a tag, or something like that.

Also, Metcalfe’s Law that says that the value of all this connectivity goes up with the square of the number of connected points, and that applies to both the connection of the things but more importantly the connection of the data.

The trouble is, as we have said, that there’s so much data here. The question is how do you manage it and how do you keep control over it so that you actually get business value from it. That’s going to require us to have this new concept of a platform to not only to aggregate, but to just connect the data, aggregate it, correlate it as you said, and present it in ways that people can make decisions however they want.

Also, because of the raw volume, we have to start thinking about machine agency. We have to think about the system actually making the routine decisions or giving advice to the humans who are actually doing it. Those are important parts of the solution beyond just a simple “How do we connect all the stuff together?”

Gardner: We might need a higher order of intelligence, now that we have reached this border of what we can do with our conventional approaches to data, information, and process.

Thinking about where this works best first in order to then understand where it might end up later, I was intrigued again this morning by Professor Van Alstyne. He mentioned that in healthcare, we should expect major battles, that there is a turf element to this, that the organization, entity or even commercial corporation that controls and manages certain types of information and access to that information might have some very serious platform benefits.

The openness element now is something to look at, and I’ll come back to the public sector. Is there a degree of openness that we could legislate or regulate to require enough control to prevent the next generation of lock-in, which might not be to a platform to access to data information and endpoints? Where is it in the public sector that we might look to a leadership position to establish needed openness and not just interoperability.

Barsoum: I’m not even sure where to start answering that question. To take healthcare as an example, I certainly didn’t write the bible on healthcare IT systems and if someone did write that, I think they really need to publish it quickly.

We have a single-payer system in Canada, and you would think that would be relatively easy to manage. There is one entity that manages paying the doctors, and everybody gets covered the same way. Therefore, the data should be easily shared among all the players and it should be easy for you to go from your doctor, to your oncologist, to whomever, and maybe to your pharmacy, so that everybody has access to this same information.

We don’t have that and we’re nowhere near having that. If I look to other areas in the public sector, areas where we’re beginning to solve the problem are ones where we face a crisis, and so we need to address that crisis rapidly.

Possibility of improvement

In the transportation infrastructure, we’re getting to that point where the infrastructure we have just doesn’t meet the needs. There’s a constraint in terms of money, and we can’t put much more money into the structure. Then, there are new technologies that are coming in. Chris had talked about driverless cars earlier. They’re essentially throwing a wrench into the works or may be offering the possibility of improvement.

On any given piece of infrastructure, you could fit twice as many driverless cars as cars with human drivers in them. Given that set of circumstances, the governments are going to find they have no choice but to share data in order to be able to manage those. Are there cases where we could go ahead of a crisis in order to manage it? I certainly hope so.

Gardner: How about allowing some of the natural forces of marketplaces, behavior, groups, maybe even chaos theory, where if sufficient openness is maintained there will be some kind of a pattern that will emerge? We need to let this go through its paces, but if we have artificial barriers, that might be thwarted or power could go to places that we would regret later.

Barsoum: I agree. People often focus on structure. So the governance doesn’t work. We should find some way to change the governance of transportation. London has done a very good job of that. They’ve created something called Transport for London that manages everything related to transportation. It doesn’t matter if it’s taxis, bicycles, pedestrians, boats, cargo trains, or whatever, they manage it.

You could do that, but it requires a lot of political effort. The other way to go about doing it is saying, “I’m not going to mess with the structures. I’m just going to require you to open and share all your data.” So, you’re creating a new environment where the governance, the structures, don’t really matter so much anymore. Everybody shares the same data.

Gardner: Said, to the private sector example of manufacturing, you still want to have a global fabric of manufacturing capabilities. This is requiring many partners to work in concert, but with a vast new amount of data and new potential for efficiency.

How do you expect that openness will emerge in the manufacturing sector? How will interoperability play when you don’t have to wait for legislation, but you do need to have cooperation and openness nonetheless?

Tabet: It comes back to the question you asked Dave about standards. I’ll just give you some examples. For example, in the automotive industry, there have been some activities in Europe around specific standards for communication.

The Europeans came to the US and started to have discussions, and the Japanese have interest, as well as the Chinese. That shows, because there is a common interest in creating these new models from a business standpoint, that these challenges they have to be dealt with together.

Managing complexity

When we talk about the amounts of data, what we call now big data, and what we are going to see in about five years or so, you can’t even imagine. How do we manage that complexity, which is multidimensional? We talked about this sort of platform and then further, that capability and the data that will be there. From that point of view, openness is the only way to go.

There’s no way that we can stay away from it and still be able to work in silos in that new environment. There are lots of things that we take for granted today. I invite some of you to go back and read articles from 10 years ago that try to predict the future in technology in the 21st century. Look at your smart phones. Adoption is there, because the business models are there, and we can see that progress moving forward.

Collaboration is a must, because it is a multidimensional level. It’s not just manufacturing like jet engines, car manufacturers, or agriculture, where you have very specific areas. They really they have to work with their customers and the customers of their customers.

Adoption is there, because the business models are there, and we can see that progress moving forward.

Gardner: Dave, I have a question for both you and Penelope. I’ve seen some instances where there has been a cooperative endeavor for accessing data, but then making it available as a service, whether it’s an API, a data set, access to a data library, or even analytics applications set. The Ocean Observatories Initiative is one example, where it has created a sensor network across the oceans and have created data that then they make available.

Do you think we expect to see an intermediary organization level that gets between the sensors and the consumers or even controllers of the processes? Is there’s a model inherent in that that we might look to — something like that cooperative data structure that in some ways creates structure and governance, but also allows for freedom? It’s sort of an entity that we don’t have yet in many organizations or many ecosystems and that needs to evolve.

Lounsbury: We’re already seeing that in the marketplace. If you look at the commercial and social Internet of Things area, we’re starting to see intermediaries or brokers cropping up that will connect the silo of my android ecosystem to the ecosystem of package tracking or something like that. There are dozens and dozens of these cropping up.

In fact, you now see APIs even into a silo of what you might consider a proprietary system and what people are doing is to to build a layer on top of those APIs that intermediate the data.

This is happening on a point-to-point basis now, but you can easily see the path forward. That’s going to expand to large amounts of data that people will share through a third party. I can see this being a whole new emerging market much as what Google did for search. You could see that happening for the Internet of Things.

Gardner: Penelope, do you have any thoughts about how that would work? Is there a mutually assured benefit that would allow people to want to participate and cooperate with that third entity? Should they have governance and rules about good practices, best practices for that intermediary organization? Any thoughts about how data can be managed in this sort of hierarchical model?

Nothing new

Gordon: First, I’ll contradict it a little bit. To me, a lot of this is nothing new, particularly coming from a marketing strategy perspective, with business intelligence (BI). Having various types of intermediaries, who are not only collecting the data, but then doing what we call data hygiene, synthesis, and even correlation of the data has been around for a long time.

It was an interesting, when I looked at recent listing of the big-data companies, that some notable companies were excluded from that list — companies like Nielsen. Nielsen’s been collecting data for a long time. Harte-Hanks is another one that collects a tremendous amount of information and sells that to companies.

That leads into the another part of it that I think there’s going to be. We’re seeing an increasing amount of opportunity that involves taking public sources of data and then providing synthesis on it. What remains to be seen is how much of the output of that is going to be provided for “free”, as opposed to “fee”. We’re going to see a lot more companies figuring out creative ways of extracting more value out of data and then charging directly for that, rather than using that as an indirect way of generating traffic.

Gardner: We’ve seen examples of how this has been in place. Does it scale and does the governance or lack of governance that might be in the market now sustain us through the transition into Platform 3.0 and the Internet of Things.

Gordon: That aspect is the lead-on part of “you get what you pay for”. If you’re using a free source of data, you don’t have any guarantee that it is from authoritative sources of data. Often, what we’re getting now is something somebody put it in a blog post, and then that will get referenced elsewhere, but there was nothing to go back to. It’s the shaky supply chain for data.

You need to think about the data supply and that is where the governance comes in. Having standards is going to increasingly become important, unless we really address a lot of the data illiteracy that we have. A lot of people do not understand how to analyze data.

One aspect of that is a lot of people expect that we have to do full population surveys, as opposed representative sampling to get much more accurate and much more cost-effective collection of data. That’s just one example, and we do need a lot more in governance and standards.

Gardner: What would you like to see changed most in order for the benefits and rewards of the Internet of Things to develop and overcome the drawbacks, the risks, the downside? What, in your opinion, would you like to see happen to make this a positive, rapid outcome? Let’s start with you Jean-Francois.

Barsoum: There are things that I have seen cities start to do now. There are couple of examples: Philadelphia is one and Barcelona does this too. Rather than do the typical request for proposal (RFP), where they say, “This is the kind of solution we’re looking for, and here are our parameters. Can l you tell us how much it is going to cost to build,” they come to you with the problem and they say, “Here is the problem I want to fix. Here are my priorities, and you’re at liberty to decide how best to fix the problem, but tell us how much that would cost.”

If you do that and you combine it with access to the public data that is available — if public sector opens up its data — you end up with a very powerful combination that liberates a lot of creativity. You can create a lot of new business models. We need to see much more of that. That’s where I would start.

More education

Tabet: I agree with Jean-Francois on that. What I’d like to add is that I think we need to push the relation a little further. We need more education, to your point earlier, around the data and the capabilities.

We need these platforms that we can leverage a little bit further with the analytics, with machine learning, and with all of these capabilities that are out there. We have to also remember, when we talk about the Internet of Things, it is things talking to each other.

So it is not human-machine communication. Machine-to-machine automation will be further than that, and we need more innovation and more work in this area, particularly more activity from the governments. We’ve seen that, but it is a little bit frail from that point of view right now.

Gardner: Dave Lounsbury, thoughts about what need to happen in order to keep this on the tracks?

Lounsbury: We’ve touched on lot of them already. Thank you for mentioning the machine-to-machine part, because there are plenty of projections that show that it’s going to be the dominant form of Internet communication, probably within the next four years.

So we need to start thinking of that and moving beyond our traditional models of humans talking through interfaces to set of services. We need to identify the building blocks of capability that you need to manage, not only the information flow and the skilled person that is going to produce it, but also how you manage the machine-to-machine interactions.

Gordon: I’d like to see not so much focus on data management, but focus on what is the data managing and helping us to do. Focusing on the machine-to-machine and the devices is great, but it should be not on the devices or on the machines… it should be on what can they accomplish by communicating; what can you accomplish with the devices and then have a reverse engineer from that.

Gardner: Let’s go to some questions from the audience. The first one asks about a high order of intelligence which we mentioned earlier. It could be artificial intelligence, perhaps, but they ask whether that’s really the issue. Is the nature of the data substantially different, or we are just creating more of the same, so that it is a storage, plumbing, and processing problem? What, if anything, are we lacking in our current analytics capabilities that are holding us back from exploiting the Internet of Things?

Gordon: I’ve definitely seen that. That has a lot to do with not setting your decision objectives and your decision criteria ahead of time so that you end up collecting a whole bunch of data, and the important data gets lost in the mix. There is a term “data smog.”

Most important

The solution is to figure out, before you go collecting data, what data is most important to you. If you can’t collect certain kinds of data that are important to you directly, then think about how to indirectly collect that data and how to get proxies. But don’t try to go and collect all the data for that. Narrow in on what is going to be most important and most representative of what you’re trying to accomplish.

Gardner: Does anyone want to add to this idea of understanding what current analytics capabilities are lacking, if we have to adopt and absorb the Internet of Things?

Barsoum: There is one element around projection into the future. We’ve been very good at analyzing historical information to understand what’s been happening in the past. We need to become better at projecting into the future, and obviously we’ve been doing that for some time already.

But so many variables are changing. Just to take the driverless car as an example. We’ve been collecting data from loop detectors, radar detectors, and even Bluetooth antennas to understand how traffic moves in the city. But we need to think harder about what that means and how we understand the city of tomorrow is going to work. That requires more thinking about the data, a little bit like what Penelope mentioned, how we interpret that, and how we push that out into the future.

Lounsbury: I have to agree with both. It’s not about statistics. We can use historical data. It helps with lot of things, but one of the major issues we still deal with today is the question of semantics, the meaning of the data. This goes back to your point, Penelope, around the relevance and the context of that information – how you get what you need when you need it, so you can make the right decisions.

Gardner: Our last question from the audience goes back to Jean-Francois’s comments about the Canadian healthcare system. I imagine it applies to almost any healthcare system around the world. But it asks why interoperability is so difficult to achieve, when we have the power of the purse, that is the market. We also supposedly have the power of the legislation and regulation. You would think between one or the other or both that interoperability, because the stakes are so high, would happen. What’s holding it up?

Barsoum: There are a couple of reasons. One, in the particular case of healthcare, is privacy, but that is one that you could see going elsewhere. As soon as you talk about interoperability in the health sector, people start wondering where is their data going to go and how accessible is it going to be and to whom.

You need to put a certain number of controls over top of that. What is happening in parallel is that you have people who own some data, who believe they have some power from owning that data, and that they will lose that power if they share it. That can come from doctors, hospitals, anywhere.

So there’s a certain amount of change management you have to get beyond. Everybody has to focus on the welfare of the patient. They have to understand that there has to be a priority, but you also have to understand the welfare of the different stakeholders in the system and make sure that you do not forget about them, because if you forget about them they will find some way to slow you down.

Use of an ecosystem

Lounsbury: To me, that’s a perfect example of what Marshall Van Alstyne talked about this morning. It’s the change from focus on product to a focus on an ecosystem. Healthcare traditionally has been very focused on a doctor providing product to patient, or a caregiver providing a product to a patient. Now, we’re actually starting to see that the only way we’re able to do this is through use of an ecosystem.

That’s a hard transition. It’s a business-model transition. I will put in a plug here for The Open Group Healthcare vertical, which is looking at that from architecture perspective. I see that our Forum Director Jason Lee is over here. So if you want to explore that more, please see him.

Gardner: I’m afraid we will have to leave it there. We’ve been discussing the practical implications of the Internet of Things and how it is now set to add a new dimension to Open Platform 3.0 and Boundaryless Information Flow.

We’ve heard how new thinking about interoperability will be needed to extract the value and orchestrate out the chaos with such vast new scales of inputs and a whole new categories of information.

So with that, a big thank you to our guests: Said Tabet, Chief Technology Officer for Governance, Risk and Compliance Strategy at EMC; Penelope Gordon, Emerging Technology Strategist at 1Plug Corp.; Jean-Francois Barsoum, Senior Managing Consultant for Smarter Cities, Water and Transportation at IBM, and Dave Lounsbury, Chief Technology Officer at The Open Group.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on Open Platform 3.0 and Boundaryless Information Flow at The Open Group Conference, recently held in Boston. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript.

Transcript of The Open Group podcast exploring the challenges and ramifications of the Internet of Things, as machines and sensors collect vast amounts of data. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2014. All rights reserved.

You may also be interested in:

Leave a comment

Filed under Boundaryless Information Flow™, Business Architecture, Cloud, Cloud/SOA, Data management, digital technologies, Enterprise Architecture, Future Technologies, Information security, Internet of Things, Interoperability, Open Platform 3.0, Service Oriented Architecture, Standards, Strategy, Supply chain risk, Uncategorized

Enterprise Architecture: A Practitioner View

By Prasad Palli and Dr. Gopala Krishna Behara, Wipro

Overview of Enterprise Architecture

IT organizations as usual are always ready to take challenges and start the journey in defining/refining their IT strategies and aligning with business strategies. During this journey, enterprises adopt a framework / methodology / best-practice / pattern / process called “Enterprise Architecture” which will help them to structure their processes and address growth together.

The effective management and exploitation of information through IT is a key factor to business success, and an indispensable means to achieving competitive advantage. Enterprise Architecture addresses this need, by providing a strategic context for the evolution of the IT system in response to the constantly changing needs of the business environment.

Without Enterprise Architecture

Based on our experience in Enterprise Architecture consulting, we highlight the common mistakes/frequent issues faced by the organizations in the absence of Enterprise Architecture.

Strategy

  • No link to business strategic planning and budget process
  • Slow and ineffective decision-making
  • Inability to rapidly respond to changes driven by business challenges
  • Lack of focus on enterprise requirements
  • Lack of common direction and synergies
  • Focusing on the art or language of EA rather than outcomes
  • Incomplete visibility of the current and future target Enterprise Architecture vision

Governance

  • Inability to predict impacts of future changes
  • Confusing “IT Architecture” With “Enterprise Architecture”
  • Lack of governance
  • Strict following of EA frameworks
  • “Ivory Tower” approach
  • Lack of communication and feedback
  • Limiting the EA team to IT resources
  • Lack of performance measures
  • No measurement criteria for EA metrics
  • Picking a tool before understanding your business needs

Technology

  • Increased gaps and architecture conflicts
  • Lack of commonality and consistency due to the absence of standards
  • Dilution and dissipation of critical information and knowledge of the deployed solutions
  • Rigidity, redundancy and lack of scalability and flexibility in the deployed solutions
  • Over-standardization
  • Non-adoption of Next Generation Technologies
  • Lack of integration, compatibility and interoperability between applications
  • Complex, fragile and costly interfaces between incongruent application

Enterprise Architecture Perspective

The main drivers of Enterprise Architecture of the enterprise are:

  • Highly optimized and flexible processes (Business & IT)
  • Ability to integrate seamlessly with systems within the enterprise and partners
  • Highly optimized and shared IT infrastructure
  • Loosely coupled systems to quickly respond to new processes or new product or new channel – Business value generation
  • Well mapping of business processes to application to information to technology
  • Strict adherence to regulatory and compliance factors

This article highlights our framework of Enterprise Architecture and its roadmap for the development and management of various components. It depicts how these components work together, what are the various measures of business units, enterprise and their outcome. The framework includes putting in place the proper organizational structure and hybrid business/IT roles, consolidating and standardizing information and data stores, and integrating applications and infrastructure to support the right business processes across the enterprise.

The key Components of Enterprise Architecture are depicted below.

EA1

EA – Practical Experience

Enterprise Architecture is not a one-time event, nor limited to specific projects or business units. EA is an on-going, iterative process that provides:

  • A common vision of the future shared by business and IT; business aware of IT and vice-versa
  • Guidance in the selection, creation and implementation of solutions driven by business requirements
  • Support for the various enterprise business lines through improved information sharing – provides plan for the integration of information and services at the design level across business lines
  • A means to control growing complexities of technology by setting enterprise-wide, leverageable standards for information technology
  • Defines an approach for the evaluation, consideration and assimilation of new and emerging technology innovations to meet business requirements

Some of the key aspects that teams will come across during EA execution:

  • EA is NOT a project: This is one of common mistake that most enterprises do. Enterprise Architecture is NOT a project, which can be delivered within specified timeframe. Enterprise Architecture is more of a culture that enterprises must adopt like SDLC process.
  • EA is NOT about review : Generally, people tend to think that EA is always for review and do policing team/individual performance and provide review reports to higher management. Instead EA is of bringing standards and making enterprise flexible to address changes as needed for business growth.
  • EA is NOT a one-time activity: The success of EA is possible only when enterprises will adopt it as part of their culture. For this to happen, Enterprise Architecture should execute as an iterative and on-going process and educate all stakeholders (business, portfolio managers, architects, program/project managers, designers, developers, operations, partners etc.) about the initiative and make them responsible for EA success.
  • EA is NOT for IT: Most of the times Enterprise Architecture initiative is driven by IT organizations without much involvement from Business. This is the first step towards a big failure. Depending upon the approach (whether it is top-down or bottom-up), business should be aware of what’s happening in the Enterprise Architecture initiative and be actively participating in the program when needed. Business is as equally responsible as IT for the success of an EA initiative.
  • EA is NOT a strategy: There is a common view across organizations that Enterprise Architecture is more of a strategy and teams like solution architecture, portfolio management and design & development and operations streams doesn’t have a role to play. In fact, the aforementioned teams are key contributors to Enterprise Architecture definition and its success by inculcating EA standards and best practices in their day-to-day activities.
  • EA is NOT all about cost-reduction: Most of the enterprises will look at EA from cost savings perspective that puts lot of pressure on IT to show some immediate benefits in terms of savings. With this kind of pressure, EA will get off track and be seen as more of a tactical initiative rather than strategic. Enterprises should start looking at EA more from Business-IT alignment, agility, innovation etc. which are strategic in nature along with cost savings.
  • EA is NOT one-man show: Enterprise Architecture is neither a CIO job or CFO or any CXO. It’s everybody’s job within an enterprise. During the EA strategy definition phase, probably more leadership involvement is needed and at EA implementation stage all the stakeholders will have a role to play and contribute one way or another.
  • EA is all about communication: One of the common mistakes that most enterprises do during the EA program is the team will work in silos and build huge pile of documents without having proper communication sessions within enterprise. At a minimum, the EA team should spend 50% of efforts towards communicating EA artifacts with the team and successful medium is through meetings rather than sending over emails or website.
  • Measure EA: During the initial stages of an EA program, the team should define measuring criteria/factors of EA (for ex: customer satisfaction, time to market, agility, cost savings, standardization, resources skills, trainings/certification etc.). Without these factors defined, EA will end up in ad-hoc planning which leads to chaos and frustrates leadership.
  • Adoption of Latest Technology Trends on EA: Traditional EA is more of the “Ivory Tower” approach which is modeled as framework-centered and tool-driven. Most of the EA function is technology-centric and defined as a one-time initiative. Application built on Traditional EA principles are business-constraint before they are completed. The Next Generation Enterprise Architecture (NGEA) is business-centric, global, agile, continuous and social digital network. Also, the organizations adopt latest digital capabilities like social web, SOA, big data analytics, omni channel customer management, cloud computing, virtualization, Internet of Things and so on. These technologies are interrelated and fit together to define Next Generation Enterprise Architecture for an organization.

The vision of an enterprise is shifting from Traditional EA to Digital Architecture which addresses Networked Community Capabilities (interacting with users through social media), globalization (Borderless Enterprise), innovation of products and services (open, closed & virtual innovation), collaboration (enable employees in decision-making, location flexibility, schedule flexibility), flexibility (flexibility to choose the technologies, infrastructure, applications).

The following diagram shows the Next Generation EA Model.

EA2

  • Network-centric enterprise: Online communities, workforce (network/social collaboration), business partners, customers and the marketplace
  • Enterprise resources: Teams, project-centric, process-based work conducted by communities
  • Business partners: Strategic partners and suppliers can be engaged together in operations
  • Customers: Customer care communities
  • Outside enterprise: Regulators, influencers, crowdsourcing participants, software developers and other interested parties
  • Third party vendors: Packaged vendors like SAP, Oracle ERP etc.
  • New channels: Web, mobile devices, Social business environments (communities of all functional types and audiences) and CRM

Conclusions

This article attempts to demonstrate practical views of an Enterprise Architect in improving the success rate of EA across the organizations. There is no hard and fast rule that enterprises should adopt to one particular framework or standard or approach. They can choose to adopt any industry specific framework, however it can be customized as per the needs of the enterprise. It does not force fit EA programs to any industry framework. The deliverables of EA should integrate with business planning, focus on business architecture and defining/streamlining business outcome metrics.

EA program definition should not span for years. It should deliver business value in months or weeks. Also, the program output should be actionable. Always measure impact but not activity.

Apart from these steps, enterprise should think about following other key aspects like:

  • Should have strong leadership commitments
  • Not always as-Is instead it can start with defining future state
  • Start with the highest-priority business outcomes

Use the right diagnostic tools — EAs must have a broad set of tools to choose from:

  • Ensure the program outputs are actionable
  • Measure impact, not activity
  • Adopt Next Generation Enterprise Architecture patterns
  • Socialize, listen, crowd source and be transparent
  • Do not re-architect legacy systems for the sake of re-architecting: most old systems should be wrapped, then replaced
  • Prepare to measure degree of success before starting on with the new architecture initiative
  • Do not over-design your systems of innovation or under-design the systems of differentiation or record

References

1.http://www.opengroup.org/architecture/togaf7-doc/arch/p4/comp/comp.htm

Acknowledgements

The authors would like to thank Hari Kishan Burle, Raju Alluri of Architecture Group of Wipro Technologies for giving us the required time and support in many ways in bringing this article as part of Enterprise Architecture Practice efforts.

Authors

PalliPrasad Palli is a Practice Partner in the Enterprise Architecture division of Wipro. He has a total of 17 years of IT experience. He can be reached at prasad.palli@wipro.com

 

BeharaDr. Gopala Krishna Behara is a Senior Enterprise Architect in the Enterprise Architecture division of Wipro. He has a total of 18 years of IT experience. He can be reached at gopalkrishna.behra@wipro.com

 

Disclaimer

The views expressed in this article/presentation are that of authors and Wipro does not subscribe to the substance, veracity or truthfulness of the said opinion.

1 Comment

Filed under Enterprise Architecture, Enterprise Transformation, Governance, IT, Standards

Using The Open Group Standards – O-ISM3 with TOGAF®

By Jose Salamanca, UST Global, and Vicente Aceituno, Inovement

In order to prevent duplication of work and maximize the value provided by the Enterprise Architecture and Information Security discipline, it is necessary to find ways to communicate and take advantage from each other’s work. We have been examining the relationship between O-ISM3 and TOGAF®, both Open Group standards, and have found that, terminology differences aside, there are quite a number of ways to use these two standards together. We’d like to share our findings with The Open Group’s audience of Enterprise Architects, IT professionals, and Security Architects in this article.

Any ISMS manager needs to understand what the Security needs of the business are, how IT can cater for these needs, and how Information Security can contribute the most with the least amount of resources possible. Conversely, Enterprise Architects are challenged to build Security into the architectures deployed in the business in such a way that Security operations may be managed effectively.

There are parts of Enterprise Architecture that make the process of understanding the dependencies between the business and IT pretty straightforward. For example:

  • The TOGAF® 9 document “Business Principles – Goals – Drivers” will help inform the O-ISM3 practitioner what the business is about, in other words, what needs to be protected.
  • The TOGAF 9 document – Architecture Definition contains the Application, Technology and Data Domains, and the Business Domain. As a TOGAF service is a subdivision of an application used by one or several business functions, the O-ISM3 practitioner will be able to understand the needs of the business, developed and expressed as O-ISM3 Security objectives and Security targets, by interviewing the business process owners (found in the TOGAF Architecture Definition).
  • To determine how prepared applications are to meet those Security objectives and Security targets the O-ISM3 practitioner can interview the owner (found in the TOGAF Application Portfolio Catalog) of each application.
  • To check the location of the Components (parts of the application from the point of view of IT), which can have licensing and privacy protection implications, the O-ISM3 practitioner can interview the data owners (found in the TOGAF Architecture Definition) of each application.
  • To check the different Roles of use of an application, which will direct how access control is designed and operated, the O-ISM3 practitioner can interview the business process owners (found in the TOGAF Architecture Definition).
  • To understand how Components depend on each other, which has broad reaching implications in Security and business continuity, the O-ISM3 practitioner can examine the TOGAF Logical Application Components Map.

TOGAF practitioners can find Security constraints, which are equivalent to O-ISM3 Security Objectives (documented in “TOGAF 9 Architecture Vision” and “Data Landscape”) in the documents TSP-031 Information Security Targets and TSP-032 Information Requirements and Classification.

The Application Portfolio artifact in TOGAF is especially suitable to document the way applications are categorized from the point of view of security. The categorization enables prioritizing how they are protected.

The Security requirements which are created in O-ISM3, namely Security objectives and Security targets, should be included in the document “Requirements TOGAF 9 Template – Architecture Requirements Specification”, which contains all the requirements, constraints, and assumptions.

What are your views and experiences of aligning your ISMS + Enterprise Architecture methods? We’d love to hear your thoughts.

 

JMSalamanca photoJosé Salamanca is Regional Head of Solutions & Services at UST Global Spain. Certified in TOGAF9®, Project Management Professional (PMP®), and EFQM®. Jose also holds a MBA Executive by the Business European School (Spain) and achieved his BSc. at Universidad Complutense of Madrid. He is Vice President of the Association of Enterprise Architects Spanish chapter and Master Teacher at Universidad de Antonio de Nebrija of Madrid. José has built his professional career with repeated successes in Europe and the Middle East.

 

 

JulioVicente Aceituno is Principal author of O-ISM3, an experienced Information Security Manager and Consultant with broad experience in outsourcing of security services and research. His focus is information security outsourcing, management and related fields like metrics and certification of ISMS. Vicente is President of the Spanish chapter of the Information Security Systems Association; Member of The Open Group Security Forum Steering Committee; Secretary of the Spanish Chapter of the Association of Enterprise Architects; ISMS Forum Member.

2 Comments

Filed under Enterprise Architecture, Enterprise Transformation, Information security, Security, Security Architecture, Standards, TOGAF®, Uncategorized

Case Study – ArchiMate®, An Open Group Standard: Public Research Centre Henri Tudor and Centre Hospitalier de Luxembourg

By The Open Group

The Public Research Centre Henri Tudor is an institute of applied research aimed at reinforcing the innovation capacity at organizations and companies and providing support for national policies and international recognition of Luxembourg’s scientific community. Its activities include applied and experimental research; doctoral research; the development of tools, methods, labels, certifications and standards; technological assistance; consulting and watch services; and knowledge and competency transfer. Its main technological domains are advanced materials, environmental, Healthcare, information and communication technologies as well as business organization and management. The Centre utilizes its competencies across a number of industries including Healthcare, industrial manufacturing, mobile, transportation and financial services among others.

In 2012, the Centre Hospitalier de Luxembourg allowed Tudor to experiment with an access rights management system modeled using ArchiMate®, an Open Group standard. This model was tested by CRP Tudor to confirm the approach used by the hospital’s management to grant employees, nurses and doctors permission to access patient records.

Background

The Centre Hospitalier de Luxembourg is a public hospital that focuses on severe pathologies, medical and surgical emergencies and palliative care. The hospital also has an academic research arm. The hospital employs a staff of approximately 2,000, including physicians and specialized employees, medical specialists, nurses and administrative staff. On average the hospital performs more than 450,000 outpatient services, 30,000 inpatient services and more than 60,000 adult and pediatric emergency services, respectively, per year.

Unlike many hospitals throughout the world, the Centre Hospitalier de Luxembourg is open and accessible 24 hours a day, seven days a week. Accessing patient records is required at the hospital at any time, no matter the time of day or weekend. In addition, the Grand Duchy of Luxembourg has a system where medical emergencies are allocated to one hospital each weekend across each of the country’s three regions. In other words, every two weeks, one hospital within a given region is responsible for all of the incoming medical emergencies on its assigned weekend, affecting patient volume and activity.

Access rights management

As organizations have become not only increasingly global but also increasingly digital, access rights management has become a critical component of keeping institutional information secure so that it does not fall into the wrong hands. Managing access to internal information is a critical component of every company’s security strategy, but it is particularly important for organizations that deal with sensitive information about consumers, or in the case of the Centre Hospitalier de Luxembourg, patients.

Modeling an access rights management system was important for the hospital for a number of reasons. First, European privacy laws dictate that only the people who require information regarding patient medical files should be allowed access to those files. Although privacy laws may restrict access to patient records, a rights management system must be flexible enough to grant access to the correct individuals when necessary.

In the case of a hospital such as the Centre Hospitalier de Luxembourg, access to information may be critical for the life of the patient. For instance, if a patient was admitted to the emergency room, the emergency room physician will be able to better treat the patient if he or she can access the patient’s records, even if they are not the patient’s primary care physician. Admitting personnel may also need access to records at the time of admittance. Therefore, a successful access rights management system must combine a balance between restricting information and providing flexible access as necessary, giving the right access at the right time without placing an administrative burden on the doctors or staff.

The project

Prior to the experiment in which the Public Research Centre Henri Tudor tested this access rights management model, the Centre Hospitalier de Luxembourg had not experienced any problems in regard to its information sharing system. However, its access rights were still being managed by a primarily paper-based system. As part of the scope of the project, the hospital was also looking to become compliant with existing privacy laws. Developing an access rights management model was intended to close the gap within the hospital between restricting access to patient information overall and providing new rights, as necessary, to employees that would allow them to do their work without endangering patient lives. From a technical perspective, the access rights management system also needed not only to work in conjunction with existing applications, such as the ERP system, used within the hospital but also support rights management at the business layer.

Most current access rights managements systems provide information access to individuals based on a combination of the functional requirements necessary for employees to do their jobs and governance rights, which provide the protections that will keep the organization and its information safe and secure. What many existing models have failed to take into account is that most access control models and rights engineering methods don’t adequately represent both sides of this equation. As such, determining the correct level of access for different employees within organizations can be difficult.

Modeling access rights management

Within the Centre Hospitalier de Luxembourg, employee access rights were defined based on individual job responsibilities and job descriptions. To best determine how to grant access rights across an hospital, the Public Research Centre Henri Tudor needed to create a system that could take these responsibilities into account, rather than just rely on functional or governance requirements.

To create an access rights management model that would work with the hospital’s existing processes and ERP software, the Public Research Centre Henri Tudor first needed to come up with a way to model responsibility requirements instead of just functional or governance requirements. According to Christophe Feltus, Research Engineer at the Public Research Centre, defining a new approach based on actor or employee responsibilities was the first step in creating a new model for the hospital.

Although existing architecture modeling languages provide views for many different types of stakeholders within organizations—from executives to IT and project managers—no modeling language had previously been used to develop a view dedicated to access rights management, Feltus says. As such, that view needed to be created and modeled anew for this project.

To develop this new view, the Public Research Centre needed to find an architecture modeling language that was flexible enough to accommodate such an extension. After evaluating three separate modeling languages, they chose ArchiMate®, an Open Group Standard and open and independent modeling language, to help them visualize the relationships among the hospital’s various employees in an unambiguous way.

Much like architectural drawings are used in building architecture to describe the various aspects of construction and building use, ArchiMate provides a common language for describing how to construct business processes, organizational structures, information flows, IT systems and technical infrastructures. By providing a common language and visual representation of systems, ArchiMate helps stakeholders within organizations design, assess and communicate how decisions and changes within business domains will affect the organization.

According to Feltus, Archimate provided a well-formalized language for the Public Research Centre to portray the architecture needed to model the access rights management system they wanted to propose for Centre Hospitalier. Because ArchiMate is a flexible and open language, it also provided an extension mechanism that could accommodate the responsibility modeling language (ReMMo) that the engineering team had developed for the hospital.

In addition to providing the tools and extensions necessary for the engineering team to properly model the hospital’s access rights system, the Public Research Centre also chose ArchiMate because it is an open and vendor-neutral modeling language. As a publically funded institution, it was important that the Public Research Centre avoided using vendor-specific tools that would lock them in to a potentially costly cycle of constant version upgrades.

“What was very interesting [about ArchiMate] was that it was an open and independent solution. This is very important for us. As a public company, it’s preferable not to use private solutions. This was something very important,” said Feltus.

Feltus notes that using ArchiMate to model the access rights project was also a relatively easy and intuitive process. “It was rather easy,” Feltus said. “The concepts are clear and recommendations are well done, so it was easy to explore the framework.” The most challenging part of the project was selecting which extension mechanism would best portray the design and model they wanted to use.

Results

After developing the access rights model using ArchiMate, the responsibility metamodel was presented to the hospital’s IT staff by the Public Research Centre Henri Tudor. The Public Research Centre team believes that the responsibility model created using ArchiMate allows for better alignment between the hospital’s business processes defined at the business layer with their IT applications being run at the application layer. The team also believes the model could both enhance provisioning of access rights to employees and improve the hospital’s performance. For example, using the proposed responsibility model, the team found that some employees in the reception department had been assigned more permissions than they required in practice. Comparing the research findings with the reality on the ground at the hospital has shown the Public Research Centre team that ArchiMate is an effective tool for modeling and determining both responsibilities and access rights within organizations.

Due to the ease of use and success the Public Research Centre Henri Tudor experienced in using ArchiMate to create the responsibility model and the access rights management system for the hospital, Tudor also intends to continue to use ArchiMate for other public and private research projects as appropriate.

Follow The Open Group @theopengroup, #ogchat and / or let us know your thoughts on the blog here.

 

4 Comments

Filed under ArchiMate®, Healthcare, Standards, Uncategorized

The Open Group Boston 2014 – Day Two Highlights

By Loren K. Bayes, Director, Global Marketing Communications

Enabling Boundaryless Information Flow™  continued in Boston on Tuesday, July 22Allen Brown, CEO and President of The Open Group welcomed attendees with an overview of the company’s second quarter results.

The Open Group membership is at 459 organizations in 39 countries, including 16 new membership agreements in 2Q 2014.

Membership value is highlighted by the collaboration Open Group members experience. For example, over 4,000 individuals attended Open Group events (physically and virtually whether at member meetings, webinars, podcasts, tweet jams). The Open Group website had more than 1 million page views and over 105,000 publication items were downloaded by members in 80 countries.

Brown also shared highlights from The Open Group Forums which featured status on many upcoming white papers, snapshots, reference models and standards, as well as individiual Forum Roadmaps. The Forums are busy developing and reviewing projects such as the Next Version of TOGAF®, an Open Group standard, an ArchiMate® white paper, The Open Group Healthcare Forum charter and treatise, Standard Mils™ APIs and Open Fair. Many publications are translated into multiple languages including Chinese and Portuguese. Also, a new Forum will be announced in the third quarter at The Open Group London 2014 so stay tuned for that launch news!

Our first keynote of the day was Making Health Addictive by Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health.

Dr. Kvedar described how Healthcare delivery is changing, with mobile technology being a big part. Other factors pushing changes are reimbursement paradigms and caregivers being paid to be more efficient and interested in keeping people healthy and out of hospitals. The goal of Healthcare providers is to integrate care into the day-to-day lives of patients. Healthcare also aims for better technologies and architecture.

Mobile is a game-changer in Healthcare because people are “always on and connected”. Mobile technology allows for in-the-moment messaging, ability to capture health data (GPS, accelerator, etc.) and display information in real time as needed. Bottom-line, smartphones are addictive so they are excellent tools for communication and engagement.

But there is a need to understand and address the implications of automating Healthcare: security, privacy, accountability, economics.

The plenary continued with Proteus Duxbury, CTO, Connect for Health Colorado, who presented From Build to Run at the Colorado Health Insurance Exchange – Achieving Long-term Sustainability through Better Architecture.

Duxbury stated the keys to successes of his organization are the leadership and team’s shared vision, a flexible vendor being agile with rapidly changing regulatory requirements, and COTS solution which provided minimal customization and custom development, resilient architecture and security. Connect for Health experiences many challenges including budget restraints, regulation and operating in a “fish bowl”. Yet, they are on-track with their three-year ‘build to run’ roadmap, stabilizing their foundation and gaining efficiencies.

During the Q&A with Allen Brown following each presentation, both speakers emphasized the need for standards, architecture and data security.

Brown and DuxburyAllen Brown and Proteus Duxbury

During the afternoon, track sessions consisted of Healthcare, Enterprise Architecture (EA) & Business Value, Service-Oriented Architecture (SOA), Security & Risk Management, Professional Development and ArchiMate Tutorials. Chris Armstrong, President, Armstrong Process Group, Inc. discussed Architecture Value Chain and Capability Model. Laura Heritage, Principal Solution Architect / Enterprise API Platform, SOA Software, presented Protecting your APIs from Threats and Hacks.

The evening culminated with a reception at the historic Old South Meeting House, where the Boston Tea Party began in 1773.

photo2

IMG_2814Networking Reception at Old South Meeting House

A special thank you to our sponsors and exhibitors at The Open Group Boston 2014: BiZZdesign, Black Duck, Corso, Good e-Learning, Orbus and AEA.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

2 Comments

Filed under Accreditations, Boundaryless Information Flow™, Business Architecture, COTS, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open FAIR Certification, OTTF, RISK Management, Service Oriented Architecture, Standards, Uncategorized

New Health Data Deluges Require Secure Information Flow Enablement Via Standards, Says The Open Group’s New Healthcare Director

By The Open Group

Below is the transcript of The Open Group podcast on how new devices and practices have the potential to expand the information available to Healthcare providers and facilities.

Listen to the podcast here.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview coming to you in conjunction with The Open Group’s upcoming event, Enabling Boundaryless Information Flow™ July 21-22, 2014 in Boston.

GardnerI’m Dana Gardner, Principal Analyst at Interarbor Solutions and I’ll be your host and moderator for the series of discussions from the conference on Boundaryless Information Flow, Open Platform 3.0™, Healthcare, and Security issues.

One area of special interest is the Healthcare arena, and Boston is a hotbed of innovation and adaption for how technology, Enterprise Architecture, and standards can improve the communication and collaboration among Healthcare ecosystem players.

And so, we’re joined by a new Forum Director at The Open Group to learn how an expected continued deluge of data and information about patients, providers, outcomes, and efficiencies is pushing the Healthcare industry to rapid change.

WJason Lee headshotith that, please join me now in welcoming our guest. We’re here with Jason Lee, Healthcare and Security Forums Director at The Open Group. Welcome, Jason.

Jason Lee: Thank you so much, Dana. Good to be here.

Gardner: Great to have you. I’m looking forward to the Boston conference and want to remind our listeners and readers that it’s not too late to sign up. You can learn more at http://www.opengroup.org.

Jason, let’s start by talking about the relationship between Boundaryless Information Flow, which is a major theme of the conference, and healthcare. Healthcare perhaps is the killer application for Boundaryless Information Flow.

Lee: Interesting, I haven’t heard it referred to that way, but healthcare is 17 percent of the US economy. It’s upwards of $3 trillion. The costs of healthcare are a problem, not just in the United States, but all over the world, and there are a great number of inefficiencies in the way we practice healthcare.

We don’t necessarily intend to be inefficient, but there are so many places and people involved in healthcare, it’s very difficult to get them to speak the same language. It’s almost as if you’re in a large house with lots of different rooms, and every room you walk into they speak a different language. To get information to flow from one room to the other requires some active efforts and that’s what we’re undertaking here at The Open Group.

Gardner: What is it about the current collaboration approaches that don’t work? Obviously, healthcare has been around for a long time and there have been different players involved. What’s the hurdle? What prevents a nice, seamless, easy flow and collaboration in information that gets better outcomes? What’s the holdup?

Lee: There are many ways to answer that question, because there are many barriers. Perhaps the simplest is the transformation of healthcare from a paper-based industry to a digital industry. Everyone has walked into an office, looked behind the people at the front desk, and seen file upon file and row upon row of folders, information that’s kept in a written format.

When there’s been movement toward digitizing that information, not everyone has used the same system. It’s almost like trains running on a different gauge track. Obviously if the track going east to west is a different gauge than going north to south, then trains aren’t going to be able to travel on those same tracks. In the same way, healthcare information does not flow easily from one office to another or from one provider to another.

Gardner: So not only do we have disparate strategies for collecting and communicating health data, but we’re also seeing much larger amounts of data coming from a variety of new and different places. Some of them now even involve sensors inside of patients themselves or devices that people will wear. So is the data deluge, the volume, also an issue here?

Lee: Certainly. I heard recently that an integrated health plan, which has multiple hospitals involved, contains more elements of data than the Library of Congress. As information is collected at multiple points in time, over a relatively short period of time, you really do have a data deluge. Figuring out how to find your way through all the data and look at the most relevant for the patient is a great challenge.

Gardner: I suppose the bad news is that there is this deluge of data, but it’s also good news, because more data means more opportunity for analysis, a better ability to predict and determine best practices, and also provide overall lower costs with better patient care.

So it seems like the stakes are rather high here to get this right, to not just crumble under a volume or an avalanche of data, but to master it, because it’s perhaps the future. The solution is somewhere in there too.

Lee: No question about it. At The Open Group, our focus is on solutions. We, like others, put a great deal of effort into describing the problems, but figuring out how to bring IT technologies to bear on business problems, how to encourage different parts of organizations to speak to one another and across organizations to speak the same language, and to operate using common standards and language. That’s really what we’re all about.

And it is, in a large sense, part of the process of helping to bring healthcare into the 21st Century. A number of industries are a couple of decades ahead of healthcare in the way they use large datasets — big data, some people refer to it as. I’m talking about companies like big department stores and large online retailers. They really have stepped up to the plate and are using that deluge of data in ways that are very beneficial to them, and healthcare can do the same. We’re just not quite at the same level of evolution.

Gardner: And to your point, the stakes are so much higher. Retail is, of course, a big deal in the economy, but as you pointed out, healthcare is such a much larger segment and portion. So just making modest improvements in communication, collaboration, or data analysis can reap huge rewards.

Lee: Absolutely true. There is the cost side of things, but there is also the quality side. So there are many ways in which healthcare can improve through standardization and coordinated development, using modern technology that cannot just reduce cost, but improve quality at the same time.

Gardner: I’d like to get into a few of the hotter trends, but before we do, it seems that The Open Group has recognized the importance here by devoting the entire second day of their conference in Boston, that will be on July 22, to Healthcare.

Maybe you could give us a brief overview of what participants, and even those who come in online and view recorded sessions of the conference at http://new.livestream.com/opengroup should expect? What’s going to go on July 22nd?

Lee: We have a packed day. We’re very excited to have Dr. Joe Kvedar, a physician at Partners HealthCare and Founding Director of the Center for Connected Health, as our first plenary speaker. The title of his presentation is “Making Health Additive.” Dr. Kvedar is a widely respected expert on mobile health, which is currently the Healthcare Forum’s top work priority. As mobile medical devices become ever more available and diversified, they will enable consumers to know more about their own health and wellness. A great deal of data of potentially useful health data will be generated. How this information can be used–not just by consumers but also by the healthcare establishment that takes care of them as patients, will become a question of increasing importance. It will become an area where standards development and The Open Group can be very helpful.

Our second plenary speaker, Proteus Duxbury, Chief Technology Officer at Connect for Health Colorado,will discuss a major feature of the Affordable Care Act—the health insurance exchanges–which are designed to bring health insurance to tens of millions of people who previously did not have access to it. Mr. Duxbury is going to talk about how Enterprise Architecture–which is really about getting to solutions by helping the IT folks talk to the business folks and vice versa–has helped the State of Colorado develop their Health Insurance Exchange.

After the plenaries, we will break up into 3 tracks, one of which is Healthcare-focused. In this track there will be three presentations, all of which discuss how Enterprise Architecture and the approach to Boundaryless Information Flow can help healthcare and healthcare decision-makers become more effective and efficient.

One presentation will focus on the transformation of care delivery at the Visiting Nurse Service of New York. Another will address stewarding healthcare transformation using Enterprise Architecture, focusing on one of our Platinum members, Oracle, and a company called Intelligent Medical Objects, and how they’re working together in a productive way, bringing IT and healthcare decision-making together.

Then, the final presentation in this track will focus on the development of an Enterprise Architecture-based solution at an insurance company. The payers, or the insurers–the big companies that are responsible for paying bills and collecting premiums–have a very important role in the healthcare system that extends beyond administration of benefits. Yet, payers are not always recognized for their key responsibilities and capabilities in the area of clinical improvements and cost improvements.

With the increase in payer data brought on in large part by the adoption of a new coding system–the ICD-10–which will come online this year, there will be a huge amount of additional data, including clinical data, that become available. At The Open Group, we consider payers—health insurance companies (some of which are integrated with providers)–as very important stakeholders in the big picture..

In the afternoon, we’re going to switch gears a bit and have a speaker talk about the challenges, the barriers, the “pain points” in introducing new technology into the healthcare systems. The focus will return to remote or mobile medical devices and the predictable but challenging barriers to getting newly generated health information to flow to doctors’ offices and into patients records, electronic health records, and hospitals data keeping and data sharing systems.

We’ll have a panel of experts that responds to these pain points, these challenges, and then we’ll draw heavily from the audience, who we believe will be very, very helpful, because they bring a great deal of expertise in guiding us in our work. So we’re very much looking forward to the afternoon as well.

Gardner: It’s really interesting. A couple of these different plenaries and discussions in the afternoon come back to this user-generated data. Jason, we really seem to be on the cusp of a whole new level of information that people will be able to develop from themselves through their lifestyle, new devices that are connected.

We hear from folks like Apple, Samsung, Google, and Microsoft. They’re all pulling together information and making it easier for people to not only monitor their exercise, but their diet, and maybe even start to use sensors to keep track of blood sugar levels, for example.

In fact, a new Flurry Analytics survey showed 62 percent increase in the use of health and fitness application over the last six months on the popular mobile devices. This compares to a 33 percent increase in other applications in general. So there’s an 87 percent faster uptick in the use of health and fitness applications.

Tell me a little bit how you see this factoring in. Is this a mixed blessing? Will so much data generated from people in addition to the electronic medical records, for example, be a bad thing? Is this going to be a garbage in, garbage out, or is this something that could potentially be a game-changer in terms of how people react to their own data and then bring more data into the interactions they have with care providers?

Lee: It’s always a challenge to predict what the market is going to do, but I think that’s a remarkable statistic that you cited. My prediction is that the increased volume of person- generated data from mobile health devices is going to be a game-changer. This view also reflects how the Healthcare Forum members (which includes members from Capgemini, Philips, IBM, Oracle and HP) view the future.

The commercial demand for mobile medical devices, things that can be worn, embedded, or swallowed, as in pills, as you mentioned, is growing ever more. The software and the applications that will be developed to be used with the devices is going to grow by leaps and bounds. As you say, there are big players getting involved. Already some of the pedometer type devices that measure the number of steps taken in a day have captured the interest of many, many people. Even David Sedaris, serious guy that he is, was writing about it recently in ‘The New Yorker’.

What we will find is that many of the health indicators that we used to have to go to the doctor or nurse or lab to get information on will become available to us through these remote devices.

There will be a question, of course, as to reliability and validity of the information, to your point about garbage in, garbage out, but I think standards development will help here This, again, is where The Open Group comes in. We might also see the FDA exercising its role in ensuring safety here, as well as other organizations, in determining which devices are reliable.

The Open Group is working in the area of mobile data and information systems that are developed around them, and their ability to (a) talk to one another and (b) talk to the data devices/infrastructure used in doctors’ offices and in hospitals. This is called interoperability and it’s certainly lacking in the country.

There are already problems around interoperability and connectivity of information in the healthcare establishment as it is now. When patients and consumers start collecting their own data, and the patient is put at the center of the nexus of healthcare, then the question becomes how does that information that patients collect get back to the doctor/clinician in ways in which the data can be trusted and where the data are helpful?

After all, if a patient is wearing a medical device, there is the opportunity to collect data, about blood sugar level let’s say, throughout the day. And this is really taking healthcare outside of the four walls of the clinic and bringing information to bear that can be very, very useful to clinicians and beneficial to patients.

In short, the rapid market dynamic in mobile medical devices and in the software and hardware that facilitates interoperability begs for standards-based solutions that reduce costs and improve quality, and all of which puts the patient at the center. This is The Open Group’s Healthcare Forum’s sweet spot.

Gardner: It seems to me a real potential game-changer as well, and that something like Boundaryless Information Flow and standards will play an essential role. Because one of the big question marks with many of the ailments in a modern society has to do with lifestyle and behavior.

So often, the providers of the care only really have the patient’s responses to questions, but imagine having a trove of data at their disposal, a 360-degree view of the patient to then further the cause of understanding what’s really going on, on a day-to-day basis.

But then, it’s also having a two-way street, being able to deliver perhaps in an automated fashion reinforcements and incentives, information back to the patient in real-time about behavior and lifestyles. So it strikes me as something quite promising, and I look forward to hearing more about it at the Boston conference.

Any other thoughts on this issue about patient flow of data, not just among and between providers and payers, for example, or providers in an ecosystem of care, but with the patient as the center of it all, as you said?

Lee: As more mobile medical devices come to the market, we’ll find that consumers own multiple types of devices at least some of which collect multiple types of data. So even for the patient, being at the center of their own healthcare information collection, there can be barriers to having one device talk to the other. If a patient wants to keep their own personal health record, there may be difficulties in bringing all that information into one place.

So the interoperability issue, the need for standards, guidelines, and voluntary consensus among stakeholders about how information is represented becomes an issue, not just between patients and their providers, but for individual consumers as well.

Gardner: And also the cloud providers. There will be a variety of large organizations with cloud-modeled services, and they are going to need to be, in some fashion, brought together, so that a complete 360-degree view of the patient is available when needed. It’s going to be an interesting time.

Of course, we’ve also looked at many other industries and tried to have a cloud synergy, a cloud-of-clouds approach to data and also the transaction. So it’s interesting how what’s going on in multiple industries is common, but it strikes me that, again, the scale and the impact of the healthcare industry makes it a leader now, and perhaps a driver for some of these long overdue structured and standardized activities.

Lee: It could become a leader. There is no question about it. Moreover, there is a lot Healthcare can learn from other companies, from mistakes that other companies have made, from lessons they have learned, from best practices they have developed (both on the content and process side). And there are issues, around security in particular, where Healthcare will be at the leading edge in trying to figure out how much is enough, how much is too much, and what kinds of solutions work.

There’s a great future ahead here. It’s not going to be without bumps in the road, but organizations like The Open Group are designed and experienced to help multiple stakeholders come together and have the conversations that they need to have in order to push forward and solve some of these problems.

Gardner: Well, great. I’m sure there will be a lot more about how to actually implement some of those activities at the conference. Again, that’s going to be in Boston, beginning on July 21, 2014.

We’ll have to leave it there. We’re about out of time. We’ve been talking with a new Director at The Open Group to learn how an expected continued deluge of data and information about patients and providers, outcomes and efficiencies are all working together to push the Healthcare industry to rapid change. And, as we’ve heard, that might very well spill over into other industries as well.

So we’ve seen how innovation and adaptation around technology, Enterprise Architecture and standards can improve the communication and collaboration among Healthcare ecosystem players.

It’s not too late to register for The Open Group Boston 2014 (http://www.opengroup.org/boston2014) and join the conversation via Twitter #ogchat #ogBOS, where you will be able to learn more about Boundaryless Information Flow, Open Platform 3.0, Healthcare and other relevant topics.

So a big thank you to our guest. We’ve been joined by Jason Lee, Healthcare and Security Forums Director at The Open Group. Thanks so much, Jason.

Lee: Thank you very much.

 

 

 

 

 

 

 

 

 

3 Comments

Filed under Boundaryless Information Flow™, Cloud, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Interoperability, Open Platform 3.0, Standards, Uncategorized