Category Archives: Strategy

The Open Group Panel: Internet of Things – Opportunities and Obstacles

Below is the transcript of The Open Group podcast exploring the challenges and ramifications of the Internet of Things, as machines and sensors collect vast amounts of data.

Listen to the podcast.

Dana Gardner: Hello, and welcome to a special BriefingsDirect thought leadership interview series coming to you in conjunction with recent The Open Group Boston 2014 on July 21 in Boston.

Dana Gardner I’m Dana Gardner, principal analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these discussions on Open Platform 3.0 and Boundaryless Information Flow.

We’re going to now specifically delve into the Internet of Things with a panel of experts. The conference has examined how Open Platform 3.0™ leverages the combined impacts of cloud, big data, mobile, and social. But to each of these now we can add a new cresting wave of complexity and scale as we consider the rapid explosion of new devices, sensors, and myriad endpoints that will be connected using internet protocols, standards and architectural frameworks.

This means more data, more cloud connectivity and management, and an additional tier of “things” that are going to be part of the mobile edge — and extending that mobile edge ever deeper into even our own bodies.

When we think about inputs to these social networks — that’s going to increase as well. Not only will people be tweeting, your device could be very well tweet, too — using social networks to communicate. Perhaps your toaster will soon be sending you a tweet about your English muffins being ready each morning.

The Internet of Things is more than the “things” – it means a higher order of software platforms. For example, if we are going to operate data centers with new dexterity thanks to software-definited networking (SDN) and storage (SDS) — indeed the entire data center being software-defined (SDDC) — then why not a software-defined automobile, or factory floor, or hospital operating room — or even a software-defined city block or neighborhood?

And so how does this all actually work? Does it easily spin out of control? Or does it remain under proper management and governance? Do we have unknown unknowns about what to expect with this new level of complexity, scale, and volume of input devices?

Will architectures arise that support the numbers involved, interoperability, and provide governance for the Internet of Things — rather than just letting each type of device do its own thing?

To help answer some of these questions, The Open Group assembled a distinguished panel to explore the practical implications and limits of the Internet of Things. So please join me in welcoming Said Tabet, Chief Technology Officer for Governance, Risk and Compliance Strategy at EMC, and a primary representative to the Industrial Internet Consortium; Penelope Gordon, Emerging Technology Strategist at 1Plug Corporation; Jean-Francois Barsoum, Senior Managing Consultant for Smarter Cities, Water and Transportation at IBM, and Dave Lounsbury, Chief Technical Officer at The Open Group.

Jean-Francois, we have heard about this notion of “cities as platforms,” and I think the public sector might offer us some opportunity to look at what is going to happen with the Internet of Things, and then extrapolate from that to understand what might happen in the private sector.

Hypothetically, the public sector has a lot to gain. It doesn’t have to go through the same confines of a commercial market development, profit motive, and that sort of thing. Tell us a little bit about what the opportunity is in the public sector for smart cities.

Barsoum_Jean-FrancoisJean-Francois Barsoum: It’s immense. The first thing I want to do is link to something that Marshall Van Alstyne (Professor at Boston University and Researcher at MIT) had talked about, because I was thinking about his way of approaching platforms and thinking about how cities represent an example of that.

You don’t have customers; you have citizens. Cities are starting to see themselves as platforms, as ways to communicate with their customers, their citizens, to get information from them and to communicate back to them. But the complexity with cities is that as a good a platform as they could be, they’re relatively rigid. They’re legislated into existence and what they’re responsible for is written into law. It’s not really a market.

Chris Harding (Forum Director of The Open Group Open Platform 3.0) earlier mentioned, for example, water and traffic management. Cities could benefit greatly by managing traffic a lot better.

Part of the issue is that you might have a state or provincial government that looks after highways. You might have the central part of the city that looks after arterial networks. You might have a borough that would look after residential streets, and these different platforms end up not talking to each other.

They gather their own data. They put in their own widgets to collect information that concerns them, but do not necessarily share with their neighbor. One of the conditions that Marshall said would favor the emergence of a platform had to do with how much overlap there would be in your constituents and your customers. In this case, there’s perfect overlap. It’s the same citizen, but they have to carry an Android and an iPhone, despite the fact it is not the best way of dealing with the situation.

The complexities are proportional to the amount of benefit you could get if you could solve them.

Gardner: So more interoperability issues?

Barsoum: Yes.

More hurdles

Gardner: More hurdles, and when you say commensurate, you’re saying that the opportunity is huge, but the hurdles are huge and we’re not quite sure how this is going to unfold.

Barsoum: That’s right.

Gardner: Let’s go to an area where the opportunity outstrips the challenge, manufacturing. Said, what is the opportunity for the software-defined factory floor for recognizing huge efficiencies and applying algorithmic benefits to how management occurs across domains of supply-chain, distribution, and logistics. It seems to me that this is a no-brainer. It’s such an opportunity that the solution must be found.

Tabet_SaidSaid Tabet: When it comes to manufacturing, the opportunities are probably much bigger. It’s where we can see a lot of progress that has already been done and still work is going on. There are two ways to look at it.

One is the internal side of it, where you have improvements of business processes. For example, similar to what Jean-Francois said, in a lot of the larger companies that have factories all around the world, you’ll see such improvements on a factory base level. You still have those silos at that level.

Now with this new technology, with this connectedness, those improvements are going to be made across factories, and there’s a learning aspect to it in terms of trying to manage that data. In fact, they do a better job. We still have to deal with interoperability, of course, and additional issues that could be jurisdictional, etc.

However, there is that learning that allows them to improve their processes across factories. Maintenance is one of them, as well as creating new products, and connecting better with their customers. We can see a lot of examples in the marketplace. I won’t mention names, but there are lots of them out there with the large manufacturers.

Gardner: We’ve had just-in-time manufacturing and lean processes for quite some time, trying to compress the supply chain and distribution networks, but these haven’t necessarily been done through public networks, the internet, or standardized approaches.

But if we’re to benefit, we’re going to need to be able to be platform companies, not just product companies. How do you go from being a proprietary set of manufacturing protocols and approaches to this wider, standardized interoperability architecture?

Tabet: That’s a very good question, because now we’re talking about that connection to the customer. With the airline and the jet engine manufacturer, for example, when the plane lands and there has been some monitoring of the activity during the whole flight, at that moment, they’ll get that data made available. There could be improvements and maybe solutions available as soon as the plane lands.

Interoperability

That requires interoperability. It requires Platform 3.0 for example. If you don’t have open platforms, then you’ll deal with the same hurdles in terms of proprietary technologies and integration in a silo-based manner.

Gardner: Penelope, you’ve been writing about the obstacles to decision-making that might become apparent as big data becomes more prolific and people try to capture all the data about all the processes and analyze it. That’s a little bit of a departure from the way we’ve made decisions in organizations, public and private, in the past.

Of course, one of the bigger tenets of Internet of Things is all this great data that will be available to us from so many different points. Is there a conundrum of some sort? Is there an unknown obstacle for how we, as organizations and individuals, can deal with that data? Is this going to be chaos, or is this going to be all the promises many organizations have led us to believe around big data in the Internet of Things?

Gordon_PenelopePenelope Gordon: It’s something that has just been accelerated. This is not a new problem in terms of the decision-making styles not matching the inputs that are being provided into the decision-making process.

Former US President Bill Clinton was known for delaying making decisions. He’s a head-type decision-maker and so he would always want more data and more data. That just gets into a never-ending loop, because as people collect data for him, there is always more data that you can collect, particularly on the quantitative side. Whereas, if it is distilled down and presented very succinctly and then balanced with the qualitative, that allows intuition to come to fore, and you can make optimal decisions in that fashion.

Conversely, if you have someone who is a heart-type or gut-type decision-maker and you present them with a lot of data, their first response is to ignore the data. It’s just too much for them to take in. Then you end up completely going with whatever you feel is correct or whatever you have that instinct that it’s the correct decision. If you’re talking about strategic decisions, where you’re making a decision that’s going to influence your direction five years down the road, that could be a very wrong decision to make, a very expensive decision, and as you said, it could be chaos.

It just brings to mind to me Dr. Suess’s The Cat in the Hat with Thing One and Thing Two. So, as we talk about the Internet of Things, we need to keep in mind that we need to have some sort of structure that we are tying this back to and understanding what are we trying to do with these things.

Gardner: Openness is important, and governance is essential. Then, we can start moving toward higher-order business platform benefits. But, so far, our panel has been a little bit cynical. We’ve heard that the opportunity and the challenges are commensurate in the public sector and that in manufacturing we’re moving into a whole new area of interoperability, when we think about reaching out to customers and having a boundary that is managed between internal processes and external communications.

And we’ve heard that an overload of data could become a very serious problem and that we might not get benefits from big data through the Internet of Things, but perhaps even stumble and have less quality of decisions.

So Dave Lounsbury of The Open Group, will the same level of standardization work? Do we need a new type of standards approach, a different type of framework, or is this a natural path and course what we have done in the past?

Different level

Lounsbury_DaveDave Lounsbury: We need to look at the problem at a different level than we institutionally think about an interoperability problem. Internet of Things is riding two very powerful waves, one of which is Moore’s Law, that these sensors, actuators, and network get smaller and smaller. Now we can put Ethernet in a light switch right, a tag, or something like that.

Also, Metcalfe’s Law that says that the value of all this connectivity goes up with the square of the number of connected points, and that applies to both the connection of the things but more importantly the connection of the data.

The trouble is, as we have said, that there’s so much data here. The question is how do you manage it and how do you keep control over it so that you actually get business value from it. That’s going to require us to have this new concept of a platform to not only to aggregate, but to just connect the data, aggregate it, correlate it as you said, and present it in ways that people can make decisions however they want.

Also, because of the raw volume, we have to start thinking about machine agency. We have to think about the system actually making the routine decisions or giving advice to the humans who are actually doing it. Those are important parts of the solution beyond just a simple “How do we connect all the stuff together?”

Gardner: We might need a higher order of intelligence, now that we have reached this border of what we can do with our conventional approaches to data, information, and process.

Thinking about where this works best first in order to then understand where it might end up later, I was intrigued again this morning by Professor Van Alstyne. He mentioned that in healthcare, we should expect major battles, that there is a turf element to this, that the organization, entity or even commercial corporation that controls and manages certain types of information and access to that information might have some very serious platform benefits.

The openness element now is something to look at, and I’ll come back to the public sector. Is there a degree of openness that we could legislate or regulate to require enough control to prevent the next generation of lock-in, which might not be to a platform to access to data information and endpoints? Where is it in the public sector that we might look to a leadership position to establish needed openness and not just interoperability.

Barsoum: I’m not even sure where to start answering that question. To take healthcare as an example, I certainly didn’t write the bible on healthcare IT systems and if someone did write that, I think they really need to publish it quickly.

We have a single-payer system in Canada, and you would think that would be relatively easy to manage. There is one entity that manages paying the doctors, and everybody gets covered the same way. Therefore, the data should be easily shared among all the players and it should be easy for you to go from your doctor, to your oncologist, to whomever, and maybe to your pharmacy, so that everybody has access to this same information.

We don’t have that and we’re nowhere near having that. If I look to other areas in the public sector, areas where we’re beginning to solve the problem are ones where we face a crisis, and so we need to address that crisis rapidly.

Possibility of improvement

In the transportation infrastructure, we’re getting to that point where the infrastructure we have just doesn’t meet the needs. There’s a constraint in terms of money, and we can’t put much more money into the structure. Then, there are new technologies that are coming in. Chris had talked about driverless cars earlier. They’re essentially throwing a wrench into the works or may be offering the possibility of improvement.

On any given piece of infrastructure, you could fit twice as many driverless cars as cars with human drivers in them. Given that set of circumstances, the governments are going to find they have no choice but to share data in order to be able to manage those. Are there cases where we could go ahead of a crisis in order to manage it? I certainly hope so.

Gardner: How about allowing some of the natural forces of marketplaces, behavior, groups, maybe even chaos theory, where if sufficient openness is maintained there will be some kind of a pattern that will emerge? We need to let this go through its paces, but if we have artificial barriers, that might be thwarted or power could go to places that we would regret later.

Barsoum: I agree. People often focus on structure. So the governance doesn’t work. We should find some way to change the governance of transportation. London has done a very good job of that. They’ve created something called Transport for London that manages everything related to transportation. It doesn’t matter if it’s taxis, bicycles, pedestrians, boats, cargo trains, or whatever, they manage it.

You could do that, but it requires a lot of political effort. The other way to go about doing it is saying, “I’m not going to mess with the structures. I’m just going to require you to open and share all your data.” So, you’re creating a new environment where the governance, the structures, don’t really matter so much anymore. Everybody shares the same data.

Gardner: Said, to the private sector example of manufacturing, you still want to have a global fabric of manufacturing capabilities. This is requiring many partners to work in concert, but with a vast new amount of data and new potential for efficiency.

How do you expect that openness will emerge in the manufacturing sector? How will interoperability play when you don’t have to wait for legislation, but you do need to have cooperation and openness nonetheless?

Tabet: It comes back to the question you asked Dave about standards. I’ll just give you some examples. For example, in the automotive industry, there have been some activities in Europe around specific standards for communication.

The Europeans came to the US and started to have discussions, and the Japanese have interest, as well as the Chinese. That shows, because there is a common interest in creating these new models from a business standpoint, that these challenges they have to be dealt with together.

Managing complexity

When we talk about the amounts of data, what we call now big data, and what we are going to see in about five years or so, you can’t even imagine. How do we manage that complexity, which is multidimensional? We talked about this sort of platform and then further, that capability and the data that will be there. From that point of view, openness is the only way to go.

There’s no way that we can stay away from it and still be able to work in silos in that new environment. There are lots of things that we take for granted today. I invite some of you to go back and read articles from 10 years ago that try to predict the future in technology in the 21st century. Look at your smart phones. Adoption is there, because the business models are there, and we can see that progress moving forward.

Collaboration is a must, because it is a multidimensional level. It’s not just manufacturing like jet engines, car manufacturers, or agriculture, where you have very specific areas. They really they have to work with their customers and the customers of their customers.

Adoption is there, because the business models are there, and we can see that progress moving forward.

Gardner: Dave, I have a question for both you and Penelope. I’ve seen some instances where there has been a cooperative endeavor for accessing data, but then making it available as a service, whether it’s an API, a data set, access to a data library, or even analytics applications set. The Ocean Observatories Initiative is one example, where it has created a sensor network across the oceans and have created data that then they make available.

Do you think we expect to see an intermediary organization level that gets between the sensors and the consumers or even controllers of the processes? Is there’s a model inherent in that that we might look to — something like that cooperative data structure that in some ways creates structure and governance, but also allows for freedom? It’s sort of an entity that we don’t have yet in many organizations or many ecosystems and that needs to evolve.

Lounsbury: We’re already seeing that in the marketplace. If you look at the commercial and social Internet of Things area, we’re starting to see intermediaries or brokers cropping up that will connect the silo of my android ecosystem to the ecosystem of package tracking or something like that. There are dozens and dozens of these cropping up.

In fact, you now see APIs even into a silo of what you might consider a proprietary system and what people are doing is to to build a layer on top of those APIs that intermediate the data.

This is happening on a point-to-point basis now, but you can easily see the path forward. That’s going to expand to large amounts of data that people will share through a third party. I can see this being a whole new emerging market much as what Google did for search. You could see that happening for the Internet of Things.

Gardner: Penelope, do you have any thoughts about how that would work? Is there a mutually assured benefit that would allow people to want to participate and cooperate with that third entity? Should they have governance and rules about good practices, best practices for that intermediary organization? Any thoughts about how data can be managed in this sort of hierarchical model?

Nothing new

Gordon: First, I’ll contradict it a little bit. To me, a lot of this is nothing new, particularly coming from a marketing strategy perspective, with business intelligence (BI). Having various types of intermediaries, who are not only collecting the data, but then doing what we call data hygiene, synthesis, and even correlation of the data has been around for a long time.

It was an interesting, when I looked at recent listing of the big-data companies, that some notable companies were excluded from that list — companies like Nielsen. Nielsen’s been collecting data for a long time. Harte-Hanks is another one that collects a tremendous amount of information and sells that to companies.

That leads into the another part of it that I think there’s going to be. We’re seeing an increasing amount of opportunity that involves taking public sources of data and then providing synthesis on it. What remains to be seen is how much of the output of that is going to be provided for “free”, as opposed to “fee”. We’re going to see a lot more companies figuring out creative ways of extracting more value out of data and then charging directly for that, rather than using that as an indirect way of generating traffic.

Gardner: We’ve seen examples of how this has been in place. Does it scale and does the governance or lack of governance that might be in the market now sustain us through the transition into Platform 3.0 and the Internet of Things.

Gordon: That aspect is the lead-on part of “you get what you pay for”. If you’re using a free source of data, you don’t have any guarantee that it is from authoritative sources of data. Often, what we’re getting now is something somebody put it in a blog post, and then that will get referenced elsewhere, but there was nothing to go back to. It’s the shaky supply chain for data.

You need to think about the data supply and that is where the governance comes in. Having standards is going to increasingly become important, unless we really address a lot of the data illiteracy that we have. A lot of people do not understand how to analyze data.

One aspect of that is a lot of people expect that we have to do full population surveys, as opposed representative sampling to get much more accurate and much more cost-effective collection of data. That’s just one example, and we do need a lot more in governance and standards.

Gardner: What would you like to see changed most in order for the benefits and rewards of the Internet of Things to develop and overcome the drawbacks, the risks, the downside? What, in your opinion, would you like to see happen to make this a positive, rapid outcome? Let’s start with you Jean-Francois.

Barsoum: There are things that I have seen cities start to do now. There are couple of examples: Philadelphia is one and Barcelona does this too. Rather than do the typical request for proposal (RFP), where they say, “This is the kind of solution we’re looking for, and here are our parameters. Can l you tell us how much it is going to cost to build,” they come to you with the problem and they say, “Here is the problem I want to fix. Here are my priorities, and you’re at liberty to decide how best to fix the problem, but tell us how much that would cost.”

If you do that and you combine it with access to the public data that is available — if public sector opens up its data — you end up with a very powerful combination that liberates a lot of creativity. You can create a lot of new business models. We need to see much more of that. That’s where I would start.

More education

Tabet: I agree with Jean-Francois on that. What I’d like to add is that I think we need to push the relation a little further. We need more education, to your point earlier, around the data and the capabilities.

We need these platforms that we can leverage a little bit further with the analytics, with machine learning, and with all of these capabilities that are out there. We have to also remember, when we talk about the Internet of Things, it is things talking to each other.

So it is not human-machine communication. Machine-to-machine automation will be further than that, and we need more innovation and more work in this area, particularly more activity from the governments. We’ve seen that, but it is a little bit frail from that point of view right now.

Gardner: Dave Lounsbury, thoughts about what need to happen in order to keep this on the tracks?

Lounsbury: We’ve touched on lot of them already. Thank you for mentioning the machine-to-machine part, because there are plenty of projections that show that it’s going to be the dominant form of Internet communication, probably within the next four years.

So we need to start thinking of that and moving beyond our traditional models of humans talking through interfaces to set of services. We need to identify the building blocks of capability that you need to manage, not only the information flow and the skilled person that is going to produce it, but also how you manage the machine-to-machine interactions.

Gordon: I’d like to see not so much focus on data management, but focus on what is the data managing and helping us to do. Focusing on the machine-to-machine and the devices is great, but it should be not on the devices or on the machines… it should be on what can they accomplish by communicating; what can you accomplish with the devices and then have a reverse engineer from that.

Gardner: Let’s go to some questions from the audience. The first one asks about a high order of intelligence which we mentioned earlier. It could be artificial intelligence, perhaps, but they ask whether that’s really the issue. Is the nature of the data substantially different, or we are just creating more of the same, so that it is a storage, plumbing, and processing problem? What, if anything, are we lacking in our current analytics capabilities that are holding us back from exploiting the Internet of Things?

Gordon: I’ve definitely seen that. That has a lot to do with not setting your decision objectives and your decision criteria ahead of time so that you end up collecting a whole bunch of data, and the important data gets lost in the mix. There is a term “data smog.”

Most important

The solution is to figure out, before you go collecting data, what data is most important to you. If you can’t collect certain kinds of data that are important to you directly, then think about how to indirectly collect that data and how to get proxies. But don’t try to go and collect all the data for that. Narrow in on what is going to be most important and most representative of what you’re trying to accomplish.

Gardner: Does anyone want to add to this idea of understanding what current analytics capabilities are lacking, if we have to adopt and absorb the Internet of Things?

Barsoum: There is one element around projection into the future. We’ve been very good at analyzing historical information to understand what’s been happening in the past. We need to become better at projecting into the future, and obviously we’ve been doing that for some time already.

But so many variables are changing. Just to take the driverless car as an example. We’ve been collecting data from loop detectors, radar detectors, and even Bluetooth antennas to understand how traffic moves in the city. But we need to think harder about what that means and how we understand the city of tomorrow is going to work. That requires more thinking about the data, a little bit like what Penelope mentioned, how we interpret that, and how we push that out into the future.

Lounsbury: I have to agree with both. It’s not about statistics. We can use historical data. It helps with lot of things, but one of the major issues we still deal with today is the question of semantics, the meaning of the data. This goes back to your point, Penelope, around the relevance and the context of that information – how you get what you need when you need it, so you can make the right decisions.

Gardner: Our last question from the audience goes back to Jean-Francois’s comments about the Canadian healthcare system. I imagine it applies to almost any healthcare system around the world. But it asks why interoperability is so difficult to achieve, when we have the power of the purse, that is the market. We also supposedly have the power of the legislation and regulation. You would think between one or the other or both that interoperability, because the stakes are so high, would happen. What’s holding it up?

Barsoum: There are a couple of reasons. One, in the particular case of healthcare, is privacy, but that is one that you could see going elsewhere. As soon as you talk about interoperability in the health sector, people start wondering where is their data going to go and how accessible is it going to be and to whom.

You need to put a certain number of controls over top of that. What is happening in parallel is that you have people who own some data, who believe they have some power from owning that data, and that they will lose that power if they share it. That can come from doctors, hospitals, anywhere.

So there’s a certain amount of change management you have to get beyond. Everybody has to focus on the welfare of the patient. They have to understand that there has to be a priority, but you also have to understand the welfare of the different stakeholders in the system and make sure that you do not forget about them, because if you forget about them they will find some way to slow you down.

Use of an ecosystem

Lounsbury: To me, that’s a perfect example of what Marshall Van Alstyne talked about this morning. It’s the change from focus on product to a focus on an ecosystem. Healthcare traditionally has been very focused on a doctor providing product to patient, or a caregiver providing a product to a patient. Now, we’re actually starting to see that the only way we’re able to do this is through use of an ecosystem.

That’s a hard transition. It’s a business-model transition. I will put in a plug here for The Open Group Healthcare vertical, which is looking at that from architecture perspective. I see that our Forum Director Jason Lee is over here. So if you want to explore that more, please see him.

Gardner: I’m afraid we will have to leave it there. We’ve been discussing the practical implications of the Internet of Things and how it is now set to add a new dimension to Open Platform 3.0 and Boundaryless Information Flow.

We’ve heard how new thinking about interoperability will be needed to extract the value and orchestrate out the chaos with such vast new scales of inputs and a whole new categories of information.

So with that, a big thank you to our guests: Said Tabet, Chief Technology Officer for Governance, Risk and Compliance Strategy at EMC; Penelope Gordon, Emerging Technology Strategist at 1Plug Corp.; Jean-Francois Barsoum, Senior Managing Consultant for Smarter Cities, Water and Transportation at IBM, and Dave Lounsbury, Chief Technology Officer at The Open Group.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on Open Platform 3.0 and Boundaryless Information Flow at The Open Group Conference, recently held in Boston. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript.

Transcript of The Open Group podcast exploring the challenges and ramifications of the Internet of Things, as machines and sensors collect vast amounts of data. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2014. All rights reserved.

You may also be interested in:

Leave a comment

Filed under Boundaryless Information Flow™, Business Architecture, Cloud, Cloud/SOA, Data management, digital technologies, Enterprise Architecture, Future Technologies, Information security, Internet of Things, Interoperability, Open Platform 3.0, Service Oriented Architecture, Standards, Strategy, Supply chain risk, Uncategorized

The Open Group Boston 2014 – Q&A with Proteus Duxbury, Connect for Health Colorado

By The Open Group

The U.S. healthcare industry is undergoing a major sea change right now due in part to the Affordable Care Act, as well as the need to digitize legacy systems that have remained largely paper-based in order to better facilitate information exchange.

Proteus Duxbury, the CTO for the state of Colorado’s health insurance exchange, Connect for Health Colorado, has a wide and varied background in healthcare IT ranging from IT consulting and helping to lead a virtual health medicine group to his current position running the supporting technologies operating the Colorado exchange. Duxbury joined Connect for Health Colorado early 2014 as the exchange was going through its first major enrollment period.

We spoke to Duxbury in advance of his keynote on July 22 at The Open Group Boston 2014 conference about the current state of healthcare IT and how Enterprise Architecture will play an integral part in the Connect for Health Colorado exchange moving forward as the organizations transitions from a start-up culture to a maintenance and run mode.

Below is a transcript of that conversation.

What factors went into making the roll-out of Connect for Health Colorado healthcare exchange a success?

There were three things. The first is we have an exceptional leadership team. The CEO, especially, is a fantastic leader and was able to create a strong vision and have her team rally quickly behind it. The executive team was empowered to make decisions quickly and there was a highly dedicated work force and a powerful start-up culture. In addition, there was a uniformly shared passion to see healthcare reform successfully implemented in Colorado.

The second reason for success was the flexibility and commitment of our core vendor—which is CGI—and their ability to effectively manage and be agile with rapidly changing regulatory requirements and rapidly changing needs. These systems had never been built anywhere else before; it really was a green field program of work. There was a shared commitment to achieving success and very strong contracting in place ensuring that we were fully protected throughout the whole process.

The third is our COTS (Commercial Off-The-Shelf) solution that was selected. Early on, we established an architecture principle of deploying out-of-the-box products rather than trying to build from scratch, so there was minimal customization and development effort. Scope control was tight. We implemented the hCentive package, which is one of the leading health insurance exchange software packages. Best-of-breed solutions were implemented around the edges where it was necessary to meet a niche need, but we try to build as much into the single product as we can. We have a highly resilient and available architecture. The technical architecture scales well and has been very robust and resilient through a couple of very busy periods at the end of open enrollment, particularly on March 31st and toward the end of December, as the deadline for enrollment in 2014 plans approached.

Why are you putting together an Enterprise Architecture for the exchange?

We’re extremely busy right now with a number of critical projects. We’re still implementing core functionality but we do have a bit of a breather on the horizon. Going into next year things will get lighter, and now is the time for a clear roadmap to achieve the IT strategic objectives that I have set for the organization.

We are trying to achieve a reduction in our M&O (maintenance and operations) expense because we need to be self-sustaining from a budgetary point of view. Our federal funding will be going away starting 2015 so we need to consolidate architecture and systems and gain additional efficiencies. We need to continue to meet our key SLAs, specifically around availability—we have a very public-facing set of systems. IT needs to be operationalized. We need to move from the existing start-up culture to the management of IT in a CMM (Capability Maturity Model) or ITIL-type fashion. And we also need to continue to grow and hold on to our customer base, as there is always a reasonable amount of churn and competing services in a relatively uncertain marketplace. We need to continue to grow our customer base so we can be self-sustaining. To support this, we need to have a more operationalized, robust and cost-efficient IT architecture, and we need a clear roadmap to get there. If you don’t have a roadmap or design that aligns with business priorities, then those things are difficult to achieve.

Finally, I am building up an IT team. To date, we’ve been highly reliant on contractors and consultants to get us to where we are now. In order to reduce our cost base, we are building out our internal IT team and a number of key management roles. That means we need to have a roadmap and something that we can all steer towards—a shared architecture roadmap.

What benefits do you expect to see from implementing the architecture?

Growing our customer base is a critical goal—we need to stabilize the foundations of our IT solution and use that as a platform for future growth and innovation. It’s hard to grow and innovate if you haven’t got your core IT platform stabilized. By making our IT systems easier to be maintained and updated we hope to see continued reduction in IT M&O. High availability is another benefit I expect to see, as well as closer alignment with business goals and business processes and capabilities.

Are there any particular challenges in setting up an Enterprise Architecture for a statewide health exchange? What are they?

I think there are some unique challenges. The first is budget. We do need to be self-sustaining, and there is not a huge amount of budget available for additional capital investments. There is some, but it has to be very carefully allocated, managed and spent diligently. We do work within a tightly controlled federal set of regulations and constraints and are frequently under the spotlight from auditors and others.

There are CMS (Center for Medicaid Services) regulations that define what we can and cannot do with our technology. We have security regulations that we have to exist within and a lot of IRS requirements that we have to meet and be compliant with. We have a complex set of partners to work with in Colorado and nationally—we have to work with Colorado state agencies such as the Department of Insurance and Medicaid (HCPF), we have to work very closely with a large number—we’ve currently got 17—of carriers. We have CMS and the federal marketplace (Federal Data Services Hub). We have one key vendor—CGI—but we are in a multi-vendor environment and all our projects involve having to manage multiple different organizations towards success.

The final challenge is that we’re very busy still building applications and implementing functionality, so my job is to continue to be focused on successful delivery of two very large projects, while ensuring our longer term architecture planning is completed, which is going to be critical for our long-term sustainability. That’s the classic Enterprise Architecture conundrum. I feel like we’ve got a handle on it pretty well here—because they’re both critical.

What are some of the biggest challenges that you see facing the Healthcare industry right now?

Number one is probably integration—the integration of data especially between different systems. A lot of EMR (electronic medical record) systems are relatively closed to the outside world, and it can be expensive and difficult to open them up. Even though there are some good standards out there like HL7 and EDI (Electronic Data Interchange), everyone seems to be implementing them differently.

Personal healthcare tech (mHealth and Telehealth) is not going to take off until there is more integration. For example, between whatever you’re using to track your smoking, blood pressure, weight, etc., it needs to be integrated seamlessly with your medical records and insurance provider. And until this data can be used for meaningful analytics and care planning, until they solve this integration nightmare, it’s going to be difficult really to make great strides.

Security is the second challenge. There’s a huge proliferation of new technology endpoints and there’s a lot of weak leaks around people, process and technology. The regulators are only really starting to catch up, and they’re one step behind. There’s a lot of personal data out there and it’s not always technology that’s the weak leak. We have that pretty tightly controlled here because we’re highly regulated and are technology is tightly controlled, but on the provider side especially, it’s a huge challenge and every week we see a new data breach.

The third challenge is ROI. There’s a lot of investment being made into personal health technology but because we’re in a private insurance market and a private provider market, until someone has really cracked what the ROI is for these initiatives, whether it’s tied to re-admissions or reimbursements, it’s never going to really become mainstream. And until it becomes part of the fabric of care delivery, real value is not going to be realized and health outcomes not significantly improved.

But models are changing—once the shift to outcome-based reimbursement takes hold, providers will be more incentivized to really invest in these kind of technologies and get them working. But that shift hasn’t really occurred yet, and I’ve yet to see really compelling ROI models for a lot of these new investments. I’m a believer that it really has to be the healthcare provider that drives and facilitates the engagement with patients on these new technologies. Ultimately, I believe, people, left to their own devices, will experiment and play with something for a while, but unless their healthcare provider is engaging them actively on it, it’s not something that they will persist in doing. A lot of the large hospital groups are dipping their toe in the water and seeing what sticks, but I don’t really see any system where these new technologies are becoming part of the norm of healthcare delivery.

Do you feel like there are other places that are seeing more success in this outside of the US?

I know in the UK, they’re having a lot of success with their Telehealth pilots. But their primary objective is to make people healthier, so it’s a lot easier in that environment to have a good idea, show that there’s some case for improving outcomes and get funding. In the US, proving outcomes currently isn’t enough. You have to prove that there’s some revenue to be made or cost to be saved. In some markets, they’ve experienced problems similar to the US and in some markets it’s probably been easier. That doesn’t mean they’ve had an easy time implementing them—the UK has had huge problems with integration and with getting EMR systems deployed and implemented nationally. But a lot of those are classical IT problems of change management, scope control and trying to achieve too much too quickly. The healthcare industry is about 20 years behind other industries. They’re going through all the pain with the EMR rollouts that most manufacturing companies went through with ERP 20 years ago and most banks went through 40 years ago.

How can organizations such as The Open Group and its Healthcare Forum better work with the Healthcare industry to help them achieve better results?

I think firstly bringing a perspective from other industries. Healthcare IT conferences and organizations tend to be largely made up of people who have been in healthcare most of their working lives. The Open Group brings in perspective from other industries. Also reference architectures—there’s a shortage of good reference architectures in the healthcare space and that’s something that is really The Open Group’s strong point. Models that span the entire healthcare ecosystem—including payers, providers, pharma and exchanges, IT process and especially IT architecture process—can be improved in healthcare. Healthcare IT departments aren’t as mature as other industries because the investment has not been there until now. They’re in a relative start-up mode. Enterprise Architecture—if you’re a large healthcare provider and you’re growing rapidly through M&O (like so many are right now), that’s a classic use case for having a structured Enterprise Architecture process.

Within the insurance marketplace movement, things have grown very quickly; it’s been tough work. A handful of the states have been very successful, and I think we’re not unique in that we’re a start-up organization and it’s going to be several years until we mature to fully functional, well measured l IT organization. Architecture rigor and process is key to achieving sustainability and maturity.

Join the conversation – #ogchat #ogBOS

duxbury_0Proteus Duxbury joined Connect for Health Colorado as Chief Technology Officer in February 2014, directing technology strategy and operations. Proteus previously served at Catholic Health Initiatives, where he led all IT activities for Virtual Health Services, a division responsible for deploying Telehealth solutions throughout the US. Prior to that, Proteus served as a Managing Consultant at the PA Consulting Group, leading technology change programs in the US and UK primarily in the healthcare and life science industry. He holds a Bachelor of Science in Information Systems Management from Bournemouth University.

 

 

Comments Off

Filed under COTS, Enterprise Architecture, Healthcare, Professional Development, Strategy, Uncategorized

The Power of APIs – Join The Open Group Tweet Jam on Wednesday, July 9th

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The face of technology is evolving at breakneck speed, driven by demand from consumers and businesses alike for more robust, intuitive and integrated service offerings. APIs (application programming interfaces) have made this possible by offering greater interoperability between otherwise disparate software and hardware systems. While there are clear benefits to their use, how do today’s security and value-conscious enterprises take advantage of this new interoperability without exposing them themselves?

On Wednesday, July 9th at 9:00 am PT/12:00 pm ET/5:00 pm GMT, please join us for a tweet jam that will explore how APIs are changing the face of business today, and how to prepare for their implementation in your enterprise.

APIs are at the heart of how today’s technology communicates with one another, and have been influential in enabling new levels of development for social, mobility and beyond. The business benefits of APIs are endless, as are the opportunities to explore how they can be effectively used and developed.

There is reason to maintain a certain level of caution, however, as recent security issues involving open APIs have impacted overall confidence and sustainability.

This tweet jam will look at the business benefits of APIs, as well as potential vulnerabilities and weak points that you should be wary of when integrating them into your Enterprise Architecture.

We welcome The Open Group members and interested participants from all backgrounds to join the discussion and interact with our panel of thought-leaders from The Open Group including Jason Lee, Healthcare and Security Forums Director; Jim Hietala, Vice President of Security; David Lounsbury, CTO; and Dr. Chris Harding, Director for Interoperability and Open Platform 3.0™ Forum Director. To access the discussion, please follow the hashtag #ogchat during the allotted discussion time.

Interested in joining The Open Group Security Forum? Register your interest, here.

What Is a Tweet Jam?

A tweet jam is a 45 minute “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on relevant and thought-provoking issues. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Here are some helpful guidelines for taking part in the tweet jam:

  • Please introduce yourself (name, title and organization)
  • Use the hashtag #ogchat following each of your tweets
  • Begin your tweets with the question number to which you are responding
  • Please refrain from individual product/service promotions – the goal of the tweet jam is to foster an open and informative dialogue
  • Keep your commentary focused, thoughtful and on-topic

If you have any questions prior to the event or would like to join as a participant, please contact George Morin (@GMorin81 or george.morin@hotwirepr.com).

We look forward to a spirited discussion and hope you will be able to join!

 

3 Comments

Filed under Data management, digital technologies, Enterprise Architecture, Enterprise Transformation, Information security, Open Platform 3.0, real-time and embedded systems, Standards, Strategy, Tweet Jam, Uncategorized

Strategic Alignment Survey

These days organizations operate in a dynamic and fast changing environment which makes formulating a consistent strategy a challenging task and executing that strategy even more difficult. More than half of organizations surveyed in previous economic studies indicated that they have not been successful at executing strategic initiatives. Moreover, a majority of organizations face problems when executing their strategic vision.

In an environment where competition and globalization of markets is intensifying, managing and surviving change becomes increasingly important. A business strategy determines the decisions and course of action that businesses take to achieve competitive advantage and is therefore crucial to survive change. Nonetheless, several economic studies indicated that many organizations fail to implement strategic alternatives. Therefore, it is important to know more about the reasons underlying the difficulties of organizations to reach strategic alignment.

Strategic alignment
Organizations develop and implement strategies to achieve (strategic) goals. The development of a strategy is about formulating what should be changed to evolve from the current situation to the desired future state. Strategy implementation is about translating the strategic plans into clear actions to execute the strategy. Strategic alignment is the ability to create a fit or synergy between the position of the organization within the environment (business) and the design of the appropriate business processes, resources and capabilities (IT) to support the execution. Strategic alignment cannot be reached when strategy development is considered to be a separate process from strategy implementation. Strategy development and strategy implementation are intertwined processes which both need to be successful for superior firm performance.

Strategy1

The way how organizations move from strategy development to strategy implementation is influenced by many factors. Consequently, strategic alignment is influenced by several factors which all contribute to the successful development and implementation of a strategy. We distinguish three categories in which several factors are combined that influence strategic alignment. How organizations manage the factors within these three categories determine whether they are able to reach strategic alignment or not. These three categories are:

  • Culture and shared beliefs: the collective thoughts and actions of employees towards the strategic orientation of the organization determine whether strategy implementation will be successful or not. Consequently, all the employees must be clear on the what, why, when and how of the strategy. According to previous studies the inability of management to overcome resistance to change is an important obstacle to strategy execution.
  • Organizational capabilities: capabilities, resources, systems and processes should be aligned with the strategy to be able to execute the strategy properly. An organization needs to consider their existing and needed capabilities and resources during strategy development and implementation. Strategic change gets obstructed when long-term strategic goals are not translated to short-term objectives or actions.
  • Communication: creating understanding throughout the organization about the strategy, like why it is developed and how it is implemented, is essential for developing and implementing a strategy. There should be a clear definition of purpose, values and behaviors to guide the implementation process. A poor or vague strategy makes it nearly impossible to successfully execute a strategy which makes it a killer of strategy implementation.strategy2

Strategic Alignment Survey

In order to gain a better understanding of the strategic alignment efforts of individual organizations, BiZZdesign has created a Strategic Alignment survey. We want to understand more about the way in which organizations move from strategy development to strategy implementation. The information gathered from this survey contributes to the work done on improving strategic alignment within organizations. We would like to learn from your organization’s experiences regarding strategy development and implementation and its efforts towards strategic alignment. For this reason we kindly ask you to fill in the survey: http://alignment-eng.enquete.com/.

BiZZdesign (along with our partners The Open Group, NAF and the University of Twente) would be grateful if you could complete this Strategic Alignment survey to help us get a better understanding of the strategic alignment efforts of organizations.

The survey will be available on-line until the end of June 2014. All results will be analysed and reported in an anonymous way.

The results of this survey will then be published in a White Paper by The Open Group. If you leave us your contact email, then you will also receive the e-book ‘Strategizer – The Method’, in which initial results on strategic alignment are documented, and you have a chance to win a book voucher worth €200.

We really appreciate your time and effort. Thank you in advance.

franken_henryHenry Franken M.Sc. Ph.D, is CEO of BiZZdesign and chair of The ArchiMate Forum at The Open Group. Henry is a speaker at many conferences. Henry has co-authored several international journal and conference publications and Open Group whitepapers.

At BiZZdesign, Henry is also responsible for research and innovation. Alignment with and contribution to open standards are key. BiZZdesign has contributed to and edited the ArchiMate 2 specification. BiZZdesign is involved in the workgroup working towards the next version of TOGAF® and its further hand-in-pocket alignment with ArchiMate®.

BiZZdesign offers native tooling, certification training and consultancy for TOGAF® and ArchiMate®, both standards of The Open Group. BiZZdesign offers complete and integrated solutions (tools, methods, consultancy and training) to design and improve organizations. Business models, enterprise architecture, business requirements management and process business analysis and management are important ingredients in the solutions.

1 Comment

Filed under Strategy