The Internet of Things is the New Media

By Dave Lounsbury, Chief Technical Officer, The Open Group

A tip of the hat to @artbourbon for pointing out the article “Principles for Open Innovation and Open Leadingship” by Peter Vander Auwera, which led to a TED Talk by Joi Ito with his “Nine Principles of the Media Lab”. Something in this presentation struck me:

“Media is plural for Medium, Medium is something in which you can express yourself. The medium was hardware, screens, robots, etc. Now the medium is society, ecosystem, journalism,… Our work looks more like social science.”

Great changes in society often go hand-in-hand with advances in communications, which in turn are tied to improvements in scale or portability of media. Think the printing press, television or even the development of paint in tubes which allowed impressionist painters to get out of the studios to paint water lilies and wheat fields.

295px-Vincent_Van_Gogh_0020

We are seeing a similar advance in the next generation of the Internet. Traditionally, humans interact with computer systems and networks through visual media, like screens of varying sizes and printed material. However, this is changing: Sensors and actuators are shrinking in size and price, and there has been an explosion of devices, new services and applications that network these together into larger systems  to increase their value through Metcalfe’s law. We interact with the actions of these sensors not just with our eyes, but other senses as well – a simple example is the feeling of warmth as your house adjusts its temperature as you arrive home.

These devices, and the platforms that orchestrate their interactions, are the media in which the next generation of the internet will be painted. We call it the Internet of Things today, or maybe the Internet of Everything – but in long run, it will become just be the Internet. The expression of connectivity through sensors and devices will soon become as commonplace as social media is today.

Join the conversation! @theopengroup #ogchat

lounsburyDavid is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, David leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia.

David holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

Leave a comment

Filed under digital technologies, Future Technologies, Internet of Things, Open Platform 3.0, Uncategorized

Q&A with Marshall Van Alstyne, Professor, Boston University School of Management and Research Scientist MIT Center for Digital Business

By The Open Group

The word “platform” has become a nearly ubiquitous term in the tech and business worlds these days. From “Platform as a Service” (PaaS) to IDC’s Third Platform to The Open Group Open Platform 3.0™ Forum, the concept of platforms and building technology frames and applications on top of them has become the next “big thing.”

Although the technology industry tends to conceive of “platforms” as the vehicle that is driving trends such as mobile, social networking, the Cloud and Big Data, Marshall Van Alstyne, Professor at Boston University’s School of Management and a Research Scientist at the MIT Center for Digital Business, believes that the radical shifts that platforms bring are not just technological.

We spoke with Van Alstyne prior to The Open Group Boston 2014, where he presented a keynote, about platforms, how they have shifted traditional business models and how they are impacting industries everywhere.

The title of your session at the Boston conference was “Platform Shift – How New Open Business Models are Changing the Shape of Industry.” How would you define both “platform” and “open business model”?

I think of “platform” as a combination of two things. One, a set of standards or components that folks can take up and use for production of goods and services. The second thing is the rules of play, or the governance model – who has the ability to participate, how do you resolve conflict, and how do you divide up the royalty streams, or who gets what? You can think of it as the two components of the platform—the open standard together with the governance model. The technologists usually get the technology portion of it, and the economists usually get the governance and legal portions of it, but you really need both of them to understand what a ‘platform’ is.

What is the platform allowing then and how is that different from a regular business model?

The platform allows third parties to conduct business using system resources so they can actually meet and exchange goods across the platform. Wonderful examples of that include AirBnB where you can rent rooms or you can post rooms, or eBay, where you can sell goods or exchange goods, or iTunes where you can go find music, videos, apps and games provided by others, or Amazon where third parties are even allowed to set up shop on top of Amazon. They have moved to a business model where they can take control of the books in addition to allowing third parties to sell their own books and music and products and services through the Amazon platform. So by opening it up to allow third parties to participate, you facilitate exchange and grow a market by helping that exchange.

How does this relate to the concept of the technology industry is defining the “third platform”?

I think of it slightly differently. The tech industry uses mobile and social and cloud and data to characterize it. In some sense this view offers those as the attributes that characterize platforms or the knowledge base that enable platforms. But we would add to that the economic forces that actually shape platforms. What we want to do is give you some of the strategic tools, the incentives, the rules that will actually help you control their trajectory by helping you improve who participates and then measure and improve the value they contribute to the platform. So a full ecosystem view is not just the technology and the data, it also measures the value and how you divide that value. The rules of play really become important.

I think the “third platform” offers marvelous concepts and attributes but you also need to add the economics to it: Why do you participate, who gets what portions of the value, and who ultimately owns control.

Who does control the platform then?

A platform has multiple parts. Determining who controls what part is the art and design of the governance model. You have to set up control in the right way to motivate people to participate. But before we get to that, let’s go back and complete the idea of what’s an ‘open platform.’

To define an open platform, consider both the right of access and the right to manipulate platform resources, then consider granting those rights to four different parties. One is the user—can they access one another, can they access data, can they access system resources? Another group is developers—can they manipulate system resources, can they add new features to it, can they sell through the platform? The third group is the platform providers. You often think of them as those folks that facilitate access across the platform. To give you an example, iTunes is a single monolithic store, so the provider is simply Apple, but Android, in contrast, allows multiple providers, so there’s a Samsung Android store, an LTC Android store, a Google Android store, there’s even an Amazon version that uses a different version of Android. So that platform has multiple providers each with rights to access users. The fourth group is the party that controls the underlying property rights, who owns the IP. The ability modify the underlying standard and also the rights of access for other parties is the bottom-most layer.

So to answer the question of what is ‘open,’ you have to consider the rights of access of all four groups—the users, developers, the providers and IP rights holders, or sponsors, underneath.

Popping back up a level, we’re trying to motivate different parties to participate in the ecosystem. So what do you give the users? Usually it’s some kind of value. What do you give developers? Usually it’s some set of SDKs and APIs, but also some level of royalties. It’s fascinating. If you look back historically, Amazon initially tried a publishing royalty where they took 70% and gave a minority 30% back to developers. They found that didn’t fly very well and they had to fall back to the app store or software-style royalty, where they’re taking a lower percentage. I think Apple, for example, takes 30 percent, and Amazon is now close to that. You see ranges of royalties going anywhere from a few percent—an example is credit cards—all the way up to iStock photo where they take roughly 70 percent. That’s an extremely high rate, and one that I don’t recommend. We were just contracting for designs at 99Designs and they take a 20 percent cut. That’s probably more realistic, but lower might perhaps even be better—you can create stronger network effect if that’s the case.

Again, the real question of control is how you motivate third parties to participate and add value? If you are allowing them to use resources to create value and keep a lot of that value, then they’re more motivated to participate, to invest, to bring their resources to your platform. If you take most of the value they create, they won’t participate. They won’t add value. One of the biggest challenges for open platforms—what you might call the ‘Field of Dreams’ approach—is that most folks open their platform and assume ‘if you build it, they will come,’ but you really need to reward them to do so. Why would they want to come build with you? There are numerous instances of platforms that opened but no developer chooses to add value—the ecosystem is too small. You have to solve the chicken and egg problem where if you don’t have users, developers don’t want to build for you, but if you don’t have developer apps, then why do users participate? So you’ve got a huge feedback problem. And those are where the economics become critical, you must solve the chicken and egg problem to build and roll out platforms.

It’s not just a technology question; it’s also an economics and rewards question.

Then who is controlling the platform?

The answer depends on the type of platform. Giving different groups a different set of rights creates different types of platform. Consider the four different parties: users, developers, providers, and sponsors. At one extreme, the Apple Mac platform of the 1980s reserved most rights for development, for producing hardware (the provider layer), and for modifying the IP (the sponsor layer) all to Apple. Apple controlled the platform and it remained closed. In contrast, Microsoft relaxed platform control in specific ways. It licensed to multiple providers, enabling Dell, HP, Compaq and others to sell the platform. It gave developers rights of access to SDKs and APIs, enabling them to extend the platform. These control choices gave Microsoft more than six times the number of developers and more than twenty times the market share of Apple at the high point of Microsoft’s dominance of desktop operating systems. Microsoft gave up some control in order to create a more inclusive platform and a much bigger market.

Control is not a single concept. There are many different control rights you can grant to different parties. For example, you often want to give users an ability to control their own data. You often want to give developers intellectual property rights for the apps that they create and often over the data that their users create. You may want to give them some protections against platform misappropriation. Developers resent it if you take their ideas. So if the platform sees a really clever app that’s been built on top of its platform, what’s the guarantee that the platform simply doesn’t take it or build a competing app? You need to protect your developers in that case. Same thing’s true of the platform provider—what guarantees do they provide users for the quality of content provided on their ecosystem? For example, the Android ecosystem is much more open than the iPhone ecosystem, which means you have more folks offering stores. Simultaneously, that means that there are more viruses and more malware in Android, so what rights and guarantees do you require of the platform providers to protect the users in order that they want to participate? And then at the bottom, what rights do other participants have to control the direction of the platform growth? In the Visa model, for example, there are multiple member banks that help to influence the general direction of that credit card standard. Usually the most successful platforms have a single IP rights holder, but there are several examples of that have multiple IP rights holders.

So, in the end control defines the platform as much as the platform defines control.

What is the “secret” of the Internet-driven marketplace? Is that indeed the platform?

The secret is that, in effect, the goal of the platform is to increase transaction volume and value. If you can do that—and we can give you techniques for doing it—then you can create massive scale. Increasing the transaction value and transactions volume across your platform means that the owner of the platform doesn’t have to be the sole source of content and new ideas provided on the platform. If the platform owner is the only source of value then the owner is also the bottleneck. The goal is to consummate matches between producers and consumers of value. You want to help users find the content, find the resources, find the other people that they want to meet across your platform. In Apple’s case, you’re helping them find the music, the video, the games, and the apps that they want. In AirBnB’s case, you’re helping them find the rooms that they want, or Uber, you’re helping them find a driver. On Amazon, the book recommendations help you find the content that you want. In all the truly successful platforms, the owner of the platform is not providing all of that value. They’re enabling third parties to add that value, and that’s one reasy why The Open Group’s ideas are so important—you need open systems for this to happen.

What’s wrong with current linear business models? Why is a network-driven approach superior?

The fundamental reason why the linear business model no longer works is that it does not manage network effects. Network effects allow you to build platforms where users attract other users and you get feedback that grows your system. As more users join your platform, more developers join your platform, which attracts more users, which attracts more developers. You can see it on any of the major platforms. This is also true of Google. As advertisers use Google Search, the algorithms get better, people find the content that they want, so more advertisers use it. As more drivers join Uber, more people are happier passengers, which attracts more drivers. The more merchants accept Visa, the more consumers are willing to carry it, which attracts more merchants, which attracts more consumers. You get positive feedback.

The consequence of that is that you tend to get market concentration—you get winner take all markets. That’s where platforms dominate. So you have a few large firms within a given category, whether this is rides or books or hotels or auctions. Further, once you get network effects changing your business model, the linear insights into pricing, into inventory management, into innovation, into strategy breakdown.

When you have these multi-sided markets, pricing breaks down because you often price differently to one side than another because one side attracts the other. Inventory management practices breakdown because you’re selling inventory that you don’t even own. Your R&D strategies breakdown because now you’re motivating innovation and research outside the boundaries of the firm, as opposed to inside the internal R&D group. And your strategies breakdown because you’re not just looking for cost leadership or product differentiation, now you’re looking to shape the network effects as you create barriers to entry.

One of the things that I really want to argue strenuously is that in markets where platforms will emerge, platforms beat product every time. So the platform business model will inevitably beat the linear, product-based business model. Because you’re harnessing new forces in order to develop a different kind of business model.

Think of it the following way–imagine that value is growing as users consume your product. Think of any of the major platforms, as more folks use Google, search gets better, the more recommendations improve on Amazon, and the easier it is to find a ride on Uber, so more folks want to be on there. It is easier to scale network effects outside your business than inside your business. There’s simply more people outside than inside. The moment that happens, the locus control, the locus of innovation, moves from inside the firm to outside the firm. So the rules change. Pricing changes, your innovation strategies change, your inventory policies change, your R&D changes. You’re now managing resources outside the firm, rather than inside, in order to capture scale. This is different than the traditional industrial supply economies of scale.

Old systems are giving away to new systems. It’s not that the whole system breaks down, it’s simply that you’re looking to manage network effects and manage new business models. Another way to see this is that previously you were managing capital. In the industrial era, you were managing steel, you were managing large amounts of finance in banking, you were managing auto parts—huge supply economies of scale. In telecommunications, you were managing infrastructure. Now, you’re managing communities and these are managed outside the firm. The value that’s been created at Facebook or WhatsApp or Instagram or any of the new acquisitions, it’s not the capital that’s critical, it’s the communities that are critical, and these are built outside the firm.

There is a lot of talk in the industry about the Nexus of Forces as Gartner calls it, or Third Platform (IDC). The Open Group calls it Open Platform 3.0. Your concept goes well beyond technology—how does Open Platform 3.0 enable new business models?

Those are the enablers—they’re shall we say necessary, but they’re not sufficient. You really must harness the economic forces in addition to those enablers—mobile, social, Cloud, data. You must manage communities outside the firm, that’s the mobile and the social element of it. But this also involves designing governance and setting incentives. How are you capturing users outside the organization, how are they contributing, how are they being motivated to participate, why are they spreading your products to their peers? The Cloud allows it to scale—so Instagram and What’s App and others scale. Data allows you to “consummate the match.” You use that data to help people find what they need, to add value, so all of those things are the enablers. Then you have to harness the economics of the enablers to encourage people to do the right thing. You can see the correct intuition if you simply ask what happens if all you offer is a Cloud service and nothing more. Why will anyone use it? What’s the value to that system? If you open APIs to it, again, if you don’t have a user base, why are developers going to contribute? Developers want to reach users. Users want valuable functionality.

You must manage the motives and the value-add on the platform. New business models come from orchestrating not just the technology but also the third party sources of value. One of the biggest challenges is to grow these businesses from scratch—you’ve got the cold start chicken and egg problem. You don’t have network effects if you don’t have a user base, if you don’t have users, you don’t have network effects.

Do companies need to transform themselves into a “business platform” to succeed in this new marketplace? Are there industries immune to this shift?

There is a continuum of companies that are going to be affected. It starts at one end with companies that are highly information intense—anything that’s an information intensive business will be dramatically affected, anything that’s community or fashion-based business will be dramatically affected. Those include companies involved in media and news, songs, music, video; all of those are going to be the canaries in the coalmine that see this first. Moving farther along will be those industries that require some sort of certification—those include law and medicine and education—those, too, will also be platformized, so the services industries will become platforms. Farther down that are the ones that are heavily, heavily capital intensive where control of physical capital is paramount, those include trains and oil rigs and telecommunications infrastructure—eventually those will be affected by platform business models to the extent that data helps them gain efficiencies or add value, but they will in some sense be the last to be affected by platform business models. Look for the businesses where the cost side is shrinking in proportion to the delivery of value and where the network effects are rising as a proportional increase in value. Those forces will help you predict which industries will be transformed.

How can Enterprise Architecture be a part of this and how do open standards play a role?

The second part of that question is actually much easier. How do open standards play a role? The open standards make it much easier for third parties to attach and incorporate technology and features such that they can in turn add value. Open standards are essential to that happening. You do need to ask the question as to who controls those standards—is it completely open or is it a proprietary standard, a published standard but it’s not manipulable by a third party.

There will be at least two or three different things that Enterprise Architects need to do. One of these is to design modular components that are swappable, so as better systems become available, the better systems can be swapped in. The second element will be to watch for components of value that should be absorbed into the platform itself. As an example, in operating systems, web browsing has effectively been absorbed into the platform, streaming has been absorbed into the platform so that they become aware of how that actually works. A third thing they need to do is talk to the legal team to see where it is that the third parties property rights can be protected so that they invest. One of the biggest mistakes that firms make is to simply assume that because they own the platform, because they have the rights of control, that they can do what they please. If they do that, they risk alienating their ecosystems. So they should talk to their potential developers to incorporate developer concerns. One of my favorite examples is the Intel Architecture Lab which has done a beautiful job of articulating the voices of developers in their own architectural plans. A fourth thing that can be done is an idea borrowed from SAP, that builds Enterprise Architecture—they articulate an 18-24 month roadmap where they say these are the features that are coming, so you can anticipate and build on those. Also it gives you an idea of what features will be safe to build on so you won’t lose the value you’ve created.

What can companies do to begin opening their business models and more easily architect that?

What they should do is to consider four groups articulated earlier— those are the users, the providers, the developers and the sponsors—each serve a different role. Firms need to understand what their own role will be in order that they can open and architect the other roles within their ecosystem. They’ll also need to choose what levels of exclusivity they need to give their ecosystem partners in a different slice of the business. They should also figure out which of those components they prefer to offer themselves as unique competencies and where they need to seek third party assistance, either in new ideas or new resources or even new marketplaces. Those factors will help guide businesses toward different kinds of partnerships, and they’ll have to be open to those kinds of partners. In particular, they should think about where are they most likely to be missing ideas or missing opportunities. Those technical and business areas should open in order that third parties can take advantage of those opportunities and add value.

 

vanalstynemarshallProfessor Van Alstyne is one of the leading experts in network business models. He conducts research on information economics, covering such topics as communications markets, the economics of networks, intellectual property, social effects of technology, and productivity effects of information. As co-developer of the concept of “two sided networks” he has been a major contributor to the theory of network effects, a set of ideas now taught in more than 50 business schools worldwide.

Awards include two patents, National Science Foundation IOC, SGER, SBIR, iCorp and Career Awards, and six best paper awards. Articles or commentary have appeared in Science, Nature, Management Science, Harvard Business Review, Strategic Management Journal, The New York Times, and The Wall Street Journal.

1 Comment

Filed under architecture, Cloud, Conference, Data management, digital technologies, Enterprise Architecture, Governance, Open Platform 3.0, Standards, Uncategorized

Discussing Enterprise Decision-Making with Penelope Everall Gordon

By The Open Group

Most enterprises today are in the process of jumping onto the Big Data bandwagon. The promise of Big Data, as we’re told, is that if every company collects as much data as they can—about everything from their customers to sales transactions to their social media feeds—executives will have “the data they need” to make important decisions that can make or break the company. Not collecting and using your data, as the conventional wisdom has it, can have deadly consequences for any business.

As is often the case with industry trends, the hype around Big Data contains both a fair amount of truth and a fair amount of fuzz. The problem is that within most organizations, that conventional wisdom about the power of data for decision-making is usually just the tip of the iceberg when it comes to how and why organizational decisions are made.

According to Penelope Gordon, a consultant for 1Plug Corporation who was recently a Cloud Strategist at Verizon and was formerly a Service Product Strategist at IBM, that’s why big “D” (Data) needs to be put back into the context of enterprise decision-making. Gordon, who spoke at The Open Group Boston 2014, in the session titled “Putting the D Back in Decision” with Jean-Francois Barsoum of IBM, argues that a focus on collecting a lot of data has the potential to get in the way of making quality decisions. This is, in part, due to the overabundance of data that’s being collected under the assumption that you never know where there’s gold to be mined in your data, so if you don’t have all of it at hand, you may just miss something.

Gordon says that assuming the data will make decisions obvious also ignores the fact that ultimately decisions are made by people—and people usually make decisions based on their own biases. According to Gordon, there are three types of natural decision making styles—heart, head and gut styles—corresponding to different personality types, she said; the greater the amount of data the more likely that it will not balance the natural decision-making style.

Head types, Gordon says, naturally make decisions based on quantitative evidence. But what often happens is that head types often put off making a decision until more data can be collected, wanting more and more data so that they can make the best decision based on the facts. She cites former President Bill Clinton as a classic example of this type. During his presidency, he was famous for putting off decision-making in favor of gathering more and more data before making the decision, she says. Relying solely on quantitative data also can mean you may miss out on other important factors in making optimal decisions based on either heart (qualitative) or instinct. Conversely, a gut-type presented with too much data will likely just end up ignoring data and acting on instinct, much like former President George W. Bush, Gordon says.

Gordon believes part of the reason that data and decisions are more disconnected than one might think is because IT and Marketing departments have become overly enamored with what technology can offer. These data providers need to step back and first examine the decision objectives as well as the governance behind those decisions. Without understanding the organization’s decision-making processes and the dynamics of the decision-makers, it can be difficult to make optimal and effective strategic recommendations, she says, because you don’t have the full picture of what the stakeholders will or will not accept in terms of a recommendation, data or no data.

Ideally, Gordon says, you want to get to a point where you can get to the best decision outcome possible by first figuring out the personal and organizational dynamics driving decisions within the organization, shifting the focus from the data to the decision for which the data is an input.

“…what you’re trying to do is get the optimal outcome, so your focus needs to be on the outcome, so when you’re collecting the data and assessing the data, you also need to be thinking about ‘how am I going to present this data in a way that it is going to be impactful in improving the decision outcomes?’ And that’s where the governance comes into play,” she said.

Governance is of particular importance now, Gordon says, because decisions are increasingly being made by individual departments, such as when departments buy their own cloud-enabled services, such as sales force automation. In that case, an organization needs to have a roadmap in place with compensation to incent decision-makers to adhere to that roadmap and decision criteria for buying decisions, she said.

Gordon recommends that companies put in place 3-5 top criteria for each decision that needs to be made so that you can ensure that the decision objectives are met. This distillation of the metrics gives decision-makers a more comprehensible picture of their data so that their decisions don’t become either too subjective or disconnected from the data. Lower levels of metrics can be used underneath each of those top-level criteria to facilitate a more nuanced valuation. For example, if an organization needing to find good partner candidates scored and ranked (preferably in tiers) potential partners using decision criteria based on the characteristics of the most attractive partner, rather than just assuming that companies with the best reputation or biggest brands will be the best, then they will expeditiously identify the optimal partner candidates.

One of the reasons that companies have gotten so concerned with collecting and storing data rather than just making better decisions, Gordon believes, is that business decisions have become inherently more risky. The required size of investment is increasing in tandem with an increase in the time to return; time to return is a key determinant of risk. Data helps people feel like they are making competent decisions but in reality does little to reduce risk.

“If you’ve got lots of data, then the thinking is, ‘well, I did the best that I could because I got all of this data.’ People are worried that they might miss something,“ she said. “But that’s where I’m trying to come around and say, ‘yeah, but going and collecting more data, if you’ve got somebody like President Clinton, you’re just feeding into their tendency to put off making decisions. If you’ve got somebody like President Bush and you’re feeding into their tendency to ignore it, then there may be some really good information, good recommendations they’re ignoring.”

Gordon also says that having all the data possible to work with isn’t usually necessary—generally a representative sample will do. For example, she says the U.S Census Bureau takes the approach where it tries to count every citizen; consequently certain populations are chronically undercounted and inaccuracies pass undetected. The Canadian census, on the other hand, uses representative samples and thus tends to be much more accurate—and much less expensive to conduct. Organizations should also think about how they can find representative or “proxy” data in cases where collecting data that directly addresses a top-level decision criteria isn’t really practical.

To begin shifting the focus from collecting data inputs to improving decision outcomes, Gordon recommends clearly stating the decision objectives for each major decision and then identifying and defining the 3-5 criteria that are most important for achieving the decision objectives. She also recommends ensuring that there is sufficient governance and a process in place for making decisions including mechanisms for measuring the performance of the decision-making process and the outcomes resulting from the execution of that process. In addition, companies need to consider whether their decisions are made in a centralized or decentralized manner and then adapt decision governance accordingly.

One way that Enterprise Architects can help to encourage better decision-making within the organizations in which they work is to help in developing that governance rather than just providing data or data architectures, Gordon says. They should help stakeholders identify and define the important decision criteria, determine when full population rather than representative sampling is justified, recognize better methods for data analysis, and form decision recommendations based on that analysis. By gauging the appropriate blend of quantitative and qualitative data for a particular decision maker, an Architect can moderate gut types’ reliance on instinct and stimulate head and heart types’ intuition – thereby producing an optimally balanced decision. Architects should help lead and facilitate execution of the decision process, as well as help determine how data is presented within organizations in order to support the recommendations with the highest potential for meeting the decision objectives.

Join the conversation – #ogchat

penelopegordonPenelope Gordon recently led the expansion of the channel and service packaging strategies for Verizon’s cloud network products. Previously she was an IBM Strategist and Product Manager bringing emerging technologies such as predictive analytics to market. She helped to develop one of the world’s first public clouds.

 

 

2 Comments

Filed under architecture, Conference, Data management, Enterprise Architecture, Governance, Professional Development, Uncategorized

Case Study – ArchiMate®, An Open Group Standard: Public Research Centre Henri Tudor and Centre Hospitalier de Luxembourg

By The Open Group

The Public Research Centre Henri Tudor is an institute of applied research aimed at reinforcing the innovation capacity at organizations and companies and providing support for national policies and international recognition of Luxembourg’s scientific community. Its activities include applied and experimental research; doctoral research; the development of tools, methods, labels, certifications and standards; technological assistance; consulting and watch services; and knowledge and competency transfer. Its main technological domains are advanced materials, environmental, Healthcare, information and communication technologies as well as business organization and management. The Centre utilizes its competencies across a number of industries including Healthcare, industrial manufacturing, mobile, transportation and financial services among others.

In 2012, the Centre Hospitalier de Luxembourg allowed Tudor to experiment with an access rights management system modeled using ArchiMate®, an Open Group standard. This model was tested by CRP Tudor to confirm the approach used by the hospital’s management to grant employees, nurses and doctors permission to access patient records.

Background

The Centre Hospitalier de Luxembourg is a public hospital that focuses on severe pathologies, medical and surgical emergencies and palliative care. The hospital also has an academic research arm. The hospital employs a staff of approximately 2,000, including physicians and specialized employees, medical specialists, nurses and administrative staff. On average the hospital performs more than 450,000 outpatient services, 30,000 inpatient services and more than 60,000 adult and pediatric emergency services, respectively, per year.

Unlike many hospitals throughout the world, the Centre Hospitalier de Luxembourg is open and accessible 24 hours a day, seven days a week. Accessing patient records is required at the hospital at any time, no matter the time of day or weekend. In addition, the Grand Duchy of Luxembourg has a system where medical emergencies are allocated to one hospital each weekend across each of the country’s three regions. In other words, every two weeks, one hospital within a given region is responsible for all of the incoming medical emergencies on its assigned weekend, affecting patient volume and activity.

Access rights management

As organizations have become not only increasingly global but also increasingly digital, access rights management has become a critical component of keeping institutional information secure so that it does not fall into the wrong hands. Managing access to internal information is a critical component of every company’s security strategy, but it is particularly important for organizations that deal with sensitive information about consumers, or in the case of the Centre Hospitalier de Luxembourg, patients.

Modeling an access rights management system was important for the hospital for a number of reasons. First, European privacy laws dictate that only the people who require information regarding patient medical files should be allowed access to those files. Although privacy laws may restrict access to patient records, a rights management system must be flexible enough to grant access to the correct individuals when necessary.

In the case of a hospital such as the Centre Hospitalier de Luxembourg, access to information may be critical for the life of the patient. For instance, if a patient was admitted to the emergency room, the emergency room physician will be able to better treat the patient if he or she can access the patient’s records, even if they are not the patient’s primary care physician. Admitting personnel may also need access to records at the time of admittance. Therefore, a successful access rights management system must combine a balance between restricting information and providing flexible access as necessary, giving the right access at the right time without placing an administrative burden on the doctors or staff.

The project

Prior to the experiment in which the Public Research Centre Henri Tudor tested this access rights management model, the Centre Hospitalier de Luxembourg had not experienced any problems in regard to its information sharing system. However, its access rights were still being managed by a primarily paper-based system. As part of the scope of the project, the hospital was also looking to become compliant with existing privacy laws. Developing an access rights management model was intended to close the gap within the hospital between restricting access to patient information overall and providing new rights, as necessary, to employees that would allow them to do their work without endangering patient lives. From a technical perspective, the access rights management system also needed not only to work in conjunction with existing applications, such as the ERP system, used within the hospital but also support rights management at the business layer.

Most current access rights managements systems provide information access to individuals based on a combination of the functional requirements necessary for employees to do their jobs and governance rights, which provide the protections that will keep the organization and its information safe and secure. What many existing models have failed to take into account is that most access control models and rights engineering methods don’t adequately represent both sides of this equation. As such, determining the correct level of access for different employees within organizations can be difficult.

Modeling access rights management

Within the Centre Hospitalier de Luxembourg, employee access rights were defined based on individual job responsibilities and job descriptions. To best determine how to grant access rights across an hospital, the Public Research Centre Henri Tudor needed to create a system that could take these responsibilities into account, rather than just rely on functional or governance requirements.

To create an access rights management model that would work with the hospital’s existing processes and ERP software, the Public Research Centre Henri Tudor first needed to come up with a way to model responsibility requirements instead of just functional or governance requirements. According to Christophe Feltus, Research Engineer at the Public Research Centre, defining a new approach based on actor or employee responsibilities was the first step in creating a new model for the hospital.

Although existing architecture modeling languages provide views for many different types of stakeholders within organizations—from executives to IT and project managers—no modeling language had previously been used to develop a view dedicated to access rights management, Feltus says. As such, that view needed to be created and modeled anew for this project.

To develop this new view, the Public Research Centre needed to find an architecture modeling language that was flexible enough to accommodate such an extension. After evaluating three separate modeling languages, they chose ArchiMate®, an Open Group Standard and open and independent modeling language, to help them visualize the relationships among the hospital’s various employees in an unambiguous way.

Much like architectural drawings are used in building architecture to describe the various aspects of construction and building use, ArchiMate provides a common language for describing how to construct business processes, organizational structures, information flows, IT systems and technical infrastructures. By providing a common language and visual representation of systems, ArchiMate helps stakeholders within organizations design, assess and communicate how decisions and changes within business domains will affect the organization.

According to Feltus, Archimate provided a well-formalized language for the Public Research Centre to portray the architecture needed to model the access rights management system they wanted to propose for Centre Hospitalier. Because ArchiMate is a flexible and open language, it also provided an extension mechanism that could accommodate the responsibility modeling language (ReMMo) that the engineering team had developed for the hospital.

In addition to providing the tools and extensions necessary for the engineering team to properly model the hospital’s access rights system, the Public Research Centre also chose ArchiMate because it is an open and vendor-neutral modeling language. As a publically funded institution, it was important that the Public Research Centre avoided using vendor-specific tools that would lock them in to a potentially costly cycle of constant version upgrades.

“What was very interesting [about ArchiMate] was that it was an open and independent solution. This is very important for us. As a public company, it’s preferable not to use private solutions. This was something very important,” said Feltus.

Feltus notes that using ArchiMate to model the access rights project was also a relatively easy and intuitive process. “It was rather easy,” Feltus said. “The concepts are clear and recommendations are well done, so it was easy to explore the framework.” The most challenging part of the project was selecting which extension mechanism would best portray the design and model they wanted to use.

Results

After developing the access rights model using ArchiMate, the responsibility metamodel was presented to the hospital’s IT staff by the Public Research Centre Henri Tudor. The Public Research Centre team believes that the responsibility model created using ArchiMate allows for better alignment between the hospital’s business processes defined at the business layer with their IT applications being run at the application layer. The team also believes the model could both enhance provisioning of access rights to employees and improve the hospital’s performance. For example, using the proposed responsibility model, the team found that some employees in the reception department had been assigned more permissions than they required in practice. Comparing the research findings with the reality on the ground at the hospital has shown the Public Research Centre team that ArchiMate is an effective tool for modeling and determining both responsibilities and access rights within organizations.

Due to the ease of use and success the Public Research Centre Henri Tudor experienced in using ArchiMate to create the responsibility model and the access rights management system for the hospital, Tudor also intends to continue to use ArchiMate for other public and private research projects as appropriate.

Follow The Open Group @theopengroup, #ogchat and / or let us know your thoughts on the blog here.

 

4 Comments

Filed under ArchiMate®, Healthcare, Standards, Uncategorized

The Open Group Boston 2014 – Day Two Highlights

By Loren K. Bayes, Director, Global Marketing Communications

Enabling Boundaryless Information Flow™  continued in Boston on Tuesday, July 22Allen Brown, CEO and President of The Open Group welcomed attendees with an overview of the company’s second quarter results.

The Open Group membership is at 459 organizations in 39 countries, including 16 new membership agreements in 2Q 2014.

Membership value is highlighted by the collaboration Open Group members experience. For example, over 4,000 individuals attended Open Group events (physically and virtually whether at member meetings, webinars, podcasts, tweet jams). The Open Group website had more than 1 million page views and over 105,000 publication items were downloaded by members in 80 countries.

Brown also shared highlights from The Open Group Forums which featured status on many upcoming white papers, snapshots, reference models and standards, as well as individiual Forum Roadmaps. The Forums are busy developing and reviewing projects such as the Next Version of TOGAF®, an Open Group standard, an ArchiMate® white paper, The Open Group Healthcare Forum charter and treatise, Standard Mils™ APIs and Open Fair. Many publications are translated into multiple languages including Chinese and Portuguese. Also, a new Forum will be announced in the third quarter at The Open Group London 2014 so stay tuned for that launch news!

Our first keynote of the day was Making Health Addictive by Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health.

Dr. Kvedar described how Healthcare delivery is changing, with mobile technology being a big part. Other factors pushing changes are reimbursement paradigms and caregivers being paid to be more efficient and interested in keeping people healthy and out of hospitals. The goal of Healthcare providers is to integrate care into the day-to-day lives of patients. Healthcare also aims for better technologies and architecture.

Mobile is a game-changer in Healthcare because people are “always on and connected”. Mobile technology allows for in-the-moment messaging, ability to capture health data (GPS, accelerator, etc.) and display information in real time as needed. Bottom-line, smartphones are addictive so they are excellent tools for communication and engagement.

But there is a need to understand and address the implications of automating Healthcare: security, privacy, accountability, economics.

The plenary continued with Proteus Duxbury, CTO, Connect for Health Colorado, who presented From Build to Run at the Colorado Health Insurance Exchange – Achieving Long-term Sustainability through Better Architecture.

Duxbury stated the keys to successes of his organization are the leadership and team’s shared vision, a flexible vendor being agile with rapidly changing regulatory requirements, and COTS solution which provided minimal customization and custom development, resilient architecture and security. Connect for Health experiences many challenges including budget restraints, regulation and operating in a “fish bowl”. Yet, they are on-track with their three-year ‘build to run’ roadmap, stabilizing their foundation and gaining efficiencies.

During the Q&A with Allen Brown following each presentation, both speakers emphasized the need for standards, architecture and data security.

Brown and DuxburyAllen Brown and Proteus Duxbury

During the afternoon, track sessions consisted of Healthcare, Enterprise Architecture (EA) & Business Value, Service-Oriented Architecture (SOA), Security & Risk Management, Professional Development and ArchiMate Tutorials. Chris Armstrong, President, Armstrong Process Group, Inc. discussed Architecture Value Chain and Capability Model. Laura Heritage, Principal Solution Architect / Enterprise API Platform, SOA Software, presented Protecting your APIs from Threats and Hacks.

The evening culminated with a reception at the historic Old South Meeting House, where the Boston Tea Party began in 1773.

photo2

IMG_2814Networking Reception at Old South Meeting House

A special thank you to our sponsors and exhibitors at The Open Group Boston 2014: BiZZdesign, Black Duck, Corso, Good e-Learning, Orbus and AEA.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

1 Comment

Filed under Accreditations, Boundaryless Information Flow™, Business Architecture, COTS, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open FAIR Certification, OTTF, RISK Management, Service Oriented Architecture, Standards, Uncategorized

The Open Group Boston 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

The Open Group kicked off Enabling Boundaryless Information Flow™  July 21 at the spectacular setting of the Hyatt Boston Harbor. Allen Brown, CEO and President of The Open Group, welcomed over 150 people from 20 countries, including as far away as Australia, Japan, Saudi Arabia and India.

The first keynote speaker was Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, known as a leading expert in business models. His presentation entitled Platform Shift – How New Open Business Models are Changing the Shape of Industry posed the questions “What does ‘openness’ mean? Why do platforms beat products every time?”.

Van AlstyneMarshall Van Alstyne

According to “InterBrand: 2014 Best Global Brands”, 13 of the top 31 companies are “platform companies”. To be a ‘platform’, a company needs embeddable functions or service and allow 3rd party access. Alystyne noted, “products have features, platforms have communities”. Great standalone products are not sufficient. Positive changes experienced by a platform company include pricing/profitability, supply chains, internal organization, innovation, decreased industry bottlenecks and strategy.

Platforms benefit from broad contributions, as long as there is control of the top several complements. Alstyne commented, “If you believe in the power of community, you need to embrace the platform.”

The next presentation was Open Platform 3.0™ – An Integrated Approach to the Convergence of Technology Platforms, by Dr. Chris Harding, Director for Interoperability, The Open Group. Dr. Harding discussed how society has developed a digital society.

1970 was considered the dawn of an epoch which saw the First RAM chip, IBM introduction of System/370 and a new operating system – UNIX®. Examples of digital progress since that era include driverless cars and Smart Cities (management of traffic, energy, water, communication).

Digital society enablers are digital structural change and corporate social media. The benefits are open innovation, open access, open culture, open government and delivering more business value.

Dr. Harding also noted, standards are essential to innovation and enable markets based on integration. The Open Group Open Platform 3.0™ is using ArchiMate®, an Open Group standard, to analyze the 30+ business use cases produced by the Forum. The development cycle is understanding, analysis, specification, iteration.

Dr. Harding emphasized the importance of Boundaryless Information Flow™, as an enabler of business objectives and efficiency through IT standards in the era of digital technology, and designed for today’s agile enterprise with direct involvement of business users.

Both sessions concluded with an interactive audience Q&A hosted by Allen Brown.

The last session of the morning’s plenary was a panel: The Internet of Things and Interoperability. Dana Gardner, Principal Analyst at Interarbor Solutions, moderated the panel. Participating in the panel were Said Tabet, CTO for Governance, Risk and Compliance Strategy, EMC; Penelope Gordon, Emerging Technology Strategist, 1Plug Corporation; Jean-Francois Barsoum, Senior Managing Consultant, Smarter Cities, Water & Transportation, IBM; and Dave Lounsbury, CTO, The Open Group.

IoT PanelIoT Panel – Gardner, Barsoum, Tabet, Lounsbury, Gordon

The panel explored the practical limits and opportunities of Internet of Things (IoT). The different areas discussed include obstacles to decision-making as big data becomes more prolific, openness, governance and connectivity of things, data and people which pertain to many industries such as smart cities, manufacturing and healthcare.

How do industries, organizations and individuals deal with IoT? This is not necessarily a new problem, but an accelerated one. There are new areas of interoperability but where does the data go and who owns the data? Openness is important and governance is essential.

What needs to change most to see the benefits of the IoT? The panel agreed there needs to be a push for innovation, increased education, move beyond models of humans managing the interface (i.e. machine-to-machine) and determine what data is most important, not always collecting all the data.

A podcast and transcript of the Internet of Things and Interoperability panel will be posted soon.

The afternoon was divided into several tracks: Boundaryless Information Flow™, Open Platform 3.0™ and Enterprise Architecture (EA) & Enterprise Transformation. Best Practices for Enabling Boundaryless Information Flow across the Government was presented by Syed Husain, Consultant Enterprise Architecture, Saudi Arabia E-government Authority. Robert K. Pucci, CTO, Communications Practice, Cognizant Technology Solutions discussed Business Transformation Justification Leveraging Business and Enterprise Architecture.

The evening concluded with a lively networking reception at the hotel.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

 

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Interoperability, Open Platform 3.0, Professional Development, Standards, Uncategorized

The Open Group Boston 2014 – Q&A with Proteus Duxbury, Connect for Health Colorado

By The Open Group

The U.S. healthcare industry is undergoing a major sea change right now due in part to the Affordable Care Act, as well as the need to digitize legacy systems that have remained largely paper-based in order to better facilitate information exchange.

Proteus Duxbury, the CTO for the state of Colorado’s health insurance exchange, Connect for Health Colorado, has a wide and varied background in healthcare IT ranging from IT consulting and helping to lead a virtual health medicine group to his current position running the supporting technologies operating the Colorado exchange. Duxbury joined Connect for Health Colorado early 2014 as the exchange was going through its first major enrollment period.

We spoke to Duxbury in advance of his keynote on July 22 at The Open Group Boston 2014 conference about the current state of healthcare IT and how Enterprise Architecture will play an integral part in the Connect for Health Colorado exchange moving forward as the organizations transitions from a start-up culture to a maintenance and run mode.

Below is a transcript of that conversation.

What factors went into making the roll-out of Connect for Health Colorado healthcare exchange a success?

There were three things. The first is we have an exceptional leadership team. The CEO, especially, is a fantastic leader and was able to create a strong vision and have her team rally quickly behind it. The executive team was empowered to make decisions quickly and there was a highly dedicated work force and a powerful start-up culture. In addition, there was a uniformly shared passion to see healthcare reform successfully implemented in Colorado.

The second reason for success was the flexibility and commitment of our core vendor—which is CGI—and their ability to effectively manage and be agile with rapidly changing regulatory requirements and rapidly changing needs. These systems had never been built anywhere else before; it really was a green field program of work. There was a shared commitment to achieving success and very strong contracting in place ensuring that we were fully protected throughout the whole process.

The third is our COTS (Commercial Off-The-Shelf) solution that was selected. Early on, we established an architecture principle of deploying out-of-the-box products rather than trying to build from scratch, so there was minimal customization and development effort. Scope control was tight. We implemented the hCentive package, which is one of the leading health insurance exchange software packages. Best-of-breed solutions were implemented around the edges where it was necessary to meet a niche need, but we try to build as much into the single product as we can. We have a highly resilient and available architecture. The technical architecture scales well and has been very robust and resilient through a couple of very busy periods at the end of open enrollment, particularly on March 31st and toward the end of December, as the deadline for enrollment in 2014 plans approached.

Why are you putting together an Enterprise Architecture for the exchange?

We’re extremely busy right now with a number of critical projects. We’re still implementing core functionality but we do have a bit of a breather on the horizon. Going into next year things will get lighter, and now is the time for a clear roadmap to achieve the IT strategic objectives that I have set for the organization.

We are trying to achieve a reduction in our M&O (maintenance and operations) expense because we need to be self-sustaining from a budgetary point of view. Our federal funding will be going away starting 2015 so we need to consolidate architecture and systems and gain additional efficiencies. We need to continue to meet our key SLAs, specifically around availability—we have a very public-facing set of systems. IT needs to be operationalized. We need to move from the existing start-up culture to the management of IT in a CMM (Capability Maturity Model) or ITIL-type fashion. And we also need to continue to grow and hold on to our customer base, as there is always a reasonable amount of churn and competing services in a relatively uncertain marketplace. We need to continue to grow our customer base so we can be self-sustaining. To support this, we need to have a more operationalized, robust and cost-efficient IT architecture, and we need a clear roadmap to get there. If you don’t have a roadmap or design that aligns with business priorities, then those things are difficult to achieve.

Finally, I am building up an IT team. To date, we’ve been highly reliant on contractors and consultants to get us to where we are now. In order to reduce our cost base, we are building out our internal IT team and a number of key management roles. That means we need to have a roadmap and something that we can all steer towards—a shared architecture roadmap.

What benefits do you expect to see from implementing the architecture?

Growing our customer base is a critical goal—we need to stabilize the foundations of our IT solution and use that as a platform for future growth and innovation. It’s hard to grow and innovate if you haven’t got your core IT platform stabilized. By making our IT systems easier to be maintained and updated we hope to see continued reduction in IT M&O. High availability is another benefit I expect to see, as well as closer alignment with business goals and business processes and capabilities.

Are there any particular challenges in setting up an Enterprise Architecture for a statewide health exchange? What are they?

I think there are some unique challenges. The first is budget. We do need to be self-sustaining, and there is not a huge amount of budget available for additional capital investments. There is some, but it has to be very carefully allocated, managed and spent diligently. We do work within a tightly controlled federal set of regulations and constraints and are frequently under the spotlight from auditors and others.

There are CMS (Center for Medicaid Services) regulations that define what we can and cannot do with our technology. We have security regulations that we have to exist within and a lot of IRS requirements that we have to meet and be compliant with. We have a complex set of partners to work with in Colorado and nationally—we have to work with Colorado state agencies such as the Department of Insurance and Medicaid (HCPF), we have to work very closely with a large number—we’ve currently got 17—of carriers. We have CMS and the federal marketplace (Federal Data Services Hub). We have one key vendor—CGI—but we are in a multi-vendor environment and all our projects involve having to manage multiple different organizations towards success.

The final challenge is that we’re very busy still building applications and implementing functionality, so my job is to continue to be focused on successful delivery of two very large projects, while ensuring our longer term architecture planning is completed, which is going to be critical for our long-term sustainability. That’s the classic Enterprise Architecture conundrum. I feel like we’ve got a handle on it pretty well here—because they’re both critical.

What are some of the biggest challenges that you see facing the Healthcare industry right now?

Number one is probably integration—the integration of data especially between different systems. A lot of EMR (electronic medical record) systems are relatively closed to the outside world, and it can be expensive and difficult to open them up. Even though there are some good standards out there like HL7 and EDI (Electronic Data Interchange), everyone seems to be implementing them differently.

Personal healthcare tech (mHealth and Telehealth) is not going to take off until there is more integration. For example, between whatever you’re using to track your smoking, blood pressure, weight, etc., it needs to be integrated seamlessly with your medical records and insurance provider. And until this data can be used for meaningful analytics and care planning, until they solve this integration nightmare, it’s going to be difficult really to make great strides.

Security is the second challenge. There’s a huge proliferation of new technology endpoints and there’s a lot of weak leaks around people, process and technology. The regulators are only really starting to catch up, and they’re one step behind. There’s a lot of personal data out there and it’s not always technology that’s the weak leak. We have that pretty tightly controlled here because we’re highly regulated and are technology is tightly controlled, but on the provider side especially, it’s a huge challenge and every week we see a new data breach.

The third challenge is ROI. There’s a lot of investment being made into personal health technology but because we’re in a private insurance market and a private provider market, until someone has really cracked what the ROI is for these initiatives, whether it’s tied to re-admissions or reimbursements, it’s never going to really become mainstream. And until it becomes part of the fabric of care delivery, real value is not going to be realized and health outcomes not significantly improved.

But models are changing—once the shift to outcome-based reimbursement takes hold, providers will be more incentivized to really invest in these kind of technologies and get them working. But that shift hasn’t really occurred yet, and I’ve yet to see really compelling ROI models for a lot of these new investments. I’m a believer that it really has to be the healthcare provider that drives and facilitates the engagement with patients on these new technologies. Ultimately, I believe, people, left to their own devices, will experiment and play with something for a while, but unless their healthcare provider is engaging them actively on it, it’s not something that they will persist in doing. A lot of the large hospital groups are dipping their toe in the water and seeing what sticks, but I don’t really see any system where these new technologies are becoming part of the norm of healthcare delivery.

Do you feel like there are other places that are seeing more success in this outside of the US?

I know in the UK, they’re having a lot of success with their Telehealth pilots. But their primary objective is to make people healthier, so it’s a lot easier in that environment to have a good idea, show that there’s some case for improving outcomes and get funding. In the US, proving outcomes currently isn’t enough. You have to prove that there’s some revenue to be made or cost to be saved. In some markets, they’ve experienced problems similar to the US and in some markets it’s probably been easier. That doesn’t mean they’ve had an easy time implementing them—the UK has had huge problems with integration and with getting EMR systems deployed and implemented nationally. But a lot of those are classical IT problems of change management, scope control and trying to achieve too much too quickly. The healthcare industry is about 20 years behind other industries. They’re going through all the pain with the EMR rollouts that most manufacturing companies went through with ERP 20 years ago and most banks went through 40 years ago.

How can organizations such as The Open Group and its Healthcare Forum better work with the Healthcare industry to help them achieve better results?

I think firstly bringing a perspective from other industries. Healthcare IT conferences and organizations tend to be largely made up of people who have been in healthcare most of their working lives. The Open Group brings in perspective from other industries. Also reference architectures—there’s a shortage of good reference architectures in the healthcare space and that’s something that is really The Open Group’s strong point. Models that span the entire healthcare ecosystem—including payers, providers, pharma and exchanges, IT process and especially IT architecture process—can be improved in healthcare. Healthcare IT departments aren’t as mature as other industries because the investment has not been there until now. They’re in a relative start-up mode. Enterprise Architecture—if you’re a large healthcare provider and you’re growing rapidly through M&O (like so many are right now), that’s a classic use case for having a structured Enterprise Architecture process.

Within the insurance marketplace movement, things have grown very quickly; it’s been tough work. A handful of the states have been very successful, and I think we’re not unique in that we’re a start-up organization and it’s going to be several years until we mature to fully functional, well measured l IT organization. Architecture rigor and process is key to achieving sustainability and maturity.

Join the conversation – #ogchat #ogBOS

duxbury_0Proteus Duxbury joined Connect for Health Colorado as Chief Technology Officer in February 2014, directing technology strategy and operations. Proteus previously served at Catholic Health Initiatives, where he led all IT activities for Virtual Health Services, a division responsible for deploying Telehealth solutions throughout the US. Prior to that, Proteus served as a Managing Consultant at the PA Consulting Group, leading technology change programs in the US and UK primarily in the healthcare and life science industry. He holds a Bachelor of Science in Information Systems Management from Bournemouth University.

 

 

Leave a comment

Filed under COTS, Enterprise Architecture, Healthcare, Professional Development, Strategy, Uncategorized