Tag Archives: The Open Group

Q&A with Marshall Van Alstyne, Professor, Boston University School of Management and Research Scientist MIT Center for Digital Business

By The Open Group

The word “platform” has become a nearly ubiquitous term in the tech and business worlds these days. From “Platform as a Service” (PaaS) to IDC’s Third Platform to The Open Group Open Platform 3.0™ Forum, the concept of platforms and building technology frames and applications on top of them has become the next “big thing.”

Although the technology industry tends to conceive of “platforms” as the vehicle that is driving trends such as mobile, social networking, the Cloud and Big Data, Marshall Van Alstyne, Professor at Boston University’s School of Management and a Research Scientist at the MIT Center for Digital Business, believes that the radical shifts that platforms bring are not just technological.

We spoke with Van Alstyne prior to The Open Group Boston 2014, where he presented a keynote, about platforms, how they have shifted traditional business models and how they are impacting industries everywhere.

The title of your session at the Boston conference was “Platform Shift – How New Open Business Models are Changing the Shape of Industry.” How would you define both “platform” and “open business model”?

I think of “platform” as a combination of two things. One, a set of standards or components that folks can take up and use for production of goods and services. The second thing is the rules of play, or the governance model – who has the ability to participate, how do you resolve conflict, and how do you divide up the royalty streams, or who gets what? You can think of it as the two components of the platform—the open standard together with the governance model. The technologists usually get the technology portion of it, and the economists usually get the governance and legal portions of it, but you really need both of them to understand what a ‘platform’ is.

What is the platform allowing then and how is that different from a regular business model?

The platform allows third parties to conduct business using system resources so they can actually meet and exchange goods across the platform. Wonderful examples of that include AirBnB where you can rent rooms or you can post rooms, or eBay, where you can sell goods or exchange goods, or iTunes where you can go find music, videos, apps and games provided by others, or Amazon where third parties are even allowed to set up shop on top of Amazon. They have moved to a business model where they can take control of the books in addition to allowing third parties to sell their own books and music and products and services through the Amazon platform. So by opening it up to allow third parties to participate, you facilitate exchange and grow a market by helping that exchange.

How does this relate to the concept of the technology industry is defining the “third platform”?

I think of it slightly differently. The tech industry uses mobile and social and cloud and data to characterize it. In some sense this view offers those as the attributes that characterize platforms or the knowledge base that enable platforms. But we would add to that the economic forces that actually shape platforms. What we want to do is give you some of the strategic tools, the incentives, the rules that will actually help you control their trajectory by helping you improve who participates and then measure and improve the value they contribute to the platform. So a full ecosystem view is not just the technology and the data, it also measures the value and how you divide that value. The rules of play really become important.

I think the “third platform” offers marvelous concepts and attributes but you also need to add the economics to it: Why do you participate, who gets what portions of the value, and who ultimately owns control.

Who does control the platform then?

A platform has multiple parts. Determining who controls what part is the art and design of the governance model. You have to set up control in the right way to motivate people to participate. But before we get to that, let’s go back and complete the idea of what’s an ‘open platform.’

To define an open platform, consider both the right of access and the right to manipulate platform resources, then consider granting those rights to four different parties. One is the user—can they access one another, can they access data, can they access system resources? Another group is developers—can they manipulate system resources, can they add new features to it, can they sell through the platform? The third group is the platform providers. You often think of them as those folks that facilitate access across the platform. To give you an example, iTunes is a single monolithic store, so the provider is simply Apple, but Android, in contrast, allows multiple providers, so there’s a Samsung Android store, an LTC Android store, a Google Android store, there’s even an Amazon version that uses a different version of Android. So that platform has multiple providers each with rights to access users. The fourth group is the party that controls the underlying property rights, who owns the IP. The ability modify the underlying standard and also the rights of access for other parties is the bottom-most layer.

So to answer the question of what is ‘open,’ you have to consider the rights of access of all four groups—the users, developers, the providers and IP rights holders, or sponsors, underneath.

Popping back up a level, we’re trying to motivate different parties to participate in the ecosystem. So what do you give the users? Usually it’s some kind of value. What do you give developers? Usually it’s some set of SDKs and APIs, but also some level of royalties. It’s fascinating. If you look back historically, Amazon initially tried a publishing royalty where they took 70% and gave a minority 30% back to developers. They found that didn’t fly very well and they had to fall back to the app store or software-style royalty, where they’re taking a lower percentage. I think Apple, for example, takes 30 percent, and Amazon is now close to that. You see ranges of royalties going anywhere from a few percent—an example is credit cards—all the way up to iStock photo where they take roughly 70 percent. That’s an extremely high rate, and one that I don’t recommend. We were just contracting for designs at 99Designs and they take a 20 percent cut. That’s probably more realistic, but lower might perhaps even be better—you can create stronger network effect if that’s the case.

Again, the real question of control is how you motivate third parties to participate and add value? If you are allowing them to use resources to create value and keep a lot of that value, then they’re more motivated to participate, to invest, to bring their resources to your platform. If you take most of the value they create, they won’t participate. They won’t add value. One of the biggest challenges for open platforms—what you might call the ‘Field of Dreams’ approach—is that most folks open their platform and assume ‘if you build it, they will come,’ but you really need to reward them to do so. Why would they want to come build with you? There are numerous instances of platforms that opened but no developer chooses to add value—the ecosystem is too small. You have to solve the chicken and egg problem where if you don’t have users, developers don’t want to build for you, but if you don’t have developer apps, then why do users participate? So you’ve got a huge feedback problem. And those are where the economics become critical, you must solve the chicken and egg problem to build and roll out platforms.

It’s not just a technology question; it’s also an economics and rewards question.

Then who is controlling the platform?

The answer depends on the type of platform. Giving different groups a different set of rights creates different types of platform. Consider the four different parties: users, developers, providers, and sponsors. At one extreme, the Apple Mac platform of the 1980s reserved most rights for development, for producing hardware (the provider layer), and for modifying the IP (the sponsor layer) all to Apple. Apple controlled the platform and it remained closed. In contrast, Microsoft relaxed platform control in specific ways. It licensed to multiple providers, enabling Dell, HP, Compaq and others to sell the platform. It gave developers rights of access to SDKs and APIs, enabling them to extend the platform. These control choices gave Microsoft more than six times the number of developers and more than twenty times the market share of Apple at the high point of Microsoft’s dominance of desktop operating systems. Microsoft gave up some control in order to create a more inclusive platform and a much bigger market.

Control is not a single concept. There are many different control rights you can grant to different parties. For example, you often want to give users an ability to control their own data. You often want to give developers intellectual property rights for the apps that they create and often over the data that their users create. You may want to give them some protections against platform misappropriation. Developers resent it if you take their ideas. So if the platform sees a really clever app that’s been built on top of its platform, what’s the guarantee that the platform simply doesn’t take it or build a competing app? You need to protect your developers in that case. Same thing’s true of the platform provider—what guarantees do they provide users for the quality of content provided on their ecosystem? For example, the Android ecosystem is much more open than the iPhone ecosystem, which means you have more folks offering stores. Simultaneously, that means that there are more viruses and more malware in Android, so what rights and guarantees do you require of the platform providers to protect the users in order that they want to participate? And then at the bottom, what rights do other participants have to control the direction of the platform growth? In the Visa model, for example, there are multiple member banks that help to influence the general direction of that credit card standard. Usually the most successful platforms have a single IP rights holder, but there are several examples of that have multiple IP rights holders.

So, in the end control defines the platform as much as the platform defines control.

What is the “secret” of the Internet-driven marketplace? Is that indeed the platform?

The secret is that, in effect, the goal of the platform is to increase transaction volume and value. If you can do that—and we can give you techniques for doing it—then you can create massive scale. Increasing the transaction value and transactions volume across your platform means that the owner of the platform doesn’t have to be the sole source of content and new ideas provided on the platform. If the platform owner is the only source of value then the owner is also the bottleneck. The goal is to consummate matches between producers and consumers of value. You want to help users find the content, find the resources, find the other people that they want to meet across your platform. In Apple’s case, you’re helping them find the music, the video, the games, and the apps that they want. In AirBnB’s case, you’re helping them find the rooms that they want, or Uber, you’re helping them find a driver. On Amazon, the book recommendations help you find the content that you want. In all the truly successful platforms, the owner of the platform is not providing all of that value. They’re enabling third parties to add that value, and that’s one reasy why The Open Group’s ideas are so important—you need open systems for this to happen.

What’s wrong with current linear business models? Why is a network-driven approach superior?

The fundamental reason why the linear business model no longer works is that it does not manage network effects. Network effects allow you to build platforms where users attract other users and you get feedback that grows your system. As more users join your platform, more developers join your platform, which attracts more users, which attracts more developers. You can see it on any of the major platforms. This is also true of Google. As advertisers use Google Search, the algorithms get better, people find the content that they want, so more advertisers use it. As more drivers join Uber, more people are happier passengers, which attracts more drivers. The more merchants accept Visa, the more consumers are willing to carry it, which attracts more merchants, which attracts more consumers. You get positive feedback.

The consequence of that is that you tend to get market concentration—you get winner take all markets. That’s where platforms dominate. So you have a few large firms within a given category, whether this is rides or books or hotels or auctions. Further, once you get network effects changing your business model, the linear insights into pricing, into inventory management, into innovation, into strategy breakdown.

When you have these multi-sided markets, pricing breaks down because you often price differently to one side than another because one side attracts the other. Inventory management practices breakdown because you’re selling inventory that you don’t even own. Your R&D strategies breakdown because now you’re motivating innovation and research outside the boundaries of the firm, as opposed to inside the internal R&D group. And your strategies breakdown because you’re not just looking for cost leadership or product differentiation, now you’re looking to shape the network effects as you create barriers to entry.

One of the things that I really want to argue strenuously is that in markets where platforms will emerge, platforms beat product every time. So the platform business model will inevitably beat the linear, product-based business model. Because you’re harnessing new forces in order to develop a different kind of business model.

Think of it the following way–imagine that value is growing as users consume your product. Think of any of the major platforms, as more folks use Google, search gets better, the more recommendations improve on Amazon, and the easier it is to find a ride on Uber, so more folks want to be on there. It is easier to scale network effects outside your business than inside your business. There’s simply more people outside than inside. The moment that happens, the locus control, the locus of innovation, moves from inside the firm to outside the firm. So the rules change. Pricing changes, your innovation strategies change, your inventory policies change, your R&D changes. You’re now managing resources outside the firm, rather than inside, in order to capture scale. This is different than the traditional industrial supply economies of scale.

Old systems are giving away to new systems. It’s not that the whole system breaks down, it’s simply that you’re looking to manage network effects and manage new business models. Another way to see this is that previously you were managing capital. In the industrial era, you were managing steel, you were managing large amounts of finance in banking, you were managing auto parts—huge supply economies of scale. In telecommunications, you were managing infrastructure. Now, you’re managing communities and these are managed outside the firm. The value that’s been created at Facebook or WhatsApp or Instagram or any of the new acquisitions, it’s not the capital that’s critical, it’s the communities that are critical, and these are built outside the firm.

There is a lot of talk in the industry about the Nexus of Forces as Gartner calls it, or Third Platform (IDC). The Open Group calls it Open Platform 3.0. Your concept goes well beyond technology—how does Open Platform 3.0 enable new business models?

Those are the enablers—they’re shall we say necessary, but they’re not sufficient. You really must harness the economic forces in addition to those enablers—mobile, social, Cloud, data. You must manage communities outside the firm, that’s the mobile and the social element of it. But this also involves designing governance and setting incentives. How are you capturing users outside the organization, how are they contributing, how are they being motivated to participate, why are they spreading your products to their peers? The Cloud allows it to scale—so Instagram and What’s App and others scale. Data allows you to “consummate the match.” You use that data to help people find what they need, to add value, so all of those things are the enablers. Then you have to harness the economics of the enablers to encourage people to do the right thing. You can see the correct intuition if you simply ask what happens if all you offer is a Cloud service and nothing more. Why will anyone use it? What’s the value to that system? If you open APIs to it, again, if you don’t have a user base, why are developers going to contribute? Developers want to reach users. Users want valuable functionality.

You must manage the motives and the value-add on the platform. New business models come from orchestrating not just the technology but also the third party sources of value. One of the biggest challenges is to grow these businesses from scratch—you’ve got the cold start chicken and egg problem. You don’t have network effects if you don’t have a user base, if you don’t have users, you don’t have network effects.

Do companies need to transform themselves into a “business platform” to succeed in this new marketplace? Are there industries immune to this shift?

There is a continuum of companies that are going to be affected. It starts at one end with companies that are highly information intense—anything that’s an information intensive business will be dramatically affected, anything that’s community or fashion-based business will be dramatically affected. Those include companies involved in media and news, songs, music, video; all of those are going to be the canaries in the coalmine that see this first. Moving farther along will be those industries that require some sort of certification—those include law and medicine and education—those, too, will also be platformized, so the services industries will become platforms. Farther down that are the ones that are heavily, heavily capital intensive where control of physical capital is paramount, those include trains and oil rigs and telecommunications infrastructure—eventually those will be affected by platform business models to the extent that data helps them gain efficiencies or add value, but they will in some sense be the last to be affected by platform business models. Look for the businesses where the cost side is shrinking in proportion to the delivery of value and where the network effects are rising as a proportional increase in value. Those forces will help you predict which industries will be transformed.

How can Enterprise Architecture be a part of this and how do open standards play a role?

The second part of that question is actually much easier. How do open standards play a role? The open standards make it much easier for third parties to attach and incorporate technology and features such that they can in turn add value. Open standards are essential to that happening. You do need to ask the question as to who controls those standards—is it completely open or is it a proprietary standard, a published standard but it’s not manipulable by a third party.

There will be at least two or three different things that Enterprise Architects need to do. One of these is to design modular components that are swappable, so as better systems become available, the better systems can be swapped in. The second element will be to watch for components of value that should be absorbed into the platform itself. As an example, in operating systems, web browsing has effectively been absorbed into the platform, streaming has been absorbed into the platform so that they become aware of how that actually works. A third thing they need to do is talk to the legal team to see where it is that the third parties property rights can be protected so that they invest. One of the biggest mistakes that firms make is to simply assume that because they own the platform, because they have the rights of control, that they can do what they please. If they do that, they risk alienating their ecosystems. So they should talk to their potential developers to incorporate developer concerns. One of my favorite examples is the Intel Architecture Lab which has done a beautiful job of articulating the voices of developers in their own architectural plans. A fourth thing that can be done is an idea borrowed from SAP, that builds Enterprise Architecture—they articulate an 18-24 month roadmap where they say these are the features that are coming, so you can anticipate and build on those. Also it gives you an idea of what features will be safe to build on so you won’t lose the value you’ve created.

What can companies do to begin opening their business models and more easily architect that?

What they should do is to consider four groups articulated earlier— those are the users, the providers, the developers and the sponsors—each serve a different role. Firms need to understand what their own role will be in order that they can open and architect the other roles within their ecosystem. They’ll also need to choose what levels of exclusivity they need to give their ecosystem partners in a different slice of the business. They should also figure out which of those components they prefer to offer themselves as unique competencies and where they need to seek third party assistance, either in new ideas or new resources or even new marketplaces. Those factors will help guide businesses toward different kinds of partnerships, and they’ll have to be open to those kinds of partners. In particular, they should think about where are they most likely to be missing ideas or missing opportunities. Those technical and business areas should open in order that third parties can take advantage of those opportunities and add value.

 

vanalstynemarshallProfessor Van Alstyne is one of the leading experts in network business models. He conducts research on information economics, covering such topics as communications markets, the economics of networks, intellectual property, social effects of technology, and productivity effects of information. As co-developer of the concept of “two sided networks” he has been a major contributor to the theory of network effects, a set of ideas now taught in more than 50 business schools worldwide.

Awards include two patents, National Science Foundation IOC, SGER, SBIR, iCorp and Career Awards, and six best paper awards. Articles or commentary have appeared in Science, Nature, Management Science, Harvard Business Review, Strategic Management Journal, The New York Times, and The Wall Street Journal.

Leave a comment

Filed under architecture, Cloud, Conference, Data management, digital technologies, Enterprise Architecture, Governance, Open Platform 3.0, Standards, Uncategorized

Discussing Enterprise Decision-Making with Penelope Everall Gordon

By The Open Group

Most enterprises today are in the process of jumping onto the Big Data bandwagon. The promise of Big Data, as we’re told, is that if every company collects as much data as they can—about everything from their customers to sales transactions to their social media feeds—executives will have “the data they need” to make important decisions that can make or break the company. Not collecting and using your data, as the conventional wisdom has it, can have deadly consequences for any business.

As is often the case with industry trends, the hype around Big Data contains both a fair amount of truth and a fair amount of fuzz. The problem is that within most organizations, that conventional wisdom about the power of data for decision-making is usually just the tip of the iceberg when it comes to how and why organizational decisions are made.

According to Penelope Gordon, a consultant for 1Plug Corporation who was recently a Cloud Strategist at Verizon and was formerly a Service Product Strategist at IBM, that’s why big “D” (Data) needs to be put back into the context of enterprise decision-making. Gordon, who spoke at The Open Group Boston 2014, in the session titled “Putting the D Back in Decision” with Jean-Francois Barsoum of IBM, argues that a focus on collecting a lot of data has the potential to get in the way of making quality decisions. This is, in part, due to the overabundance of data that’s being collected under the assumption that you never know where there’s gold to be mined in your data, so if you don’t have all of it at hand, you may just miss something.

Gordon says that assuming the data will make decisions obvious also ignores the fact that ultimately decisions are made by people—and people usually make decisions based on their own biases. According to Gordon, there are three types of natural decision making styles—heart, head and gut styles—corresponding to different personality types, she said; the greater the amount of data the more likely that it will not balance the natural decision-making style.

Head types, Gordon says, naturally make decisions based on quantitative evidence. But what often happens is that head types often put off making a decision until more data can be collected, wanting more and more data so that they can make the best decision based on the facts. She cites former President Bill Clinton as a classic example of this type. During his presidency, he was famous for putting off decision-making in favor of gathering more and more data before making the decision, she says. Relying solely on quantitative data also can mean you may miss out on other important factors in making optimal decisions based on either heart (qualitative) or instinct. Conversely, a gut-type presented with too much data will likely just end up ignoring data and acting on instinct, much like former President George W. Bush, Gordon says.

Gordon believes part of the reason that data and decisions are more disconnected than one might think is because IT and Marketing departments have become overly enamored with what technology can offer. These data providers need to step back and first examine the decision objectives as well as the governance behind those decisions. Without understanding the organization’s decision-making processes and the dynamics of the decision-makers, it can be difficult to make optimal and effective strategic recommendations, she says, because you don’t have the full picture of what the stakeholders will or will not accept in terms of a recommendation, data or no data.

Ideally, Gordon says, you want to get to a point where you can get to the best decision outcome possible by first figuring out the personal and organizational dynamics driving decisions within the organization, shifting the focus from the data to the decision for which the data is an input.

“…what you’re trying to do is get the optimal outcome, so your focus needs to be on the outcome, so when you’re collecting the data and assessing the data, you also need to be thinking about ‘how am I going to present this data in a way that it is going to be impactful in improving the decision outcomes?’ And that’s where the governance comes into play,” she said.

Governance is of particular importance now, Gordon says, because decisions are increasingly being made by individual departments, such as when departments buy their own cloud-enabled services, such as sales force automation. In that case, an organization needs to have a roadmap in place with compensation to incent decision-makers to adhere to that roadmap and decision criteria for buying decisions, she said.

Gordon recommends that companies put in place 3-5 top criteria for each decision that needs to be made so that you can ensure that the decision objectives are met. This distillation of the metrics gives decision-makers a more comprehensible picture of their data so that their decisions don’t become either too subjective or disconnected from the data. Lower levels of metrics can be used underneath each of those top-level criteria to facilitate a more nuanced valuation. For example, if an organization needing to find good partner candidates scored and ranked (preferably in tiers) potential partners using decision criteria based on the characteristics of the most attractive partner, rather than just assuming that companies with the best reputation or biggest brands will be the best, then they will expeditiously identify the optimal partner candidates.

One of the reasons that companies have gotten so concerned with collecting and storing data rather than just making better decisions, Gordon believes, is that business decisions have become inherently more risky. The required size of investment is increasing in tandem with an increase in the time to return; time to return is a key determinant of risk. Data helps people feel like they are making competent decisions but in reality does little to reduce risk.

“If you’ve got lots of data, then the thinking is, ‘well, I did the best that I could because I got all of this data.’ People are worried that they might miss something,“ she said. “But that’s where I’m trying to come around and say, ‘yeah, but going and collecting more data, if you’ve got somebody like President Clinton, you’re just feeding into their tendency to put off making decisions. If you’ve got somebody like President Bush and you’re feeding into their tendency to ignore it, then there may be some really good information, good recommendations they’re ignoring.”

Gordon also says that having all the data possible to work with isn’t usually necessary—generally a representative sample will do. For example, she says the U.S Census Bureau takes the approach where it tries to count every citizen; consequently certain populations are chronically undercounted and inaccuracies pass undetected. The Canadian census, on the other hand, uses representative samples and thus tends to be much more accurate—and much less expensive to conduct. Organizations should also think about how they can find representative or “proxy” data in cases where collecting data that directly addresses a top-level decision criteria isn’t really practical.

To begin shifting the focus from collecting data inputs to improving decision outcomes, Gordon recommends clearly stating the decision objectives for each major decision and then identifying and defining the 3-5 criteria that are most important for achieving the decision objectives. She also recommends ensuring that there is sufficient governance and a process in place for making decisions including mechanisms for measuring the performance of the decision-making process and the outcomes resulting from the execution of that process. In addition, companies need to consider whether their decisions are made in a centralized or decentralized manner and then adapt decision governance accordingly.

One way that Enterprise Architects can help to encourage better decision-making within the organizations in which they work is to help in developing that governance rather than just providing data or data architectures, Gordon says. They should help stakeholders identify and define the important decision criteria, determine when full population rather than representative sampling is justified, recognize better methods for data analysis, and form decision recommendations based on that analysis. By gauging the appropriate blend of quantitative and qualitative data for a particular decision maker, an Architect can moderate gut types’ reliance on instinct and stimulate head and heart types’ intuition – thereby producing an optimally balanced decision. Architects should help lead and facilitate execution of the decision process, as well as help determine how data is presented within organizations in order to support the recommendations with the highest potential for meeting the decision objectives.

Join the conversation – #ogchat

penelopegordonPenelope Gordon recently led the expansion of the channel and service packaging strategies for Verizon’s cloud network products. Previously she was an IBM Strategist and Product Manager bringing emerging technologies such as predictive analytics to market. She helped to develop one of the world’s first public clouds.

 

 

2 Comments

Filed under architecture, Conference, Data management, Enterprise Architecture, Governance, Professional Development, Uncategorized

Case Study – ArchiMate®, An Open Group Standard: Public Research Centre Henri Tudor and Centre Hospitalier de Luxembourg

By The Open Group

The Public Research Centre Henri Tudor is an institute of applied research aimed at reinforcing the innovation capacity at organizations and companies and providing support for national policies and international recognition of Luxembourg’s scientific community. Its activities include applied and experimental research; doctoral research; the development of tools, methods, labels, certifications and standards; technological assistance; consulting and watch services; and knowledge and competency transfer. Its main technological domains are advanced materials, environmental, Healthcare, information and communication technologies as well as business organization and management. The Centre utilizes its competencies across a number of industries including Healthcare, industrial manufacturing, mobile, transportation and financial services among others.

In 2012, the Centre Hospitalier de Luxembourg allowed Tudor to experiment with an access rights management system modeled using ArchiMate®, an Open Group standard. This model was tested by CRP Tudor to confirm the approach used by the hospital’s management to grant employees, nurses and doctors permission to access patient records.

Background

The Centre Hospitalier de Luxembourg is a public hospital that focuses on severe pathologies, medical and surgical emergencies and palliative care. The hospital also has an academic research arm. The hospital employs a staff of approximately 2,000, including physicians and specialized employees, medical specialists, nurses and administrative staff. On average the hospital performs more than 450,000 outpatient services, 30,000 inpatient services and more than 60,000 adult and pediatric emergency services, respectively, per year.

Unlike many hospitals throughout the world, the Centre Hospitalier de Luxembourg is open and accessible 24 hours a day, seven days a week. Accessing patient records is required at the hospital at any time, no matter the time of day or weekend. In addition, the Grand Duchy of Luxembourg has a system where medical emergencies are allocated to one hospital each weekend across each of the country’s three regions. In other words, every two weeks, one hospital within a given region is responsible for all of the incoming medical emergencies on its assigned weekend, affecting patient volume and activity.

Access rights management

As organizations have become not only increasingly global but also increasingly digital, access rights management has become a critical component of keeping institutional information secure so that it does not fall into the wrong hands. Managing access to internal information is a critical component of every company’s security strategy, but it is particularly important for organizations that deal with sensitive information about consumers, or in the case of the Centre Hospitalier de Luxembourg, patients.

Modeling an access rights management system was important for the hospital for a number of reasons. First, European privacy laws dictate that only the people who require information regarding patient medical files should be allowed access to those files. Although privacy laws may restrict access to patient records, a rights management system must be flexible enough to grant access to the correct individuals when necessary.

In the case of a hospital such as the Centre Hospitalier de Luxembourg, access to information may be critical for the life of the patient. For instance, if a patient was admitted to the emergency room, the emergency room physician will be able to better treat the patient if he or she can access the patient’s records, even if they are not the patient’s primary care physician. Admitting personnel may also need access to records at the time of admittance. Therefore, a successful access rights management system must combine a balance between restricting information and providing flexible access as necessary, giving the right access at the right time without placing an administrative burden on the doctors or staff.

The project

Prior to the experiment in which the Public Research Centre Henri Tudor tested this access rights management model, the Centre Hospitalier de Luxembourg had not experienced any problems in regard to its information sharing system. However, its access rights were still being managed by a primarily paper-based system. As part of the scope of the project, the hospital was also looking to become compliant with existing privacy laws. Developing an access rights management model was intended to close the gap within the hospital between restricting access to patient information overall and providing new rights, as necessary, to employees that would allow them to do their work without endangering patient lives. From a technical perspective, the access rights management system also needed not only to work in conjunction with existing applications, such as the ERP system, used within the hospital but also support rights management at the business layer.

Most current access rights managements systems provide information access to individuals based on a combination of the functional requirements necessary for employees to do their jobs and governance rights, which provide the protections that will keep the organization and its information safe and secure. What many existing models have failed to take into account is that most access control models and rights engineering methods don’t adequately represent both sides of this equation. As such, determining the correct level of access for different employees within organizations can be difficult.

Modeling access rights management

Within the Centre Hospitalier de Luxembourg, employee access rights were defined based on individual job responsibilities and job descriptions. To best determine how to grant access rights across an hospital, the Public Research Centre Henri Tudor needed to create a system that could take these responsibilities into account, rather than just rely on functional or governance requirements.

To create an access rights management model that would work with the hospital’s existing processes and ERP software, the Public Research Centre Henri Tudor first needed to come up with a way to model responsibility requirements instead of just functional or governance requirements. According to Christophe Feltus, Research Engineer at the Public Research Centre, defining a new approach based on actor or employee responsibilities was the first step in creating a new model for the hospital.

Although existing architecture modeling languages provide views for many different types of stakeholders within organizations—from executives to IT and project managers—no modeling language had previously been used to develop a view dedicated to access rights management, Feltus says. As such, that view needed to be created and modeled anew for this project.

To develop this new view, the Public Research Centre needed to find an architecture modeling language that was flexible enough to accommodate such an extension. After evaluating three separate modeling languages, they chose ArchiMate®, an Open Group Standard and open and independent modeling language, to help them visualize the relationships among the hospital’s various employees in an unambiguous way.

Much like architectural drawings are used in building architecture to describe the various aspects of construction and building use, ArchiMate provides a common language for describing how to construct business processes, organizational structures, information flows, IT systems and technical infrastructures. By providing a common language and visual representation of systems, ArchiMate helps stakeholders within organizations design, assess and communicate how decisions and changes within business domains will affect the organization.

According to Feltus, Archimate provided a well-formalized language for the Public Research Centre to portray the architecture needed to model the access rights management system they wanted to propose for Centre Hospitalier. Because ArchiMate is a flexible and open language, it also provided an extension mechanism that could accommodate the responsibility modeling language (ReMMo) that the engineering team had developed for the hospital.

In addition to providing the tools and extensions necessary for the engineering team to properly model the hospital’s access rights system, the Public Research Centre also chose ArchiMate because it is an open and vendor-neutral modeling language. As a publically funded institution, it was important that the Public Research Centre avoided using vendor-specific tools that would lock them in to a potentially costly cycle of constant version upgrades.

“What was very interesting [about ArchiMate] was that it was an open and independent solution. This is very important for us. As a public company, it’s preferable not to use private solutions. This was something very important,” said Feltus.

Feltus notes that using ArchiMate to model the access rights project was also a relatively easy and intuitive process. “It was rather easy,” Feltus said. “The concepts are clear and recommendations are well done, so it was easy to explore the framework.” The most challenging part of the project was selecting which extension mechanism would best portray the design and model they wanted to use.

Results

After developing the access rights model using ArchiMate, the responsibility metamodel was presented to the hospital’s IT staff by the Public Research Centre Henri Tudor. The Public Research Centre team believes that the responsibility model created using ArchiMate allows for better alignment between the hospital’s business processes defined at the business layer with their IT applications being run at the application layer. The team also believes the model could both enhance provisioning of access rights to employees and improve the hospital’s performance. For example, using the proposed responsibility model, the team found that some employees in the reception department had been assigned more permissions than they required in practice. Comparing the research findings with the reality on the ground at the hospital has shown the Public Research Centre team that ArchiMate is an effective tool for modeling and determining both responsibilities and access rights within organizations.

Due to the ease of use and success the Public Research Centre Henri Tudor experienced in using ArchiMate to create the responsibility model and the access rights management system for the hospital, Tudor also intends to continue to use ArchiMate for other public and private research projects as appropriate.

Follow The Open Group @theopengroup, #ogchat and / or let us know your thoughts on the blog here.

 

4 Comments

Filed under ArchiMate®, Healthcare, Standards, Uncategorized

The Open Group Boston 2014 – Day Two Highlights

By Loren K. Bayes, Director, Global Marketing Communications

Enabling Boundaryless Information Flow™  continued in Boston on Tuesday, July 22Allen Brown, CEO and President of The Open Group welcomed attendees with an overview of the company’s second quarter results.

The Open Group membership is at 459 organizations in 39 countries, including 16 new membership agreements in 2Q 2014.

Membership value is highlighted by the collaboration Open Group members experience. For example, over 4,000 individuals attended Open Group events (physically and virtually whether at member meetings, webinars, podcasts, tweet jams). The Open Group website had more than 1 million page views and over 105,000 publication items were downloaded by members in 80 countries.

Brown also shared highlights from The Open Group Forums which featured status on many upcoming white papers, snapshots, reference models and standards, as well as individiual Forum Roadmaps. The Forums are busy developing and reviewing projects such as the Next Version of TOGAF®, an Open Group standard, an ArchiMate® white paper, The Open Group Healthcare Forum charter and treatise, Standard Mils™ APIs and Open Fair. Many publications are translated into multiple languages including Chinese and Portuguese. Also, a new Forum will be announced in the third quarter at The Open Group London 2014 so stay tuned for that launch news!

Our first keynote of the day was Making Health Addictive by Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health.

Dr. Kvedar described how Healthcare delivery is changing, with mobile technology being a big part. Other factors pushing changes are reimbursement paradigms and caregivers being paid to be more efficient and interested in keeping people healthy and out of hospitals. The goal of Healthcare providers is to integrate care into the day-to-day lives of patients. Healthcare also aims for better technologies and architecture.

Mobile is a game-changer in Healthcare because people are “always on and connected”. Mobile technology allows for in-the-moment messaging, ability to capture health data (GPS, accelerator, etc.) and display information in real time as needed. Bottom-line, smartphones are addictive so they are excellent tools for communication and engagement.

But there is a need to understand and address the implications of automating Healthcare: security, privacy, accountability, economics.

The plenary continued with Proteus Duxbury, CTO, Connect for Health Colorado, who presented From Build to Run at the Colorado Health Insurance Exchange – Achieving Long-term Sustainability through Better Architecture.

Duxbury stated the keys to successes of his organization are the leadership and team’s shared vision, a flexible vendor being agile with rapidly changing regulatory requirements, and COTS solution which provided minimal customization and custom development, resilient architecture and security. Connect for Health experiences many challenges including budget restraints, regulation and operating in a “fish bowl”. Yet, they are on-track with their three-year ‘build to run’ roadmap, stabilizing their foundation and gaining efficiencies.

During the Q&A with Allen Brown following each presentation, both speakers emphasized the need for standards, architecture and data security.

Brown and DuxburyAllen Brown and Proteus Duxbury

During the afternoon, track sessions consisted of Healthcare, Enterprise Architecture (EA) & Business Value, Service-Oriented Architecture (SOA), Security & Risk Management, Professional Development and ArchiMate Tutorials. Chris Armstrong, President, Armstrong Process Group, Inc. discussed Architecture Value Chain and Capability Model. Laura Heritage, Principal Solution Architect / Enterprise API Platform, SOA Software, presented Protecting your APIs from Threats and Hacks.

The evening culminated with a reception at the historic Old South Meeting House, where the Boston Tea Party began in 1773.

photo2

IMG_2814Networking Reception at Old South Meeting House

A special thank you to our sponsors and exhibitors at The Open Group Boston 2014: BiZZdesign, Black Duck, Corso, Good e-Learning, Orbus and AEA.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Leave a comment

Filed under Accreditations, Boundaryless Information Flow™, Business Architecture, COTS, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open FAIR Certification, OTTF, RISK Management, Service Oriented Architecture, Standards, Uncategorized

The Open Group Boston 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

The Open Group kicked off Enabling Boundaryless Information Flow™  July 21 at the spectacular setting of the Hyatt Boston Harbor. Allen Brown, CEO and President of The Open Group, welcomed over 150 people from 20 countries, including as far away as Australia, Japan, Saudi Arabia and India.

The first keynote speaker was Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, known as a leading expert in business models. His presentation entitled Platform Shift – How New Open Business Models are Changing the Shape of Industry posed the questions “What does ‘openness’ mean? Why do platforms beat products every time?”.

Van AlstyneMarshall Van Alstyne

According to “InterBrand: 2014 Best Global Brands”, 13 of the top 31 companies are “platform companies”. To be a ‘platform’, a company needs embeddable functions or service and allow 3rd party access. Alystyne noted, “products have features, platforms have communities”. Great standalone products are not sufficient. Positive changes experienced by a platform company include pricing/profitability, supply chains, internal organization, innovation, decreased industry bottlenecks and strategy.

Platforms benefit from broad contributions, as long as there is control of the top several complements. Alstyne commented, “If you believe in the power of community, you need to embrace the platform.”

The next presentation was Open Platform 3.0™ – An Integrated Approach to the Convergence of Technology Platforms, by Dr. Chris Harding, Director for Interoperability, The Open Group. Dr. Harding discussed how society has developed a digital society.

1970 was considered the dawn of an epoch which saw the First RAM chip, IBM introduction of System/370 and a new operating system – UNIX®. Examples of digital progress since that era include driverless cars and Smart Cities (management of traffic, energy, water, communication).

Digital society enablers are digital structural change and corporate social media. The benefits are open innovation, open access, open culture, open government and delivering more business value.

Dr. Harding also noted, standards are essential to innovation and enable markets based on integration. The Open Group Open Platform 3.0™ is using ArchiMate®, an Open Group standard, to analyze the 30+ business use cases produced by the Forum. The development cycle is understanding, analysis, specification, iteration.

Dr. Harding emphasized the importance of Boundaryless Information Flow™, as an enabler of business objectives and efficiency through IT standards in the era of digital technology, and designed for today’s agile enterprise with direct involvement of business users.

Both sessions concluded with an interactive audience Q&A hosted by Allen Brown.

The last session of the morning’s plenary was a panel: The Internet of Things and Interoperability. Dana Gardner, Principal Analyst at Interarbor Solutions, moderated the panel. Participating in the panel were Said Tabet, CTO for Governance, Risk and Compliance Strategy, EMC; Penelope Gordon, Emerging Technology Strategist, 1Plug Corporation; Jean-Francois Barsoum, Senior Managing Consultant, Smarter Cities, Water & Transportation, IBM; and Dave Lounsbury, CTO, The Open Group.

IoT PanelIoT Panel – Gardner, Barsoum, Tabet, Lounsbury, Gordon

The panel explored the practical limits and opportunities of Internet of Things (IoT). The different areas discussed include obstacles to decision-making as big data becomes more prolific, openness, governance and connectivity of things, data and people which pertain to many industries such as smart cities, manufacturing and healthcare.

How do industries, organizations and individuals deal with IoT? This is not necessarily a new problem, but an accelerated one. There are new areas of interoperability but where does the data go and who owns the data? Openness is important and governance is essential.

What needs to change most to see the benefits of the IoT? The panel agreed there needs to be a push for innovation, increased education, move beyond models of humans managing the interface (i.e. machine-to-machine) and determine what data is most important, not always collecting all the data.

A podcast and transcript of the Internet of Things and Interoperability panel will be posted soon.

The afternoon was divided into several tracks: Boundaryless Information Flow™, Open Platform 3.0™ and Enterprise Architecture (EA) & Enterprise Transformation. Best Practices for Enabling Boundaryless Information Flow across the Government was presented by Syed Husain, Consultant Enterprise Architecture, Saudi Arabia E-government Authority. Robert K. Pucci, CTO, Communications Practice, Cognizant Technology Solutions discussed Business Transformation Justification Leveraging Business and Enterprise Architecture.

The evening concluded with a lively networking reception at the hotel.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

 

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Interoperability, Open Platform 3.0, Professional Development, Standards, Uncategorized

The Open Group Boston 2014 Preview: Talking People Architecture with David Foote

By The Open Group

Among all the issues that CIOs, CTOs and IT departments are facing today, staffing is likely near the top of the list of what’s keeping them up at night. Sure, there’s dealing with constant (and disruptive) technological changes and keeping up with the latest tech and business trends, such as having a Big Data, Internet of Things (IoT) or a mobile strategy, but without the right people with the right skills at the right time it’s impossible to execute on these initiatives.

Technology jobs are notoriously difficult to fill–far more difficult than positions in other industries where roles and skillsets may be much more static. And because technology is rapidly evolving, the roles for tech workers are also always in flux. Last year you may have needed an Agile developer, but today you may need a mobile developer with secure coding ability and in six months you might need an IoT developer with strong operations or logistics domain experience—with each position requiring different combinations of tech, functional area, solution and “soft” skillsets.

According to David Foote, IT Industry Analyst and co-founder of IT workforce research and advisory firm Foote Partners, the mash-up of HR systems and ad hoc people management practices most companies have been using for years to manage IT workers have become frighteningly ineffective. He says that to cope in today’s environment, companies need to architect their people infrastructure similar to how they have been architecting their technical infrastructure.

“People Architecture” is the term Foote has coined to describe the application of traditional architectural principles and practices that may already be in place elsewhere within an organization and applying them to managing the IT workforce. This includes applying such things as strategy and capability roadmaps, phase gate blueprints, benchmarks, performance metrics, governance practices and stakeholder management to human capital management (HCM).

HCM components for People Architecture typically include job definition and design, compensation, incentives and recognition, skills demand and acquisition, job and career paths, professional development and work/life balance.

Part of the dilemma for employers right now, Foote says, is that there is very little job title standardization in the marketplace and too many job titles floating around IT departments today. “There are too many dimensions and variability in jobs now that companies have gotten lost from an HR perspective. They’re unable to cope with the complexity of defining, determining pay and laying out career paths for all these jobs, for example. For many, serious retention and hiring problems are showing up for the first time. Work-around solutions used for years to cope with systemic weaknesses in their people management systems have stopped working,” says Foote. “Recruiters start picking off their best people and candidates are suddenly rejecting offers and a panic sets in. Tensions are palpable in their IT workforce. These IT realities are pervasive.”

Twenty-five years ago, Foote says, defining roles in IT departments was easier. But then the Internet exploded and technology became far more customer-facing, shifting basic IT responsibilities from highly technical people deep within companies to roles requiring more visibility and transparency within and outside the enterprise. Large chunks of IT budgets moved into the business lines while traditional IT became more of a business itself.

According to Foote, IT roles became siloed not just by technology but by functional areas such as finance and accounting, operations and logistics, sales, marketing and HR systems, and by industry knowledge and customer familiarity. Then the IT professional services industry rapidly expanded to compete with their customers for talent in the marketplace. Even the architect role changed: an Enterprise Architect today can specialize in applications, security or data architecture among others, or focus on a specific industry such as energy, retail or healthcare.

Foote likens the fragmentation of IT jobs and skillsets that’s happening now to the emergence of IT architecture 25 years ago. Just as technical architecture practices emerged to help make sense of the disparate systems rapidly growing within companies and how best to determine the right future tech investments, a people architecture approach today helps organizations better manage an IT workforce spread through the enterprise with roles ranging from architects and analysts to a wide variety of engineers, developers and project and program managers.

“Technical architecture practices were successful because—when you did them well—companies achieved an understanding of what they have systems-wise and then connected it to where they were going and how they were going to get there, all within a process inclusive of all the various stakeholders who shared the risk in the outcome. It helped clearly define enterprise technology capabilities and gave companies more options and flexibility going forward,” according to Foote.

“Right now employers desperately need to incorporate in human capital management systems and practice the same straightforward, inclusive architecture approaches companies are already using in other areas of their businesses. This can go a long way toward not just lessening staffing shortages but also executing more predictably and being more agile in face of constant uncertainties and the accelerating pace of change. Ultimately this translates into a more effective workforce whether they are full-timers or the contingent workforce of part-timers, consultants and contractors.

“It always comes down to your people. That’s not a platitude but a fact,” insists Foote. “If you’re not competitive in today’s labor marketplace and you’re not an employer where people want to work, you’re dead.”

One industry that he says has gotten it right is the consulting industry. “After all, their assets walk out the door every night. Consulting groups within firms such as IBM and Accenture have been good at architecting their staffing because it’s their job to get out in front of what’s coming technologically. Because these firms must anticipate customer needs before they get the call to implement services, they have to be ahead of the curve in already identifying and hiring the bench strength needed to fulfill demand. They do many things right to hire, develop and keep the staff they need in place.”

Unfortunately, many companies take too much of a just-in-time approach to their workforce so they are always managing staffing from a position of scarcity rather than looking ahead, Foote says. But, this is changing, in part due to companies being tired of never having the people they need and being able to execute predictably.

The key is to put a structure in place that addresses a strategy around what a company needs and when. This applies not just to the hiring process, but also to compensation, training and advancement.

“Architecting anything allows you to be able to, in a more organized way, be more agile in dealing with anything that comes at you. That’s the beauty of architecture. You plan for the fact that you’re going to continue to scale and continue to change systems, the world’s going to continue to change, but you have an orderly way to manage the governance, planning and execution of that, the strategy of that and the implementation of decisions knowing that the architecture provides a more agile and flexible modular approach,” he said.

Foote says organizations such as The Open Group can lend themselves to facilitating People Architecture in a couple different ways. First, through extending the principles of architecture to human capital management, and second through vendor-independent, expertise and experience driven certifications, such as TOGAF® or OpenCA and OpenCITS, that help companies define core competencies for people and that provide opportunities for training and career advancement.

“I’m pretty bullish on many vendor-independent certifications in general, particularly where a defined book of knowledge exists that’s achieved wide acceptance in the industry. And that’s what you’ve got with The Open Group. Nobody’s challenging the architectural framework supremacy of TOGAF that that I’m aware of. In fact, large vendors with their own certifications participated actively in developing the framework and applying it very successfully to their business models,” he said.

Although the process of implementing People Architecture can be difficult and may take several years to master (much like Enterprise Architecture), Foote says it is making a huge difference for companies that implement it.

To learn more about People Architecture and models for implementing it, plan to attend Foote’s session at The Open Group Boston 2014 on Tuesday July 22. Foote’s session will address how architectural principles are being applied to human capital so that organizations can better manage their workforces from hiring and training through compensation, incentives and advancement. He will also discuss how career paths for EAs can be architected. Following the conference, the session proceedings will be available to Open Group members and conference attendees at www.opengroup.org.

Join the conversation – #ogchat #ogBOS

footeDavid Foote is an IT industry research pioneer, innovator, and one of the most quoted industry analysts on global IT workforce trends and multiple facets of the human side of technology value creation. His two decades of groundbreaking deep research and analysis of IT-business cross-skilling and technology/business management integration and leading the industry in innovative IT skills demand and compensation benchmarking has earned him a place on a short list of thought leaders in IT human capital management.

A former Gartner and META Group analyst, David leads the research and analytical practice groups at Foote Partners that reach 2,300 customers on six continents.

Leave a comment

Filed under architecture, Conference, Open CA, Open CITS, Professional Development, Standards, TOGAF®, Uncategorized

New Health Data Deluges Require Secure Information Flow Enablement Via Standards, Says The Open Group’s New Healthcare Director

By The Open Group

Below is the transcript of The Open Group podcast on how new devices and practices have the potential to expand the information available to Healthcare providers and facilities.

Listen to the podcast here.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview coming to you in conjunction with The Open Group’s upcoming event, Enabling Boundaryless Information Flow™ July 21-22, 2014 in Boston.

GardnerI’m Dana Gardner, Principal Analyst at Interarbor Solutions and I’ll be your host and moderator for the series of discussions from the conference on Boundaryless Information Flow, Open Platform 3.0™, Healthcare, and Security issues.

One area of special interest is the Healthcare arena, and Boston is a hotbed of innovation and adaption for how technology, Enterprise Architecture, and standards can improve the communication and collaboration among Healthcare ecosystem players.

And so, we’re joined by a new Forum Director at The Open Group to learn how an expected continued deluge of data and information about patients, providers, outcomes, and efficiencies is pushing the Healthcare industry to rapid change.

WJason Lee headshotith that, please join me now in welcoming our guest. We’re here with Jason Lee, Healthcare and Security Forums Director at The Open Group. Welcome, Jason.

Jason Lee: Thank you so much, Dana. Good to be here.

Gardner: Great to have you. I’m looking forward to the Boston conference and want to remind our listeners and readers that it’s not too late to sign up. You can learn more at http://www.opengroup.org.

Jason, let’s start by talking about the relationship between Boundaryless Information Flow, which is a major theme of the conference, and healthcare. Healthcare perhaps is the killer application for Boundaryless Information Flow.

Lee: Interesting, I haven’t heard it referred to that way, but healthcare is 17 percent of the US economy. It’s upwards of $3 trillion. The costs of healthcare are a problem, not just in the United States, but all over the world, and there are a great number of inefficiencies in the way we practice healthcare.

We don’t necessarily intend to be inefficient, but there are so many places and people involved in healthcare, it’s very difficult to get them to speak the same language. It’s almost as if you’re in a large house with lots of different rooms, and every room you walk into they speak a different language. To get information to flow from one room to the other requires some active efforts and that’s what we’re undertaking here at The Open Group.

Gardner: What is it about the current collaboration approaches that don’t work? Obviously, healthcare has been around for a long time and there have been different players involved. What’s the hurdle? What prevents a nice, seamless, easy flow and collaboration in information that gets better outcomes? What’s the holdup?

Lee: There are many ways to answer that question, because there are many barriers. Perhaps the simplest is the transformation of healthcare from a paper-based industry to a digital industry. Everyone has walked into an office, looked behind the people at the front desk, and seen file upon file and row upon row of folders, information that’s kept in a written format.

When there’s been movement toward digitizing that information, not everyone has used the same system. It’s almost like trains running on a different gauge track. Obviously if the track going east to west is a different gauge than going north to south, then trains aren’t going to be able to travel on those same tracks. In the same way, healthcare information does not flow easily from one office to another or from one provider to another.

Gardner: So not only do we have disparate strategies for collecting and communicating health data, but we’re also seeing much larger amounts of data coming from a variety of new and different places. Some of them now even involve sensors inside of patients themselves or devices that people will wear. So is the data deluge, the volume, also an issue here?

Lee: Certainly. I heard recently that an integrated health plan, which has multiple hospitals involved, contains more elements of data than the Library of Congress. As information is collected at multiple points in time, over a relatively short period of time, you really do have a data deluge. Figuring out how to find your way through all the data and look at the most relevant for the patient is a great challenge.

Gardner: I suppose the bad news is that there is this deluge of data, but it’s also good news, because more data means more opportunity for analysis, a better ability to predict and determine best practices, and also provide overall lower costs with better patient care.

So it seems like the stakes are rather high here to get this right, to not just crumble under a volume or an avalanche of data, but to master it, because it’s perhaps the future. The solution is somewhere in there too.

Lee: No question about it. At The Open Group, our focus is on solutions. We, like others, put a great deal of effort into describing the problems, but figuring out how to bring IT technologies to bear on business problems, how to encourage different parts of organizations to speak to one another and across organizations to speak the same language, and to operate using common standards and language. That’s really what we’re all about.

And it is, in a large sense, part of the process of helping to bring healthcare into the 21st Century. A number of industries are a couple of decades ahead of healthcare in the way they use large datasets — big data, some people refer to it as. I’m talking about companies like big department stores and large online retailers. They really have stepped up to the plate and are using that deluge of data in ways that are very beneficial to them, and healthcare can do the same. We’re just not quite at the same level of evolution.

Gardner: And to your point, the stakes are so much higher. Retail is, of course, a big deal in the economy, but as you pointed out, healthcare is such a much larger segment and portion. So just making modest improvements in communication, collaboration, or data analysis can reap huge rewards.

Lee: Absolutely true. There is the cost side of things, but there is also the quality side. So there are many ways in which healthcare can improve through standardization and coordinated development, using modern technology that cannot just reduce cost, but improve quality at the same time.

Gardner: I’d like to get into a few of the hotter trends, but before we do, it seems that The Open Group has recognized the importance here by devoting the entire second day of their conference in Boston, that will be on July 22, to Healthcare.

Maybe you could give us a brief overview of what participants, and even those who come in online and view recorded sessions of the conference at http://new.livestream.com/opengroup should expect? What’s going to go on July 22nd?

Lee: We have a packed day. We’re very excited to have Dr. Joe Kvedar, a physician at Partners HealthCare and Founding Director of the Center for Connected Health, as our first plenary speaker. The title of his presentation is “Making Health Additive.” Dr. Kvedar is a widely respected expert on mobile health, which is currently the Healthcare Forum’s top work priority. As mobile medical devices become ever more available and diversified, they will enable consumers to know more about their own health and wellness. A great deal of data of potentially useful health data will be generated. How this information can be used–not just by consumers but also by the healthcare establishment that takes care of them as patients, will become a question of increasing importance. It will become an area where standards development and The Open Group can be very helpful.

Our second plenary speaker, Proteus Duxbury, Chief Technology Officer at Connect for Health Colorado,will discuss a major feature of the Affordable Care Act—the health insurance exchanges–which are designed to bring health insurance to tens of millions of people who previously did not have access to it. Mr. Duxbury is going to talk about how Enterprise Architecture–which is really about getting to solutions by helping the IT folks talk to the business folks and vice versa–has helped the State of Colorado develop their Health Insurance Exchange.

After the plenaries, we will break up into 3 tracks, one of which is Healthcare-focused. In this track there will be three presentations, all of which discuss how Enterprise Architecture and the approach to Boundaryless Information Flow can help healthcare and healthcare decision-makers become more effective and efficient.

One presentation will focus on the transformation of care delivery at the Visiting Nurse Service of New York. Another will address stewarding healthcare transformation using Enterprise Architecture, focusing on one of our Platinum members, Oracle, and a company called Intelligent Medical Objects, and how they’re working together in a productive way, bringing IT and healthcare decision-making together.

Then, the final presentation in this track will focus on the development of an Enterprise Architecture-based solution at an insurance company. The payers, or the insurers–the big companies that are responsible for paying bills and collecting premiums–have a very important role in the healthcare system that extends beyond administration of benefits. Yet, payers are not always recognized for their key responsibilities and capabilities in the area of clinical improvements and cost improvements.

With the increase in payer data brought on in large part by the adoption of a new coding system–the ICD-10–which will come online this year, there will be a huge amount of additional data, including clinical data, that become available. At The Open Group, we consider payers—health insurance companies (some of which are integrated with providers)–as very important stakeholders in the big picture..

In the afternoon, we’re going to switch gears a bit and have a speaker talk about the challenges, the barriers, the “pain points” in introducing new technology into the healthcare systems. The focus will return to remote or mobile medical devices and the predictable but challenging barriers to getting newly generated health information to flow to doctors’ offices and into patients records, electronic health records, and hospitals data keeping and data sharing systems.

We’ll have a panel of experts that responds to these pain points, these challenges, and then we’ll draw heavily from the audience, who we believe will be very, very helpful, because they bring a great deal of expertise in guiding us in our work. So we’re very much looking forward to the afternoon as well.

Gardner: It’s really interesting. A couple of these different plenaries and discussions in the afternoon come back to this user-generated data. Jason, we really seem to be on the cusp of a whole new level of information that people will be able to develop from themselves through their lifestyle, new devices that are connected.

We hear from folks like Apple, Samsung, Google, and Microsoft. They’re all pulling together information and making it easier for people to not only monitor their exercise, but their diet, and maybe even start to use sensors to keep track of blood sugar levels, for example.

In fact, a new Flurry Analytics survey showed 62 percent increase in the use of health and fitness application over the last six months on the popular mobile devices. This compares to a 33 percent increase in other applications in general. So there’s an 87 percent faster uptick in the use of health and fitness applications.

Tell me a little bit how you see this factoring in. Is this a mixed blessing? Will so much data generated from people in addition to the electronic medical records, for example, be a bad thing? Is this going to be a garbage in, garbage out, or is this something that could potentially be a game-changer in terms of how people react to their own data and then bring more data into the interactions they have with care providers?

Lee: It’s always a challenge to predict what the market is going to do, but I think that’s a remarkable statistic that you cited. My prediction is that the increased volume of person- generated data from mobile health devices is going to be a game-changer. This view also reflects how the Healthcare Forum members (which includes members from Capgemini, Philips, IBM, Oracle and HP) view the future.

The commercial demand for mobile medical devices, things that can be worn, embedded, or swallowed, as in pills, as you mentioned, is growing ever more. The software and the applications that will be developed to be used with the devices is going to grow by leaps and bounds. As you say, there are big players getting involved. Already some of the pedometer type devices that measure the number of steps taken in a day have captured the interest of many, many people. Even David Sedaris, serious guy that he is, was writing about it recently in ‘The New Yorker’.

What we will find is that many of the health indicators that we used to have to go to the doctor or nurse or lab to get information on will become available to us through these remote devices.

There will be a question, of course, as to reliability and validity of the information, to your point about garbage in, garbage out, but I think standards development will help here This, again, is where The Open Group comes in. We might also see the FDA exercising its role in ensuring safety here, as well as other organizations, in determining which devices are reliable.

The Open Group is working in the area of mobile data and information systems that are developed around them, and their ability to (a) talk to one another and (b) talk to the data devices/infrastructure used in doctors’ offices and in hospitals. This is called interoperability and it’s certainly lacking in the country.

There are already problems around interoperability and connectivity of information in the healthcare establishment as it is now. When patients and consumers start collecting their own data, and the patient is put at the center of the nexus of healthcare, then the question becomes how does that information that patients collect get back to the doctor/clinician in ways in which the data can be trusted and where the data are helpful?

After all, if a patient is wearing a medical device, there is the opportunity to collect data, about blood sugar level let’s say, throughout the day. And this is really taking healthcare outside of the four walls of the clinic and bringing information to bear that can be very, very useful to clinicians and beneficial to patients.

In short, the rapid market dynamic in mobile medical devices and in the software and hardware that facilitates interoperability begs for standards-based solutions that reduce costs and improve quality, and all of which puts the patient at the center. This is The Open Group’s Healthcare Forum’s sweet spot.

Gardner: It seems to me a real potential game-changer as well, and that something like Boundaryless Information Flow and standards will play an essential role. Because one of the big question marks with many of the ailments in a modern society has to do with lifestyle and behavior.

So often, the providers of the care only really have the patient’s responses to questions, but imagine having a trove of data at their disposal, a 360-degree view of the patient to then further the cause of understanding what’s really going on, on a day-to-day basis.

But then, it’s also having a two-way street, being able to deliver perhaps in an automated fashion reinforcements and incentives, information back to the patient in real-time about behavior and lifestyles. So it strikes me as something quite promising, and I look forward to hearing more about it at the Boston conference.

Any other thoughts on this issue about patient flow of data, not just among and between providers and payers, for example, or providers in an ecosystem of care, but with the patient as the center of it all, as you said?

Lee: As more mobile medical devices come to the market, we’ll find that consumers own multiple types of devices at least some of which collect multiple types of data. So even for the patient, being at the center of their own healthcare information collection, there can be barriers to having one device talk to the other. If a patient wants to keep their own personal health record, there may be difficulties in bringing all that information into one place.

So the interoperability issue, the need for standards, guidelines, and voluntary consensus among stakeholders about how information is represented becomes an issue, not just between patients and their providers, but for individual consumers as well.

Gardner: And also the cloud providers. There will be a variety of large organizations with cloud-modeled services, and they are going to need to be, in some fashion, brought together, so that a complete 360-degree view of the patient is available when needed. It’s going to be an interesting time.

Of course, we’ve also looked at many other industries and tried to have a cloud synergy, a cloud-of-clouds approach to data and also the transaction. So it’s interesting how what’s going on in multiple industries is common, but it strikes me that, again, the scale and the impact of the healthcare industry makes it a leader now, and perhaps a driver for some of these long overdue structured and standardized activities.

Lee: It could become a leader. There is no question about it. Moreover, there is a lot Healthcare can learn from other companies, from mistakes that other companies have made, from lessons they have learned, from best practices they have developed (both on the content and process side). And there are issues, around security in particular, where Healthcare will be at the leading edge in trying to figure out how much is enough, how much is too much, and what kinds of solutions work.

There’s a great future ahead here. It’s not going to be without bumps in the road, but organizations like The Open Group are designed and experienced to help multiple stakeholders come together and have the conversations that they need to have in order to push forward and solve some of these problems.

Gardner: Well, great. I’m sure there will be a lot more about how to actually implement some of those activities at the conference. Again, that’s going to be in Boston, beginning on July 21, 2014.

We’ll have to leave it there. We’re about out of time. We’ve been talking with a new Director at The Open Group to learn how an expected continued deluge of data and information about patients and providers, outcomes and efficiencies are all working together to push the Healthcare industry to rapid change. And, as we’ve heard, that might very well spill over into other industries as well.

So we’ve seen how innovation and adaptation around technology, Enterprise Architecture and standards can improve the communication and collaboration among Healthcare ecosystem players.

It’s not too late to register for The Open Group Boston 2014 (http://www.opengroup.org/boston2014) and join the conversation via Twitter #ogchat #ogBOS, where you will be able to learn more about Boundaryless Information Flow, Open Platform 3.0, Healthcare and other relevant topics.

So a big thank you to our guest. We’ve been joined by Jason Lee, Healthcare and Security Forums Director at The Open Group. Thanks so much, Jason.

Lee: Thank you very much.

 

 

 

 

 

 

 

 

 

Leave a comment

Filed under Boundaryless Information Flow™, Cloud, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Interoperability, Open Platform 3.0, Standards, Uncategorized

The Enterprise Architecture Kaleidoscope

By Stuart Boardman, Senior Business Consultant, Business & IT Advisory, KPN Consulting

Last week I attended a Club of Rome (Netherlands) debate about a draft report on sustainability and social responsibility. The author of the report described his approach as being like a kaleidoscope, because the same set of elements can form quite different pictures.

EA 1

Some people had some difficulty with this. They wanted a single picture they could focus on. To me it felt quite natural, because that’s very much what we try to do in Enterprise Architecture (EA) – produce different views of the same whole for the benefit of different stakeholders. And suddenly I realized how to express the relationship between EA and a broader topic like sustainability. That matters to me, because sustainability is something I’m passionate about and I’d like my work to be some small contribution to achieving that.

Before that, I’d been thinking that EA obviously has a role to play in a sustainable enterprise but I hadn’t convinced myself that the relationship was so fundamental – it felt a bit too much like wishful thinking on my part.

When we talk about sustainability today, we need to be clear that we’re not just talking about environmental issues and we’re certainly not talking about “greenwashing”. There’s an increasing awareness that a change needs to occur (and is to some extent occurring) in how we work, how we do business, how we relate to and value each other and how we relate to and value our natural environment.

This is relevant too for The Open Group Open Platform 3.0™. Plenty is written these days about the role that the Internet of Things and Big Data Analytics can play in sustainability. A lot is actually happening. Too much of this fails to take any account of the kaleidoscope and offers a purely technological and resource centric view of a shining future. People are reduced to being the happy consumers of this particular soma. By bringing other factors and in particular social media and locating the discussion in The Open Group’s traditions of Enterprise Architecture (and see also The Open Group’s work on Identity), these rather dangerous limitations can be overcome.

EA 2

 

 

 

 

EA 3

 Source: Wikipedia

Success in any one of these areas is dependent on success in the others. That was really the message of the Club of Rome discussion.

And that’s where EA comes in – the architecture of a global enterprise. There are multiple stakeholders with multiple concerns. They range from a CEO with a company to keep afloat to a farming community, whose livelihood is threatened by a giant coal mine. They also include those whose livelihood is threatened by closing that mine and governments saddled with crippling national debt. They include the people working to achieve change. These people also have their own areas of focus within the overall picture. There are people designing the new solutions – technological or otherwise. There are the people who will have to operate the changed situation. There are the stewards for the natural environment and the non-human inhabitants of platform Earth.

Now Enterprise Architects are in a sense always concerned with sustainability, at least at the micro level of one organization or enterprise. We try to develop an architecture in which the whole enterprise (and all its parts) can achieve its goals – with a minimum of instability and with the ability to respond effectively to change. That in and of itself requires us to be aware of what’s going on in the world outside our organization’s direct sphere of influence, so it’s a small step to looking at a broader picture and wondering what the future of the enterprise might be in a non-sustainable world.

The next step is an obvious one for any Enterprise Architect – well actually any architect at all in any kind of enterprise. This isn’t a political or moral question (although architects have as much right as anyone to else to such considerations) but really just one of drawing conclusions, which are logical and obvious – unless one is merely driven by short-term considerations. What you do with those conclusions is up to you and constrained by your own situation. You do what you can. You can take the campaigning viewpoint or look for collateral lack of damage or just facilitate sustainability when it’s on the agenda – look for opportunities for re-use or repair. And if your situation is one where nothing is possible, you might want to be thinking about moving on.

Sustainability is not conservatism. Some things reach the end of their useful life or can’t survive unexpected and/or dramatic changes. Some things actually improve as a result of taking a serious knock – what Nicholas Nassim Taleb calls anti-fragility. That’s true in nature at both micro and macro levels and it’s particularly true in nature. It’s not surprising that the ideas of biomimicry are rapidly gaining traction in sustainability circles.

EA 4

 

 

 

 

 

Stickybot

In this sense, agile is really about sustainability. When we work with agile methods, we’re not trying to create something changeless. We’re trying to create a way of working in which our enterprise or some small part of it, can change and adapt so as to continue to fulfill its mission for so long as that remains relevant in the world.

So yes, there’s a lot an (enterprise) architect can do towards achieving a sustainable world and there are more than enough reasons that’s consistent with our role in the organizations and enterprises we serve.

Agreed? Not? Please comment one way or the other and let’s continue the discussion.

SONY DSCStuart Boardman is a Senior Business Consultant with KPN Consulting where he leads the Enterprise Architecture practice and consults to clients on Cloud Computing, Enterprise Mobility and The Internet of Everything. He is Co-Chair of The Open Group Open Platform 3.0™ Forum and was Co-Chair of the Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by KPN, the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI as well as several Open Group white papers, guides and standards. He is a frequent speaker at conferences on the topics of Open Platform 3.0 and Identity.

1 Comment

Filed under Enterprise Architecture, Enterprise Transformation, Identity Management, Professional Development, Uncategorized

The Open Group Boston 2014 to Explore How New IT Trends are Empowering Improvements in Business

By The Open Group

The Open Group Boston 2014 will be held on July 21-22 and will cover the major issues and trends surrounding Boundaryless Information Flow™. Thought-leaders at the event will share their outlook on IT trends, capabilities, best practices and global interoperability, and how this will lead to improvements in responsiveness and efficiency. The event will feature presentations from representatives of prominent organizations on topics including Healthcare, Service-Oriented Architecture, Security, Risk Management and Enterprise Architecture. The Open Group Boston will also explore how cross-organizational collaboration and trends such as big data and cloud computing are helping to make enterprises more effective.

The event will consist of two days of plenaries and interactive sessions that will provide in-depth insight on how new IT trends are leading to improvements in business. Attendees will learn how industry organizations are seeking large-scale transformation and some of the paths they are taking to realize that.

The first day of the event will bring together subject matter experts in the Open Platform 3.0™, Boundaryless Information Flow™ and Enterprise Architecture spaces. The day will feature thought-leaders from organizations including Boston University, Oracle, IBM and Raytheon. One of the keynotes is from Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, which reveals the secret of internet-driven marketplaces. Other content:

• The Open Group Open Platform 3.0™ focuses on new and emerging technology trends converging with each other and leading to new business models and system designs. These trends include mobility, social media, big data analytics, cloud computing and the Internet of Things.
• Cloud security and the key differences in securing cloud computing environments vs. traditional ones as well as the methods for building secure cloud computing architectures
• Big Data as a service framework as well as preparing to deliver on Big Data promises through people, process and technology
• Integrated Data Analytics and using them to improve decision outcomes

The second day of the event will have an emphasis on Healthcare, with keynotes from Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health, and Connect for Health Colorado CTO, Proteus Duxbury. The day will also showcase speakers from Hewlett Packard and Blue Cross Blue Shield, multiple tracks on a wide variety of topics such as Risk and Professional Development, and Archimate® tutorials. Key learnings include:

• Improving healthcare’s information flow is a key enabler to improving healthcare outcomes and implementing efficiencies within today’s delivery models
• Identifying the current state of IT standards and future opportunities which cover the healthcare ecosystem
• How Archimate® can be used by Enterprise Architects for driving business innovation with tried and true techniques and best practices
• Security and Risk Management evolving as software applications become more accessible through APIs – which can lead to vulnerabilities and the potential need to increase security while still understanding the business value of APIs

Member meetings will also be held on Wednesday and Thursday, June 23-24.

Don’t wait, register now to participate in these conversations and networking opportunities during The Open Group Boston 2014: http://www.opengroup.org/boston2014/registration

Join us on Twitter – #ogchat #ogBOS

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open Platform 3.0, Professional Development, RISK Management, Service Oriented Architecture, Standards, Uncategorized

The Power of APIs – Join The Open Group Tweet Jam on Wednesday, July 9th

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The face of technology is evolving at breakneck speed, driven by demand from consumers and businesses alike for more robust, intuitive and integrated service offerings. APIs (application programming interfaces) have made this possible by offering greater interoperability between otherwise disparate software and hardware systems. While there are clear benefits to their use, how do today’s security and value-conscious enterprises take advantage of this new interoperability without exposing them themselves?

On Wednesday, July 9th at 9:00 am PT/12:00 pm ET/5:00 pm GMT, please join us for a tweet jam that will explore how APIs are changing the face of business today, and how to prepare for their implementation in your enterprise.

APIs are at the heart of how today’s technology communicates with one another, and have been influential in enabling new levels of development for social, mobility and beyond. The business benefits of APIs are endless, as are the opportunities to explore how they can be effectively used and developed.

There is reason to maintain a certain level of caution, however, as recent security issues involving open APIs have impacted overall confidence and sustainability.

This tweet jam will look at the business benefits of APIs, as well as potential vulnerabilities and weak points that you should be wary of when integrating them into your Enterprise Architecture.

We welcome The Open Group members and interested participants from all backgrounds to join the discussion and interact with our panel of thought-leaders from The Open Group including Jason Lee, Healthcare and Security Forums Director; Jim Hietala, Vice President of Security; David Lounsbury, CTO; and Dr. Chris Harding, Director for Interoperability and Open Platform 3.0™ Forum Director. To access the discussion, please follow the hashtag #ogchat during the allotted discussion time.

Interested in joining The Open Group Security Forum? Register your interest, here.

What Is a Tweet Jam?

A tweet jam is a 45 minute “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on relevant and thought-provoking issues. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Here are some helpful guidelines for taking part in the tweet jam:

  • Please introduce yourself (name, title and organization)
  • Use the hashtag #ogchat following each of your tweets
  • Begin your tweets with the question number to which you are responding
  • Please refrain from individual product/service promotions – the goal of the tweet jam is to foster an open and informative dialogue
  • Keep your commentary focused, thoughtful and on-topic

If you have any questions prior to the event or would like to join as a participant, please contact George Morin (@GMorin81 or george.morin@hotwirepr.com).

We look forward to a spirited discussion and hope you will be able to join!

 

3 Comments

Filed under Data management, digital technologies, Enterprise Architecture, Enterprise Transformation, Information security, Open Platform 3.0, real-time and embedded systems, Standards, Strategy, Tweet Jam, Uncategorized

Brand Marketing of Standards

By Allen Brown, President and CEO, The Open Group

Today everyone is familiar with the power of brands. Managed well, they can develop strong biases amongst customers for the product or service, resulting in greatly increased revenues and profits. Managed badly, they can destroy a product or an organization.

I was sitting in San Francisco International Airport one day. A very loud couple was looking for somewhere to get coffee. The wife said, “There’s a Peet’s right here.” Angrily the husband replied, “I don’t want Peet’s, I want Starbucks!”

A jewelry retailer in the UK had grown, in six years, from having 150 stores to more than 2,000, with 25,000 staff and annual sales of £1.2 billion. Then at the Institute of Directors conference at the Royal Albert Hall in 1991, he told an audience of 5,000 business leaders the secret of his success. Describing his company’s products, he said: ‘We also do cut-glass sherry decanters complete with six glasses on a silver-plated tray that your butler can serve you drinks on, for £4.95. People say “How can you sell this for such a low price?”  I say, because it’s total crap.’  As if that were not enough, he added that his stores’ earrings were ‘cheaper than a prawn sandwich, but probably wouldn’t last as long’.

It was a joke that he had told before but this time it got into the press. Hordes of people queued at his stores, immediately that word got out, to return everything from earrings to engagement rings. The company was destroyed.

The identity of a brand emerges through communication backed up by a promise to customers. That promise can be a promise of quality or service or innovation or style. Or it can be much less tangible: “people like you buy this product”, for example.

Early in my career, I worked for a company that was in the business of manufacturing and marketing edible oils and fats – margarines, cooking oils and cooking fat.   When first developed, margarine was simply a substitute for the butter that was in short supply in the UK during wartime. But when butter once again became plentiful, the product needed to offer other advantages to the consumer. Research focused on methods to improve the quality of margarine–such as making it easier to spread, more flavorful and more nutritious.

At the time there were many brands all focused on a specific niche which together amounted to something like a 95% market share. Stork Margarine was promoted as a low cost butter substitute for working class households, Blue Band Margarine was positioned slightly up-market, Tomor Margarine for the kosher community, Flora Margarine was marketed as recommended by doctors as being good for the heart and so on. Today, Unilever continues to market these brands, amongst many others, successfully although the positioning may be a little different.

Creating, managing and communicating brands is not inexpensive but the rewards can be significant. There are three critical activities that must be done well. The brand must be protected, policed and promoted.

Protection starts with ensuring that the brand is trademarked but it does not end there. Consistent and correct usage of the brand is essential – without that, a trademark can be challenged and the value of the brand and all that has been invested in it can be lost.

Policing is about identifying and preventing unauthorized or incorrect usage of the mark by others. Unauthorized usage can range from organizations using the brand to market their own products or services, all the way up to counterfeit copies of the branded products. Cellophane is a registered trademark in the UK and other countries, and the property of Innovia Films. However, in many countries “cellophane” has become a generic term, often used informally to refer to a wide variety of plastic film products, even those not made of cellulose,such as plastic wrap, thereby diminishing the value of the brand to its owner. There are several other well-known and valuable marks that have been lost through becoming generic – mostly due to the brand owner not insisting on correct usage.

Promotion begins with identifying the target market, articulating the brand promise and the key purchase factors and benefits. The target market can be consumers or organizations but at the end of the day, people buy products or services or vote for candidates seeking election and it is important to segment and profile the target customers sufficiently and develop key messages for each segment.

Profiling has been around for a long time: the margarine example shows how it was used in the past.   But today consumers, organization buyers and voters have a plethora of messages targeted at them and through a broader than ever variety of media, so it is critical to be as precise as possible. Some of the best examples of profiling, such as soccer moms and NASCAR dads have been popularized as a result of their usage in US presidential election campaigns.

In the mid-1990’s X/Open (now part of The Open Group) started using branding to promote the market adoption of open standards. The members of X/Open had developed a set of specifications aimed at enabling portability of applications between the UNIX® systems of competing vendors, which was called the X/Open Portability Guide, or XPG for short.

The target market was the buyers of UNIX systems. The brand promise was that any product that was supplied by the vendors that carried the X/Open brand conformed to the specification, would always conform and, in the event of any non-conformance being found, the vendor would, at their own cost, rectify the non-conformance for the customer within a prescribed period of time. To this day, there has only ever been one report of non-conformance, an obscure mathematical result, reported by an academic. The vendor concerned quickly rectified the issue, even though it was extremely unlikely that any customer would ever be affected by it.

The trademark license agreement signed by all vendors who used the X/Open brand carried the words “warrant and represent” in support of the brand promise. It was a significant commitment on the part of the vendors as it also carried with it significant risk and potential liability.   For these reasons, the vendors pooled their resources to fund the development of test suite software, so they could better understand the commitment they had entered into. These test suites were developed in stages and, over time, their coverage of the set of specifications grew.

It was only later that products had to be tested and certified before they could carry the X/Open brand.

The trademark was, of course protected, policed and promoted. Procurements that could be identified, which were mostly government procurements, were recorded and totaled in excess of $50bn in a short period of time. Procurements by commerce and industry were more difficult to track, but were clearly significant.

The XPG brand program was enormously successful and has evolved to become the UNIX® brand program and, in spite of challenges from open source software, continues to deliver revenues for the vendors in excess of $30bn per annum.

When new brand programs are contemplated, an early concern of both vendors and customers is the cost. Customers worry that the vendors will pass the cost on to them; vendors worry that they will have to absorb the cost. In the case of XPG and UNIX, both sides looked not at the cost but at the benefits. For customers, even if the vendors had passed on the cost, the savings that could be achieved as a result of portability in a heterogeneous environment were orders of magnitude greater. For vendors, in a competitive environment, the price that they can charge customers, for their products, is dictated by the market, so their ability to pass on the costs of the branding program, directly to the customer, is limited. However, the reality is that the cost of the branding program pales into insignificance when spread over the revenue of related products. For one vendor we estimate the cost to be less than 100th of 1% of related revenue. Combine that with a preference from customers for branded products and everybody wins.

So the big question for vendors is: Do you see certification as a necessary cost to be kept as low as possible or do you see brand marketing of open standards, of which certification is a part, as a means to grow the market and your share of that market?

The big question for customers is: Do you want to negotiate and enforce a warranty with every vendor and in every contract or do you want the industry to do that for you and spread the cost over billions of dollars of procurements?

brown-smallAllen Brown is President and CEO of The Open Group – a global consortium that enables the achievement of business objectives through IT standards.  For over 15 years, Allen has been responsible for driving The Open Group’s strategic plan and day-to-day operations, including extending its reach into new global markets, such as China, the Middle East, South Africa and India. In addition, he was instrumental in the creation of the Association of Enterprise Architects (AEA)., which was formed to increase job opportunities for all of its members and elevate their market value by advancing professional excellence.

 

 

3 Comments

Filed under Brand Marketing, Certifications, Standards, Uncategorized, UNIX

The Digital Ecosystem Paradox – Learning to Move to Better Digital Design Outcomes

By Mark Skilton, Professor of Practice, Information Systems Management, Warwick Business School

Does digital technologies raise quality and improve efficiencies but at the same time drive higher costs of service as more advanced solutions and capabilities become available demanding higher entry investment and maintenance costs?

Many new digital technologies introduce step change in performance that would have been cost prohibitive in the previous technology generations. But in some industries the technology cost per outcome have be steadily rising in some industries.

In the healthcare market the cost per treatment of health care technology was highlighted in a MIT Technology Review article (1). In areas such as new drugs for treating depression, left-ventricular assistance devices, or implantable defibrillators may be raising the overall cost of health, yet how do we value this if patient quality of life is improving and life extending. While lower cost drugs and vaccines may be enabling better overall patient outcomes

In the smart city a similar story is unfolding where governments and organizations are seeking paths to use digitization to drive improvements in jobs productivity, better lifestyles and support of environmental sustainability. While there are several opportunities to reduce energy bills, improve transport and office spaces exist with savings of 40% to 60% consumption and efficiencies complexity costs of connecting different residential, corporate offices, transport and other living spaces requires digital initiatives that are coordinated and managed. (U-city experience in South Korea (2)).

These digital paradoxes represent the digital ecosystem challenge to maximise what these new digital technologies can do to augment every objects, services, places and spaces while taking account of the size and addressable market that all these solutions can serve.

Skilton1

What we see is that technology can be both a driver of the physical and digital economy through lowering of price per function in computer storage, compute, access and application technology and creating new value; conversely the issues around driving new value is having different degrees of success in industries.

Creating value in the digital economy

The digital economy is at a tipping point, a growing 30% of business is shifting online to search and engage with consumers, markets and transactions taking account of retail , mobile and impact on supply channels (3);  80% of transport, real estate and hotelier activity is processed through websites (4); over 70% of companies and consumers are experiencing cyber-privacy challenges (5), (6) yet the digital media in social, networks, mobile devices, sensors and the explosion of big data and cloud computing networks is interconnecting potentially everything everywhere – amounting to a new digital “ecosystem.

Disruptive business models across industries and new consumer innovation are increasingly built around new digital technologies such as social media, mobility, big data, cloud computing and the emerging internet of things sensors, networks and machine intelligence. (MISQ Digital Strategy Special Issue (7)).

These trends have significantly enhanced the relevance and significance of IT in its role and impact on business and market value at local, regional and global scale.

With IT budgets increasing shifting more towards the marketing functions and business users of these digital services from traditional IT, there is a growing role for technology to be able to work together in new connected ways.

Driving better digital design outcomes

The age of new digital technologies are combining in new ways to drive new value for individuals, enterprise, communities and societies. The key is in understanding the value that each of these technologies can bring individually and in the mechanisms to creating additive value when used appropriately and cost effectively to drive brand, manage cyber risk, and build consumer engagement and economic growth.

Skilton2

Value-in-use, value in contextualization

Each digital technology has the potential to enable better contextualization of the consumer experience and the value added by providers.   Each industry market has emerging combinations of technologies that can be developed to enable focused value.

Examples of these include.

  • Social media networks

o   Creating enhanced co-presence

  • Big data

o   Providing uniqueness profiling , targeting advice and preferences in context

  • Mobility

o   Creating location context services and awareness

  • Cloud

o   Enabling access to resources and services

  • Sensors

o   Creating real time feedback responsiveness

  • Machine intelligence

o   Enabling insight and higher decision quality

Together these digital technologies can build generative effects that when in context can enable higher value outcomes in digital workspaces.

Skilton3

Value in Contextualization

The value is not in whether these technologies, objects, consumers or provider inside or outside the enterprise or market. These distinctions are out-of-context from relating them to the situation and the consumer needs and wants. The issue is how to apply and put into context the user experience and enterprise and social environment to best use and maximise the outcomes in a specific setting context rom the role perspective.

With the medical roles of patient and clinician, the aim in digitization is how mobile devices, wearable monitoring can be used most efficiently and effectively to raise patient outcome quality and manage health service costs. Especially in the developing countries and remote areas where infrastructure and investment costs, how can technologies reach and improve the quality of health and at an effective cost price point.

This phenomena is wide spread and growing across all industry sectors such as: the connected automobile with in-car entertainment, route planning services; to tele-health that offers remote patient care monitoring and personalized responses; to smart buildings and smart cities that are optimizing energy consumption and work environments; to smart retail where interactive product tags for instant customer mobile information feedback and in-store promotions and automated supply chains. The convergence of these technologies requires a response from all businesses.

These issues are not going to go away, the statistics from analysts describe a new era of a digital industrial economy (8). What is common is the prediction in the next twenty to fifty years suggest double or triple growth in demand for new digital technologies and their adoption.

Skilton4

Platforming and designing better digital outcomes

Developing efective digital workspaces will be fundamental to the value and use of these technologies. There will be not absolute winners and losers as a result of the digital paradox. What is at state is in how the cost and inovation of these technologies can be leveraged to fit specific outcomes.

Understanding the architecting practices will be essentuial in realizing the digitel enterprise. Central to this is how to develop ways to contextualize digital technologies to enable this value for consumers and customers (Value and Worth – creating new markets in the digital economy (9)).Skilton5Platforming will be a central IT strategy that we see already emerging in early generations of digital marketplaces, mobile app ecosystems and emerging cross connecting services in health, automotive, retail and others seeking to create joined up value.

Digital technologies will enable new forms of digital workspaces to support new outcomes. By driving contextualized offers that meet and stimulate consumer behaviors and demand , a richer and more effective value experience and growth potential is possible.

Skilton6The challenge ahead

The evolution of digital technologies will enable many new types of architect and platforms. How these are constructed into meaningful solutions is both the opportunity and the task ahead.

The challenge for both business and IT practitioners is how to understand the practical use and advantages as well as the pitfalls and challenges from these digital technologies

  • What can be done using digital technologies to enhance customer experience, employee productivity and sell more products and services
  • Where to position in a digital market, create generative reinforcing positive behavior and feedback for better market branding
  • Who are the beneficiaries of the digital economy and the impact on the roles and jobs of business and IT professionals
  • Why do enterprises and industry marketplaces need to understand the disruptive effects of these digital technologies and how to leverage these for competitive advantage.
  • How to architect and design robust digital solutions that support the enterprise, its supply chain and extended consumers, customers and providers

References

  1. http://www.technologyreview.com/news/518876/the-costly-paradox-of-health-care-technology/.
  2. http://www.kyoto-smartcity.com/result_pdf/ksce2014_hwang.pdf.
  3. http://www.smartinsights.com/digital-marketing-strategy/online-retail-sales-growth/
  4. http://www.statisticbrain.com/internet-travel-hotel-booking-statistics/
  5. http://www.fastcompany.com/3019097/fast-feed/63-of-americans-70-of-milennials-are-cybercrime-victims
  6. https://www.kpmg.com/Global/en/IssuesAndInsights/ArticlesPublications/Documents/cyber-crime.pdf
  7. http://www.misq.org/contents-37-2
  8. http://www.gartner.com/newsroom/id/2602817
  9. http://www2.warwick.ac.uk/fac/sci/wmg/mediacentre/wmgnews/?newsItem=094d43a23d3fbe05013d835d6d5d05c6

 

Skilton7Digital Health

As the cost of health care, the increasing aging population and the rise of medical advances enable people to live longer and improved quality of life; the health sector together with governments and private industry are increasingly using digital technologies to manage the rising costs of health care while improve patient survival and quality outcomes.

Digital Health Technologies

mHealth, TeleHealth and Translation-to-Bench Health services are just some of the innovative medical technology practices creating new Connected Health Digital Ecosystems.

These systems connect Mobile phones, wearable health monitoring devices, remote emergency alerts to clinician respond and back to big data research for new generation health care.

The case for digital change

UN Department of Economic and Social Affairs

“World population projected to reach 8.92 billion for 2050 and 9.22 Million in 2075. Life expectance is expected to range from 66 to 97 years by 2100.”

OECD Organization for Economic Cooperation and Development

The cost of Health care in developing countries is 8 to 17% of GDP in developed countries. But overall Health car e spending is falling while population growth and life expectancy and aging is increasing.

 

Skilton8Smart cities

The desire to improve buildings, reduce pollution and crime, improve transport, create employment, better education and ways to launch new business start-ups through the use of digital technologies are at the core of important outcomes to drive city growth from “Smart Cities” digital Ecosystem.

Smart city digital technologies

Embedded sensors in building energy management, smart ID badges, and mobile apps for location based advice and services supporting social media communities, enabling improved traffic planning and citizen service response are just some of the ways digital technologies are changing the physical city in the new digital metropolis hubs of tomorrow.

The case for digital change

WHO World Health Organization

“By the middle of the 21st century, the urban population will almost double globally, By 2030, 6 out of every 10 people will live in a city, and by 2050, this proportion will increase to 7 out of 10 people.”

UN Inter-governmental Panel on Climate Change IPCC

“In 2010, the building sector accounted for around 32% final energy use with energy demand projected to approximately double and CO2 emissions to increase by 50–150% by mid-century”

IATA International Air Transport Association

“Airline Industry Forecast 2013-2017 show that airlines expect to see a 31% increase in passenger numbers between 2012 and 2017. By 2017 total passenger numbers are expected to rise to 3.91 billion—an increase of 930 million passengers over the 2.98 billion carried in 2012.”

Mark Skilton 2 Oct 2013Professor Mark Skilton,  Professor of Practice in Information Systems Management , Warwick Business School has over twenty years’ experience in Information Technology and Business consulting to many of the top fortune 1000 companies across many industry sectors and working in over 25 countries at C level board level to transform their operations and IT value.  Mark’s career has included CIO, CTO  Director roles for several FMCG, Telecoms Media and Engineering organizations and recently working in Global Strategic Office roles in the big 5 consulting organizations focusing on digital strategy and new multi-sourcing innovation models for public and private sectors. He is currently a part-time Professor of practice at Warwick Business School, UK where he teaches outsourcing and the intervention of new digital business models and CIO Excellence practices with leading Industry practitioners.

Mark’s current research and industry leadership engagement interests are in Digital Ecosystems and the convergence of social media networks, big data, mobility, cloud computing and M2M Internet of things to enable digital workspaces. This has focused on define new value models digitizing products, workplaces, transport and consumer and provider contextual services. He has spoken and published internationally on these subjects and is currently writing a book on the Digital Economy Series.

Since 2010 Mark has held International standards body roles in The Open Group co-chair of Cloud Computing and leading Open Platform 3.0™ initiatives and standards publications. Mark is active in the ISO JC38 distributed architecture standards and in the Hubs-of-all-things HAT a multi-disciplinary project funded by the Research Council’s UK Digital Economy Programme. Mark is also active in Cyber security forums at Warwick University, Ovum Security Summits and INFOSEC. He has spoken at the EU Commission on Digital Ecosystems Agenda and is currently an EU Commission Competition Judge on Smart Outsourcing Innovation.

 

 

 

 

 

Leave a comment

Filed under Data management, digital technologies, Enterprise Architecture, Future Technologies, Healthcare, Open Platform 3.0, Uncategorized

Business Capabilities – Taking Your Organization into the Next Dimension

By Stuart Macgregor, Chief Executive, Real IRM Solutions

Decision-makers in large enterprises today face a number of paradoxes when it comes to implementing a business operating model and deploying Enterprise Architecture:

- How to stabilize and embed concrete systems that ensure control and predictability, but at the same time remain flexible and open to new innovations?

- How to employ new technology to improve the productivity of the enterprise and its staff in the face of continual pressures on the IT budget?

- How to ensure that Enterprise Architecture delivers tangible results today, but remains relevant in an uncertain future environment.

Answering these tough questions requires an enterprise to elevate its thinking beyond ‘business processes’ and develop a thorough understanding of its ‘business capabilities’. It demands that the enterprise optimizes and leverages these capabilities to improve every aspect of the business – from coal-face operations to blue-sky strategy.

Business capabilities articulate an organization’s inner-workings: the people, process, technology, tools, and content (information). Capabilities map the ways in which each component interfaces with each other, developing an intricate line-drawing of the entire organizational ecosystem at a technical and social level.  By understanding one’s current business capabilities, an organization is armed with a strategic planning tool. We refer to what is known as the BIDAT framework – which addresses the business, information, data, applications and technology architecture domains.

From this analysis, the journey to addressing the organization’s Enterprise Architecture estate begins. This culminates in the organization being able to dynamically optimize, add and improve on its capabilities as the external environment shifts and evolves. A BIDAT approach provides a permanent bridge between the two islands of business architecture and technology architecture.

Put another way, business capability management utilizes the right architectural solutions to deliver the business strategy. In this way, Enterprise Architecture is inextricably linked to capability management. It is the integrated architecture (combined with effective organizational change leadership) that develops the business capabilities and unleashes their power.

This can at times feel very conceptual and hard to apply to real-world environments. Perhaps the best recent example of tangible widespread implementations of a capability-based Enterprise Architecture approach is in South Africa’s minerals and mining sector.

Known as the Exploration and Mining Business Capability Reference Map, and published as part of a set of standards, this framework was developed by The Open Group Exploration, Mining, Metals and Minerals (EMMM™) Forum.  Focusing on all levels of mining operations, from strategic planning, portfolio planning, program enablement and project enablement – and based on the principles of open standards – this framework provides miners with a capability-based approach to information, processes, technology, and people.

The Reference Map isolates specific capabilities within mining organizations, analyzes them from multiple dimensions, and shows their various relationships to other parts of the organization. In the context of increased automation in the mining sector, this becomes an invaluable tool in determining those functions that are ripe for automation.

In this new dimension, this new era of business, there is no reason why achievements from the EMMM’s Business Capability Reference Map cannot be repeated in every industry, and in every mid- to large-scale enterprise throughout the globe.

For more information on joining The Open Group, please visit:  http://www.opengroup.org/getinvolved/becomeamember

For more information on joining The Open Group EMMM™ Forum, please visit:  http://opengroup.co.za/emmm

Photo - Stuart #2Stuart Macgregor is the Chief Executive of the South African company, Real IRM Solutions. Through his personal achievements, he has gained the reputation of an Enterprise Architecture and IT Governance specialist, both in South Africa and internationally.

Macgregor participated in the development of the Microsoft Enterprise Computing Roadmap in Seattle. He was then invited by John Zachman to Scottsdale Arizona to present a paper on using the Zachman framework to implement ERP systems. In addition, Macgregor was selected as a member of both the SAP AG Global Customer Council for Knowledge Management, and of the panel that developed COBIT 3rd Edition Management Guidelines. He has also assisted a global Life Sciences manufacturer to define their IT Governance framework, a major financial institution to define their global, regional and local IT organizational designs and strategy. He was also selected as a core member of the team that developed the South African Breweries (SABMiller) plc global IT strategy.

Stuart, as the lead researcher, assisted the IT Governance Institute map CobiT 4.0 to TOGAF® This mapping document was published by ISACA and The Open Group. More recently, he participated in the COBIT 5 development workshop held in London during May 2010.

 

 

 

1 Comment

Filed under EMMMv™, Enterprise Architecture, Enterprise Transformation, Standards, Uncategorized

The Onion & The Open Group Open Platform 3.0™

By Stuart Boardman, Senior Business Consultant, KPN Consulting, and Co-Chair of The Open Group Open Platform 3.0™

Onion1

The onion is widely used as an analogy for complex systems – from IT systems to mystical world views.Onion2

 

 

 

It’s a good analogy. From the outside it’s a solid whole but each layer you peel off reveals a new onion (new information) underneath.

And a slice through the onion looks quite different from the whole…Onion3

What (and how much) you see depends on where and how you slice it.Onion4

 

 

 

 

The Open Group Open Platform 3.0™ is like that. Use-cases for Open Platform 3.0 reveal multiple participants and technologies (Cloud Computing, Big Data Analytics, Social networks, Mobility and The Internet of Things) working together to achieve goals that vary by participant. Each participant’s goals represent a different slice through the onion.

The Ecosystem View
We commonly use the idea of peeling off layers to understand large ecosystems, which could be Open Platform 3.0 systems like the energy smart grid but could equally be the workings of a large cooperative or the transport infrastructure of a city. We want to know what is needed to keep the ecosystem healthy and what the effects could be of the actions of individuals on the whole and therefore on each other. So we start from the whole thing and work our way in.

Onion5

The Service at the Centre of the Onion

If you’re the provider or consumer (or both) of an Open Platform 3.0 service, you’re primarily concerned with your slice of the onion. You want to be able to obtain and/or deliver the expected value from your service(s). You need to know as much as possible about the things that can positively or negatively affect that. So your concern is not the onion (ecosystem) as a whole but your part of it.

Right in the middle is your part of the service. The first level out from that consists of other participants with whom you have a direct relationship (contractual or otherwise). These are the organizations that deliver the services you consume directly to enable your own service.

One level out from that (level 2) are participants with whom you have no direct relationship but on whose services you are still dependent. It’s common in Platform 3.0 that your partners too will consume other services in order to deliver their services (see the use cases we have documented). You need to know as much as possible about this level , because whatever happens here can have a positive or negative effect on you.

One level further from the centre we find indirect participants who don’t necessarily delivery any part of the service but whose actions may well affect the rest. They could just be indirect materials suppliers. They could also be part of a completely different value network in which your level 1 or 2 “partners” participate. You can’t expect to understand this level in detail but you know that how that value network performs can affect your partners’ strategy or even their very existence. The knock-on impact on your own strategy can be significant.

We can conceive of more levels but pretty soon a law of diminishing returns sets in. At each level further from your own organization you will see less detail and more variety. That in turn means that there will be fewer things you can actually know (with any certainty) and not much more that you can even guess at. That doesn’t mean that the ecosystem ends at this point. Ecosystems are potentially infinite. You just need to decide how deep you can usefully go.

Limits of the Onion
At a certain point one hits the limits of an analogy. If everybody sees their own organization as the centre of the onion, what we actually have is a bunch of different, overlapping onions.

Onion6

And you can’t actually make onions overlap, so let’s not take the analogy too literally. Just keep it in mind as we move on. Remember that our objective is to ensure the value of the service we’re delivering or consuming. What we need to know therefore is what can change that’s outside of our own control and what kind of change we might expect. At each visible level of the theoretical onion we will find these sources of variety. How certain of their behaviour we can be will vary – with a tendency to the less certain as we move further from the centre of the onion. We’ll need to decide how, if at all, we want to respond to each kind of variety.

But that will have to wait for my next blog. In the meantime, here are some ways people look at the onion.

Onion7   Onion8

 

 

 

 

SONY DSCStuart Boardman is a Senior Business Consultant with KPN Consulting where he leads the Enterprise Architecture practice and consults to clients on Cloud Computing, Enterprise Mobility and The Internet of Everything. He is Co-Chair of The Open Group Open Platform 3.0™ Forum and was Co-Chair of the Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by KPN, the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI as well as several Open Group white papers, guides and standards. He is a frequent speaker at conferences on the topics of Open Platform 3.0 and Identity.

2 Comments

Filed under Cloud, Cloud/SOA, Conference, Enterprise Architecture, Open Platform 3.0, Service Oriented Architecture, Standards, Uncategorized

ArchiMate® Users Group Meeting

By The Open Group

During a special ArchiMate® users group meeting on Wednesday, May 14 in Amsterdam, Andrew Josey, Director of Standards within The Open Group, presented on the ArchiMate certification program and adoption of the language. Andrew is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate 2.1, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4.

ArchiMate®, a standard of The Open Group, is an open and independent modeling language for Enterprise Architecture that is supported by different vendors and consulting firms. ArchiMate provides instruments to enable Enterprise Architects to describe, analyze and visualize the relationships among business domains in an unambiguous way. ArchiMate is not an isolated development. The relationships with existing methods and techniques, like modeling languages such as UML and BPMN, and methods and frameworks like TOGAF and Zachman, are well-described.

In this talk, Andrew provided an overview of the ArchiMate 2 certification program, including information on the adoption of the ArchiMate modeling language. He gave an overview of the major milestones in the development of Archimate and referred to the Dutch origins of the language. The Dutch Telematica Institute created the Archimate language in the period 2002-2004 and the language is now widespread. There have been over 41,000 downloads of different versions of the ArchiMate specification from more than 150 countries. At 52%, The Netherlands is leading the “Top 10 Certifications by country”. However, the “Top 20 Downloads by country” is dominated by the USA (19%), followed by the UK (14%) and The Netherlands (12%). One of the tools developed to support ArchiMate is Archi, a free open-source tool created by Phil Beauvoir at the University of Bolton in the UK. Since its development, Archi also has grown from a relatively small, home-grown tool to become a widely used open-source resource that averages 3,000 downloads per month and whose community ranges from independent practitioners to Fortune 500 companies. It is no surprise that again, Archi is mostly downloaded in The Netherlands (17.67%), the United States (12.42%) and the United Kingdom (8.81%).

After these noteworthy facts and figures, Henk Jonkers took a deep dive into modeling risk and security. Henk Jonkers is a senior research consultant, involved in BiZZdesign’s innovations in the areas of Enterprise Architecture and engineering. He was one of the main developers of the ArchiMate language, an author of the ArchiMate 1.0 and 2.0 Specifications, and is actively involved in the activities of the ArchiMate Forum of The Open Group. In this talk, Henk showed several examples of how risk and security aspects can be incorporated in Enterprise Architecture models using the ArchiMate language. He also explained how the resulting models could be used to analyze risks and vulnerabilities in the different architectural layers, and to visualize the business impact that they have.

First Henk described the limitations of current approaches – existing information security and risk management methods do not systematically identify potential attacks. They are based on checklists, heuristics and experience. Security controls are applied in a bottom-up way and are not based on a thorough analysis of risks and vulnerabilities. There is no explicit definition of security principles and requirements. Existing systems only focus on IT security. They have difficulties in dealing with complex attacks on socio-technical systems, combining physical and digital access, and social engineering. Current approaches focus on preventive security controls, and corrective and curative controls are not considered. Security by Design is a must, and there is always a trade-off between the risk factor versus process criticality. Henk gave some arguments as to why ArchiMate provides the right building blocks for a solid risk and security architecture. ArchiMate is widely accepted as an open standard for modeling Enterprise Architecture and support is widely available. ArchiMate is also suitable as a basis for qualitative and quantitative analysis. And last but not least: there is a good fit with other Enterprise Architecture and security frameworks (TOGAF, Zachman, SABSA).

“The nice thing about standards is that there are so many to choose from”, emeritus professor Andrew Stuart Tanenbaum once said. Using this quote as a starting point, Gerben Wierda focused his speech on the relationship between the ArchiMate language and Business Process Model and Notation (BPMN). In particular he discussed Bruce Silver’s BPMN Method and Style. He stated that ArchiMate and BPMN can exist side by side. Why would you link BPMN and Archimate? According to Gerben there is a fundamental vision behind all of this. “There are unavoidably many ‘models’ of the enterprise that are used. We cannot reduce that to one single model because of fundamentally different uses. We even cannot reduce that to a single meta-model (or pattern/structure) because of fundamentally different requirements. Therefore, what we need to do is look at the documentation of the enterprise as a collection of models with different structures. And what we thus need to do is make this collection coherent.”

Gerben is Lead Enterprise Architect of APG Asset Management, one of the largest Fiduciary Managers (± €330 billion Assets under Management) in the world, with offices in Heerlen, Amsterdam, New York, Hong Kong and Brussels. He has overseen the construction of one of the largest single ArchiMate models in the world to date and is the author of the book “Mastering ArchiMate”, based on his experience in large scale ArchiMate modeling. In his speech, Gerben showed how the leading standards ArchiMate and BPMN (Business Process Modeling Notation, an OMG standard) can be used together, creating one structured logically coherent and automatically synchronized description that combines architecture and process details.

Marc Lankhorst, Managing Consultant and Service Line Manager Enterprise Architecture at BiZZdesign, presented on the topic of capability modeling in ArchiMate. As an internationally recognized thought leader on Enterprise Architecture, he guides the development of BiZZdesign’s portfolio of services, methods, techniques and tools in this field. Marc is also active as a consultant in government and finance. In the past, he has managed the development of the ArchiMate language for Enterprise Architecture modeling, now a standard of The Open Group. Marc is a certified TOGAF9 Enterprise Architect and holds an MSc in Computer Science from the University of Twente and a PhD from the University of Groningen in the Netherlands. In his speech, Marc discussed different notions of “capability” and outlined the ways in which these might be modeled in ArchiMate. In short, a business capability is something an enterprise does or can do, given the various resources it possesses. Marc described the use of capability-based planning as a way of translating enterprise strategy to architectural choices and look ahead at potential extensions of ArchiMate for capability modeling. Business capabilities provide a high-level view of current and desired abilities of the organization, in relation to strategy and environment. Enterprise Architecture practitioners design extensive models of the enterprise, but these are often difficult to communicate with business leaders. Capabilities form a bridge between the business leaders and the Enterprise Architecture practitioners. They are very helpful in business transformation and are the ratio behind capability based planning, he concluded.

For more information on ArchiMate, please visit:

http://www.opengroup.org/subjectareas/enterprise/archimate

For information on the Archi tool, please visit: http://www.archimatetool.com/

For information on joining the ArchiMate Forum, please visit: http://www.opengroup.org/getinvolved/forums/archimate

 

1 Comment

Filed under ArchiMate®, Certifications, Conference, Enterprise Architecture, Enterprise Transformation, Professional Development, Standards, TOGAF®

The Open Group Summit Amsterdam 2014 – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On Tuesday, May 13, day two of The Open Group Summit Amsterdam, the morning plenary began with a welcome from The Open Group President and CEO Allen Brown. He presented an overview of the Forums and the corresponding Roadmaps. He described the process of standardization, from the initial work to a preliminary standard, including review documents, whitepapers and snapshots, culminating in the final publication of an open standard. Brown also announced that Capgemini is again a Platinum member of The Open Group and contributes to the realization of the organization’s objectives in various ways.

Charles Betz, Chief Architect, Signature Client Group, AT&T and Karel van Zeeland, Lead IT4IT Architect, Shell IT International, presented the second keynote of the morning, ‘A Reference Architecture For the Business of IT’.  When the IT Value Chain and IT4IT Reference Architecture is articulated, instituted and automated, the business can experience huge cost savings in IT and significantly improved response times for IT service delivery, as well as increasing customer satisfaction.

AmsterdamPlenaryKarel van Zeeland, Charles Betz and Allen Brown

In 1998, Shell Information Technology started to restructure the IT Management and the chaos was complete. There were too many tools, too many vendors, a lack of integration, no common data model, a variety of user interfaces and no standards to support rapid implementation. With more than 28 different solutions for incident management and more than 160 repositories of configuration data, the complexity was immense. An unclear relationship with Enterprise Architecture and other architectural issues made the case even worse.

Restructuring the IT Management turned out to be a long journey for the Shell managers. How to manage 1,700 locations in 90 countries, 8,000 applications, 25,000 servers, dozens of global and regional datacenters,125,000 PCs and laptops, when at the same time you are confronted with trends like BYOD, mobility, cloud computing, security, big data and the Internet of Things (IoT).  According to Betz and van Zeeland, IT4IT is a promising platform for evolution of the IT profession. IT4IT however has the potential to become a full open standard for managing the business of IT.

Jeroen Tas, CEO of Healthcare Informatics Solutions and Services within Philips Healthcare, explained in his keynote speech, “Philips is becoming a software company”. Digital solutions connect and streamline workflow across the continuum of care to improve patient outcomes. Today, big data is supporting adaptive therapies. Smart algorithms are used for early warning and active monitoring of patients in remote locations. Tas has a dream, he wants to make a valuable contribution to a connected healthcare world for everyone.

In January 2014, Royal Philips announced the formation of Healthcare Informatics Solutions and Services, a new business group within Philips’ Healthcare sector that offers hospitals and health systems the customized clinical programs, advanced data analytics and interoperable, cloud-based platforms necessary to implement new models of care. Tas, who previously served as the Chief Information Officer of Philips, leads the group.

In January of this year, The Open Group launched The Open Group Healthcare Forum whichfocuses on bringing Boundaryless Information Flow™ to the healthcare industry enabling data to flow more easily throughout the complete healthcare ecosystem.

Ed Reynolds, HP Fellow and responsible for the HP Enterprise Security Services in the US, described the role of information risk in a new technology landscape. How do C-level executives think about risk? This is a relevant and urgent question because it can take more than 243 days before a data breach is detected. Last year, the average cost associated with a data breach increased 78% to 11.9 million dollars. Critical data assets may be of strategic national importance, have massive corporate value or have huge significance to an employee or citizen, be it the secret recipe of Coca Cola or the medical records of a patient. “Protect your crown jewels” is the motto.

Bart Seghers, Cyber Security Manager, Thales Security and Henk Jonkers, Senior Research Consultant of BiZZdesign, visualized the Business Impact of Technical Cyber Risks. Attacks on information systems are becoming increasingly sophisticated. Organizations are increasingly networked and thus more complex. Attacks use digital, physical and social engineering and the departments responsible for each of these domains within an organization operate in silos. Current risk management methods cannot handle the resulting complexity. Therefore they are using ArchiMate® as a risk and security architecture. ArchiMate is a widely accepted open standard for modeling Enterprise Architecture. There is also a good fit with other EA and security frameworks, such as TOGAF®. A pentest-based Business Impact Assessment (BIA) is a powerful management dashboard that increases the return on investment for your Enterprise Architecture effort, they concluded.

Risk Management was also a hot topic during several sessions in the afternoon. Moderator Jim Hietala, Vice President, Security at The Open Group, hosted a panel discussion on Risk Management.

In the afternoon several international speakers covered topics including Enterprise Architecture & Business Value, Business & Data Architecture and Open Platform 3.0™. In relation to social networks, Andy Jones, Technical Director, EMEA, SOA Software, UK, presented “What Facebook, Twitter and Netflix Didn’t Tell You”.

The Open Group veteran Dr. Chris Harding, Director for Interoperability at The Open Group, and panelists discussed and emphasized the importance of The Open Group Open Platform 3.0™. The session also featured a live Q&A via Twitter #ogchat, #ogop3.

The podcast is now live. Here are the links:

Briefings Direct Podcast Home Page: http://www.briefingsdirect.com/

PODCAST STREAM: http://traffic.libsyn.com/interarbor/BriefingsDirect-The_Open_Group_Amsterdam_Conference_Panel_Delves_into_How_to_Best_Gain_Business_Value_From_Platform_3.mp3

PODCAST SUMMARY: http://briefingsdirect.com/the-open-group-amsterdam-panel-delves-into-how-to-best-gain-business-value-from-platform-30

In the evening, The Open Group hosted a tour and dinner experience at the world-famous Heineken Brewery.

For those of you who attended the summit, please give us your feedback! https://www.surveymonkey.com/s/AMST2014

Comments Off

Filed under ArchiMate®, Boundaryless Information Flow™, Certifications, Enterprise Architecture, Enterprise Transformation, Healthcare, Open Platform 3.0, RISK Management, Standards, TOGAF®, Uncategorized

The Open Group Summit Amsterdam 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The Open Group Summit Amsterdam, held at the historic Hotel Krasnapolsky, began on Monday, May 12 by highlighting how the industry is moving further towards Boundaryless Information Flow™. After the successful introduction of The Open Group Healthcare Forum in San Francisco, the Governing Board is now considering other vertical Forums such as the airline industry and utilities sector.

The morning plenary began with a welcome from Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects (AEA). He mentioned that Amsterdam has a special place in his heart because of the remembrance of the 2001 event also held in Amsterdam, just one month after the 9/11 attacks which shocked the world. Today, with almost 300 registrations and people from 29 different countries, The Open Group is still appealing to a wide range of nationalities.

Allen Brown, President and CEO of The Open Group, took the audience on a journey as he described the transformation process that The Open Group has been on over the last thirty years from its inception in 1984. After a radically financial reorganization and raising new working capital, The Open Group is flourishing more than ever and is in good financial health.

It is amazing that 40 percent of the staff of 1984 is still working for The Open Group. What is the secret? You should have the right people in the boat with shared values and commitment. “In 2014, The Open Group runs a business, but stays a not-for-profit organization, a consortium”, Brown emphasized. “Enterprise Architecture is not a commercial vehicle or a ‘trendy’ topic. The Open Group always has a positive attitude and will never criticize other organizations. Our certification programs are a differentiator compared to other organizations. We collaborate with other consortia and standard bodies like ISO and ITIL”, Brown said.

Now the world is much more complex. Technology risk is increasing. A common language based on common standards is needed more than ever. TOGAF®, an Open Group standard, was in its infancy in 1998 and now it is the common standard for Enterprise Architects all over the world. In 1984, the UNIX® platform was the first platform of The Open Group. The Open Group Open Platform 3.0™, launched last year, focuses on new and emerging technology trends like mobility, big data, cloud computing and the Internet of Things converging with each other and leading to new business models and system designs. “The Open Group is all about building relationships and networking”, Brown concluded.

Leonardo Ramirez, CEO of ARCA SG and Chair of AEA Colombia, talked about the role of interoperability and Enterprise Architecture in Latin America. Colombia is now a safe country and has the strongest economy in the region. In 2011 Colombia promoted the electronic government and TOGAF was selected as the best choice for Enterprise Architecture. Ramirez is determined to stimulate social economic development projects in Latin America with the help of Enterprise Architecture. There is a law in Colombia (Regulation Law 1712, 2014) that says that every citizen has the right to access all the public information without boundaries.

Dr. Jonas Ridderstråle, Chairman, Mgruppen and Visiting Professor, Ashridge (UK) and IE Business Schools (Spain), said in his keynote speech, “Womenomics rules, the big winners of the personal freedom movement will be women. Women are far more risk averse. What would have happened with Lehman Brothers if it was managed by women? ‘Lehman Sisters’ probably had the potential to survive. Now women can spend 80 percent of their time on other things than just raising kids.” Ridderstråle continued to discuss life-changing and game-changing events throughout his presentation. He noted that The Open Group Open Platform 3.0 for instance is a good example of a successful reinvention.

“Towards a European Interoperability Architecture” was the title of one of the afternoon sessions led by Mr. R. Abril Jimenez. Analysis during the first phase of the European Interoperability Strategy (EIS) found that, at conceptual level, architecture guidelines were missing or inadequate. In particular, there are no architectural guidelines for cross-border interoperability of building blocks. Concrete, reusable interoperability guidelines and rules and principles on standards and architecture are also lacking. Based on the results achieved and direction set in the previous phases of the action, the EIA project has moved into a more practical phase that consists of two main parts: Conceptual Reference Architecture and Cartography.

Other tracks featured Healthcare, Professional Development and Dependability through Assuredness™.

The evening concluded with a lively networking reception in the hotel’s Winter Garden ballroom.

For those of you who attended the summit, please give us your feedback!  https://www.surveymonkey.com/s/AMST2014

Comments Off

Filed under Boundaryless Information Flow™, Conference, Dependability through Assuredness™, Enterprise Architecture, Enterprise Transformation, Healthcare, Open Platform 3.0, Professional Development, Standards, TOGAF®, Uncategorized

Improving Patient Care and Reducing Costs in Healthcare

By Jason Lee, Director of Healthcare and Security Forums, The Open Group

Recently, The Open Group Healthcare Forum hosted a tweet jam to discuss IT and Enterprise Architecture (EA) issues as they relate to two of the most persistent problems in healthcare: reducing costs and improving patient care. Below I summarize the key points that followed from a rather unique discussion. Unique how? Unique in that rather than address these issues from the perspective of “must do” priorities (including EHR implementation, transitioning to ICD-10, and meeting enhanced HIPAA security requirements), we focused on “should do” opportunities.

We asked how stakeholders in the healthcare system can employ “Boundaryless Information Flow™” and standards development through the application of EA approaches that have proven effective in other industries to add new insights and processes to reduce costs and improve quality.

Question 1: What barriers exist for collaboration among providers in healthcare, and what can be done to improve things?
• tetradian: Huge barriers of language, terminology, mindset, worldview, paradigm, hierarchy, role and much more
• jasonsleephd: Financial, organizational, structural, lack of enabling technology, cultural, educational, professional insulation
• jim_hietala: EHRs with proprietary interfaces represent a big barrier in healthcare
• Technodad: Isn’t question really what barriers exist for collaboration between providers and patients in healthcare?
• tetradian: Communication b/w patients and providers is only one (type) amongst very many
• Technodad: Agree. Debate needs to identify whose point of view the #healthcare problem is addressing.
• Dana_Gardner: Where to begin? A Tower of Babel exists on multiple levels among #healthcare ecosystems. Too complex to fix wholesale.
• EricStephens: Also, legal ramifications of sharing information may impede sharing
• efeatherston: Patient needs provider collaboration to see any true benefit (I don’t just go to one provider)
• Dana_Gardner: Improve first by identifying essential collaborative processes that have most impact, and then enable them as secure services.
• Technodad: In US at least, solutions will need to be patient-centric to span providers- Bring Your Own Wellness (BYOW™) for HC info.
• loseby: Lack of shared capabilities & interfaces between EHRs leads to providers w/o comprehensive view of patient
• EricStephens: Are incentives aligned sufficiently to encourage collaboration? + lack of technology integration.
• tetradian: Vast numbers of stakeholder-groups, many beyond medicine – e.g. pharma, university, politics, local care (esp. outside of US)
• jim_hietala: Gap in patient-centric information flow
• Technodad: I think patents will need to drive the collaboration – they have more incentive to manage info than providers.
• efeatherston: Agreed, stakeholder list could be huge
• EricStephens: High-deductible plans will drive patients (us) to own our health care experience
• Dana_Gardner: Take patient-centric approach to making #healthcare processes better: drives adoption, which drives productivity, more adoption
• jasonsleephd: Who thinks standards development and data sharing is an essential collaboration tool?
• tetradian: not always patient-centric – e.g. epidemiology /public-health is population centric – i.e. _everything_ is ‘the centre’
• jasonsleephd: How do we break through barriers to collaboration? For one thing, we need to create financial incentives to collaborate (e.g., ACOs)
• efeatherston: Agreed, the challenge is to get them to challenge (if that makes sense). Many do not question
• EricStephens: Some will deify those in a lab coat.
• efeatherston: Still do, especially older generations, cultural
• Technodad: Agree – also displaying, fusing data from different providers, labs, monitors etc.
• dianedanamac: Online collaboration, can be cost effective & promote better quality but must financially incented
• efeatherston: Good point, unless there is a benefit/incentive for provider, they may not be bothered to try
• tetradian: “must financially incented” – often other incentives work better – money can be a distraction – also who pays?

Participants identified barriers that are not atypical: financial disincentives, underpowered technology, failure to utilize existing capability, lack of motivation to collaborate. Yet all participants viewed more collaboration as key. Consensus developed around:
• The patient (and by one commenter, the population) as the main driver of collaboration, and
• The patient as the most important stakeholder at the center of information flow.

Question 2: Does implementing remote patient tele-monitoring and online collaboration drive better and more cost-effective patient care?
• EricStephens: “Hell yes” comes to mind. Why drag yourself into a dr. office when a device can send the information (w/ video)
• efeatherston: Will it? Will those with high deductible plans have ability/understanding/influence to push for it?
• EricStephens: Driving up participation could drive up efficacy
• jim_hietala: Big opportunities to improve patient care thru remote tele-monitoring
• jasonsleephd: Tele-ICUs can keep patients (and money) in remote settings while receiving quality care
• jasonsleephd: Remote monitoring of patients admitted with CHF can reduce rehospitalization w/i 6 months @connectedhealth.org
• Dana_Gardner: Yes! Pacemakers now uplink to centralized analysis centers, communicate trends back to attending doctor. Just scratches surface
• efeatherston: Amen. Do that now, monthly uplink, annual check in with doctor to discuss any trends he sees.
• tetradian: Assumes tele-monitoring options even exist – very wide range of device-capabilities, from very high to not-much, and still not common.
• tetradian: (General request to remember that there’s more to the world, and medicine, than just the US and its somewhat idiosyncratic systems?)
• efeatherston: Yes, I do find myself looking through the lens of my own experiences, forgetting the way we do things may not translate
• jasonsleephd: Amen to point about our idiosyncrasies! Still, we have to live with them, and we can do so much better with good information flow!
• Dana_Gardner: Governments should remove barriers so more remote patient tele-monitoring occurs. Need to address the malpractice risks issue.
• TerryBlevins: Absolutely. Just want the information to go to the right place!
• Technodad: . Isn’t “right place” someplace you & all your providers can access? Need interoperability!
• TerryBlevins: It requires interoperability yes – the info must flow to those that must know.
• Technodad: Many areas where continuous monitoring can help. Improved IoT (internet of things) sensors e.g. cardio, blood chemistry coming. http://t.co/M3xw3tNvv3
• tetradian: Ethical/privacy concerns re how/with-whom that data is shared – e.g. with pharma, research, epidemiology etc
• efeatherston: Add employers to that etc. list of how/who/what is shared

Participants agreed that remote patient monitoring and telemonitoring can improve collaboration, improve patient care, and put patients more in control of their own healthcare data. However, participants expressed concerns about lack of widespread availability and the related issue of high cost. In addition, they raised important questions about who has access to these data, and they addressed nagging privacy and liability concerns.

Question 3: Can a mobile strategy improve patient experience, empowerment and satisfaction? If so, how?
• jim_hietala: mobile is a key area where patient health information can be developed/captured
• EricStephens: Example: link blood sugar monitor to iPhone to MyFitnessPal + gamification to drive adherence (and drive $$ down?)
• efeatherston: Mobile along with #InternetOfThings, wearables linked to mobile. Contact lens measuring blood sugar in recent article as ex.
• TerryBlevins: Sick people, or people getting sick are on the move. In a patient centric world we must match need.
• EricStephens: Mobile becomes a great data acquisition point. Something as simple as SMS can drive adherence with complication drug treatments
• jasonsleephd: mHealth is a very important area for innovation, better collaboration, $ reduction & quality improvement. Google recent “Webby Awards & handheld devices”
• tetradian: Mobile can help – e.g. use of SMS for medicine in Africa etc
• Technodad: Mobile isn’t option any more. Retail, prescription IoT, mobile network & computing make this a must-have. http://t.co/b5atiprIU9
• dianedanamac: Providers need to be able to receive the information mHealth
• Dana_Gardner: Healthcare should go location-independent. Patient is anywhere, therefore so is care, data, access. More than mobile, IMHO.
• Technodad: Technology and mobile demand will outrun regional provider systems, payers, regulation
• Dana_Gardner: As so why do they need to be regional? Cloud can enable supply-demand optimization regardless of location for much.
• TerryBlevins: And the caregivers are also on the move!
• Dana_Gardner: Also, more machine-driven care, i.e. IBM Watson, for managing the routing and prioritization. Helps mitigate overload.
• Technodad: Agree – more on that later!
• Technodad: Regional providers are the reality in the US. Would love to have more national/global coverage.
• Dana_Gardner: Yes, let the market work its magic by making it a larger market, when information is the key.
• tetradian: “let the market do its work” – ‘the market’ is probably the quickest way to destroy trust! – not a good idea…
• Technodad: To me, problem is coordinating among multi providers, labs etc. My health info seems to move at glacial pace then.
• tetradian: “Regional providers are the reality in the US.” – people move around: get info follow them is _hard_ (1st-hand exp. there…)
• tetradian: danger of hype/fear-driven apps – may need regulation, or at least regulatory monitoring
• jasonsleephd: Regulators, as in FDA or something similar?
• tetradian: “Regulators as in FDA” etc – at least oversight of that kind, yes (cf. vitamins, supplements, health-advice services)
• jim_hietala: mobile, consumer health device innovation moving much faster than IT ability to absorb
• tetradian: also beware of IT-centrism and culture – my 90yr-old mother has a cell-phone, but has almost no idea how to use it!
• Dana_Gardner: Information and rely of next steps (in prevention or acute care) are key, and can be mobile. Bring care to the patient ASAP.

Participants began in full agreement. Mobile health is not even an option but a “given” now. Recognition that provider ability to receive information is lacking. Cloud viewed as means to overcome regionalization of data storage problems. When the discussion turned to further development of mHealth there was some debate on what can be left to the market and whether some form of regulatory action is needed.

Question 4: Does better information flow and availability in healthcare reduce operation cost, and free up resources for more patient care?
• tetradian: A4: should do, but it’s _way_ more complex than most IT-folks seem to expect or understand (e.g. repeated health-IT fails in UK)
• jim_hietala: A4: removing barriers to health info flow may reduce costs, but for me it’s mostly about opportunity to improve patient care
• jasonsleephd: Absolutely. Consider claims processing alone. Admin costs in private health ins. are 20% or more. In Medicare less than 2%.
• loseby: Absolutely! ACO model is proving it. Better information flow and availability also significantly reduces hospital admissions
• dianedanamac: I love it when the MD can access my x-rays and lab results so we have more time.
• efeatherston: I love it when the MD can access my x-rays and lab results so we have more time.
• EricStephens: More info flow + availability -> less admin staff -> more med staff.
• EricStephens: Get the right info to the ER Dr. can save a life by avoiding contraindicated medicines
• jasonsleephd: EricStephens GO CPOE!!
• TerryBlevins: @theopengroup. believe so, but ask the providers. My doctor is more focused on patient by using simple tech to improve info flow
• tetradian: don’t forget link b/w information-flows and trust – if trust fails, so does the information-flow – worse than where we started!
• jasonsleephd: Yes! Trust is really key to this conversation!
• EricStephens: processing a claim, in most cases, should be no more difficult than an expense report or online order. Real-time adjudication
• TerryBlevins: Great point.
• efeatherston: Agreed should be, would love to see it happen. Trust in the data as mentioned earlier is key (and the process)
• tetradian: A4: sharing b/w patient and MD is core, yes, but who else needs to access that data – or _not_ see it? #privacy
• TerryBlevins: A4: @theopengroup can’t forget that if info doesn’t flow sometimes the consequences are fatal, so unblocked the flow.
• tetradian: .@TerryBlevins A4: “if info doesn’t flow sometimes the consequences are fatal,” – v.important!
• Technodad: . @tetradian To me, problem is coordinating among multi providers, labs etc. My health info seems to move at glacial pace then.
• TerryBlevins: A4: @Technodad @tetradian I have heard that a patient moving on a gurney moves faster than the info in a hospital.
• Dana_Gardner: A4 Better info flow in #healthcare like web access has helped. Now needs to go further to be interactive, responsive, predictive.
• jim_hietala: A4: how about pricing info flow in healthcare, which is almost totally lacking
• Dana_Gardner: A4 #BigData, #cloud, machine learning can make 1st points of #healthcare contact a tech interface. Not sci-fi, but not here either.

Starting with the recognition that this is a very complicated issue, the conversation quickly produced a consensus view that mobile health is key, both to cost reduction and quality improvement and increased patient satisfaction. Trust that information is accurate, available and used to support trust in the provider-patient relationship emerged as a relevant issue. Then, naturally, privacy issues surfaced. Coordination of information flow and lack of interoperability were recognized as important barriers and the conversation finally turned somewhat abstract and technical with mentions of big data and the cloud and pricing information flows without much in the way of specifying how to connect the dots.

Question 5: Do you think payers and providers are placing enough focus on using technology to positively impact patient satisfaction?
• Technodad: A5: I think there are positive signs but good architecture is lacking. Current course will end w/ provider information stovepipes.
• TerryBlevins: A5: @theopengroup Providers are doing more. I think much more is needed for payers – they actually may be worse.
• theopengroup: @TerryBlevins Interesting – where do you see opportunities for improvements with payers?
• TerryBlevins: A5: @theopengroup like was said below claims processing – an onerous job for providers and patients – mostly info issue.
• tetradian: A5: “enough focus on using tech”? – no, not yet – but probably won’t until tech folks properly face the non-tech issues…
• EricStephens: A5 No. I’m not sure patient satisfaction (customer experience/CX?) is even a factor sometimes. Patients not treated like customers
• dianedanamac: .@EricStephens SO TRUE! Patients not treated like customers
• Technodad: . @EricStephens Amen to that. Stovepipe data in provider systems is barrier to understanding my health & therefore satisfaction.
• dianedanamac: “@mclark497: @EricStephens issue is the customer is treat as only 1 dimension. There is also the family experience to consider too
• tetradian: .@EricStephens A5: “Patients not treated like customers” – who _is_ ‘the customer’? – that’s a really tricky question…
• efeatherston: @tetradian @EricStephens Trickiest question. to the provider is the patient or the payer the customer?
• tetradian: .@efeatherston “patient or payer” – yeah, though it gets _way_ more complex than that once we explore real stakeholder-relations
• efeatherston: @tetradian So true.
• jasonsleephd: .@tetradian @efeatherston Very true. There are so many diff stakeholders. But to align payers and pts would be huge
• efeatherston: @jasonsleephd @tetradian re: aligning payers and patients, agree, it would be huge and a good thing
• jasonsleephd: .@efeatherston @tetradian @EricStephens Ideally, there should be no dividing line between the payer and the patient!
• efeatherston: @jasonsleephd @tetradian @EricStephens Ideally I agree, and long for that ideal world.
• EricStephens: .@jasonsleephd @efeatherston @tetradian the payer s/b a financial proxy for the patient. and nothing more
• TerryBlevins: @EricStephens @jasonsleephd @efeatherston @tetradian … got a LOL out of me.
• Technodad: . @tetradian @EricStephens That’s a case of distorted marketplace. #Healthcare architecture must cut through to patient.
• tetradian: .@Technodad “That’s a case of distorted marketplace.” – yep. now add in the politics of consultants and their hierarchies, etc?
• TerryBlevins: A5: @efeatherston @tetradian @EricStephens in patient cetric world it is the patient and or their proxy.
• jasonsleephd: A5: Not enough emphasis on how proven technologies and architectural structures in other industries can benefit healthcare
• jim_hietala: A5: distinct tension in healthcare between patient-focus and meeting mandates (a US issue)
• tetradian: .@jim_hietala A5: “meeting mandates (a US issue)” – UK NHS (national-health-service) may be even worse than US – a mess of ‘targets’
• EricStephens: A5 @jim_hietala …and avoiding lawsuits
• tetradian: A5: most IT-type tech still not well-suited to the level of mass-uniqueness inherent in the healthcare context
• Dana_Gardner: A5 They are using tech, but patient “satisfaction” not yet a top driver. We have a long ways to go on that. But it can help a ton.
• theopengroup: @Dana_Gardner Agree, there’s a long way to go. What would you say is the starting point for providers to tie the two together?
• Dana_Gardner: @theopengroup An incentive other than to avoid lawsuits. A transparent care ratings capability. Outcomes focus based on total health
• Technodad: A5: I’d be satisfied just to not have to enter my patient info & history on a clipboard in every different provider I go to!
• dianedanamac: A5 @tetradian Better data sharing & Collab. less redundancy, lower cost, more focus on patient needs -all possible w/ technology
• Technodad: A5: The patient/payer discussion is a red herring. If the patient weren’t there, rest of the system would be unnecessary.
• jim_hietala: RT @Technodad: The patient/payer discussion is a red herring. If the patient weren’t there, rest of system unnecessary. AMEN

Very interesting conversation. Positive signs of progress were noted but so too were indications that healthcare will remain far behind the technology curve in the foreseeable future. Providers were given higher “grades” than payers. Yet, claims processing would seemingly be one of the easiest areas for technology-assisted improvement. One discussant noted that there will not be enough focus on technology in healthcare “until the tech folks properly face the non-tech issues”. This would seem to open a wide door for EA experts to enter the healthcare domain! The barriers (and opportunities) to this may be the topic of another tweet jam, or Open Group White Paper.
Interestingly, part way into the discussion the topic turned to the lack of a real customer/patient focus in healthcare. Not enough emphasis on patient satisfaction. Not enough attention to patient outcomes. There needs to be a better/closer alignment between what motivates payers and the needs of patients.

Question 6: As some have pointed out, many of the EHR systems are highly proprietary, how can standards deliver benefits in healthcare?
• jim_hietala: A6: Standards will help by lowering the barriers to capturing data, esp. for mhealth, and getting it to point of care
• tetradian: .@jim_hietala “esp. for mhealth” – focus on mhealth may be a way to break the proprietary logjam, ‘cos it ain’t proprietary yet
• TerryBlevins: A6: @theopengroup So now I deal with at least 3 different EHR systems. All requiring me to be the info steward! Hmmm
• TerryBlevins: A6 @theopengroup following up if they shared data through standards maybe they can synchronize.
• EricStephens: A6 – Standards lead to better interoperability, increased viscosity of information which will lead to lowers costs, better outcomes.
• efeatherston: @EricStephens and greater trust in the info (as was mentioned earlier, trust in the information key to success)
• jasonsleephd: A6: Standards development will not kill innovation but rather make proprietary systems interoperable
• Technodad: A6: Metcalfe’s law rules! HC’s many providers-many patients structure means interop systems will be > cost effective in long run.
• tetradian: A6: the politics of this are _huge_, likewise the complexities – if we don’t face those issues right up-front, this is going nowhere

On his April 24, 2014 post at www.weblog.tetradian.com, Tom Graves provided a clearly stated position on the role of The Open Group in delivering standards to help healthcare improve. He wrote:

“To me, this is where The Open Group has an obvious place and a much-needed role, because it’s more than just an IT-standards body. The Open Group membership are mostly IT-type organisations, yes, which tends to guide towards IT-standards, and that’s unquestionably of importance here. Yet perhaps the real role for The Open Group as an organisation is in its capabilities and experience in building consortia across whole industries: EMMM™ and FACE are two that come immediately to mind. Given the maze of stakeholders and the minefields of vested-interests across the health-context, those consortia-building skills and experience are perhaps what’s most needed here.”

The Open Group is the ideal organization to engage in this work. There are many ways to collaborate. You can join The Open Group Healthcare Forum, follow the Forum on Twitter @ogHealthcare and connect on The Open Group Healthcare Forum LinkedIn Group.

Jason Lee headshotJason Lee, Director of Healthcare and Security Forums at The Open Group, has conducted healthcare research, policy analysis and consulting for over 20 years. He is a nationally recognized expert in healthcare organization, finance and delivery and applies his expertise to a wide range of issues, including healthcare quality, value-based healthcare, and patient-centered outcomes research. Jason worked for the legislative branch of the U.S. Congress from 1990-2000 — first at GAO, then at CRS, then as Health Policy Counsel for the Chairman of the House Energy and Commerce Committee (in which role the National Journal named him a “Top Congressional Aide” and he was profiled in the Almanac of the Unelected). Subsequently, Jason held roles of increasing responsibility with non-profit organizations — including AcademyHealth, NORC, NIHCM, and NEHI. Jason has published quantitative and qualitative findings in Health Affairs and other journals and his work has been quoted in Newsweek, the Wall Street Journal and a host of trade publications. He is a Fellow of the Employee Benefit Research Institute, was an adjunct faculty member at the George Washington University, and has served on several boards. Jason earned a Ph.D. in social psychology from the University of Michigan and completed two postdoctoral programs (supported by the National Science Foundation and the National Institutes of Health). He is the proud father of twins and lives outside of Boston.

Comments Off

Filed under Boundaryless Information Flow™, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Professional Development, Standards

The Open Group Summit Amsterdam – ArchiMate® Day – May 14, 2014

By Andrew Josey, Director of Standards, The Open Group

The Open Group Summit 2014 Amsterdam features an all day track on the ArchiMate® modeling language, followed by an ArchiMate Users Group meeting in the evening. The meeting attendees include the core developers of the ArchiMate language, users and tool developers.

The sessions include tutorials, a panel session on the past, present and future of the language and case studies. The Users Group meeting follows in the evening. The evening session is free and open to all — whether attending the rest of the conference or not — and starts at 6pm with free beer and pizza!

The timetable for ArchiMate Day is as follows:

• Tutorials (09:00 – 10:30), Henry Franken, CEO, BiZZdesign, and Alan Burnett, COO & Consulting Head, Corso

Henry Franken will show how the TOGAF® and ArchiMate® standards can be used to provide an actionable EA capability. Alan Burnett will present on how the ArchiMate language can be extended to support roadmapping, which is a fundamental part of strategic planning and enterprise architecture.

• Panel Discussion (11:00 – 12:30), Moderator: Henry Franken, Chair of The Open Group ArchiMate Forum

The  topic for the Panel Discussion is the ArchiMate Language — Past, Present and Future. The panel is comprised of key developers and users of the ArchiMate® language, including Marc Lankhorst and Henk Jonkers from the ArchiMate Core team, Jan van Gijsen from SNS REAAL, a Dutch financial institution, and Gerben Wierda author of Mastering ArchiMate. The session will include brief updates on current status from the panel members (30 minutes) and a 60-minute panel discussion with questions from the moderator and audience.

• Case Studies (14:00 – 16:00), Geert Van Grootel, Senior Researcher, Department of Economy, Science & Innovation, Flemish Government; Patrick Derde, Consultant, Envizion; and Pieter De Leenheer, Co-Founder and Research Director, Collibra. Walter Zondervan, Member – Architectural Board, ASL-BiSL Foundation. Adina Aldea, BiZZdesign.

There are three case studies:

Geert Van Grootel, Patrick Derde, and Pieter De Leenheer will present on how you can manage your business meta data by means of the use of data model patterns and an Integrated Information Architecture approach supported by a standard formal architecture language ArchiMate.

Walter Zondervan will present an ArchiMate reference architecture for governance, based on BiSL.

Adina Aldea will present on how high level strategic models can be used and modelled based on the Strategizer method.

• ArchiMate Users Group Meeting (18:00 – 21:00)

The evening session is free and open to all — whether attending the rest of the conference or not. It will start at 6pm with free beer and pizza. Invited speakers for the Users Group Meeting include: Andrew Josey, Henk Jonkers,  Marc Lankhorst and Gerben Wierda:

- Andrew Josey will present on the ArchiMate certification program and adoption of the language
– Henk Jonkers will present on modeling risk and security
– Marc Lankhorst will present about capability modeling in ArchiMate
– Gerben Wierda will present about relating ArchiMate and BPMN

Why should you attend?
• Spend time interacting directly with other ArchiMate users and tool providers in a relaxed, engaging environment
• Opportunity to listen and understand how ArchiMate can be used to develop solutions to common industry problems
• Learn about the future directions and meet with key users and developers of the language and tools
• Interact with peers to broaden your expertise and knowledge in the ArchiMate language

For detailed information, see the ArchiMate Day agenda at http://www.opengroup.org/amsterdam2014/archimate / or our YouTube event video at http://youtu.be/UVARza3uZZ4

How to register

Registration for the ArchiMate® Users Group meeting is independent of The Open Group Conference registration. There is no fee but registration is required. Please register here, select one-day pass for pass type, insert the promotion code (AMST14-AUG), tick the box against Wednesday May 14th and select ArchiMate Users Group from the conference session list. You will then be registered for the event and should not be charged.  Please note that this promotion code should only be used for those attending only the evening meeting from 6:00 p.m. Anyone attending the conference or just the ArchiMate Day will have to pay the applicable registration fee.  User Group members who want to attend The Open Group conference and who are not members of The Open Group can register using the affiliate code AMST14-AFFIL.

 Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate 2.1, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Comments Off

Filed under ArchiMate®, Enterprise Architecture, Professional Development, Standards, TOGAF®, Uncategorized

Heartbleed: Tips and Lessons Learned

By Jim Hietala, VP, Security, The Open Group

During our upcoming event May 12-14, The Open Group Summit 2014 AmsterdamEnabling Boundaryless Information Flow™ – one of the discussions will be around risk management and the development of open methodologies for managing risk.

Managing risk is an essential component of an information security program. Risk management is fundamental to effectively securing information, IT assets, and critical business processes. Risk management is also a challenge to get right. With numerous risk management frameworks and standards available, it can be difficult for practitioners to know where to start, and what methodologies to employ.

Recently, the Heartbleed bug has been wreaking havoc not only for major websites and organizations, but the security confidence of the public in general. Even as patches are being made to guarantee safety, systems will remain vulnerable for an extended period of time. Taking proactive steps and learning how to manage risk is imperative to securing your privacy.

With impacts on an estimated 60-70% of websites, Heartbleed is easily the security vulnerability with the highest degree of potential impact ever. There is helpful guidance as to what end-users can try to do to insulate themselves from any negative consequences.

Large organizations obviously need to determine where they have websites and network equipment that is vulnerable, in order to rapidly remediate this. Scanning your IP address range (both for internal addresses, and for IP addresses exposed to the Internet) should be done ASAP, to allow you to identify all sites, servers, and other equipment using OpenSSL, and needing immediate patching.

In the last few days, it has become clear that we are not just talking about websites/web servers. Numerous network equipment vendors have used OpenSSL in their networking products. Look closely at your routers, switches, firewalls, and make sure that you understand in which of these OpenSSL is also an issue. The impact of OpenSSL and Heartbleed on these infrastructure components is likely to be a bigger problem for organizations, as the top router manufacturers all have products affected by this vulnerability.

Taking a step back from the immediate frenzy of finding OpenSSL, and patching websites and network infrastructure to mitigate this security risk, it is pretty clear that we have a lot of work to do as a security community on numerous fronts:

• Open source security components that gain widespread use need much more serious attention, in terms of finding/fixing software vulnerabilities
• For IT hardware and software vendors, and for the organizations that consume their products, OpenSSL and Heartbleed will become the poster child for why we need more rigorous supply chain security mechanisms generally, and specifically for commonly used open source software.
• The widespread impacts from Heartbleed should also focus attention on the need for radically improved security for the emerging Internet of Things (IoT). As bad as Heartbleed is, try to imagine a similar situation when there are billions of IP devices connected to the internet. This is precisely where we are headed absent big changes in software assurance/supply chain security for IoT devices.

Finally, there is a deeper issue here: CIOs and IT people should realize that the fundamental security barriers, such as SSL are under constant attack – and these security walls won’t hold forever. So, it is important not to simply patch your SSL and reissue your certificates, but to rethink your strategies for security defense in depth, such as increased protection of critical data and multiple independent levels of security.

You also need to ensure that your suppliers are implementing security practices that are at least as good as yours – how many web sites got caught out by Heartbleed because of something their upstream supplier did?

Discussions during the Amsterdam Summit will outline important areas to be aware of when managing security risk, including how to be more effective against any copycat bugs. Be sure to sign up now for our summit http://www.opengroup.org/amsterdam2014 .

For more information on The Open Group Security Forum, please visit http://www.opengroup.org/subjectareas/security.

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security, risk management and healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

2 Comments

Filed under Boundaryless Information Flow™, Cybersecurity, Information security, RISK Management