Tag Archives: standards

The Open Group Boston 2014 – Day Two Highlights

By Loren K. Bayes, Director, Global Marketing Communications

Enabling Boundaryless Information Flow™  continued in Boston on Tuesday, July 22Allen Brown, CEO and President of The Open Group welcomed attendees with an overview of the company’s second quarter results.

The Open Group membership is at 459 organizations in 39 countries, including 16 new membership agreements in 2Q 2014.

Membership value is highlighted by the collaboration Open Group members experience. For example, over 4,000 individuals attended Open Group events (physically and virtually whether at member meetings, webinars, podcasts, tweet jams). The Open Group website had more than 1 million page views and over 105,000 publication items were downloaded by members in 80 countries.

Brown also shared highlights from The Open Group Forums which featured status on many upcoming white papers, snapshots, reference models and standards, as well as individiual Forum Roadmaps. The Forums are busy developing and reviewing projects such as the Next Version of TOGAF®, an Open Group standard, an ArchiMate® white paper, The Open Group Healthcare Forum charter and treatise, Standard Mils™ APIs and Open Fair. Many publications are translated into multiple languages including Chinese and Portuguese. Also, a new Forum will be announced in the third quarter at The Open Group London 2014 so stay tuned for that launch news!

Our first keynote of the day was Making Health Addictive by Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health.

Dr. Kvedar described how Healthcare delivery is changing, with mobile technology being a big part. Other factors pushing changes are reimbursement paradigms and caregivers being paid to be more efficient and interested in keeping people healthy and out of hospitals. The goal of Healthcare providers is to integrate care into the day-to-day lives of patients. Healthcare also aims for better technologies and architecture.

Mobile is a game-changer in Healthcare because people are “always on and connected”. Mobile technology allows for in-the-moment messaging, ability to capture health data (GPS, accelerator, etc.) and display information in real time as needed. Bottom-line, smartphones are addictive so they are excellent tools for communication and engagement.

But there is a need to understand and address the implications of automating Healthcare: security, privacy, accountability, economics.

The plenary continued with Proteus Duxbury, CTO, Connect for Health Colorado, who presented From Build to Run at the Colorado Health Insurance Exchange – Achieving Long-term Sustainability through Better Architecture.

Duxbury stated the keys to successes of his organization are the leadership and team’s shared vision, a flexible vendor being agile with rapidly changing regulatory requirements, and COTS solution which provided minimal customization and custom development, resilient architecture and security. Connect for Health experiences many challenges including budget restraints, regulation and operating in a “fish bowl”. Yet, they are on-track with their three-year ‘build to run’ roadmap, stabilizing their foundation and gaining efficiencies.

During the Q&A with Allen Brown following each presentation, both speakers emphasized the need for standards, architecture and data security.

Brown and DuxburyAllen Brown and Proteus Duxbury

During the afternoon, track sessions consisted of Healthcare, Enterprise Architecture (EA) & Business Value, Service-Oriented Architecture (SOA), Security & Risk Management, Professional Development and ArchiMate Tutorials. Chris Armstrong, President, Armstrong Process Group, Inc. discussed Architecture Value Chain and Capability Model. Laura Heritage, Principal Solution Architect / Enterprise API Platform, SOA Software, presented Protecting your APIs from Threats and Hacks.

The evening culminated with a reception at the historic Old South Meeting House, where the Boston Tea Party began in 1773.

photo2

IMG_2814Networking Reception at Old South Meeting House

A special thank you to our sponsors and exhibitors at The Open Group Boston 2014: BiZZdesign, Black Duck, Corso, Good e-Learning, Orbus and AEA.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Leave a comment

Filed under Accreditations, Boundaryless Information Flow™, Business Architecture, COTS, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open FAIR Certification, OTTF, RISK Management, Service Oriented Architecture, Standards, Uncategorized

The Open Group Boston 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

The Open Group kicked off Enabling Boundaryless Information Flow™  July 21 at the spectacular setting of the Hyatt Boston Harbor. Allen Brown, CEO and President of The Open Group, welcomed over 150 people from 20 countries, including as far away as Australia, Japan, Saudi Arabia and India.

The first keynote speaker was Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, known as a leading expert in business models. His presentation entitled Platform Shift – How New Open Business Models are Changing the Shape of Industry posed the questions “What does ‘openness’ mean? Why do platforms beat products every time?”.

Van AlstyneMarshall Van Alstyne

According to “InterBrand: 2014 Best Global Brands”, 13 of the top 31 companies are “platform companies”. To be a ‘platform’, a company needs embeddable functions or service and allow 3rd party access. Alystyne noted, “products have features, platforms have communities”. Great standalone products are not sufficient. Positive changes experienced by a platform company include pricing/profitability, supply chains, internal organization, innovation, decreased industry bottlenecks and strategy.

Platforms benefit from broad contributions, as long as there is control of the top several complements. Alstyne commented, “If you believe in the power of community, you need to embrace the platform.”

The next presentation was Open Platform 3.0™ – An Integrated Approach to the Convergence of Technology Platforms, by Dr. Chris Harding, Director for Interoperability, The Open Group. Dr. Harding discussed how society has developed a digital society.

1970 was considered the dawn of an epoch which saw the First RAM chip, IBM introduction of System/370 and a new operating system – UNIX®. Examples of digital progress since that era include driverless cars and Smart Cities (management of traffic, energy, water, communication).

Digital society enablers are digital structural change and corporate social media. The benefits are open innovation, open access, open culture, open government and delivering more business value.

Dr. Harding also noted, standards are essential to innovation and enable markets based on integration. The Open Group Open Platform 3.0™ is using ArchiMate®, an Open Group standard, to analyze the 30+ business use cases produced by the Forum. The development cycle is understanding, analysis, specification, iteration.

Dr. Harding emphasized the importance of Boundaryless Information Flow™, as an enabler of business objectives and efficiency through IT standards in the era of digital technology, and designed for today’s agile enterprise with direct involvement of business users.

Both sessions concluded with an interactive audience Q&A hosted by Allen Brown.

The last session of the morning’s plenary was a panel: The Internet of Things and Interoperability. Dana Gardner, Principal Analyst at Interarbor Solutions, moderated the panel. Participating in the panel were Said Tabet, CTO for Governance, Risk and Compliance Strategy, EMC; Penelope Gordon, Emerging Technology Strategist, 1Plug Corporation; Jean-Francois Barsoum, Senior Managing Consultant, Smarter Cities, Water & Transportation, IBM; and Dave Lounsbury, CTO, The Open Group.

IoT PanelIoT Panel – Gardner, Barsoum, Tabet, Lounsbury, Gordon

The panel explored the practical limits and opportunities of Internet of Things (IoT). The different areas discussed include obstacles to decision-making as big data becomes more prolific, openness, governance and connectivity of things, data and people which pertain to many industries such as smart cities, manufacturing and healthcare.

How do industries, organizations and individuals deal with IoT? This is not necessarily a new problem, but an accelerated one. There are new areas of interoperability but where does the data go and who owns the data? Openness is important and governance is essential.

What needs to change most to see the benefits of the IoT? The panel agreed there needs to be a push for innovation, increased education, move beyond models of humans managing the interface (i.e. machine-to-machine) and determine what data is most important, not always collecting all the data.

A podcast and transcript of the Internet of Things and Interoperability panel will be posted soon.

The afternoon was divided into several tracks: Boundaryless Information Flow™, Open Platform 3.0™ and Enterprise Architecture (EA) & Enterprise Transformation. Best Practices for Enabling Boundaryless Information Flow across the Government was presented by Syed Husain, Consultant Enterprise Architecture, Saudi Arabia E-government Authority. Robert K. Pucci, CTO, Communications Practice, Cognizant Technology Solutions discussed Business Transformation Justification Leveraging Business and Enterprise Architecture.

The evening concluded with a lively networking reception at the hotel.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

 

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Interoperability, Open Platform 3.0, Professional Development, Standards, Uncategorized

New Health Data Deluges Require Secure Information Flow Enablement Via Standards, Says The Open Group’s New Healthcare Director

By The Open Group

Below is the transcript of The Open Group podcast on how new devices and practices have the potential to expand the information available to Healthcare providers and facilities.

Listen to the podcast here.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview coming to you in conjunction with The Open Group’s upcoming event, Enabling Boundaryless Information Flow™ July 21-22, 2014 in Boston.

GardnerI’m Dana Gardner, Principal Analyst at Interarbor Solutions and I’ll be your host and moderator for the series of discussions from the conference on Boundaryless Information Flow, Open Platform 3.0™, Healthcare, and Security issues.

One area of special interest is the Healthcare arena, and Boston is a hotbed of innovation and adaption for how technology, Enterprise Architecture, and standards can improve the communication and collaboration among Healthcare ecosystem players.

And so, we’re joined by a new Forum Director at The Open Group to learn how an expected continued deluge of data and information about patients, providers, outcomes, and efficiencies is pushing the Healthcare industry to rapid change.

WJason Lee headshotith that, please join me now in welcoming our guest. We’re here with Jason Lee, Healthcare and Security Forums Director at The Open Group. Welcome, Jason.

Jason Lee: Thank you so much, Dana. Good to be here.

Gardner: Great to have you. I’m looking forward to the Boston conference and want to remind our listeners and readers that it’s not too late to sign up. You can learn more at http://www.opengroup.org.

Jason, let’s start by talking about the relationship between Boundaryless Information Flow, which is a major theme of the conference, and healthcare. Healthcare perhaps is the killer application for Boundaryless Information Flow.

Lee: Interesting, I haven’t heard it referred to that way, but healthcare is 17 percent of the US economy. It’s upwards of $3 trillion. The costs of healthcare are a problem, not just in the United States, but all over the world, and there are a great number of inefficiencies in the way we practice healthcare.

We don’t necessarily intend to be inefficient, but there are so many places and people involved in healthcare, it’s very difficult to get them to speak the same language. It’s almost as if you’re in a large house with lots of different rooms, and every room you walk into they speak a different language. To get information to flow from one room to the other requires some active efforts and that’s what we’re undertaking here at The Open Group.

Gardner: What is it about the current collaboration approaches that don’t work? Obviously, healthcare has been around for a long time and there have been different players involved. What’s the hurdle? What prevents a nice, seamless, easy flow and collaboration in information that gets better outcomes? What’s the holdup?

Lee: There are many ways to answer that question, because there are many barriers. Perhaps the simplest is the transformation of healthcare from a paper-based industry to a digital industry. Everyone has walked into an office, looked behind the people at the front desk, and seen file upon file and row upon row of folders, information that’s kept in a written format.

When there’s been movement toward digitizing that information, not everyone has used the same system. It’s almost like trains running on a different gauge track. Obviously if the track going east to west is a different gauge than going north to south, then trains aren’t going to be able to travel on those same tracks. In the same way, healthcare information does not flow easily from one office to another or from one provider to another.

Gardner: So not only do we have disparate strategies for collecting and communicating health data, but we’re also seeing much larger amounts of data coming from a variety of new and different places. Some of them now even involve sensors inside of patients themselves or devices that people will wear. So is the data deluge, the volume, also an issue here?

Lee: Certainly. I heard recently that an integrated health plan, which has multiple hospitals involved, contains more elements of data than the Library of Congress. As information is collected at multiple points in time, over a relatively short period of time, you really do have a data deluge. Figuring out how to find your way through all the data and look at the most relevant for the patient is a great challenge.

Gardner: I suppose the bad news is that there is this deluge of data, but it’s also good news, because more data means more opportunity for analysis, a better ability to predict and determine best practices, and also provide overall lower costs with better patient care.

So it seems like the stakes are rather high here to get this right, to not just crumble under a volume or an avalanche of data, but to master it, because it’s perhaps the future. The solution is somewhere in there too.

Lee: No question about it. At The Open Group, our focus is on solutions. We, like others, put a great deal of effort into describing the problems, but figuring out how to bring IT technologies to bear on business problems, how to encourage different parts of organizations to speak to one another and across organizations to speak the same language, and to operate using common standards and language. That’s really what we’re all about.

And it is, in a large sense, part of the process of helping to bring healthcare into the 21st Century. A number of industries are a couple of decades ahead of healthcare in the way they use large datasets — big data, some people refer to it as. I’m talking about companies like big department stores and large online retailers. They really have stepped up to the plate and are using that deluge of data in ways that are very beneficial to them, and healthcare can do the same. We’re just not quite at the same level of evolution.

Gardner: And to your point, the stakes are so much higher. Retail is, of course, a big deal in the economy, but as you pointed out, healthcare is such a much larger segment and portion. So just making modest improvements in communication, collaboration, or data analysis can reap huge rewards.

Lee: Absolutely true. There is the cost side of things, but there is also the quality side. So there are many ways in which healthcare can improve through standardization and coordinated development, using modern technology that cannot just reduce cost, but improve quality at the same time.

Gardner: I’d like to get into a few of the hotter trends, but before we do, it seems that The Open Group has recognized the importance here by devoting the entire second day of their conference in Boston, that will be on July 22, to Healthcare.

Maybe you could give us a brief overview of what participants, and even those who come in online and view recorded sessions of the conference at http://new.livestream.com/opengroup should expect? What’s going to go on July 22nd?

Lee: We have a packed day. We’re very excited to have Dr. Joe Kvedar, a physician at Partners HealthCare and Founding Director of the Center for Connected Health, as our first plenary speaker. The title of his presentation is “Making Health Additive.” Dr. Kvedar is a widely respected expert on mobile health, which is currently the Healthcare Forum’s top work priority. As mobile medical devices become ever more available and diversified, they will enable consumers to know more about their own health and wellness. A great deal of data of potentially useful health data will be generated. How this information can be used–not just by consumers but also by the healthcare establishment that takes care of them as patients, will become a question of increasing importance. It will become an area where standards development and The Open Group can be very helpful.

Our second plenary speaker, Proteus Duxbury, Chief Technology Officer at Connect for Health Colorado,will discuss a major feature of the Affordable Care Act—the health insurance exchanges–which are designed to bring health insurance to tens of millions of people who previously did not have access to it. Mr. Duxbury is going to talk about how Enterprise Architecture–which is really about getting to solutions by helping the IT folks talk to the business folks and vice versa–has helped the State of Colorado develop their Health Insurance Exchange.

After the plenaries, we will break up into 3 tracks, one of which is Healthcare-focused. In this track there will be three presentations, all of which discuss how Enterprise Architecture and the approach to Boundaryless Information Flow can help healthcare and healthcare decision-makers become more effective and efficient.

One presentation will focus on the transformation of care delivery at the Visiting Nurse Service of New York. Another will address stewarding healthcare transformation using Enterprise Architecture, focusing on one of our Platinum members, Oracle, and a company called Intelligent Medical Objects, and how they’re working together in a productive way, bringing IT and healthcare decision-making together.

Then, the final presentation in this track will focus on the development of an Enterprise Architecture-based solution at an insurance company. The payers, or the insurers–the big companies that are responsible for paying bills and collecting premiums–have a very important role in the healthcare system that extends beyond administration of benefits. Yet, payers are not always recognized for their key responsibilities and capabilities in the area of clinical improvements and cost improvements.

With the increase in payer data brought on in large part by the adoption of a new coding system–the ICD-10–which will come online this year, there will be a huge amount of additional data, including clinical data, that become available. At The Open Group, we consider payers—health insurance companies (some of which are integrated with providers)–as very important stakeholders in the big picture..

In the afternoon, we’re going to switch gears a bit and have a speaker talk about the challenges, the barriers, the “pain points” in introducing new technology into the healthcare systems. The focus will return to remote or mobile medical devices and the predictable but challenging barriers to getting newly generated health information to flow to doctors’ offices and into patients records, electronic health records, and hospitals data keeping and data sharing systems.

We’ll have a panel of experts that responds to these pain points, these challenges, and then we’ll draw heavily from the audience, who we believe will be very, very helpful, because they bring a great deal of expertise in guiding us in our work. So we’re very much looking forward to the afternoon as well.

Gardner: It’s really interesting. A couple of these different plenaries and discussions in the afternoon come back to this user-generated data. Jason, we really seem to be on the cusp of a whole new level of information that people will be able to develop from themselves through their lifestyle, new devices that are connected.

We hear from folks like Apple, Samsung, Google, and Microsoft. They’re all pulling together information and making it easier for people to not only monitor their exercise, but their diet, and maybe even start to use sensors to keep track of blood sugar levels, for example.

In fact, a new Flurry Analytics survey showed 62 percent increase in the use of health and fitness application over the last six months on the popular mobile devices. This compares to a 33 percent increase in other applications in general. So there’s an 87 percent faster uptick in the use of health and fitness applications.

Tell me a little bit how you see this factoring in. Is this a mixed blessing? Will so much data generated from people in addition to the electronic medical records, for example, be a bad thing? Is this going to be a garbage in, garbage out, or is this something that could potentially be a game-changer in terms of how people react to their own data and then bring more data into the interactions they have with care providers?

Lee: It’s always a challenge to predict what the market is going to do, but I think that’s a remarkable statistic that you cited. My prediction is that the increased volume of person- generated data from mobile health devices is going to be a game-changer. This view also reflects how the Healthcare Forum members (which includes members from Capgemini, Philips, IBM, Oracle and HP) view the future.

The commercial demand for mobile medical devices, things that can be worn, embedded, or swallowed, as in pills, as you mentioned, is growing ever more. The software and the applications that will be developed to be used with the devices is going to grow by leaps and bounds. As you say, there are big players getting involved. Already some of the pedometer type devices that measure the number of steps taken in a day have captured the interest of many, many people. Even David Sedaris, serious guy that he is, was writing about it recently in ‘The New Yorker’.

What we will find is that many of the health indicators that we used to have to go to the doctor or nurse or lab to get information on will become available to us through these remote devices.

There will be a question, of course, as to reliability and validity of the information, to your point about garbage in, garbage out, but I think standards development will help here This, again, is where The Open Group comes in. We might also see the FDA exercising its role in ensuring safety here, as well as other organizations, in determining which devices are reliable.

The Open Group is working in the area of mobile data and information systems that are developed around them, and their ability to (a) talk to one another and (b) talk to the data devices/infrastructure used in doctors’ offices and in hospitals. This is called interoperability and it’s certainly lacking in the country.

There are already problems around interoperability and connectivity of information in the healthcare establishment as it is now. When patients and consumers start collecting their own data, and the patient is put at the center of the nexus of healthcare, then the question becomes how does that information that patients collect get back to the doctor/clinician in ways in which the data can be trusted and where the data are helpful?

After all, if a patient is wearing a medical device, there is the opportunity to collect data, about blood sugar level let’s say, throughout the day. And this is really taking healthcare outside of the four walls of the clinic and bringing information to bear that can be very, very useful to clinicians and beneficial to patients.

In short, the rapid market dynamic in mobile medical devices and in the software and hardware that facilitates interoperability begs for standards-based solutions that reduce costs and improve quality, and all of which puts the patient at the center. This is The Open Group’s Healthcare Forum’s sweet spot.

Gardner: It seems to me a real potential game-changer as well, and that something like Boundaryless Information Flow and standards will play an essential role. Because one of the big question marks with many of the ailments in a modern society has to do with lifestyle and behavior.

So often, the providers of the care only really have the patient’s responses to questions, but imagine having a trove of data at their disposal, a 360-degree view of the patient to then further the cause of understanding what’s really going on, on a day-to-day basis.

But then, it’s also having a two-way street, being able to deliver perhaps in an automated fashion reinforcements and incentives, information back to the patient in real-time about behavior and lifestyles. So it strikes me as something quite promising, and I look forward to hearing more about it at the Boston conference.

Any other thoughts on this issue about patient flow of data, not just among and between providers and payers, for example, or providers in an ecosystem of care, but with the patient as the center of it all, as you said?

Lee: As more mobile medical devices come to the market, we’ll find that consumers own multiple types of devices at least some of which collect multiple types of data. So even for the patient, being at the center of their own healthcare information collection, there can be barriers to having one device talk to the other. If a patient wants to keep their own personal health record, there may be difficulties in bringing all that information into one place.

So the interoperability issue, the need for standards, guidelines, and voluntary consensus among stakeholders about how information is represented becomes an issue, not just between patients and their providers, but for individual consumers as well.

Gardner: And also the cloud providers. There will be a variety of large organizations with cloud-modeled services, and they are going to need to be, in some fashion, brought together, so that a complete 360-degree view of the patient is available when needed. It’s going to be an interesting time.

Of course, we’ve also looked at many other industries and tried to have a cloud synergy, a cloud-of-clouds approach to data and also the transaction. So it’s interesting how what’s going on in multiple industries is common, but it strikes me that, again, the scale and the impact of the healthcare industry makes it a leader now, and perhaps a driver for some of these long overdue structured and standardized activities.

Lee: It could become a leader. There is no question about it. Moreover, there is a lot Healthcare can learn from other companies, from mistakes that other companies have made, from lessons they have learned, from best practices they have developed (both on the content and process side). And there are issues, around security in particular, where Healthcare will be at the leading edge in trying to figure out how much is enough, how much is too much, and what kinds of solutions work.

There’s a great future ahead here. It’s not going to be without bumps in the road, but organizations like The Open Group are designed and experienced to help multiple stakeholders come together and have the conversations that they need to have in order to push forward and solve some of these problems.

Gardner: Well, great. I’m sure there will be a lot more about how to actually implement some of those activities at the conference. Again, that’s going to be in Boston, beginning on July 21, 2014.

We’ll have to leave it there. We’re about out of time. We’ve been talking with a new Director at The Open Group to learn how an expected continued deluge of data and information about patients and providers, outcomes and efficiencies are all working together to push the Healthcare industry to rapid change. And, as we’ve heard, that might very well spill over into other industries as well.

So we’ve seen how innovation and adaptation around technology, Enterprise Architecture and standards can improve the communication and collaboration among Healthcare ecosystem players.

It’s not too late to register for The Open Group Boston 2014 (http://www.opengroup.org/boston2014) and join the conversation via Twitter #ogchat #ogBOS, where you will be able to learn more about Boundaryless Information Flow, Open Platform 3.0, Healthcare and other relevant topics.

So a big thank you to our guest. We’ve been joined by Jason Lee, Healthcare and Security Forums Director at The Open Group. Thanks so much, Jason.

Lee: Thank you very much.

 

 

 

 

 

 

 

 

 

Leave a comment

Filed under Boundaryless Information Flow™, Cloud, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Interoperability, Open Platform 3.0, Standards, Uncategorized

The Open Group Boston 2014 to Explore How New IT Trends are Empowering Improvements in Business

By The Open Group

The Open Group Boston 2014 will be held on July 21-22 and will cover the major issues and trends surrounding Boundaryless Information Flow™. Thought-leaders at the event will share their outlook on IT trends, capabilities, best practices and global interoperability, and how this will lead to improvements in responsiveness and efficiency. The event will feature presentations from representatives of prominent organizations on topics including Healthcare, Service-Oriented Architecture, Security, Risk Management and Enterprise Architecture. The Open Group Boston will also explore how cross-organizational collaboration and trends such as big data and cloud computing are helping to make enterprises more effective.

The event will consist of two days of plenaries and interactive sessions that will provide in-depth insight on how new IT trends are leading to improvements in business. Attendees will learn how industry organizations are seeking large-scale transformation and some of the paths they are taking to realize that.

The first day of the event will bring together subject matter experts in the Open Platform 3.0™, Boundaryless Information Flow™ and Enterprise Architecture spaces. The day will feature thought-leaders from organizations including Boston University, Oracle, IBM and Raytheon. One of the keynotes is from Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, which reveals the secret of internet-driven marketplaces. Other content:

• The Open Group Open Platform 3.0™ focuses on new and emerging technology trends converging with each other and leading to new business models and system designs. These trends include mobility, social media, big data analytics, cloud computing and the Internet of Things.
• Cloud security and the key differences in securing cloud computing environments vs. traditional ones as well as the methods for building secure cloud computing architectures
• Big Data as a service framework as well as preparing to deliver on Big Data promises through people, process and technology
• Integrated Data Analytics and using them to improve decision outcomes

The second day of the event will have an emphasis on Healthcare, with keynotes from Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health, and Connect for Health Colorado CTO, Proteus Duxbury. The day will also showcase speakers from Hewlett Packard and Blue Cross Blue Shield, multiple tracks on a wide variety of topics such as Risk and Professional Development, and Archimate® tutorials. Key learnings include:

• Improving healthcare’s information flow is a key enabler to improving healthcare outcomes and implementing efficiencies within today’s delivery models
• Identifying the current state of IT standards and future opportunities which cover the healthcare ecosystem
• How Archimate® can be used by Enterprise Architects for driving business innovation with tried and true techniques and best practices
• Security and Risk Management evolving as software applications become more accessible through APIs – which can lead to vulnerabilities and the potential need to increase security while still understanding the business value of APIs

Member meetings will also be held on Wednesday and Thursday, June 23-24.

Don’t wait, register now to participate in these conversations and networking opportunities during The Open Group Boston 2014: http://www.opengroup.org/boston2014/registration

Join us on Twitter – #ogchat #ogBOS

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open Platform 3.0, Professional Development, RISK Management, Service Oriented Architecture, Standards, Uncategorized

Brand Marketing of Standards

By Allen Brown, President and CEO, The Open Group

Today everyone is familiar with the power of brands. Managed well, they can develop strong biases amongst customers for the product or service, resulting in greatly increased revenues and profits. Managed badly, they can destroy a product or an organization.

I was sitting in San Francisco International Airport one day. A very loud couple was looking for somewhere to get coffee. The wife said, “There’s a Peet’s right here.” Angrily the husband replied, “I don’t want Peet’s, I want Starbucks!”

A jewelry retailer in the UK had grown, in six years, from having 150 stores to more than 2,000, with 25,000 staff and annual sales of £1.2 billion. Then at the Institute of Directors conference at the Royal Albert Hall in 1991, he told an audience of 5,000 business leaders the secret of his success. Describing his company’s products, he said: ‘We also do cut-glass sherry decanters complete with six glasses on a silver-plated tray that your butler can serve you drinks on, for £4.95. People say “How can you sell this for such a low price?”  I say, because it’s total crap.’  As if that were not enough, he added that his stores’ earrings were ‘cheaper than a prawn sandwich, but probably wouldn’t last as long’.

It was a joke that he had told before but this time it got into the press. Hordes of people queued at his stores, immediately that word got out, to return everything from earrings to engagement rings. The company was destroyed.

The identity of a brand emerges through communication backed up by a promise to customers. That promise can be a promise of quality or service or innovation or style. Or it can be much less tangible: “people like you buy this product”, for example.

Early in my career, I worked for a company that was in the business of manufacturing and marketing edible oils and fats – margarines, cooking oils and cooking fat.   When first developed, margarine was simply a substitute for the butter that was in short supply in the UK during wartime. But when butter once again became plentiful, the product needed to offer other advantages to the consumer. Research focused on methods to improve the quality of margarine–such as making it easier to spread, more flavorful and more nutritious.

At the time there were many brands all focused on a specific niche which together amounted to something like a 95% market share. Stork Margarine was promoted as a low cost butter substitute for working class households, Blue Band Margarine was positioned slightly up-market, Tomor Margarine for the kosher community, Flora Margarine was marketed as recommended by doctors as being good for the heart and so on. Today, Unilever continues to market these brands, amongst many others, successfully although the positioning may be a little different.

Creating, managing and communicating brands is not inexpensive but the rewards can be significant. There are three critical activities that must be done well. The brand must be protected, policed and promoted.

Protection starts with ensuring that the brand is trademarked but it does not end there. Consistent and correct usage of the brand is essential – without that, a trademark can be challenged and the value of the brand and all that has been invested in it can be lost.

Policing is about identifying and preventing unauthorized or incorrect usage of the mark by others. Unauthorized usage can range from organizations using the brand to market their own products or services, all the way up to counterfeit copies of the branded products. Cellophane is a registered trademark in the UK and other countries, and the property of Innovia Films. However, in many countries “cellophane” has become a generic term, often used informally to refer to a wide variety of plastic film products, even those not made of cellulose,such as plastic wrap, thereby diminishing the value of the brand to its owner. There are several other well-known and valuable marks that have been lost through becoming generic – mostly due to the brand owner not insisting on correct usage.

Promotion begins with identifying the target market, articulating the brand promise and the key purchase factors and benefits. The target market can be consumers or organizations but at the end of the day, people buy products or services or vote for candidates seeking election and it is important to segment and profile the target customers sufficiently and develop key messages for each segment.

Profiling has been around for a long time: the margarine example shows how it was used in the past.   But today consumers, organization buyers and voters have a plethora of messages targeted at them and through a broader than ever variety of media, so it is critical to be as precise as possible. Some of the best examples of profiling, such as soccer moms and NASCAR dads have been popularized as a result of their usage in US presidential election campaigns.

In the mid-1990’s X/Open (now part of The Open Group) started using branding to promote the market adoption of open standards. The members of X/Open had developed a set of specifications aimed at enabling portability of applications between the UNIX® systems of competing vendors, which was called the X/Open Portability Guide, or XPG for short.

The target market was the buyers of UNIX systems. The brand promise was that any product that was supplied by the vendors that carried the X/Open brand conformed to the specification, would always conform and, in the event of any non-conformance being found, the vendor would, at their own cost, rectify the non-conformance for the customer within a prescribed period of time. To this day, there has only ever been one report of non-conformance, an obscure mathematical result, reported by an academic. The vendor concerned quickly rectified the issue, even though it was extremely unlikely that any customer would ever be affected by it.

The trademark license agreement signed by all vendors who used the X/Open brand carried the words “warrant and represent” in support of the brand promise. It was a significant commitment on the part of the vendors as it also carried with it significant risk and potential liability.   For these reasons, the vendors pooled their resources to fund the development of test suite software, so they could better understand the commitment they had entered into. These test suites were developed in stages and, over time, their coverage of the set of specifications grew.

It was only later that products had to be tested and certified before they could carry the X/Open brand.

The trademark was, of course protected, policed and promoted. Procurements that could be identified, which were mostly government procurements, were recorded and totaled in excess of $50bn in a short period of time. Procurements by commerce and industry were more difficult to track, but were clearly significant.

The XPG brand program was enormously successful and has evolved to become the UNIX® brand program and, in spite of challenges from open source software, continues to deliver revenues for the vendors in excess of $30bn per annum.

When new brand programs are contemplated, an early concern of both vendors and customers is the cost. Customers worry that the vendors will pass the cost on to them; vendors worry that they will have to absorb the cost. In the case of XPG and UNIX, both sides looked not at the cost but at the benefits. For customers, even if the vendors had passed on the cost, the savings that could be achieved as a result of portability in a heterogeneous environment were orders of magnitude greater. For vendors, in a competitive environment, the price that they can charge customers, for their products, is dictated by the market, so their ability to pass on the costs of the branding program, directly to the customer, is limited. However, the reality is that the cost of the branding program pales into insignificance when spread over the revenue of related products. For one vendor we estimate the cost to be less than 100th of 1% of related revenue. Combine that with a preference from customers for branded products and everybody wins.

So the big question for vendors is: Do you see certification as a necessary cost to be kept as low as possible or do you see brand marketing of open standards, of which certification is a part, as a means to grow the market and your share of that market?

The big question for customers is: Do you want to negotiate and enforce a warranty with every vendor and in every contract or do you want the industry to do that for you and spread the cost over billions of dollars of procurements?

brown-smallAllen Brown is President and CEO of The Open Group – a global consortium that enables the achievement of business objectives through IT standards.  For over 15 years, Allen has been responsible for driving The Open Group’s strategic plan and day-to-day operations, including extending its reach into new global markets, such as China, the Middle East, South Africa and India. In addition, he was instrumental in the creation of the Association of Enterprise Architects (AEA)., which was formed to increase job opportunities for all of its members and elevate their market value by advancing professional excellence.

 

 

4 Comments

Filed under Brand Marketing, Certifications, Standards, Uncategorized, UNIX

The Open Group Open Platform 3.0™ Starts to Take Shape

By Dr. Chris Harding, Director for Interoperability, The Open Group

The Open Group published a White Paper on Open Platform 3.0™ at the start of its conference in Amsterdam in May 2014. This article, based on a presentation given at the conference, explains how the definition of the platform is beginning to emerge.

Introduction

Amsterdam is a beautiful place. Walking along the canals is like moving through a set of picture postcards. But as you look up at the houses beside the canals, and you see the cargo hoists that many of them have, you are reminded that the purpose of the arrangement was not to give pleasure to tourists. Amsterdam is a great trading city, and the canals were built as a very efficient way of moving goods around.

This is also a reminder that the primary purpose of architecture is not to look beautiful, but to deliver business value, though surprisingly, the two often seem to go together quite well.

When those canals were first thought of, it might not have been obvious that this was the right thing to do for Amsterdam. Certainly the right layout for the canal network would not be obvious. The beginning of a project is always a little uncertain, and seeing the idea begin to take shape is exciting. That is where we are with Open Platform 3.0 right now.

We started with the intention to define a platform to enable enterprises to get value from new technologies including cloud computing, social computing, mobile computing, big data, the Internet of Things, and perhaps others. We developed an Open Group business scenario to capture the business requirements. We developed a set of business use-cases to show how people are using and wanting to use those technologies. And that leads to the next step, which is to define the platform. All these new technologies and their applications sound wonderful, but what actually is Open Platform 3.0?

The Third Platform

Looking historically, the first platform was the computer operating system. A vendor-independent operating system interface was defined by the UNIX® standard. The X/Open Company and the Open Software Foundation (OSF), which later combined to form The Open Group, were created because companies everywhere were complaining that they were locked into proprietary operating systems. They wanted applications portability. X/Open specified the UNIX® operating system as a common application environment, and the value that it delivered was to prevent vendor lock-in.

The second platform is the World Wide Web. It is a common services environment, for services used by people browsing web pages or for web services used by programs. The value delivered is universal deployment and access. Any person or company anywhere can create a services-based solution and deploy it on the web, and every person or company throughout the world can access that solution.

Open Platform 3.0 is developing as a common architecture environment. This does not mean it is a replacement for TOGAF®. TOGAF is about how you do architecture and will continue to be used with Open Platform 3.0. Open Platform 3.0 is about what kind of architecture you will create. It will be a common environment in which enterprises can do architecture. The big business benefit that it will deliver is integrated solutions.

ChrisBlog1

Figure 1: The Third Platform

With the second platform, you can develop solutions. Anyone can develop a solution based on services accessible over the World Wide Web. But independently-developed web service solutions will very rarely work together “out of the box”.

There is an increasing need for such solutions to work together. We see this need when looking at The Open Platform 3.0 technologies. People want to use these technologies together. There are solutions that use them, but they have been developed independently of each other and have to be integrated. That is why Open Platform 3.0 has to deliver a way of integrating solutions that have been developed independently.

Common Architecture Environment

The Open Group has recently published its first thoughts on Open Platform 3.0 in the Open Platform 3.0 White Paper. This lists a number of things that will eventually be in the Open Platform 3.0 standard. Many of these are common architecture artifacts that can be used in solution development. They will form a common architecture environment. They are:

  • Statement of need, objectives, and principles – this is not part of that environment of course; it says why we are creating it.
  • Definitions of key terms – clearly you must share an understanding of the key terms if you are going to develop common solutions or integrable solutions.
  • Stakeholders and their concerns – an understanding of these is an important aspect of an architecture development, and something that we need in the standard.
  • Capabilities map – this shows what the products and services that are in the platform do.
  • Basic models – these show how the platform components work with each other and with other products and services.
  • Explanation of how the models can be combined to realize solutions – this is an important point and one that the white paper does not yet start to address.
  • Standards and guidelines that govern how the products and services interoperate – these are not standards that The Open Group is likely to produce, they will almost certainly be produced by other bodies, but we need to identify the appropriate ones and probably in some cases coordinate with the appropriate bodies to see that they are developed.

The Open Platform 3.0 White Paper contains an initial statement of needs, objectives and principles, definitions of some key terms, a first-pass list of stakeholders and their concerns, and half a dozen basic models. The basic models are in an analysis of the business use-cases for Open Platform 3.0 that were developed earlier.

These are just starting points. The white paper is incomplete: each of the sections is incomplete in itself, and of course the white paper does not contain all the sections that will be in the standard. And it is all subject to change.

An Example Basic Model

The figure shows a basic model that could be part of the Open Platform 3.0 common architecture environment.

ChrisBlog 2

Figure 2: Mobile Connected Device Model

This is the Mobile Connected Device Model: one of the basic models that we identified in the snapshot. It comes up quite often in the use-cases.

The stack on the left is a mobile device. It has a user, it has apps, it has a platform which would probably be Android or iOS, it has infrastructure that supports the platform, and it is connected to the World Wide Web, because that’s part of the definition of mobile computing.

On the right you see, and this is a frequently encountered pattern, that you don’t just use your mobile device for running apps. Maybe you connect it to a printer, maybe you connect it to your headphones, maybe you connect it to somebody’s payment terminal, you can connect it to many things. You might do this through a Universal Serial Bus (USB). You might do it through Bluetooth. You might do it by Near Field Communications (NFC). You might use other kinds of local connection.

The device you connect to may be operated by yourself (e.g. if it is headphones), or by another organization (e.g. if it is a payment terminal). In the latter case you typically have a business relationship with the operator of the connected device.

That is an example of the basic models that came up in the analysis of the use-cases. It is captured in the White Paper. It is fundamental to mobile computing and is also relevant to the Internet of Things.

Access to Technologies

This figure captures our understanding of the need to obtain information from the new technologies, social media, mobile devices, sensors and so on, the need to process that information, maybe on the cloud, to manage it and, ultimately, to deliver it in a form where there is analysis and reasoning that enables enterprises to take business decisions.

ChrisBlog 3

Figure 3: Access to Technologies

The delivery of information to improve the quality of decisions is the source of real business value.

User-Driven IT

The next figure captures a requirement that we picked up in the development of the business scenario.

ChrisBlog 4

Figure 4: User-Driven IT

Traditionally, you would have had the business use in the business departments of an enterprise, and pretty much everything else in the IT department. But we are seeing two big changes. One is that the business users are getting smarter, more able to use technology. The other is they want to use technology themselves, or to have business technologists closely working with them, rather than accessing it indirectly through the IT department.

The systems provisioning and management is now often done by cloud service providers, and the programming and integration and helpdesk by cloud brokers, or by an IT department that plays a broker role, rather than working in the traditional way.

The business still needs to retain responsibility for the overall architecture and for compliance. If you do something against your company’s principles, your customers will hold you responsible. It is no defense to say, “Our broker did it that way.” Similarly, if you break the law, your broker does not go to jail, you do. So those things will continue to be more associated with the business departments, even as the rest is devolved.

In short, businesses have a new way of using IT that Open Platform 3.0 must and will accommodate.

Integration of Independently-Developed Solutions

The next figure illustrates how the integration of independently developed solutions can be achieved.

ChrisBlog 5

Figure 5: Architecture Integration

It shows two solutions, which come from the analysis of different business use-cases. They share a common model, which makes it much easier to integrate them. That is why the Open Platform 3.0 standard will define common models for access to the new technologies.

The Open Platform 3.0 standard will have other common artifacts: architectural principles, stakeholder definitions and descriptions, and so on. Independently-developed architectures that use them can be integrated more easily.

Enterprises develop their architectures independently, but engage with other enterprises in business ecosystems that require shared solutions. Increasingly, business relationships are dynamic, and there is no time to develop an agreed ecosystem architecture from scratch. Use of the same architecture platform, with a common architecture environment including elements such as principles, stakeholder concerns, and basic models, enables the enterprise architectures to be integrated, and shared solutions to be developed quickly.

Completing the Definition

How will we complete the definition of Open Platform 3.0?

The Open Platform 3.0 Forum recently published a set of 22 business use-cases – the Nexus of Forces in Action. These use-cases show the application of Social, Mobile and Cloud Computing, Big Data, and the Internet of Things in a wide variety of business areas.

ChrisBlog 6

Figure 6: Business Use-Cases

The figure comes from that White Paper and shows some of those areas: multimedia, social networks, building energy management, smart appliances, financial services, medical research, and so on.

Use-Case Analysis

We have started to analyze those use-cases. This is an ArchiMate model showing how our first business use-case, The Mobile Smart Store, could be realized.

ChrisBlog 7

Figure 7: Use-Case Analysis

As you look at it you see common models. Outlined on the left is a basic model that is pretty much the same as the original TOGAF Technical Reference Model. The main difference is the addition of a business layer (which shows how enterprise architecture has moved in the business direction since the TRM was defined).

But you also see that the same model appears in the use-case in a different place, as outlined on the right. It appears many times throughout the business use-cases.

Finally, you can see that the Mobile Connected Device Model has appeared in this use-case (outlined in the center). It appears in other use-cases too.

As we analyze the use-cases, we find common models, as well as common principles, common stakeholders, and other artifacts.

The Development Cycle

We have a development cycle: understanding the value of the platform by considering use-cases, analyzing those use-cases to derive common features, and documenting the common features in a specification.

ChrisBlog 8

Figure 8: The Development Cycle

The Open Platform 3.0 White Paper represents the very first pass through that cycle, further passes will result in further White Papers, a snapshot, and ultimately The Open Platform 3.0 standard, and no doubt more than one version of that standard.

Conclusions

Open Platform 3.0 provides a common architecture environment. This enables enterprises to derive business value from social computing, mobile computing, big data, the Internet-of-Things, and potentially other new technologies.

Cognitive computing, for example, has been suggested as another technology that Open Platform 3.0 might in due course accommodate. What would that lead to? There would be additional use-cases, which would lead to further analysis, which would no doubt identify some basic models for cognitive computing, which would be added to the platform.

Open Platform 3.0 enables enterprise IT to be user-driven. There is a revolution in the way that businesses use IT. Users are becoming smarter and more able to use technology, and want to do so directly, rather than through a separate IT department. Business departments are taking in business technologists who understand how to use technology for business purposes. Some companies are closing their IT departments and using cloud brokers instead. In other companies, the IT department is taking on a broker role, sourcing technology that business people use directly.Open Platform 3.0 will be part of that revolution.

Open Platform 3.0 will deliver the ability to integrate solutions that have been independently developed. Businesses typically exist within one or more business ecosystems. Those ecosystems are dynamic: partners join, partners leave, and businesses cannot standardize the whole architecture across the ecosystem; it would be nice to do so but, by the time it was done, the business opportunity would be gone. Integration of independently developed architectures is crucial to the world of business ecosystems and delivering value within them.

Call for Input

The platform will deliver a common architecture environment, user-driven enterprise IT, and the ability to integrate solutions that have been independently developed. The Open Platform 3.0 Forum is defining it through an iterative process of understanding the content, analyzing the use-cases, and documenting the common features. We welcome input and comments from other individuals within and outside The Open Group and from other industry bodies.

If you have comments on the way Open Platform 3.0 is developing or input on the way it should develop, please tell us! You can do so by sending mail to platform3-input@opengroup.org or share your comments on our blog.

References

The Open Platform 3.0 White Paper: https://www2.opengroup.org/ogsys/catalog/W147

The Nexus of Forces in Action: https://www2.opengroup.org/ogsys/catalog/W145

TOGAF®: http://www.opengroup.org/togaf/

harding

Dr. Chris Harding is Director for Interoperability at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0™ Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

 

 

 

 

 

2 Comments

Filed under architecture, Boundaryless Information Flow™, Cloud, Cloud/SOA, digital technologies, Open Platform 3.0, Service Oriented Architecture, Standards, TOGAF®, Uncategorized

Business Capabilities – Taking Your Organization into the Next Dimension

By Stuart Macgregor, Chief Executive, Real IRM Solutions

Decision-makers in large enterprises today face a number of paradoxes when it comes to implementing a business operating model and deploying Enterprise Architecture:

- How to stabilize and embed concrete systems that ensure control and predictability, but at the same time remain flexible and open to new innovations?

- How to employ new technology to improve the productivity of the enterprise and its staff in the face of continual pressures on the IT budget?

- How to ensure that Enterprise Architecture delivers tangible results today, but remains relevant in an uncertain future environment.

Answering these tough questions requires an enterprise to elevate its thinking beyond ‘business processes’ and develop a thorough understanding of its ‘business capabilities’. It demands that the enterprise optimizes and leverages these capabilities to improve every aspect of the business – from coal-face operations to blue-sky strategy.

Business capabilities articulate an organization’s inner-workings: the people, process, technology, tools, and content (information). Capabilities map the ways in which each component interfaces with each other, developing an intricate line-drawing of the entire organizational ecosystem at a technical and social level.  By understanding one’s current business capabilities, an organization is armed with a strategic planning tool. We refer to what is known as the BIDAT framework – which addresses the business, information, data, applications and technology architecture domains.

From this analysis, the journey to addressing the organization’s Enterprise Architecture estate begins. This culminates in the organization being able to dynamically optimize, add and improve on its capabilities as the external environment shifts and evolves. A BIDAT approach provides a permanent bridge between the two islands of business architecture and technology architecture.

Put another way, business capability management utilizes the right architectural solutions to deliver the business strategy. In this way, Enterprise Architecture is inextricably linked to capability management. It is the integrated architecture (combined with effective organizational change leadership) that develops the business capabilities and unleashes their power.

This can at times feel very conceptual and hard to apply to real-world environments. Perhaps the best recent example of tangible widespread implementations of a capability-based Enterprise Architecture approach is in South Africa’s minerals and mining sector.

Known as the Exploration and Mining Business Capability Reference Map, and published as part of a set of standards, this framework was developed by The Open Group Exploration, Mining, Metals and Minerals (EMMM™) Forum.  Focusing on all levels of mining operations, from strategic planning, portfolio planning, program enablement and project enablement – and based on the principles of open standards – this framework provides miners with a capability-based approach to information, processes, technology, and people.

The Reference Map isolates specific capabilities within mining organizations, analyzes them from multiple dimensions, and shows their various relationships to other parts of the organization. In the context of increased automation in the mining sector, this becomes an invaluable tool in determining those functions that are ripe for automation.

In this new dimension, this new era of business, there is no reason why achievements from the EMMM’s Business Capability Reference Map cannot be repeated in every industry, and in every mid- to large-scale enterprise throughout the globe.

For more information on joining The Open Group, please visit:  http://www.opengroup.org/getinvolved/becomeamember

For more information on joining The Open Group EMMM™ Forum, please visit:  http://opengroup.co.za/emmm

Photo - Stuart #2Stuart Macgregor is the Chief Executive of the South African company, Real IRM Solutions. Through his personal achievements, he has gained the reputation of an Enterprise Architecture and IT Governance specialist, both in South Africa and internationally.

Macgregor participated in the development of the Microsoft Enterprise Computing Roadmap in Seattle. He was then invited by John Zachman to Scottsdale Arizona to present a paper on using the Zachman framework to implement ERP systems. In addition, Macgregor was selected as a member of both the SAP AG Global Customer Council for Knowledge Management, and of the panel that developed COBIT 3rd Edition Management Guidelines. He has also assisted a global Life Sciences manufacturer to define their IT Governance framework, a major financial institution to define their global, regional and local IT organizational designs and strategy. He was also selected as a core member of the team that developed the South African Breweries (SABMiller) plc global IT strategy.

Stuart, as the lead researcher, assisted the IT Governance Institute map CobiT 4.0 to TOGAF® This mapping document was published by ISACA and The Open Group. More recently, he participated in the COBIT 5 development workshop held in London during May 2010.

 

 

 

Leave a comment

Filed under EMMMv™, Enterprise Architecture, Enterprise Transformation, Standards, Uncategorized

The Onion & The Open Group Open Platform 3.0™

By Stuart Boardman, Senior Business Consultant, KPN Consulting, and Co-Chair of The Open Group Open Platform 3.0™

Onion1

The onion is widely used as an analogy for complex systems – from IT systems to mystical world views.Onion2

 

 

 

It’s a good analogy. From the outside it’s a solid whole but each layer you peel off reveals a new onion (new information) underneath.

And a slice through the onion looks quite different from the whole…Onion3

What (and how much) you see depends on where and how you slice it.Onion4

 

 

 

 

The Open Group Open Platform 3.0™ is like that. Use-cases for Open Platform 3.0 reveal multiple participants and technologies (Cloud Computing, Big Data Analytics, Social networks, Mobility and The Internet of Things) working together to achieve goals that vary by participant. Each participant’s goals represent a different slice through the onion.

The Ecosystem View
We commonly use the idea of peeling off layers to understand large ecosystems, which could be Open Platform 3.0 systems like the energy smart grid but could equally be the workings of a large cooperative or the transport infrastructure of a city. We want to know what is needed to keep the ecosystem healthy and what the effects could be of the actions of individuals on the whole and therefore on each other. So we start from the whole thing and work our way in.

Onion5

The Service at the Centre of the Onion

If you’re the provider or consumer (or both) of an Open Platform 3.0 service, you’re primarily concerned with your slice of the onion. You want to be able to obtain and/or deliver the expected value from your service(s). You need to know as much as possible about the things that can positively or negatively affect that. So your concern is not the onion (ecosystem) as a whole but your part of it.

Right in the middle is your part of the service. The first level out from that consists of other participants with whom you have a direct relationship (contractual or otherwise). These are the organizations that deliver the services you consume directly to enable your own service.

One level out from that (level 2) are participants with whom you have no direct relationship but on whose services you are still dependent. It’s common in Platform 3.0 that your partners too will consume other services in order to deliver their services (see the use cases we have documented). You need to know as much as possible about this level , because whatever happens here can have a positive or negative effect on you.

One level further from the centre we find indirect participants who don’t necessarily delivery any part of the service but whose actions may well affect the rest. They could just be indirect materials suppliers. They could also be part of a completely different value network in which your level 1 or 2 “partners” participate. You can’t expect to understand this level in detail but you know that how that value network performs can affect your partners’ strategy or even their very existence. The knock-on impact on your own strategy can be significant.

We can conceive of more levels but pretty soon a law of diminishing returns sets in. At each level further from your own organization you will see less detail and more variety. That in turn means that there will be fewer things you can actually know (with any certainty) and not much more that you can even guess at. That doesn’t mean that the ecosystem ends at this point. Ecosystems are potentially infinite. You just need to decide how deep you can usefully go.

Limits of the Onion
At a certain point one hits the limits of an analogy. If everybody sees their own organization as the centre of the onion, what we actually have is a bunch of different, overlapping onions.

Onion6

And you can’t actually make onions overlap, so let’s not take the analogy too literally. Just keep it in mind as we move on. Remember that our objective is to ensure the value of the service we’re delivering or consuming. What we need to know therefore is what can change that’s outside of our own control and what kind of change we might expect. At each visible level of the theoretical onion we will find these sources of variety. How certain of their behaviour we can be will vary – with a tendency to the less certain as we move further from the centre of the onion. We’ll need to decide how, if at all, we want to respond to each kind of variety.

But that will have to wait for my next blog. In the meantime, here are some ways people look at the onion.

Onion7   Onion8

 

 

 

 

SONY DSCStuart Boardman is a Senior Business Consultant with KPN Consulting where he leads the Enterprise Architecture practice and consults to clients on Cloud Computing, Enterprise Mobility and The Internet of Everything. He is Co-Chair of The Open Group Open Platform 3.0™ Forum and was Co-Chair of the Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by KPN, the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI as well as several Open Group white papers, guides and standards. He is a frequent speaker at conferences on the topics of Open Platform 3.0 and Identity.

2 Comments

Filed under Cloud, Cloud/SOA, Conference, Enterprise Architecture, Open Platform 3.0, Service Oriented Architecture, Standards, Uncategorized

ArchiMate® Users Group Meeting

By The Open Group

During a special ArchiMate® users group meeting on Wednesday, May 14 in Amsterdam, Andrew Josey, Director of Standards within The Open Group, presented on the ArchiMate certification program and adoption of the language. Andrew is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate 2.1, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4.

ArchiMate®, a standard of The Open Group, is an open and independent modeling language for Enterprise Architecture that is supported by different vendors and consulting firms. ArchiMate provides instruments to enable Enterprise Architects to describe, analyze and visualize the relationships among business domains in an unambiguous way. ArchiMate is not an isolated development. The relationships with existing methods and techniques, like modeling languages such as UML and BPMN, and methods and frameworks like TOGAF and Zachman, are well-described.

In this talk, Andrew provided an overview of the ArchiMate 2 certification program, including information on the adoption of the ArchiMate modeling language. He gave an overview of the major milestones in the development of Archimate and referred to the Dutch origins of the language. The Dutch Telematica Institute created the Archimate language in the period 2002-2004 and the language is now widespread. There have been over 41,000 downloads of different versions of the ArchiMate specification from more than 150 countries. At 52%, The Netherlands is leading the “Top 10 Certifications by country”. However, the “Top 20 Downloads by country” is dominated by the USA (19%), followed by the UK (14%) and The Netherlands (12%). One of the tools developed to support ArchiMate is Archi, a free open-source tool created by Phil Beauvoir at the University of Bolton in the UK. Since its development, Archi also has grown from a relatively small, home-grown tool to become a widely used open-source resource that averages 3,000 downloads per month and whose community ranges from independent practitioners to Fortune 500 companies. It is no surprise that again, Archi is mostly downloaded in The Netherlands (17.67%), the United States (12.42%) and the United Kingdom (8.81%).

After these noteworthy facts and figures, Henk Jonkers took a deep dive into modeling risk and security. Henk Jonkers is a senior research consultant, involved in BiZZdesign’s innovations in the areas of Enterprise Architecture and engineering. He was one of the main developers of the ArchiMate language, an author of the ArchiMate 1.0 and 2.0 Specifications, and is actively involved in the activities of the ArchiMate Forum of The Open Group. In this talk, Henk showed several examples of how risk and security aspects can be incorporated in Enterprise Architecture models using the ArchiMate language. He also explained how the resulting models could be used to analyze risks and vulnerabilities in the different architectural layers, and to visualize the business impact that they have.

First Henk described the limitations of current approaches – existing information security and risk management methods do not systematically identify potential attacks. They are based on checklists, heuristics and experience. Security controls are applied in a bottom-up way and are not based on a thorough analysis of risks and vulnerabilities. There is no explicit definition of security principles and requirements. Existing systems only focus on IT security. They have difficulties in dealing with complex attacks on socio-technical systems, combining physical and digital access, and social engineering. Current approaches focus on preventive security controls, and corrective and curative controls are not considered. Security by Design is a must, and there is always a trade-off between the risk factor versus process criticality. Henk gave some arguments as to why ArchiMate provides the right building blocks for a solid risk and security architecture. ArchiMate is widely accepted as an open standard for modeling Enterprise Architecture and support is widely available. ArchiMate is also suitable as a basis for qualitative and quantitative analysis. And last but not least: there is a good fit with other Enterprise Architecture and security frameworks (TOGAF, Zachman, SABSA).

“The nice thing about standards is that there are so many to choose from”, emeritus professor Andrew Stuart Tanenbaum once said. Using this quote as a starting point, Gerben Wierda focused his speech on the relationship between the ArchiMate language and Business Process Model and Notation (BPMN). In particular he discussed Bruce Silver’s BPMN Method and Style. He stated that ArchiMate and BPMN can exist side by side. Why would you link BPMN and Archimate? According to Gerben there is a fundamental vision behind all of this. “There are unavoidably many ‘models’ of the enterprise that are used. We cannot reduce that to one single model because of fundamentally different uses. We even cannot reduce that to a single meta-model (or pattern/structure) because of fundamentally different requirements. Therefore, what we need to do is look at the documentation of the enterprise as a collection of models with different structures. And what we thus need to do is make this collection coherent.”

Gerben is Lead Enterprise Architect of APG Asset Management, one of the largest Fiduciary Managers (± €330 billion Assets under Management) in the world, with offices in Heerlen, Amsterdam, New York, Hong Kong and Brussels. He has overseen the construction of one of the largest single ArchiMate models in the world to date and is the author of the book “Mastering ArchiMate”, based on his experience in large scale ArchiMate modeling. In his speech, Gerben showed how the leading standards ArchiMate and BPMN (Business Process Modeling Notation, an OMG standard) can be used together, creating one structured logically coherent and automatically synchronized description that combines architecture and process details.

Marc Lankhorst, Managing Consultant and Service Line Manager Enterprise Architecture at BiZZdesign, presented on the topic of capability modeling in ArchiMate. As an internationally recognized thought leader on Enterprise Architecture, he guides the development of BiZZdesign’s portfolio of services, methods, techniques and tools in this field. Marc is also active as a consultant in government and finance. In the past, he has managed the development of the ArchiMate language for Enterprise Architecture modeling, now a standard of The Open Group. Marc is a certified TOGAF9 Enterprise Architect and holds an MSc in Computer Science from the University of Twente and a PhD from the University of Groningen in the Netherlands. In his speech, Marc discussed different notions of “capability” and outlined the ways in which these might be modeled in ArchiMate. In short, a business capability is something an enterprise does or can do, given the various resources it possesses. Marc described the use of capability-based planning as a way of translating enterprise strategy to architectural choices and look ahead at potential extensions of ArchiMate for capability modeling. Business capabilities provide a high-level view of current and desired abilities of the organization, in relation to strategy and environment. Enterprise Architecture practitioners design extensive models of the enterprise, but these are often difficult to communicate with business leaders. Capabilities form a bridge between the business leaders and the Enterprise Architecture practitioners. They are very helpful in business transformation and are the ratio behind capability based planning, he concluded.

For more information on ArchiMate, please visit:

http://www.opengroup.org/subjectareas/enterprise/archimate

For information on the Archi tool, please visit: http://www.archimatetool.com/

For information on joining the ArchiMate Forum, please visit: http://www.opengroup.org/getinvolved/forums/archimate

 

1 Comment

Filed under ArchiMate®, Certifications, Conference, Enterprise Architecture, Enterprise Transformation, Professional Development, Standards, TOGAF®

The Open Group Summit Amsterdam 2014 – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On Tuesday, May 13, day two of The Open Group Summit Amsterdam, the morning plenary began with a welcome from The Open Group President and CEO Allen Brown. He presented an overview of the Forums and the corresponding Roadmaps. He described the process of standardization, from the initial work to a preliminary standard, including review documents, whitepapers and snapshots, culminating in the final publication of an open standard. Brown also announced that Capgemini is again a Platinum member of The Open Group and contributes to the realization of the organization’s objectives in various ways.

Charles Betz, Chief Architect, Signature Client Group, AT&T and Karel van Zeeland, Lead IT4IT Architect, Shell IT International, presented the second keynote of the morning, ‘A Reference Architecture For the Business of IT’.  When the IT Value Chain and IT4IT Reference Architecture is articulated, instituted and automated, the business can experience huge cost savings in IT and significantly improved response times for IT service delivery, as well as increasing customer satisfaction.

AmsterdamPlenaryKarel van Zeeland, Charles Betz and Allen Brown

In 1998, Shell Information Technology started to restructure the IT Management and the chaos was complete. There were too many tools, too many vendors, a lack of integration, no common data model, a variety of user interfaces and no standards to support rapid implementation. With more than 28 different solutions for incident management and more than 160 repositories of configuration data, the complexity was immense. An unclear relationship with Enterprise Architecture and other architectural issues made the case even worse.

Restructuring the IT Management turned out to be a long journey for the Shell managers. How to manage 1,700 locations in 90 countries, 8,000 applications, 25,000 servers, dozens of global and regional datacenters,125,000 PCs and laptops, when at the same time you are confronted with trends like BYOD, mobility, cloud computing, security, big data and the Internet of Things (IoT).  According to Betz and van Zeeland, IT4IT is a promising platform for evolution of the IT profession. IT4IT however has the potential to become a full open standard for managing the business of IT.

Jeroen Tas, CEO of Healthcare Informatics Solutions and Services within Philips Healthcare, explained in his keynote speech, “Philips is becoming a software company”. Digital solutions connect and streamline workflow across the continuum of care to improve patient outcomes. Today, big data is supporting adaptive therapies. Smart algorithms are used for early warning and active monitoring of patients in remote locations. Tas has a dream, he wants to make a valuable contribution to a connected healthcare world for everyone.

In January 2014, Royal Philips announced the formation of Healthcare Informatics Solutions and Services, a new business group within Philips’ Healthcare sector that offers hospitals and health systems the customized clinical programs, advanced data analytics and interoperable, cloud-based platforms necessary to implement new models of care. Tas, who previously served as the Chief Information Officer of Philips, leads the group.

In January of this year, The Open Group launched The Open Group Healthcare Forum whichfocuses on bringing Boundaryless Information Flow™ to the healthcare industry enabling data to flow more easily throughout the complete healthcare ecosystem.

Ed Reynolds, HP Fellow and responsible for the HP Enterprise Security Services in the US, described the role of information risk in a new technology landscape. How do C-level executives think about risk? This is a relevant and urgent question because it can take more than 243 days before a data breach is detected. Last year, the average cost associated with a data breach increased 78% to 11.9 million dollars. Critical data assets may be of strategic national importance, have massive corporate value or have huge significance to an employee or citizen, be it the secret recipe of Coca Cola or the medical records of a patient. “Protect your crown jewels” is the motto.

Bart Seghers, Cyber Security Manager, Thales Security and Henk Jonkers, Senior Research Consultant of BiZZdesign, visualized the Business Impact of Technical Cyber Risks. Attacks on information systems are becoming increasingly sophisticated. Organizations are increasingly networked and thus more complex. Attacks use digital, physical and social engineering and the departments responsible for each of these domains within an organization operate in silos. Current risk management methods cannot handle the resulting complexity. Therefore they are using ArchiMate® as a risk and security architecture. ArchiMate is a widely accepted open standard for modeling Enterprise Architecture. There is also a good fit with other EA and security frameworks, such as TOGAF®. A pentest-based Business Impact Assessment (BIA) is a powerful management dashboard that increases the return on investment for your Enterprise Architecture effort, they concluded.

Risk Management was also a hot topic during several sessions in the afternoon. Moderator Jim Hietala, Vice President, Security at The Open Group, hosted a panel discussion on Risk Management.

In the afternoon several international speakers covered topics including Enterprise Architecture & Business Value, Business & Data Architecture and Open Platform 3.0™. In relation to social networks, Andy Jones, Technical Director, EMEA, SOA Software, UK, presented “What Facebook, Twitter and Netflix Didn’t Tell You”.

The Open Group veteran Dr. Chris Harding, Director for Interoperability at The Open Group, and panelists discussed and emphasized the importance of The Open Group Open Platform 3.0™. The session also featured a live Q&A via Twitter #ogchat, #ogop3.

The podcast is now live. Here are the links:

Briefings Direct Podcast Home Page: http://www.briefingsdirect.com/

PODCAST STREAM: http://traffic.libsyn.com/interarbor/BriefingsDirect-The_Open_Group_Amsterdam_Conference_Panel_Delves_into_How_to_Best_Gain_Business_Value_From_Platform_3.mp3

PODCAST SUMMARY: http://briefingsdirect.com/the-open-group-amsterdam-panel-delves-into-how-to-best-gain-business-value-from-platform-30

In the evening, The Open Group hosted a tour and dinner experience at the world-famous Heineken Brewery.

For those of you who attended the summit, please give us your feedback! https://www.surveymonkey.com/s/AMST2014

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Certifications, Enterprise Architecture, Enterprise Transformation, Healthcare, Open Platform 3.0, RISK Management, Standards, TOGAF®, Uncategorized

Improving Patient Care and Reducing Costs in Healthcare

By Jason Lee, Director of Healthcare and Security Forums, The Open Group

Recently, The Open Group Healthcare Forum hosted a tweet jam to discuss IT and Enterprise Architecture (EA) issues as they relate to two of the most persistent problems in healthcare: reducing costs and improving patient care. Below I summarize the key points that followed from a rather unique discussion. Unique how? Unique in that rather than address these issues from the perspective of “must do” priorities (including EHR implementation, transitioning to ICD-10, and meeting enhanced HIPAA security requirements), we focused on “should do” opportunities.

We asked how stakeholders in the healthcare system can employ “Boundaryless Information Flow™” and standards development through the application of EA approaches that have proven effective in other industries to add new insights and processes to reduce costs and improve quality.

Question 1: What barriers exist for collaboration among providers in healthcare, and what can be done to improve things?
• tetradian: Huge barriers of language, terminology, mindset, worldview, paradigm, hierarchy, role and much more
• jasonsleephd: Financial, organizational, structural, lack of enabling technology, cultural, educational, professional insulation
• jim_hietala: EHRs with proprietary interfaces represent a big barrier in healthcare
• Technodad: Isn’t question really what barriers exist for collaboration between providers and patients in healthcare?
• tetradian: Communication b/w patients and providers is only one (type) amongst very many
• Technodad: Agree. Debate needs to identify whose point of view the #healthcare problem is addressing.
• Dana_Gardner: Where to begin? A Tower of Babel exists on multiple levels among #healthcare ecosystems. Too complex to fix wholesale.
• EricStephens: Also, legal ramifications of sharing information may impede sharing
• efeatherston: Patient needs provider collaboration to see any true benefit (I don’t just go to one provider)
• Dana_Gardner: Improve first by identifying essential collaborative processes that have most impact, and then enable them as secure services.
• Technodad: In US at least, solutions will need to be patient-centric to span providers- Bring Your Own Wellness (BYOW™) for HC info.
• loseby: Lack of shared capabilities & interfaces between EHRs leads to providers w/o comprehensive view of patient
• EricStephens: Are incentives aligned sufficiently to encourage collaboration? + lack of technology integration.
• tetradian: Vast numbers of stakeholder-groups, many beyond medicine – e.g. pharma, university, politics, local care (esp. outside of US)
• jim_hietala: Gap in patient-centric information flow
• Technodad: I think patents will need to drive the collaboration – they have more incentive to manage info than providers.
• efeatherston: Agreed, stakeholder list could be huge
• EricStephens: High-deductible plans will drive patients (us) to own our health care experience
• Dana_Gardner: Take patient-centric approach to making #healthcare processes better: drives adoption, which drives productivity, more adoption
• jasonsleephd: Who thinks standards development and data sharing is an essential collaboration tool?
• tetradian: not always patient-centric – e.g. epidemiology /public-health is population centric – i.e. _everything_ is ‘the centre’
• jasonsleephd: How do we break through barriers to collaboration? For one thing, we need to create financial incentives to collaborate (e.g., ACOs)
• efeatherston: Agreed, the challenge is to get them to challenge (if that makes sense). Many do not question
• EricStephens: Some will deify those in a lab coat.
• efeatherston: Still do, especially older generations, cultural
• Technodad: Agree – also displaying, fusing data from different providers, labs, monitors etc.
• dianedanamac: Online collaboration, can be cost effective & promote better quality but must financially incented
• efeatherston: Good point, unless there is a benefit/incentive for provider, they may not be bothered to try
• tetradian: “must financially incented” – often other incentives work better – money can be a distraction – also who pays?

Participants identified barriers that are not atypical: financial disincentives, underpowered technology, failure to utilize existing capability, lack of motivation to collaborate. Yet all participants viewed more collaboration as key. Consensus developed around:
• The patient (and by one commenter, the population) as the main driver of collaboration, and
• The patient as the most important stakeholder at the center of information flow.

Question 2: Does implementing remote patient tele-monitoring and online collaboration drive better and more cost-effective patient care?
• EricStephens: “Hell yes” comes to mind. Why drag yourself into a dr. office when a device can send the information (w/ video)
• efeatherston: Will it? Will those with high deductible plans have ability/understanding/influence to push for it?
• EricStephens: Driving up participation could drive up efficacy
• jim_hietala: Big opportunities to improve patient care thru remote tele-monitoring
• jasonsleephd: Tele-ICUs can keep patients (and money) in remote settings while receiving quality care
• jasonsleephd: Remote monitoring of patients admitted with CHF can reduce rehospitalization w/i 6 months @connectedhealth.org
• Dana_Gardner: Yes! Pacemakers now uplink to centralized analysis centers, communicate trends back to attending doctor. Just scratches surface
• efeatherston: Amen. Do that now, monthly uplink, annual check in with doctor to discuss any trends he sees.
• tetradian: Assumes tele-monitoring options even exist – very wide range of device-capabilities, from very high to not-much, and still not common.
• tetradian: (General request to remember that there’s more to the world, and medicine, than just the US and its somewhat idiosyncratic systems?)
• efeatherston: Yes, I do find myself looking through the lens of my own experiences, forgetting the way we do things may not translate
• jasonsleephd: Amen to point about our idiosyncrasies! Still, we have to live with them, and we can do so much better with good information flow!
• Dana_Gardner: Governments should remove barriers so more remote patient tele-monitoring occurs. Need to address the malpractice risks issue.
• TerryBlevins: Absolutely. Just want the information to go to the right place!
• Technodad: . Isn’t “right place” someplace you & all your providers can access? Need interoperability!
• TerryBlevins: It requires interoperability yes – the info must flow to those that must know.
• Technodad: Many areas where continuous monitoring can help. Improved IoT (internet of things) sensors e.g. cardio, blood chemistry coming. http://t.co/M3xw3tNvv3
• tetradian: Ethical/privacy concerns re how/with-whom that data is shared – e.g. with pharma, research, epidemiology etc
• efeatherston: Add employers to that etc. list of how/who/what is shared

Participants agreed that remote patient monitoring and telemonitoring can improve collaboration, improve patient care, and put patients more in control of their own healthcare data. However, participants expressed concerns about lack of widespread availability and the related issue of high cost. In addition, they raised important questions about who has access to these data, and they addressed nagging privacy and liability concerns.

Question 3: Can a mobile strategy improve patient experience, empowerment and satisfaction? If so, how?
• jim_hietala: mobile is a key area where patient health information can be developed/captured
• EricStephens: Example: link blood sugar monitor to iPhone to MyFitnessPal + gamification to drive adherence (and drive $$ down?)
• efeatherston: Mobile along with #InternetOfThings, wearables linked to mobile. Contact lens measuring blood sugar in recent article as ex.
• TerryBlevins: Sick people, or people getting sick are on the move. In a patient centric world we must match need.
• EricStephens: Mobile becomes a great data acquisition point. Something as simple as SMS can drive adherence with complication drug treatments
• jasonsleephd: mHealth is a very important area for innovation, better collaboration, $ reduction & quality improvement. Google recent “Webby Awards & handheld devices”
• tetradian: Mobile can help – e.g. use of SMS for medicine in Africa etc
• Technodad: Mobile isn’t option any more. Retail, prescription IoT, mobile network & computing make this a must-have. http://t.co/b5atiprIU9
• dianedanamac: Providers need to be able to receive the information mHealth
• Dana_Gardner: Healthcare should go location-independent. Patient is anywhere, therefore so is care, data, access. More than mobile, IMHO.
• Technodad: Technology and mobile demand will outrun regional provider systems, payers, regulation
• Dana_Gardner: As so why do they need to be regional? Cloud can enable supply-demand optimization regardless of location for much.
• TerryBlevins: And the caregivers are also on the move!
• Dana_Gardner: Also, more machine-driven care, i.e. IBM Watson, for managing the routing and prioritization. Helps mitigate overload.
• Technodad: Agree – more on that later!
• Technodad: Regional providers are the reality in the US. Would love to have more national/global coverage.
• Dana_Gardner: Yes, let the market work its magic by making it a larger market, when information is the key.
• tetradian: “let the market do its work” – ‘the market’ is probably the quickest way to destroy trust! – not a good idea…
• Technodad: To me, problem is coordinating among multi providers, labs etc. My health info seems to move at glacial pace then.
• tetradian: “Regional providers are the reality in the US.” – people move around: get info follow them is _hard_ (1st-hand exp. there…)
• tetradian: danger of hype/fear-driven apps – may need regulation, or at least regulatory monitoring
• jasonsleephd: Regulators, as in FDA or something similar?
• tetradian: “Regulators as in FDA” etc – at least oversight of that kind, yes (cf. vitamins, supplements, health-advice services)
• jim_hietala: mobile, consumer health device innovation moving much faster than IT ability to absorb
• tetradian: also beware of IT-centrism and culture – my 90yr-old mother has a cell-phone, but has almost no idea how to use it!
• Dana_Gardner: Information and rely of next steps (in prevention or acute care) are key, and can be mobile. Bring care to the patient ASAP.

Participants began in full agreement. Mobile health is not even an option but a “given” now. Recognition that provider ability to receive information is lacking. Cloud viewed as means to overcome regionalization of data storage problems. When the discussion turned to further development of mHealth there was some debate on what can be left to the market and whether some form of regulatory action is needed.

Question 4: Does better information flow and availability in healthcare reduce operation cost, and free up resources for more patient care?
• tetradian: A4: should do, but it’s _way_ more complex than most IT-folks seem to expect or understand (e.g. repeated health-IT fails in UK)
• jim_hietala: A4: removing barriers to health info flow may reduce costs, but for me it’s mostly about opportunity to improve patient care
• jasonsleephd: Absolutely. Consider claims processing alone. Admin costs in private health ins. are 20% or more. In Medicare less than 2%.
• loseby: Absolutely! ACO model is proving it. Better information flow and availability also significantly reduces hospital admissions
• dianedanamac: I love it when the MD can access my x-rays and lab results so we have more time.
• efeatherston: I love it when the MD can access my x-rays and lab results so we have more time.
• EricStephens: More info flow + availability -> less admin staff -> more med staff.
• EricStephens: Get the right info to the ER Dr. can save a life by avoiding contraindicated medicines
• jasonsleephd: EricStephens GO CPOE!!
• TerryBlevins: @theopengroup. believe so, but ask the providers. My doctor is more focused on patient by using simple tech to improve info flow
• tetradian: don’t forget link b/w information-flows and trust – if trust fails, so does the information-flow – worse than where we started!
• jasonsleephd: Yes! Trust is really key to this conversation!
• EricStephens: processing a claim, in most cases, should be no more difficult than an expense report or online order. Real-time adjudication
• TerryBlevins: Great point.
• efeatherston: Agreed should be, would love to see it happen. Trust in the data as mentioned earlier is key (and the process)
• tetradian: A4: sharing b/w patient and MD is core, yes, but who else needs to access that data – or _not_ see it? #privacy
• TerryBlevins: A4: @theopengroup can’t forget that if info doesn’t flow sometimes the consequences are fatal, so unblocked the flow.
• tetradian: .@TerryBlevins A4: “if info doesn’t flow sometimes the consequences are fatal,” – v.important!
• Technodad: . @tetradian To me, problem is coordinating among multi providers, labs etc. My health info seems to move at glacial pace then.
• TerryBlevins: A4: @Technodad @tetradian I have heard that a patient moving on a gurney moves faster than the info in a hospital.
• Dana_Gardner: A4 Better info flow in #healthcare like web access has helped. Now needs to go further to be interactive, responsive, predictive.
• jim_hietala: A4: how about pricing info flow in healthcare, which is almost totally lacking
• Dana_Gardner: A4 #BigData, #cloud, machine learning can make 1st points of #healthcare contact a tech interface. Not sci-fi, but not here either.

Starting with the recognition that this is a very complicated issue, the conversation quickly produced a consensus view that mobile health is key, both to cost reduction and quality improvement and increased patient satisfaction. Trust that information is accurate, available and used to support trust in the provider-patient relationship emerged as a relevant issue. Then, naturally, privacy issues surfaced. Coordination of information flow and lack of interoperability were recognized as important barriers and the conversation finally turned somewhat abstract and technical with mentions of big data and the cloud and pricing information flows without much in the way of specifying how to connect the dots.

Question 5: Do you think payers and providers are placing enough focus on using technology to positively impact patient satisfaction?
• Technodad: A5: I think there are positive signs but good architecture is lacking. Current course will end w/ provider information stovepipes.
• TerryBlevins: A5: @theopengroup Providers are doing more. I think much more is needed for payers – they actually may be worse.
• theopengroup: @TerryBlevins Interesting – where do you see opportunities for improvements with payers?
• TerryBlevins: A5: @theopengroup like was said below claims processing – an onerous job for providers and patients – mostly info issue.
• tetradian: A5: “enough focus on using tech”? – no, not yet – but probably won’t until tech folks properly face the non-tech issues…
• EricStephens: A5 No. I’m not sure patient satisfaction (customer experience/CX?) is even a factor sometimes. Patients not treated like customers
• dianedanamac: .@EricStephens SO TRUE! Patients not treated like customers
• Technodad: . @EricStephens Amen to that. Stovepipe data in provider systems is barrier to understanding my health & therefore satisfaction.
• dianedanamac: “@mclark497: @EricStephens issue is the customer is treat as only 1 dimension. There is also the family experience to consider too
• tetradian: .@EricStephens A5: “Patients not treated like customers” – who _is_ ‘the customer’? – that’s a really tricky question…
• efeatherston: @tetradian @EricStephens Trickiest question. to the provider is the patient or the payer the customer?
• tetradian: .@efeatherston “patient or payer” – yeah, though it gets _way_ more complex than that once we explore real stakeholder-relations
• efeatherston: @tetradian So true.
• jasonsleephd: .@tetradian @efeatherston Very true. There are so many diff stakeholders. But to align payers and pts would be huge
• efeatherston: @jasonsleephd @tetradian re: aligning payers and patients, agree, it would be huge and a good thing
• jasonsleephd: .@efeatherston @tetradian @EricStephens Ideally, there should be no dividing line between the payer and the patient!
• efeatherston: @jasonsleephd @tetradian @EricStephens Ideally I agree, and long for that ideal world.
• EricStephens: .@jasonsleephd @efeatherston @tetradian the payer s/b a financial proxy for the patient. and nothing more
• TerryBlevins: @EricStephens @jasonsleephd @efeatherston @tetradian … got a LOL out of me.
• Technodad: . @tetradian @EricStephens That’s a case of distorted marketplace. #Healthcare architecture must cut through to patient.
• tetradian: .@Technodad “That’s a case of distorted marketplace.” – yep. now add in the politics of consultants and their hierarchies, etc?
• TerryBlevins: A5: @efeatherston @tetradian @EricStephens in patient cetric world it is the patient and or their proxy.
• jasonsleephd: A5: Not enough emphasis on how proven technologies and architectural structures in other industries can benefit healthcare
• jim_hietala: A5: distinct tension in healthcare between patient-focus and meeting mandates (a US issue)
• tetradian: .@jim_hietala A5: “meeting mandates (a US issue)” – UK NHS (national-health-service) may be even worse than US – a mess of ‘targets’
• EricStephens: A5 @jim_hietala …and avoiding lawsuits
• tetradian: A5: most IT-type tech still not well-suited to the level of mass-uniqueness inherent in the healthcare context
• Dana_Gardner: A5 They are using tech, but patient “satisfaction” not yet a top driver. We have a long ways to go on that. But it can help a ton.
• theopengroup: @Dana_Gardner Agree, there’s a long way to go. What would you say is the starting point for providers to tie the two together?
• Dana_Gardner: @theopengroup An incentive other than to avoid lawsuits. A transparent care ratings capability. Outcomes focus based on total health
• Technodad: A5: I’d be satisfied just to not have to enter my patient info & history on a clipboard in every different provider I go to!
• dianedanamac: A5 @tetradian Better data sharing & Collab. less redundancy, lower cost, more focus on patient needs -all possible w/ technology
• Technodad: A5: The patient/payer discussion is a red herring. If the patient weren’t there, rest of the system would be unnecessary.
• jim_hietala: RT @Technodad: The patient/payer discussion is a red herring. If the patient weren’t there, rest of system unnecessary. AMEN

Very interesting conversation. Positive signs of progress were noted but so too were indications that healthcare will remain far behind the technology curve in the foreseeable future. Providers were given higher “grades” than payers. Yet, claims processing would seemingly be one of the easiest areas for technology-assisted improvement. One discussant noted that there will not be enough focus on technology in healthcare “until the tech folks properly face the non-tech issues”. This would seem to open a wide door for EA experts to enter the healthcare domain! The barriers (and opportunities) to this may be the topic of another tweet jam, or Open Group White Paper.
Interestingly, part way into the discussion the topic turned to the lack of a real customer/patient focus in healthcare. Not enough emphasis on patient satisfaction. Not enough attention to patient outcomes. There needs to be a better/closer alignment between what motivates payers and the needs of patients.

Question 6: As some have pointed out, many of the EHR systems are highly proprietary, how can standards deliver benefits in healthcare?
• jim_hietala: A6: Standards will help by lowering the barriers to capturing data, esp. for mhealth, and getting it to point of care
• tetradian: .@jim_hietala “esp. for mhealth” – focus on mhealth may be a way to break the proprietary logjam, ‘cos it ain’t proprietary yet
• TerryBlevins: A6: @theopengroup So now I deal with at least 3 different EHR systems. All requiring me to be the info steward! Hmmm
• TerryBlevins: A6 @theopengroup following up if they shared data through standards maybe they can synchronize.
• EricStephens: A6 – Standards lead to better interoperability, increased viscosity of information which will lead to lowers costs, better outcomes.
• efeatherston: @EricStephens and greater trust in the info (as was mentioned earlier, trust in the information key to success)
• jasonsleephd: A6: Standards development will not kill innovation but rather make proprietary systems interoperable
• Technodad: A6: Metcalfe’s law rules! HC’s many providers-many patients structure means interop systems will be > cost effective in long run.
• tetradian: A6: the politics of this are _huge_, likewise the complexities – if we don’t face those issues right up-front, this is going nowhere

On his April 24, 2014 post at www.weblog.tetradian.com, Tom Graves provided a clearly stated position on the role of The Open Group in delivering standards to help healthcare improve. He wrote:

“To me, this is where The Open Group has an obvious place and a much-needed role, because it’s more than just an IT-standards body. The Open Group membership are mostly IT-type organisations, yes, which tends to guide towards IT-standards, and that’s unquestionably of importance here. Yet perhaps the real role for The Open Group as an organisation is in its capabilities and experience in building consortia across whole industries: EMMM™ and FACE are two that come immediately to mind. Given the maze of stakeholders and the minefields of vested-interests across the health-context, those consortia-building skills and experience are perhaps what’s most needed here.”

The Open Group is the ideal organization to engage in this work. There are many ways to collaborate. You can join The Open Group Healthcare Forum, follow the Forum on Twitter @ogHealthcare and connect on The Open Group Healthcare Forum LinkedIn Group.

Jason Lee headshotJason Lee, Director of Healthcare and Security Forums at The Open Group, has conducted healthcare research, policy analysis and consulting for over 20 years. He is a nationally recognized expert in healthcare organization, finance and delivery and applies his expertise to a wide range of issues, including healthcare quality, value-based healthcare, and patient-centered outcomes research. Jason worked for the legislative branch of the U.S. Congress from 1990-2000 — first at GAO, then at CRS, then as Health Policy Counsel for the Chairman of the House Energy and Commerce Committee (in which role the National Journal named him a “Top Congressional Aide” and he was profiled in the Almanac of the Unelected). Subsequently, Jason held roles of increasing responsibility with non-profit organizations — including AcademyHealth, NORC, NIHCM, and NEHI. Jason has published quantitative and qualitative findings in Health Affairs and other journals and his work has been quoted in Newsweek, the Wall Street Journal and a host of trade publications. He is a Fellow of the Employee Benefit Research Institute, was an adjunct faculty member at the George Washington University, and has served on several boards. Jason earned a Ph.D. in social psychology from the University of Michigan and completed two postdoctoral programs (supported by the National Science Foundation and the National Institutes of Health). He is the proud father of twins and lives outside of Boston.

Leave a comment

Filed under Boundaryless Information Flow™, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Professional Development, Standards

How the Open Trusted Technology Provider Standard (O-TTPS) and Accreditation Will Help Lower Cyber Risk

By Andras Szakal, Vice President and Chief Technology Officer, IBM U.S. Federal

Changing business dynamics and enabling technologies

In 2008, IBM introduced the concept of a “Smarter Planet.” The Smarter Planet initiative focused, in part, on the evolution of globalization against the backdrop of changing business dynamics and enabling technologies. A key concept was the need for infrastructure to be tightly integrated, interconnected, and intelligent, thereby facilitating collaboration between people, government and businesses in order to meet the world’s growing appetite for data and automation. Since then, many industries and businesses have adopted this approach, including the ICT (information and communications technology) industries that support the global technology manufacturing supply chain.

Intelligent and interconnected critical systems

This transformation has infused technology into virtually all aspects of our lives, and involves, for example, government systems, the electric grid and healthcare. Most of these technological solutions are made up of hundreds or even thousands of components that are sourced from the growing global technology supply chain.
Intelligent and interconnected critical systems

In the global technology economy, no one technology vendor or integrator is able to always provide a single source solution. It is no longer cost competitive to design all of the electronic components, printed circuit boards, card assemblies, or other sub-assemblies in-house. Adapting to the changing market place and landscape by balancing response time and cost efficiency, in an expedient manner, drives a more wide-spread use of OEM (original equipment manufacturer) products.

As a result, most technology providers procure from a myriad of global component suppliers, who very often require similarly complex supply chains to source their components. Every enterprise has a supplier network, and each of their suppliers has a supply chain network, and these sub-tier suppliers have their own supply chain networks. The resultant technology supply chain is manifested into a network of integrated suppliers.

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Cybersecurity by design: Addressing risk in a sustainable way across the ecosystem

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways:

- Does the product meet essential functional requirements?
- Has the product or component been produced by trustworthy provider?

Of course, the leaders or owners of these systems have to do their part to achieve security and safety: e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats. Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

Assuring the quality and integrity of mission-critical technology

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

The very process of manufacturing technology is not immune to cyber-attack. The primary purpose of attacking the supply chain typically is motivated by monetary gain. The primary goals of a technology supply chain attack are intended to inflict massive economic damage in an effort to gain global economic advantage or as a way to seeding targets with malware that provides unfettered access for attackers.

It is for this reason that the global technology manufacturing industry must establish practices that mitigate this risk by increasing the cost barriers of launching such attacks and increasing the likelihood of being caught before the effects of such an attack are irreversible. As these threats evolve, the global ICT industry must deploy enhanced security through advanced automated cyber intelligence analysis. As critical infrastructure becomes more automated, integrated and essential to critical to functions, the technology supply chain that surrounds it must be considered a principle theme of the overall global security and risk mitigation strategy.

A global, agile, and scalable approach to supply chain security

Certainly, the manner in which technologies are invented, produced, and sold requires a global, agile, and scalable approach to supply chain assurance and is essential to achieve the desired results. Any technology supply chain security standard that hopes to be widely adopted must be flexible and country-agnostic. The very nature of the global supply chain (massively segmented and diverse) requires an approach that provides practicable guidance but avoids being overtly prescriptive. Such an approach would require the aggregation of industry practices that have been proven beneficial and effective at mitigating risk.

The OTTF (The Open Group Trusted Technology Forum) is an increasingly recognized and promising industry initiative to establish best practices to mitigate the risk of technology supply chain attack. Facilitated by The Open Group, a recognized international standards and certification body, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Current membership includes a list of the most well-known technology vendors, integrators, and technology assessment laboratories.

The benefits of O-TTPS for governments and enterprises

IBM is currently a member of the OTTF and has been honored to hold the Chair for the last three years.  Governments and enterprises alike will benefit from the work of the OTTF. Technology purchasers can use the Open Trusted Technology Provider™ Standard (O-TTPS) and Framework best-practice recommendations to guide their strategies.

A wide range of technology vendors can use O-TTPS approaches to build security and integrity into their end-to-end supply chains. The first version of the O-TTPS is focused on mitigating the risk of maliciously tainted and counterfeit technology components or products. Note that a maliciously tainted product is one that has been produced by the provider and acquired through reputable channels but which has been tampered maliciously. A counterfeit product is produced other than by or for the provider, or is supplied by a non-reputable channel, and is represented as legitimate. The OTTF is currently working on a program that will accredit technology providers who conform to the O-TTPS. IBM expects to complete pilot testing of the program by 2014.

IBM has actively supported the formation of the OTTF and the development of the O-TTPS for several reasons. These include but are not limited to the following:

- The Forum was established within a trusted and respected international standards body – The Open Group.
- The Forum was founded, in part, through active participation by governments in a true public-private partnership in which government members actively participate.
- The OTTF membership includes some of the most mature and trusted commercial technology manufactures and vendors because a primary objective of the OTTF was harmonization with other standards groups such as ISO (International Organization for Standardization) and Common Criteria.

The O-TTPS defines a framework of organizational guidelines and best practices that enhance the security and integrity of COTS ICT. The first version of the O-TTPS is focused on mitigating certain risks of maliciously tainted and counterfeit products within the technology development / engineering lifecycle. These best practices are equally applicable for systems integrators; however, the standard is intended to primarily address the point of view of the technology manufacturer.

O-TTPS requirements

The O-TTPS requirements are divided into three categories:

1. Development / Engineering Process and Method
2. Secure Engineering Practices
3. Supply Chain Security Practices

The O-TTPS is intended to establish a normalized set of criteria against which a technology provider, component supplier, or integrator can be assessed. The standard is divided into categories that define best practices for engineering development practices, secure engineering, and supply chain security and integrity intended to mitigate the risk of maliciously tainted and counterfeit components.

The accreditation program

As part of the process for developing the accreditation criteria and policy, the OTTF established a pilot accreditation program. The purpose of the pilot was to take a handful of companies through the accreditation process and remediate any potential process or interpretation issues. IBM participated in the OTTP-S accreditation pilot to accredit a very significant segment of the software product portfolio; the Application Infrastructure Middleware Division (AIM) which includes the flagship WebSphere product line. The AIM pilot started in mid-2013 and completed in the first week of 2014 and was formally recognized as accredited in the fist week of February 2014.

IBM is currently leveraging the value of the O-TTPS and working to accredit additional development organizations. Some of the lessons learned during the IBM AIM initial O-TTPS accreditation include:

- Conducting a pre-assessment against the O-TTPS should be conducted by an organization before formally entering accreditation. This allows for remediation of any gaps and reduces potential assessment costs and project schedule.
- Starting with a segment of your development portfolio that has a mature secure engineering practices and processes. This helps an organization address accreditation requirements and facilitates interactions with the 3rd party lab.
- Using your first successful O-TTPS accreditation to create templates that will help drive data gathering and validate practices to establish a repeatable process as your organization undertakes additional accreditations.

andras-szakalAndras Szakal, VP and CTO, IBM U.S. Federal, is responsible for IBM’s industry solution technology strategy in support of the U.S. Federal customer. Andras was appointed IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team in 2005. He is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP).  Andras holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. He has been a driving force behind IBM’s adoption of government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of the IT Architect Profession Certification Standard (ITAC). More recently, he was appointed chair of The Open Group Trusted Technology Forum and leads the development of The Open Trusted Technology Provider Framework.

1 Comment

Filed under Accreditations, Cybersecurity, government, O-TTF, O-TTPS, OTTF, RISK Management, Standards, supply chain, Supply chain risk

Q&A with Allen Brown, President and CEO of The Open Group

By The Open Group

Last month, The Open Group hosted its San Francisco 2014 conference themed “Toward Boundaryless Information Flow™.” Boundaryless Information Flow has been the pillar of The Open Group’s mission since 2002 when it was adopted as the organization’s vision for Enterprise Architecture. We sat down at the conference with The Open Group President and CEO Allen Brown to discuss the industry’s progress toward that goal and the industries that could most benefit from it now as well as The Open Group’s new Dependability through Assuredness™ Standard and what the organization’s Forums are working on in 2014.

The Open Group adopted Boundaryless Information Flow as its vision in 2002, and the theme of the San Francisco Conference has been “Towards Boundaryless Information Flow.” Where do you think the industry is at this point in progressing toward that goal?

Well, it’s progressing reasonably well but the challenge is, of course, when we established that vision back in 2002, life was a little less complex, a little bit less fast moving, a little bit less fast-paced. Although organizations are improving the way that they act in a boundaryless manner – and of course that changes by industry – some industries still have big silos and stovepipes, they still have big boundaries. But generally speaking we are moving and everyone understands the need for information to flow in a boundaryless manner, for people to be able to access and integrate information and to provide it to the teams that they need.

One of the keynotes on Day One focused on the opportunities within the healthcare industry and The Open Group recently started a Healthcare Forum. Do you see Healthcare industry as a test case for Boundaryless Information Flow and why?

Healthcare is one of the verticals that we’ve focused on. And it is not so much a test case, but it is an area that absolutely seems to need information to flow in a boundaryless manner so that everyone involved – from the patient through the administrator through the medical teams – have all got access to the right information at the right time. We know that in many situations there are shifts of medical teams, and from one medical team to another they don’t have access to the same information. Information isn’t easily shared between medical doctors, hospitals and payers. What we’re trying to do is to focus on the needs of the patient and improve the information flow so that you get better outcomes for the patient.

Are there other industries where this vision might be enabled sooner rather than later?

I think that we’re already making significant progress in what we call the Exploration, Mining and Minerals industry. Our EMMM™ Forum has produced an industry-wide model that is being adopted throughout that industry. We’re also looking at whether we can have an influence in the airline industry, automotive industry, manufacturing industry. There are many, many others, government and retail included.

The plenary on Day Two of the conference focused on The Open Group’s Dependability through Assuredness standard, which was released last August. Why is The Open Group looking at dependability and why is it important?

Dependability is ultimately what you need from any system. You need to be able to rely on that system to perform when needed. Systems are becoming more complex, they’re becoming bigger. We’re not just thinking about the things that arrive on the desktop, we’re thinking about systems like the barriers at subway stations or Tube stations, we’re looking at systems that operate any number of complex activities. And they bring an awful lot of things together that you have to rely upon.

Now in all of these systems, what we’re trying to do is to minimize the amount of downtime because downtime can result in financial loss or at worst human life, and we’re trying to focus on that. What is interesting about the Dependability through Assuredness Standard is that it brings together so many other aspects of what The Open Group is working on. Obviously the architecture is at the core, so it’s critical that there’s an architecture. It’s critical that we understand the requirements of that system. It’s also critical that we understand the risks, so that fits in with the work of the Security Forum, and the work that they’ve done on Risk Analysis, Dependency Modeling, and out of the dependency modeling we can get the use cases so that we can understand where the vulnerabilities are, what action has to be taken if we identify a vulnerability or what action needs to be taken in the event of a failure of the system. If we do that and assign accountability to people for who will do what by when, in the event of an anomaly being detected or a failure happening, we can actually minimize that downtime or remove it completely.

Now the other great thing about this is it’s not only a focus on the architecture for the actual system development, and as the system changes over time, requirements change, legislation changes that might affect it, external changes, that all goes into that system, but also there’s another circle within that system that deals with failure and analyzes it and makes sure it doesn’t happen again. But there have been so many evidences of failure recently. In the banks for example in the UK, a bank recently was unable to process debit cards or credit cards for customers for about three or four hours. And that was probably caused by the work done on a routine basis over a weekend. But if Dependability through Assuredness had been in place, that could have been averted, it could have saved an awfully lot of difficulty for an awful lot of people.

How does the Dependability through Assuredness Standard also move the industry toward Boundaryless Information Flow?

It’s part of it. It’s critical that with big systems the information has to flow. But this is not so much the information but how a system is going to work in a dependable manner.

Business Architecture was another featured topic in the San Francisco plenary. What role can business architecture play in enterprise transformation vis a vis the Enterprise Architecture as a whole?

A lot of people in the industry are talking about Business Architecture right now and trying to focus on that as a separate discipline. We see it as a fundamental part of Enterprise Architecture. And, in fact, there are three legs to Enterprise Architecture, there’s Business Architecture, there’s the need for business analysts, which are critical to supplying the information, and then there are the solutions, and other architects, data, applications architects and so on that are needed. So those three legs are needed.

We find that there are two or three different types of Business Architect. Those that are using the analysis to understand what the business is doing in order that they can inform the solutions architects and other architects for the development of solutions. There are those that are more integrated with the business that can understand what is going on and provide input into how that might be improved through technology. And there are those that can actually go another step and talk about here we have the advances and the technology and here are the opportunities for advancing our competitiveness and organization.

What are some of the other key initiatives that The Open Group’s forum and work groups will be working on in 2014?

That kind question is like if you’ve got an award, you’ve got to thank your friends, so apologies to anyone that I leave out. Let me start alphabetically with the Architecture Forum. The Architecture Forum obviously is working on the evolution of TOGAF®, they’re also working with the harmonization of TOGAF with Archimate® and they have a number of projects within that, of course Business Architecture is on one of the projects going on in the Architecture space. The Archimate Forum are pushing ahead with Archimate—they’ve got two interesting activities going on at the moment, one is called ArchiMetals, which is going to be a sister publication to the ArchiSurance case study, where the ArchiSurance provides the example of Archimate is used in the insurance industry, ArchiMetals is going to be used in a manufacturing context, so there will be a whitepaper on that and there will be examples and artifacts that we can use. They’re also working on in Archimate a standard for interoperability for modeling tools. There are four tools that are accredited and certified by The Open Group right now and we’re looking for that interoperability to help organizations that have multiple tools as many of them do.

Going down the alphabet, there’s DirecNet. Not many people know about DirecNet, but Direcnet™ is work that we do around the U.S. Navy. They’re working on standards for long range, high bandwidth mobile networking. We can go to the FACE™ Consortium, the Future Airborne Capability Environment. The FACE Consortium are working on their next version of their standard, they’re working toward accreditation, a certification program and the uptake of that through procurement is absolutely amazing, we’re thrilled about that.

Healthcare we’ve talked about. The Open Group Trusted Technology Forum, where they’re working on how we can trust the supply chain in developed systems, they’ve released the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program, that was launched this week, and we already have one accredited vendor and two certified test labs, assessment labs. That is really exciting because now we’ve got a way of helping any organization that has large complex systems that are developed through a global supply chain to make sure that they can trust their supply chain. And that is going to be invaluable to many industries but also to the safety of citizens and the infrastructure of many countries. So the other part of the O-TTPS is that standard we are planning to move toward ISO standardization shortly.

The next one moving down the list would be Open Platform 3.0™. This is really exciting part of Boundaryless Information Flow, it really is. This is talking about the convergence of SOA, Cloud, Social, Mobile, Internet of Things, Big Data, and bringing all of that together, this convergence, this bringing together of all of those activities is really something that is critical right now, and we need to focus on. In the different areas, some of our Cloud computing standards have already gone to ISO and have been adopted by ISO. We’re working right now on the next products that are going to move through. We have a governance standard in process and an ecosystem standard has recently been published. In the area of Big Data there’s a whitepaper that’s 25 percent completed, there’s also a lot of work on the definition of what Open Platform 3.0 is, so this week the members have been working on trying to define Open Platform 3.0. One of the really interesting activities that’s gone on, the members of the Open Platform 3.0 Forum have produced something like 22 different use cases and they’re really good. They’re concise and they’re precise and the cover a number of different industries, including healthcare and others, and the next stage is to look at those and work on the ROI of those, the monetization, the value from those use cases, and that’s really exciting, I’m looking forward to peeping at that from time to time.

The Real Time and Embedded Systems Forum (RTES) is next. Real-Time is where we incubated the Dependability through Assuredness Framework and that was where that happened and is continuing to develop and that’s really good. The core focus of the RTES Forum is high assurance system, and they’re doing some work with ISO on that and a lot of other areas with multicore and, of course, they have a number of EC projects that we’re partnering with other partners in the EC around RTES.

The Security Forum, as I mentioned earlier, they’ve done a lot of work on risk and dependability. So they’ve not only their standards for the Risk Taxonomy and Risk Analysis, but they’ve now also developed the Open FAIR Certification for People, which is based on those two standards of Risk Analysis and Risk Taxonomy. And we’re already starting to see people being trained and being certified under that Open FAIR Certification Program that the Security Forum developed.

A lot of other activities are going on. Like I said, I probably left a lot of things out, but I hope that gives you a flavor of what’s going on in The Open Group right now.

The Open Group will be hosting a summit in Amsterdam May 12-14, 2014. What can we look forward to at that conference?

In Amsterdam we have a summit – that’s going to bring together a lot of things, it’s going to be a bigger conference that we had here. We’ve got a lot of activity in all of our activities; we’re going to bring together top-level speakers, so we’re looking forward to some interesting work during that week.

 

 

 

1 Comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Cybersecurity, EMMMv™, Enterprise Architecture, FACE™, Healthcare, O-TTF, RISK Management, Standards, TOGAF®

Accrediting the Global Supply Chain: A Conversation with O-TTPS Recognized Assessors Fiona Pattinson and Erin Connor

By The Open Group 

At the recent San Francisco 2014 conference, The Open Group Trusted Technology Forum (OTTF) announced the launch of the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program.

The program is one the first accreditation programs worldwide aimed at assuring the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

In three short years since OTTF launched, the forum has grown to include more than 25 member companies dedicated to safeguarding the global supply chain against the increasing sophistication of cybersecurity attacks through standards. Accreditation is yet another step in the process of protecting global technology supply chains from maliciously tainted and counterfeit products.

As part of the program, third-party assessor companies will be employed to assess organizations applying for accreditation, with The Open Group serving as the vendor-neutral Accreditation Authority that operates the program.  Prior to the launch, the forum conducted a pilot program with a number of member companies. It was announced at the conference that IBM is the first company to becoming accredited, earning accreditation for its Application, Infrastructure and Middleware (AIM), software business division for its product integrity and supply chain practices.

We recently spoke with OTTF members Fiona Pattinson, director of strategy and business development at Atsec Information Security, and Erin Connor, director at EWA-Canada, at the San Francisco conference to learn more about the assessment process and the new program.

The O-TTPS focus is on securing the technology supply chain. What would you say are the biggest threats facing the supply chain today?

Fiona Pattinson (FP): I think in the three years since the forum began certainly all the members have discussed the various threats quite a lot. It was one of things we discussed as an important topic early on, and I don’t know if it’s the ‘biggest threat,’ but certainly the most important threats that we needed to address initially were those of counterfeit and maliciously tainted products. We came to that through both discussion with all the industry experts in the forum and also through research into some of the requirements from government, so that’s exactly how we knew which threats [to start with].

Erin Connor (EC):  And the forum benefits from having both sides of the acquisition process, both acquirers, and the suppliers and vendors. So they get both perspectives.

How would you define maliciously tainted and counterfeit products?

FP:  They are very carefully defined in the standard—we needed to do that because people’s understanding of that can vary so much.

EC: And actually the concept of ‘maliciously’ tainted was incorporated close to the end of the development process for the standard at the request of members on the acquisition side of the process.

[Note: The standard precisely defines maliciously tainted and counterfeit products as follows:

"The two major threats that acquirers face today in their COTS ICT procurements, as addressed in this Standard, are defined as:

1. Maliciously tainted product – the product is produced by the provider and is acquired

through a provider’s authorized channel, but has been tampered with maliciously.

2. Counterfeit product – the product is produced other than by, or for, the provider, or is

supplied to the provider by other than a provider’s authorized channel and is presented as being legitimate even though it is not."]

The OTTF announced the Accreditation Program for the OTTP Standard at the recent San Francisco conference. Tell us about the standard and how the accreditation program will help ensure conformance to it?

EC: The program is intended to provide organizations with a way to accredit their lifecycle processes for their product development so they can prevent counterfeit or maliciously tainted components from getting into the products they are selling to an end user or into somebody else’s supply chain. It was determined that a third-party type of assessment program would be used. For the organizations, they will know that we Assessors have gone through a qualification process with The Open Group and that we have in place all that’s required on the management side to properly do an assessment. From the consumer side, they have confidence the assessment has been completed by an independent third-party, so they know we aren’t beholden to the organizations to give them a passing grade when perhaps they don’t deserve it. And then of course The Open Group is in position to oversee the whole process and award the final accreditation based on the recommendation we provide.  The Open Group will also be the arbiter of the process between the assessors and organizations if necessary. 

FP:  So The Open Group’s accreditation authority is validating the results of the assessors.

EC: It’s a model that is employed in many, many other product or process assessment and evaluation programs where the actual accreditation authority steps back and have third parties do the assessment.

FP: It is important that the assessor companies are working to the same standard so that there’s no advantage in taking one assessor over the other in terms of the quality of the assessments that are produced.

How does the accreditation program work?

FP: Well, it’s brand new so we don’t know if it is perfect yet, but having said that, we have worked over several months on defining the process, and we have drawn from The Open Group’s existing accreditation programs, as well as from the forum experts who have worked in the accreditation field for many years. We have been performing pilot accreditations in order to check out how the process works. So it is already tested.

How does it actually work? Well, first of all an organization will feel the need to become accredited and at that point will apply to The Open Group to get the accreditation underway. Once their scope of accreditation – which may be as small as one product or theoretically as large as a whole global company – and once the application is reviewed and approved by The Open Group, then they engage an assessor.

There is a way of sampling a large scope to identify the process variations in a larger scope using something we term ‘selective representative products.’ It’s basically a way of logically sampling a big scope so that we capture the process variations within the scope and make sure that the assessment is kept to a reasonable size for the organization undergoing the assessment, but it also gives good assurance to the consumers that it is a representative sample. The assessment is performed by the Recognized Assessor company, and a final report is written and provided to The Open Group for their validation. If everything is in order, then the company will be accredited and their scope of conformance will be added to the accreditation register and trademarked.

EC: So the customers of that organization can go and check the registration for exactly what products are covered by the scope.

FP: Yes, the register is public and anybody can check. So if IBM says WebSphere is accredited, you can go and check that claim on The Open Group web site.

How long does the process take or does it vary?

EC: It will vary depending on how large the scope to be accredited is in terms of the size of the representative set and the documentation evidence. It really does depend on what the variations in the processes are among the product lines as to how long it takes the assessor to go through the evidence and then to produce the report. The other side of the coin is how long it takes the organization to produce the evidence. It may well be that they might not have it totally there at the outset and will have to create some of it.

FP: As Erin said, it varies by the complexity and the variation of the processes and hence the number of selected representative products. There are other factors that can influence the duration. There are three parties influencing that: The applicant Organization, The Open Group’s Accreditation Authority and the Recognized Assessor.

For example, we found that the initial work by the Organization and the Accreditation Authority in checking the scope and the initial documentation can take a few weeks for a complex scope, of course for the pilots we were all new at doing that. In this early part of the project it is vital to get the scope both clearly defined and approved since it is key to a successful accreditation.

It is important that an Organization assigns adequate resources to help keep this to the shortest time possible, both during the initial scope discussions, and during the assessment. If the Organization can provide all the documentation before they get started, then the assessors are not waiting for that and the duration of the assessment can be kept as short as possible.

Of course the resources assigned by the Recognized Assessor also influences how long an assessment takes. A variable for the assessors is how much documentation do they have to read and review? It might be small or it might be a mountain.

The Open Group’s final review and oversight of the assessment takes some time and is influenced by resource availability within that organization. If they have any questions it may take a little while to resolve.

What kind of safeguards does the accreditation program put in place for enforcing the standard?

FP: It is a voluntary standard—there’s no requirement to comply. Currently some of the U.S. government organizations are recommending it. For example, NASA in their SEWP contract and some of the draft NIST documents on Supply Chain refer to it, too.

EC: In terms of actual oversight, we review what their processes are as assessors, and the report and our recommendations are based on that review. The accreditation expires after three years so before the three years is up, the organization should actually get the process underway to obtain a re-accreditation.  They would have to go through the process again but there will be a few more efficiencies because they’ve done it before. They may also wish to expand the scope to include the other product lines and portions of the company. There aren’t any periodic ‘spot checks’ after accreditation to make sure they’re still following the accredited processes, but part of what we look at during the assessment is that they have controls in place to ensure they continue doing the things they are supposed to be doing in terms of securing their supply chain.

FP:  And then the key part is the agreement the organizations signs with The Open Group includes the fact the organization warrant and represent that they remain in conformance with the standard throughout the accreditation period. So there is that assurance too, which builds on the more formal assessment checks.

What are the next steps for The Open Group Trusted Technology Forum?  What will you be working on this year now that the accreditation program has started?

FP: Reviewing the lessons we learned through the pilot!

EC: And reviewing comments from members on the standard now that it’s publicly available and working on version 1.1 to make any corrections or minor modifications. While that’s going on, we’re also looking ahead to version 2 to make more substantial changes, if necessary. The standard is definitely going to be evolving for a couple of years and then it will reach a steady state, which is the normal evolution for a standard.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Fiona Pattinson Fiona Pattinson is responsible for developing new and existing atsec service offerings.  Under the auspices of The Open Group’s OTTF, alongside many expert industry colleagues, Fiona has helped develop The Open Group’s O-TTPS, including developing the accreditation program for supply chain security.  In the past, Fiona has led service developments which have included establishing atsec’s US Common Criteria laboratory, the CMVP cryptographic module testing laboratory, the GSA FIPS 201 TP laboratory, TWIC reader compliance testing, NPIVP, SCAP, PCI, biometrics testing and penetration testing. Fiona has responsibility for understanding a broad range of information security topics and the application of security in a wide variety of technology areas from low-level design to the enterprise level.

ErinConnorErin Connor is the Director at EWA-Canada responsible for EWA-Canada’s Information Technology Security Evaluation & Testing Facility, which includes a Common Criteria Test Lab, a Cryptographic & Security Test Lab (FIPS 140 and SCAP), a Payment Assurance Test Lab (device testing for PCI PTS POI & HSM, Australian Payment Clearing Association and Visa mPOS) and an O-TTPS Assessor lab Recognized by the Open Group.  Erin participated with other expert members of the Open Group Trusted Technology Forum (OTTF) in the development of The Open Group Trusted Technology Provider Standard for supply chain security and its accompanying Accreditation Program.  Erin joined EWA-Canada in 1994 and his initial activities in the IT Security and Infrastructure Assurance field included working on the team fielding a large scale Public Key Infrastructure system, Year 2000 remediation and studies of wireless device vulnerabilities.  Since 2000, Erin has been working on evaluations of a wide variety of products including hardware security modules, enterprise security management products, firewalls, mobile device and management products, as well as system and network vulnerability management products.  He was also the only representative of an evaluation lab in the Biometric Evaluation Methodology Working Group, which developed a proposed methodology for the evaluation of biometric technologies under the Common Criteria.

Comments Off

Filed under Accreditations, Cybersecurity, OTTF, Professional Development, Standards, Supply chain risk

How to Build a Smarter City – Join The Open Group Tweet Jam on February 26

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On Wednesday, February 26, The Open Group will host a Tweet Jam examining smart cities and how Real-time and Embedded Systems can seamlessly integrate inputs from various agencies and locations. That collective data allows local governments to better adapt to change by implementing an analytics-based approach to measure:

  • Economic activity
  • Mobility patterns
  • Resource consumption
  • Waste management and sustainability measures
  • Inclement weather
  • And much more!

These metrics allow smart cities to do much more than just coordinate responses to traffic jams, they are forecasting and coordinating safety measures in advance of physical disasters and inclement weather; calculating where offices and shops can be laid out most efficiently; and how all the parts of urban life should be fitted together including energy, sustainability and infrastructural repairs and planning and development.

Smart cities are already very much a reality in the Middle East and in Korea and those have become a model for developers in China, and for redevelopment in Europe. Market research firm, IDC Government Insights projects that 2014 is the year cities around the world start getting smart. It predicts a $265 billion spend by cities worldwide this year alone to implement new technology and integrate agency data. Part of the reason for that spend is likely spurred by the fact that more than half the world’s population currently lives in urban areas. With urbanization rates rapidly increasing, Brookings Institution estimates that number could swell up to 75 percent of the global populace by 2050.

While the awe-inspiring smart city of Rio de Janeiro is proving to be an interesting smart city model for cities across the world, are smart cities always the best option for informing city decisions?  Could the beauty of a self-regulating open grid allow people to decide how best to use spaces in the city?

Please join us on Wednesday, February 26 at 9:00 am PT/12:00 pm ET/5:00 pm GMT for a tweet jam, that will discuss the issues around smart cities.  We welcome The Open Group members and interested participants from all backgrounds to join the discussion and interact with our panel of thought-leaders including  David Lounsbury, CTO and Chris Harding, Director of Interoperability from The Open Group. To access the discussion, please follow the #ogchat hashtag during the allotted discussion time.

What Is a Tweet Jam?

A tweet jam is a one-hour “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on relevant and thought-provoking issues. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

Have your first #ogchat tweet be a self-introduction: name, affiliation, occupation.

Start all other tweets with the question number you’re responding to and add the #ogchat hashtag.

Sample: “A1: There are already a number of cities implementing tech to get smarter. #ogchat”

Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.

While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue.

A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please contact Rob Checkal (@robcheckal or rob.checkal@hotwirepr.com). We anticipate a lively chat and hope you will be able to join!

2 Comments

Filed under real-time and embedded systems, Tweet Jam

Facing the Challenges of the Healthcare Industry – An Interview with Eric Stephens of The Open Group Healthcare Forum

By The Open Group

The Open Group launched its new Healthcare Forum at the Philadelphia conference in July 2013. The forum’s focus is on bringing Boundaryless Information Flow™ to the healthcare industry to enable data to flow more easily throughout the complete healthcare ecosystem through a standardized vocabulary and messaging. Leveraging the discipline and principles of Enterprise Architecture, including TOGAF®, the forum aims to develop standards that will result in higher quality outcomes, streamlined business practices and innovation within the industry.

At the recent San Francisco 2014 conference, Eric Stephens, Enterprise Architect at Oracle, delivered a keynote address entitled, “Enabling the Opportunity to Achieve Boundaryless Information Flow” along with Larry Schmidt, HP Fellow at Hewlett-Packard. A veteran of the healthcare industry, Stephens was Senior Director of Enterprise Architects Excellus for BlueCross BlueShield prior to joining Oracle and he is an active member of the Healthcare Forum.

We sat down after the keynote to speak with Stephens about the challenges of healthcare, how standards can help realign the industry and the goals of the forum. The opinions expressed here are Stephens’ own, not of his employer.

What are some of the challenges currently facing the healthcare industry?

There are a number of challenges, and I think when we look at it as a U.S.-centric problem, there’s a disproportionate amount of spending that’s taking place in the U.S. For example, if you look at GDP or percentage of GDP expenditures, we’re looking at now probably 18 percent of GDP [in the U.S.], and other developed countries are spending a full 5 percent less than that of their GDP, and in some cases they’re getting better outcomes outside the U.S.

The mere fact that there’s the existence of what we call “medical tourism, where if I need a hip replacement, I can get it done for a fraction of the cost in another country, same or better quality care and have a vacation—a rehab vacation—at the same time and bring along a spouse or significant other, means there’s a real wide range of disparity there. 

There’s also a lack of transparency. Having worked at an insurance company, I can tell you that with the advent of high deductible plans, there’s a need for additional cost information. When I go on Amazon or go to a local furniture store, I know what the cost is going to be for what I’m about to purchase. In the healthcare system, we don’t get that. With high deductible plans, if I’m going to be responsible for a portion or a larger portion of the fee, I want to know what it is. And what happens is, the incentives to drive costs down force the patient to be a consumer. The consumer now asks the tough questions. If my daughter’s going in for a tonsillectomy, show me a bill of materials that shows me what’s going to be done – if you are charging me $20/pill for Tylenol, I’ll bring my own. Increased transparency is what will in turn drive down the overall costs.

I think there’s one more thing, and this gets into the legal side of things. There is an exorbitant amount of legislation and regulation around what needs to be done. And because every time something goes sideways, there’s going to be a lawsuit, doctors will prescribe an extra test, and extra X-ray for a patient whether they need it or not.

The healthcare system is designed around a vicious cycle of diagnose-treat-release. It’s not incentivized to focus on prevention and management. Oregon is promoting these coordinated care organizations (CCOs) that would be this intermediary that works with all medical professionals – whether it was physical, mental, dental, even social worker – to coordinate episodes of care for patients. This drives down inappropriate utilization – for example, using an ER as a primary care facility and drives the medical system towards prevention and management of health. 

Your keynote with Larry Schmidt of HP focused a lot on cultural changes that need to take place within the healthcare industry – what are some of the changes necessary for the healthcare industry to put standards into place?

I would say culturally, it goes back to those incentives, and it goes back to introducing this idea of patient-centricity. And for the medical community, to really start recognizing that these individuals are consumers and increased choice is being introduced, just like you see in other industries. There are disruptive business models. As a for instance, medical tourism is a disruptive business model for United States-based healthcare. The idea of pharmacies introducing clinical medicine for routine care, such as what you see at a CVS, Wal-Mart or Walgreens. I can get a flu shot, I can get a well-check visit, I can get a vaccine – routine stuff that doesn’t warrant a full-blown medical professional. It’s applying the right amount of medical care to a particular situation.

Why haven’t existing standards been adopted more broadly within the industry? What will help providers be more likely to adopt standards?

I think the standards adoption is about “what’s in it for me, the WIIFM idea. It’s demonstrating to providers that utilizing standards is going to help them get out of the medical administration business and focus on their core business, the same way that any other business would want to standardize its information through integration, processes and components. It reduces your overall maintenance costs going forward and arguably you don’t need a team of billing folks sitting in an doctor’s office because you have standardized exchanges of information.

Why haven’t they been adopted? It’s still a question in my mind. Why would a doctor not want to do that is perhaps a question we’re going to need to explore as part of the Healthcare Forum.

Is it doctors that need to adopt the standards or technologies or combination of different constituents within the ecosystem?

I think it’s a combination. We hear a lot about the Affordable Care Act (ACA) and the health exchanges. What we don’t hear about is the legislation to drive toward standardization to increase interoperability. So unfortunately it would seem the financial incentives or things we’ve tried before haven’t worked, and we may simply have to resort to legislation or at least legislative incentives to make it happen because part of the funding does cover information exchanges so you can move health information between providers and other actors in the healthcare system.

You’re advocating putting the individual at the center of the healthcare ecosystem. What changes need to take place within the industry in order to do this?

I think it’s education, a lot of education that has to take place. I think that individuals via the incentive model around high deductible plans will force some of that but it’s taking responsibility and understanding the individual role in healthcare. It’s also a cultural/societal phenomenon.

I’m kind of speculating here, and going way beyond what enterprise architecture or what IT would deliver, but this is a philosophical thing around if I have an ailment, chances are there’s a pill to fix it. Look at the commercials, every ailment say hypertension, it’s easy, you just dial the medication correctly and you don’t worry as much about diet and exercise. These sorts of things – our over-reliance on medication. I’m certainly not going to knock the medications that are needed for folks that absolutely need them – but I think we can become too dependent on pharmacological solutions for our health problems.   

What responsibility will individuals then have for their healthcare? Will that also require a cultural and behavioral shift for the individual?

The individual has to start managing his or her own health. We manage our careers and families proactively. Now we need to focus on our health and not just float through the system. It may come to financial incentives for certain “individual KPIs such as blood pressure, sugar levels, or BMI. Advances in medical technology may facilitate more personal management of one’s health.

One of the Healthcare Forum’s goals is to help establish Boundaryless Information Flow within the Healthcare industry you’ve said that understanding the healthcare ecosystem will be a key component for that what does that ecosystem encompass and why is it important to know that first?

Very simply we’re talking about the member/patient/consumer, then we get into the payers, the providers, and we have to take into account government agencies and other non-medical agents, but they all have to work in concert and information needs to flow between those organizations in a very standardized way so that decisions can be made in a very timely fashion.

It can’t be bottled up, it’s got to be provided to the right provider at the right time, otherwise, best case, it’s going to cost more to manage all the actors in the system. Worst case, somebody dies or there is a “never event due to misinformation or lack of information during the course of care. The idea of Boundaryless Information Flow gives us the opportunity to standardize, have easily accessible information – and by the way secured – it can really aide in that decision-making process going forward. It’s no different than Wal-Mart knowing what kind of merchandise sells well before and after a hurricane (i.e., beer and toaster pastries, BTW). It’s the same kind of real-time information that’s made available to a Google car so it can steer its way down the road. It’s that kind of viscosity needed to make the right decisions at the right time.

Healthcare is a highly regulated industry, how can Boundarylesss Information Flow and data collection on individuals be achieved and still protect patient privacy?

We can talk about standards and the flow and the technical side. We need to focus on the security and privacy side.  And there’s going to be a legislative side because we’re going to touch on real fundamental data governance issue – who owns the patient record? Each actor in the system thinks they own the patient record. If we’re going to require more personal accountability for healthcare, then shouldn’t the consumer have more ownership? 

We also need to address privacy disclosure regulations to avoid catastrophic data leaks of protected health information (PHI). We need bright IT talent to pull off the integration we are talking about here. We also need folks who are well versed in the privacy laws and regulations. I’ve seen project teams of 200 have up to eight folks just focusing on the security and privacy considerations. We can argue about headcount later but my point is the same – one needs some focused resources around this topic.

What will standards bring to the healthcare industry that is missing now?

I think the standards, and more specifically the harmonization of the standards, is going to bring increased maintainability of solutions, I think it’s going to bring increased interoperability, I think it’s going to bring increased opportunities too. We see mobile computing or even DropBox, that has API hooks into all sorts of tools, and it’s well integrated – so I can integrate and I can move files between devices, I can move files between apps because they have hooks it’s easy to work with. So it’s building these communities of developers, apps and technical capabilities that makes it easy to move the personal health record for example, back and forth between providers and it’s not a cataclysmic event to integrate a new version of electronic health records (EHR) or to integrate the next version of an EHR. This idea of standardization but also some flexibility that goes into it.

Are you looking just at the U.S. or how do you make a standard that can go across borders and be international?

It is a concern, much of my thinking and much of what I’ve conveyed today is U.S.-centric, based on our problems, but many of these interoperability problems are international. We’re going to need to address it; I couldn’t tell you what the sequence is right now. There are other considerations, for example, single vs. multi-payer—that came up in the keynote. We tend to think that if we stay focused on the consumer/patient we’re going to get it for all constituencies. It will take time to go international with a standard, but it wouldn’t be the first time. We have a host of technical standards for the Internet (e.g., TCP/IP, HTTP). The industry has been able to instill these standards across geographies and vendors. Admittedly, the harmonization of health care-related standards will be more difficult. However, as our world shrinks with globalization an international lens will need to be applied to this challenge. 

Eric StephensEric Stephens (@EricStephens) is a member of Oracle’s executive advisory community where he focuses on advancing clients’ business initiatives leveraging the practice of Business and Enterprise Architecture. Prior to joining Oracle he was Senior Director of Enterprise Architecture at Excellus BlueCross BlueShield leading the organization with architecture design, innovation, and technology adoption capabilities within the healthcare industry.

 

Comments Off

Filed under Conference, Data management, Enterprise Architecture, Healthcare, Information security, Standards, TOGAF®

The Open Group San Francisco 2014 – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

Day two, February 4th, of The Open Group San Francisco conference kicked off with a welcome and opening remarks from Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects.

Nunn introduced Allen Brown, President and CEO of The Open Group, who provided highlights from The Open Group’s last quarter.  As of Q4 2013, The Open Group had 45,000 individual members in 134 countries hailing from 449 member companies in 38 countries worldwide. Ten new member companies have already joined The Open Group in 2014, and 24 members joined in the last quarter of 2013, with the first member company joining from Vietnam. In addition, 6,500 individuals attended events sponsored by The Open Group in Q4 2013 worldwide.

Updates on The Open Group’s ongoing work were provided including updates on the FACE™ Consortium, DirectNet® Waveform Standard, Architecture Forum, Archimate® Forum, Open Platform 3.0™ Forum and Security Forum.

Of note was the ongoing development of TOGAF® and introduction of a three-volume work including individual volumes outlining the TOGAF framework, guidance and tools and techniques for the standard, as well as collaborative work that allows the Archimate modeling language to be used for risk management in enterprise architectures.

In addition, Open Platform 3.0 Forum has already put together 22 business use cases outlining ROI and business value for various uses related to technology convergence. The Cloud Work Group’s Cloud Reference Architecture has also been submitted to ISO for international standards certification, and the Security Forum has introduced certification programs for OpenFAIR risk management certification for individuals.

The morning plenary centered on The Open Group’s Dependability through Assuredness™ (O-DA) Framework, which was released last August.

Speaking first about the framework was Dr. Mario Tokoro, Founder and Executive Advisor for Sony Computer Science Laboratories. Dr. Tokoro gave an overview of the Dependable Embedded OS project (DEOS), a large national project in Japan originally intended to strengthen the country’s embedded systems. After considerable research, the project leaders discovered they needed to consider whether large, open systems could be dependable when it came to business continuity, accountability and ensuring consistency throughout the systems’ lifecycle. Because the boundaries of large open systems are ever-changing, the project leaders knew they must put together dependability requirements that could accommodate constant change, allow for continuous service and provide continuous accountability for the systems based on consensus. As a result, they put together a framework to address both the change accommodation cycle and failure response cycles for large systems – this framework was donated to The Open Group’s Real-Time Embedded Systems Forum and released as the O-DA standard.

Dr. Tokoro’s presentation was followed by a panel discussion on the O-DA standard. Moderated by Dave Lounsbury, VP and CTO of The Open Group, the panel included Dr. Tokoro; Jack Fujieda, Founder and CEO ReGIS, Inc.; T.J. Virdi, Senior Enterprise IT Architect at Boeing; and Bill Brierly, Partner and Senior Consultant, Conexiam. The panel discussed the importance of openness for systems, iterating the conference theme of boundaries and the realities of having standards that can ensure openness and dependability at the same time. They also discussed how the O-DA standard provides end-to-end requirements for system architectures that also account for accommodating changes within the system and accountability for it.

Lounsbury concluded the track by iterating that assuring systems’ dependability is not only fundamental to The Open Group mission of Boundaryless Information Flow™ and interoperability but also in preventing large system failures.

Tuesday’s late morning sessions were split into two tracks, with one track continuing the Dependability through Assuredness theme hosted by Joe Bergmann, Forum Chair of The Open Group’s Real-Time and Embedded Systems Forum. In this track, Fujieda and Brierly furthered the discussion of O-DA outlining the philosophy and vision of the standard, as well as providing a roadmap for the standard.

In the morning Business Innovation & Transformation track, Alan Hakimi, Consulting Executive, Microsoft presented “Zen and the Art of Enterprise Architecture: The Dynamics of Transformation in a Complex World.” Hakimi emphasized that transformation needs to focus on a holistic view of an organization’s ecosystem and motivations, economics, culture and existing systems to help foster real change. Based on Buddhist philosophy, he presented an eightfold path to transformation that can allow enterprise architects to approach transformation and discuss it with other architects and business constituents in a way that is meaningful to them and allows for complexity and balance.

This was followed by “Building the Knowledge-Based Enterprise,” a session given by Bob Weisman, Head Management Consultant for Build the Vision.

Tuesday’s afternoon sessions centered on a number of topics including Business Innovation and Transformation, Risk Management, Archimate, TOGAF tutorials and case studies and Professional Development.

In the Archimate track, Vadim Polyakov of Inovalon, Inc., presented “Implementing an EA Practice in an Agile Enterprise” a case study centered on how his company integrated its enterprise architecture with the principles of agile development and how they customized the Archimate framework as part of the process.

The Risk Management track featured William Estrem, President, Metaplexity Associates, and Jim May of Windsor Software discussing how the Open FAIR Standard can be used in conjunction with TOGAF 9.1 to enhance risk management in organizations in their session, “Integrating Open FAIR Risk Analysis into the Enterprise Architecture Capability.” Jack Jones, President of CXOWARE, also discussed the best ways for “Communicating the Value Proposition” for cohesive enterprise architectures to business managers using risk management scenarios.

The plenary sessions and many of the track sessions from today’s tracks can be viewed on The Open Group’s Livestream channel at http://new.livestream.com/opengroup.

The day culminated with dinner and a Lion Dance performance in honor of Chinese New Year performed by Leung’s White Crane Lion & Dragon Dance School of San Francisco.

We would like to express our gratitude for the support by our following sponsors:  BIZZDesign, Corso, Good e-Learning, I-Server and Metaplexity Associates.

IMG_1460 copy

O-DA standard panel discussion with Dave Lounsbury, Bill Brierly, Dr. Mario Tokoro, Jack Fujieda and TJ Virdi

Comments Off

Filed under Conference, Enterprise Architecture, Enterprise Transformation, Standards, TOGAF®, Uncategorized

Open Platform 3.0™ to Help Rally Organizations in Innovation and Development

by Andy Mulholland, Former Global CTO, Capgemini

The Open Platform 3.0™ initiative, launched by The Open Group, provides a forum in which organizations, including standards bodies, as much as users and product vendors, can coordinate their approach to new business models and new practices for use of IT, can define or identify common vendor-neutral standards, and can foster the adoption of those standards in a cohesive and aligned manner to ensure a integrated commercial viable set of solutions.

The goal is to enable effective business architectures, that support a new generation of interoperable business solutions, quickly, and at low cost using new technologies and provisioning methods, but with integration to existing IT environments.

Acting on behalf of its core franchise base of CIOs, and in association with the US and European CIO associations, Open Platform 3.0 will act as a rallying point for all involved in the development of technology solutions that new innovative business models and practices require.

There is a distinctive sea change in the way that organizations are adopting and using a range of new technologies, mostly relating to a front office revolution in how business is performed with their customers, suppliers and even within their markets. More than ever The Open Group mission of Boundaryless Information Flow™ through the mutual development of technology standards and methods is relevant to this change.

The competitive benefits are driving rapid Business adoption but mostly through a series of business management owned and driven pilots, usually with no long term thought as to scale, compliance, security, even data integrity. Rightly the CIO is concerned as to these issues, but too often in the absence of experience in this new environment and the ability to offer constructive approaches these concerns are viewed as unacceptable barriers.

This situation is further enflamed by the sheer variety of products and different groups, both technological and business, to try to develop workable standards for particular elements. Currently there is little, if any, overall coordination and alignment between all of these individually valuable elements towards a true ‘system’ approach with understandable methods to deliver the comprehensive enterprise approach in a manner that will truly serve the full business purposes.

The business imperatives supported by the teaching of Business Schools are focused on time as a key issue and advocate small fast projects built on externally provisioned, paid for against use, cloud services.  These are elements of the sea change that have to be accepted, and indeed will grow as society overall expects to do business and obtain their own requirements in the same way.

Much of these changes are outside the knowledge, experience of often power of current IT departments, yet they rightly understand that to continue in their essential role of maintaining the core internal operations and commercial stability this change must introduce a new generation of deployment, integration, and management methods. The risk is to continue the polarization that has already started to develop between the internal IT operations based on Client-Server Enterprise Applications versus the external operations of sales and marketing using Browser-Cloud based Apps and Services.

At best this will result in an increasingly decentralized and difficult to manage business, at worst Audit and Compliance management will report this business as being in breach of financial and commercial rules. This is being recognized by organizations introducing a new type of role supported by Business Schools and Universities termed a Business Architect. Their role in the application of new technology is to determine how to orchestrate complex business processes through Big Data and Big Process from the ‘Services’ available to users. This is in many ways a direct equivalent, though with different skills, to an Enterprise Architect in conventional IT who will focus on the data integrity from designing Applications and their integration.

The Open Group’s massive experience in the development of TOGAF®, together with its wide spread global acceptability, lead to a deep understanding of the problem, the issues, and how to develop a solution both for Business Architecture, as well as for its integration with Enterprise Architecture.

The Open Group believes that it is uniquely positioned to play this role due to its extensive experience in the development of standards on behalf of user enterprises to enable Boundaryless Information Flow including its globally recognized Enterprise Architecture standard TOGAF. Moreover it believes from feedback received from many directions this move will be welcomed by many of those involved in the various aspects of this exciting period of change.

mulhollandAndy joined Capgemini in 1996, bringing with him thirteen years of experience from previous senior IT roles across all major industry sectors.

In his former role as Global Chief Technology Officer, Andy was a member of the Capgemini Group management board and advised on all aspects of technology-driven market changes, as well as serving on the technology advisory boards of several organizations and enterprises.

A popular speaker with many appearances at major events all around the World, and frequently quoted by the press, in 2009 Andy was voted one of the top 25 most influential CTOs in the world by InfoWorld USA, and in 2010 his CTOblog was voted best Blog for Business Managers and CIOs for the third third year running by Computing Weekly UK. Andy retired in June 2012, but still maintains an active association with the Capgemini Group and his activities across the Industry lead to his achieving 29th place in 2012 in the prestigious USA ExecRank ratings category ‘Top CTOs’.

Comments Off

Filed under Open Platform 3.0, TOGAF

Evolving Business and Technology Toward an Open Platform 3.0™

By Dave Lounsbury, Chief Technical Officer, The Open Group

The role of IT within the business is one that constantly evolves and changes. If you’ve been in the technology industry long enough, you’ve likely had the privilege of seeing IT grow to become integral to how businesses and organizations function.

In his recent keynote “Just Exactly What Is Going On in Business and Technology?” at The Open Group London Conference in October, Andy Mulholland, former Global Chief Technology Officer at Capgemini, discussed how the role of IT has changed from being traditionally internally focused (inside the firewall, proprietary, a few massive applications, controlled by IT) to one that is increasingly externally focused (outside the firewall, open systems, lots of small applications, increasingly controlled by users). This is due to the rise of a number of disruptive forces currently affecting the industry such as BYOD, Cloud, social media tools, Big Data, the Internet of Things, cognitive computing. As Mulholland pointed out, IT today is about how people are using technology in the front office. They are bringing their own devices, they are using apps to get outside of the firewall, they are moving further and further away from traditional “back office” IT.

Due to the rise of the Internet, the client/server model of the 1980s and 1990s that kept everything within the enterprise is no more. That model has been subsumed by a model in which development is fast and iterative and information is constantly being pushed and pulled primarily from outside organizations. The current model is also increasingly mobile, allowing users to get the information they need anytime and anywhere from any device.

At the same time, there is a push from business and management for increasingly rapid turnaround times and smaller scale projects that are, more often than not, being sourced via Cloud services. The focus of these projects is on innovating business models and acting in areas where the competition does not act. These forces are causing polarization within IT departments between internal IT operations based on legacy systems and new external operations serving buyers in business functions that are sourcing their own services through Cloud-based apps.

Just as UNIX® provided a standard platform for applications on single computers and the combination of servers, PCs and the Internet provided a second platform for web apps and services, we now need a new platform to support the apps and services that use cloud, social, mobile, big data and the Internet of Things. Rather than merely aligning with business goals or enabling business, the next platform will be embedded within the business as an integral element bringing together users, activity and data. To work properly, this must be a standard platform so that these things can work together effectively and at low cost, providing vendors a worthwhile market for their products.

Industry pundits have already begun to talk about this layer of technology. Gartner calls it the “Nexus of Forces.” IDC calls it the “third platform.” At the The Open Group, we refer to it as Open Platform 3.0™, and we announced a new Forum to address how organizations can address and support these technologies earlier this year. Open Platform 3.0 is meant to enable organizations (including standards bodies, users and vendors) coordinate their approaches to the new business models and IT practices driving the new platform to support a new generation of interoperable business solutions.

As is always the case with technologies, a point is reached where technical innovation must transition to business benefit. Open Platform 3.0 is, in essence, the next evolution of computing. To help the industry sort through these changes and create vendor-neutral standards that foster the cohesive adoption of new technologies, The Open Group must also evolve its focus and standards to respond to where the industry is headed.

The work of the Open Platform 3.0 Forum has already begun. Initial actions for the Forum have been identified and were shared during the London conference.  Our recent survey on Convergent Technologies confirmed the need to address these issues. Of those surveyed, 95 percent of respondents felt that converged technologies were an opportunity for business, and 84 percent of solution providers are already dealing with two or more of these technologies in combination. Respondents also saw vendor lock-in as a potential hindrance to using these technologies underscoring the need for an industry standard that will address interoperability. In addition to the survey, the Forum has also produced an initial Business Scenario to begin to address these industry needs and formulate requirements for this new platform.

If you have any questions about Open Platform 3.0 or if you would like to join the new Forum, please contact Chris Harding (c.harding@opengroup.org) for queries regarding the Forum or Chris Parnell (c.parnell@opengroup.org) for queries regarding membership.

 

Dave LounsburyDave is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, Dave leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia. Dave holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

 

 

1 Comment

Filed under Cloud, Data management, Future Technologies, Open Platform 3.0, Standards, Uncategorized, UNIX

The Open Group London – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

We eagerly jumped into the second day of our Business Transformation conference in London on Tuesday October 22nd!  The setting is the magnificent Central Hall Westminster.

Steve Nunn, COO of The Open Group and CEO of Association of Enterprise Architects (AEA), started off the morning introducing our plenary based on Healthcare Transformation.  Steve noted that the numbers in healthcare spend are huge and bringing Enterprise Architecture (EA) to healthcare will help with efficiencies.

The well-renowned Dr. Peter Sudbury, Healthcare Specialist with HP Enterprise Services, discussed the healthcare crisis (dollars, demand, demographics), the new healthcare paradigm, barriers to change and innovation. Dr. Sudbury also commented on the real drivers of healthcare costs: healthcare inflation is higher intrinsically; innovation increases cost; productivity improvements lag other industries.

IMG_sudburyDr. Peter Sudbury

Dr. Sudbury, Larry Schmidt (Chief Technologist, HP) and Roar Engen (Head of Enterprise Architecture, Helse Sør-Øst RHF, Norway) participated in the Healthcare Transformation Panel, moderated by Steve Nunn.  The group discussed opportunities for improvement by applying EA in healthcare.  They mentioned that physicians, hospitals, drug manufacturers, nutritionists, etc. should all be working together and using Boundaryless Information Flow™ to ensure data is smoothly shared across all entities.  It was also stated that TOGAF® is beneficial for efficiencies.

Following the panel, Dr. Mario Tokoro (Founder & Executive Advisor of Sony Computer Science Laboratories, Inc. Japanese Science & Technology Agency, DEOS Project Leader) reviewed the Dependability through Assuredness™ standard, a standard of The Open Group.

The conference also offered many sessions in Finance/Commerce, Government and Tutorials/Workshops.

Margaret Ford, Consult Hyperion, UK and Henk Jonkers of BIZZdesign, Netherlands discussed “From Enterprise Architecture to Cyber Security Risk Assessment”.  The key takeaways were: complex cyber security risks require systematic, model-based risk assessment; attack navigators can provide this by linking ArchiMate® to the Risk Taxonomy.

“Applying Service-Oriented Architecture within a Business Technology Environment in the Finance Sector” was presented by Gerard Peters, Managing Consultant, Capgemini, The Netherlands. This case study is part of a white paper on Service-Oriented Architecture for Business Technology (SOA4BT).

You can view all of the plenary and many of the track presentations at livestream.com.  And for those who attended, full conference proceedings will be available.

The night culminated with a spectacular experience on the London Eye, the largest Ferris wheel in Europe located on the River Thames.

Comments Off

Filed under ArchiMate®, Cloud/SOA, Enterprise Architecture, Enterprise Transformation, Healthcare, Professional Development, Service Oriented Architecture, TOGAF®