The Open Group Summit Amsterdam 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The Open Group Summit Amsterdam, held at the historic Hotel Krasnapolsky, began on Monday, May 12 by highlighting how the industry is moving further towards Boundaryless Information Flow™. After the successful introduction of The Open Group Healthcare Forum in San Francisco, the Governing Board is now considering other vertical Forums such as the airline industry and utilities sector.

The morning plenary began with a welcome from Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects (AEA). He mentioned that Amsterdam has a special place in his heart because of the remembrance of the 2001 event also held in Amsterdam, just one month after the 9/11 attacks which shocked the world. Today, with almost 300 registrations and people from 29 different countries, The Open Group is still appealing to a wide range of nationalities.

Allen Brown, President and CEO of The Open Group, took the audience on a journey as he described the transformation process that The Open Group has been on over the last thirty years from its inception in 1984. After a radically financial reorganization and raising new working capital, The Open Group is flourishing more than ever and is in good financial health.

It is amazing that 40 percent of the staff of 1984 is still working for The Open Group. What is the secret? You should have the right people in the boat with shared values and commitment. “In 2014, The Open Group runs a business, but stays a not-for-profit organization, a consortium”, Brown emphasized. “Enterprise Architecture is not a commercial vehicle or a ‘trendy’ topic. The Open Group always has a positive attitude and will never criticize other organizations. Our certification programs are a differentiator compared to other organizations. We collaborate with other consortia and standard bodies like ISO and ITIL”, Brown said.

Now the world is much more complex. Technology risk is increasing. A common language based on common standards is needed more than ever. TOGAF®, an Open Group standard, was in its infancy in 1998 and now it is the common standard for Enterprise Architects all over the world. In 1984, the UNIX® platform was the first platform of The Open Group. The Open Group Open Platform 3.0™, launched last year, focuses on new and emerging technology trends like mobility, big data, cloud computing and the Internet of Things converging with each other and leading to new business models and system designs. “The Open Group is all about building relationships and networking”, Brown concluded.

Leonardo Ramirez, CEO of ARCA SG and Chair of AEA Colombia, talked about the role of interoperability and Enterprise Architecture in Latin America. Colombia is now a safe country and has the strongest economy in the region. In 2011 Colombia promoted the electronic government and TOGAF was selected as the best choice for Enterprise Architecture. Ramirez is determined to stimulate social economic development projects in Latin America with the help of Enterprise Architecture. There is a law in Colombia (Regulation Law 1712, 2014) that says that every citizen has the right to access all the public information without boundaries.

Dr. Jonas Ridderstråle, Chairman, Mgruppen and Visiting Professor, Ashridge (UK) and IE Business Schools (Spain), said in his keynote speech, “Womenomics rules, the big winners of the personal freedom movement will be women. Women are far more risk averse. What would have happened with Lehman Brothers if it was managed by women? ‘Lehman Sisters’ probably had the potential to survive. Now women can spend 80 percent of their time on other things than just raising kids.” Ridderstråle continued to discuss life-changing and game-changing events throughout his presentation. He noted that The Open Group Open Platform 3.0 for instance is a good example of a successful reinvention.

“Towards a European Interoperability Architecture” was the title of one of the afternoon sessions led by Mr. R. Abril Jimenez. Analysis during the first phase of the European Interoperability Strategy (EIS) found that, at conceptual level, architecture guidelines were missing or inadequate. In particular, there are no architectural guidelines for cross-border interoperability of building blocks. Concrete, reusable interoperability guidelines and rules and principles on standards and architecture are also lacking. Based on the results achieved and direction set in the previous phases of the action, the EIA project has moved into a more practical phase that consists of two main parts: Conceptual Reference Architecture and Cartography.

Other tracks featured Healthcare, Professional Development and Dependability through Assuredness™.

The evening concluded with a lively networking reception in the hotel’s Winter Garden ballroom.

For those of you who attended the summit, please give us your feedback!

Comments Off

Filed under Boundaryless Information Flow™, Conference, Dependability through Assuredness™, Enterprise Architecture, Enterprise Transformation, Healthcare, Open Platform 3.0, Professional Development, Standards, TOGAF®, Uncategorized

Improving Patient Care and Reducing Costs in Healthcare

By Jason Lee, Director of Healthcare and Security Forums, The Open Group

Recently, The Open Group Healthcare Forum hosted a tweet jam to discuss IT and Enterprise Architecture (EA) issues as they relate to two of the most persistent problems in healthcare: reducing costs and improving patient care. Below I summarize the key points that followed from a rather unique discussion. Unique how? Unique in that rather than address these issues from the perspective of “must do” priorities (including EHR implementation, transitioning to ICD-10, and meeting enhanced HIPAA security requirements), we focused on “should do” opportunities.

We asked how stakeholders in the healthcare system can employ “Boundaryless Information Flow™” and standards development through the application of EA approaches that have proven effective in other industries to add new insights and processes to reduce costs and improve quality.

Question 1: What barriers exist for collaboration among providers in healthcare, and what can be done to improve things?
• tetradian: Huge barriers of language, terminology, mindset, worldview, paradigm, hierarchy, role and much more
• jasonsleephd: Financial, organizational, structural, lack of enabling technology, cultural, educational, professional insulation
• jim_hietala: EHRs with proprietary interfaces represent a big barrier in healthcare
• Technodad: Isn’t question really what barriers exist for collaboration between providers and patients in healthcare?
• tetradian: Communication b/w patients and providers is only one (type) amongst very many
• Technodad: Agree. Debate needs to identify whose point of view the #healthcare problem is addressing.
• Dana_Gardner: Where to begin? A Tower of Babel exists on multiple levels among #healthcare ecosystems. Too complex to fix wholesale.
• EricStephens: Also, legal ramifications of sharing information may impede sharing
• efeatherston: Patient needs provider collaboration to see any true benefit (I don’t just go to one provider)
• Dana_Gardner: Improve first by identifying essential collaborative processes that have most impact, and then enable them as secure services.
• Technodad: In US at least, solutions will need to be patient-centric to span providers- Bring Your Own Wellness (BYOW™) for HC info.
• loseby: Lack of shared capabilities & interfaces between EHRs leads to providers w/o comprehensive view of patient
• EricStephens: Are incentives aligned sufficiently to encourage collaboration? + lack of technology integration.
• tetradian: Vast numbers of stakeholder-groups, many beyond medicine – e.g. pharma, university, politics, local care (esp. outside of US)
• jim_hietala: Gap in patient-centric information flow
• Technodad: I think patents will need to drive the collaboration – they have more incentive to manage info than providers.
• efeatherston: Agreed, stakeholder list could be huge
• EricStephens: High-deductible plans will drive patients (us) to own our health care experience
• Dana_Gardner: Take patient-centric approach to making #healthcare processes better: drives adoption, which drives productivity, more adoption
• jasonsleephd: Who thinks standards development and data sharing is an essential collaboration tool?
• tetradian: not always patient-centric – e.g. epidemiology /public-health is population centric – i.e. _everything_ is ‘the centre’
• jasonsleephd: How do we break through barriers to collaboration? For one thing, we need to create financial incentives to collaborate (e.g., ACOs)
• efeatherston: Agreed, the challenge is to get them to challenge (if that makes sense). Many do not question
• EricStephens: Some will deify those in a lab coat.
• efeatherston: Still do, especially older generations, cultural
• Technodad: Agree – also displaying, fusing data from different providers, labs, monitors etc.
• dianedanamac: Online collaboration, can be cost effective & promote better quality but must financially incented
• efeatherston: Good point, unless there is a benefit/incentive for provider, they may not be bothered to try
• tetradian: “must financially incented” – often other incentives work better – money can be a distraction – also who pays?

Participants identified barriers that are not atypical: financial disincentives, underpowered technology, failure to utilize existing capability, lack of motivation to collaborate. Yet all participants viewed more collaboration as key. Consensus developed around:
• The patient (and by one commenter, the population) as the main driver of collaboration, and
• The patient as the most important stakeholder at the center of information flow.

Question 2: Does implementing remote patient tele-monitoring and online collaboration drive better and more cost-effective patient care?
• EricStephens: “Hell yes” comes to mind. Why drag yourself into a dr. office when a device can send the information (w/ video)
• efeatherston: Will it? Will those with high deductible plans have ability/understanding/influence to push for it?
• EricStephens: Driving up participation could drive up efficacy
• jim_hietala: Big opportunities to improve patient care thru remote tele-monitoring
• jasonsleephd: Tele-ICUs can keep patients (and money) in remote settings while receiving quality care
• jasonsleephd: Remote monitoring of patients admitted with CHF can reduce rehospitalization w/i 6 months
• Dana_Gardner: Yes! Pacemakers now uplink to centralized analysis centers, communicate trends back to attending doctor. Just scratches surface
• efeatherston: Amen. Do that now, monthly uplink, annual check in with doctor to discuss any trends he sees.
• tetradian: Assumes tele-monitoring options even exist – very wide range of device-capabilities, from very high to not-much, and still not common.
• tetradian: (General request to remember that there’s more to the world, and medicine, than just the US and its somewhat idiosyncratic systems?)
• efeatherston: Yes, I do find myself looking through the lens of my own experiences, forgetting the way we do things may not translate
• jasonsleephd: Amen to point about our idiosyncrasies! Still, we have to live with them, and we can do so much better with good information flow!
• Dana_Gardner: Governments should remove barriers so more remote patient tele-monitoring occurs. Need to address the malpractice risks issue.
• TerryBlevins: Absolutely. Just want the information to go to the right place!
• Technodad: . Isn’t “right place” someplace you & all your providers can access? Need interoperability!
• TerryBlevins: It requires interoperability yes – the info must flow to those that must know.
• Technodad: Many areas where continuous monitoring can help. Improved IoT (internet of things) sensors e.g. cardio, blood chemistry coming.
• tetradian: Ethical/privacy concerns re how/with-whom that data is shared – e.g. with pharma, research, epidemiology etc
• efeatherston: Add employers to that etc. list of how/who/what is shared

Participants agreed that remote patient monitoring and telemonitoring can improve collaboration, improve patient care, and put patients more in control of their own healthcare data. However, participants expressed concerns about lack of widespread availability and the related issue of high cost. In addition, they raised important questions about who has access to these data, and they addressed nagging privacy and liability concerns.

Question 3: Can a mobile strategy improve patient experience, empowerment and satisfaction? If so, how?
• jim_hietala: mobile is a key area where patient health information can be developed/captured
• EricStephens: Example: link blood sugar monitor to iPhone to MyFitnessPal + gamification to drive adherence (and drive $$ down?)
• efeatherston: Mobile along with #InternetOfThings, wearables linked to mobile. Contact lens measuring blood sugar in recent article as ex.
• TerryBlevins: Sick people, or people getting sick are on the move. In a patient centric world we must match need.
• EricStephens: Mobile becomes a great data acquisition point. Something as simple as SMS can drive adherence with complication drug treatments
• jasonsleephd: mHealth is a very important area for innovation, better collaboration, $ reduction & quality improvement. Google recent “Webby Awards & handheld devices”
• tetradian: Mobile can help – e.g. use of SMS for medicine in Africa etc
• Technodad: Mobile isn’t option any more. Retail, prescription IoT, mobile network & computing make this a must-have.
• dianedanamac: Providers need to be able to receive the information mHealth
• Dana_Gardner: Healthcare should go location-independent. Patient is anywhere, therefore so is care, data, access. More than mobile, IMHO.
• Technodad: Technology and mobile demand will outrun regional provider systems, payers, regulation
• Dana_Gardner: As so why do they need to be regional? Cloud can enable supply-demand optimization regardless of location for much.
• TerryBlevins: And the caregivers are also on the move!
• Dana_Gardner: Also, more machine-driven care, i.e. IBM Watson, for managing the routing and prioritization. Helps mitigate overload.
• Technodad: Agree – more on that later!
• Technodad: Regional providers are the reality in the US. Would love to have more national/global coverage.
• Dana_Gardner: Yes, let the market work its magic by making it a larger market, when information is the key.
• tetradian: “let the market do its work” – ‘the market’ is probably the quickest way to destroy trust! – not a good idea…
• Technodad: To me, problem is coordinating among multi providers, labs etc. My health info seems to move at glacial pace then.
• tetradian: “Regional providers are the reality in the US.” – people move around: get info follow them is _hard_ (1st-hand exp. there…)
• tetradian: danger of hype/fear-driven apps – may need regulation, or at least regulatory monitoring
• jasonsleephd: Regulators, as in FDA or something similar?
• tetradian: “Regulators as in FDA” etc – at least oversight of that kind, yes (cf. vitamins, supplements, health-advice services)
• jim_hietala: mobile, consumer health device innovation moving much faster than IT ability to absorb
• tetradian: also beware of IT-centrism and culture – my 90yr-old mother has a cell-phone, but has almost no idea how to use it!
• Dana_Gardner: Information and rely of next steps (in prevention or acute care) are key, and can be mobile. Bring care to the patient ASAP.

Participants began in full agreement. Mobile health is not even an option but a “given” now. Recognition that provider ability to receive information is lacking. Cloud viewed as means to overcome regionalization of data storage problems. When the discussion turned to further development of mHealth there was some debate on what can be left to the market and whether some form of regulatory action is needed.

Question 4: Does better information flow and availability in healthcare reduce operation cost, and free up resources for more patient care?
• tetradian: A4: should do, but it’s _way_ more complex than most IT-folks seem to expect or understand (e.g. repeated health-IT fails in UK)
• jim_hietala: A4: removing barriers to health info flow may reduce costs, but for me it’s mostly about opportunity to improve patient care
• jasonsleephd: Absolutely. Consider claims processing alone. Admin costs in private health ins. are 20% or more. In Medicare less than 2%.
• loseby: Absolutely! ACO model is proving it. Better information flow and availability also significantly reduces hospital admissions
• dianedanamac: I love it when the MD can access my x-rays and lab results so we have more time.
• efeatherston: I love it when the MD can access my x-rays and lab results so we have more time.
• EricStephens: More info flow + availability -> less admin staff -> more med staff.
• EricStephens: Get the right info to the ER Dr. can save a life by avoiding contraindicated medicines
• jasonsleephd: EricStephens GO CPOE!!
• TerryBlevins: @theopengroup. believe so, but ask the providers. My doctor is more focused on patient by using simple tech to improve info flow
• tetradian: don’t forget link b/w information-flows and trust – if trust fails, so does the information-flow – worse than where we started!
• jasonsleephd: Yes! Trust is really key to this conversation!
• EricStephens: processing a claim, in most cases, should be no more difficult than an expense report or online order. Real-time adjudication
• TerryBlevins: Great point.
• efeatherston: Agreed should be, would love to see it happen. Trust in the data as mentioned earlier is key (and the process)
• tetradian: A4: sharing b/w patient and MD is core, yes, but who else needs to access that data – or _not_ see it? #privacy
• TerryBlevins: A4: @theopengroup can’t forget that if info doesn’t flow sometimes the consequences are fatal, so unblocked the flow.
• tetradian: .@TerryBlevins A4: “if info doesn’t flow sometimes the consequences are fatal,” – v.important!
• Technodad: . @tetradian To me, problem is coordinating among multi providers, labs etc. My health info seems to move at glacial pace then.
• TerryBlevins: A4: @Technodad @tetradian I have heard that a patient moving on a gurney moves faster than the info in a hospital.
• Dana_Gardner: A4 Better info flow in #healthcare like web access has helped. Now needs to go further to be interactive, responsive, predictive.
• jim_hietala: A4: how about pricing info flow in healthcare, which is almost totally lacking
• Dana_Gardner: A4 #BigData, #cloud, machine learning can make 1st points of #healthcare contact a tech interface. Not sci-fi, but not here either.

Starting with the recognition that this is a very complicated issue, the conversation quickly produced a consensus view that mobile health is key, both to cost reduction and quality improvement and increased patient satisfaction. Trust that information is accurate, available and used to support trust in the provider-patient relationship emerged as a relevant issue. Then, naturally, privacy issues surfaced. Coordination of information flow and lack of interoperability were recognized as important barriers and the conversation finally turned somewhat abstract and technical with mentions of big data and the cloud and pricing information flows without much in the way of specifying how to connect the dots.

Question 5: Do you think payers and providers are placing enough focus on using technology to positively impact patient satisfaction?
• Technodad: A5: I think there are positive signs but good architecture is lacking. Current course will end w/ provider information stovepipes.
• TerryBlevins: A5: @theopengroup Providers are doing more. I think much more is needed for payers – they actually may be worse.
• theopengroup: @TerryBlevins Interesting – where do you see opportunities for improvements with payers?
• TerryBlevins: A5: @theopengroup like was said below claims processing – an onerous job for providers and patients – mostly info issue.
• tetradian: A5: “enough focus on using tech”? – no, not yet – but probably won’t until tech folks properly face the non-tech issues…
• EricStephens: A5 No. I’m not sure patient satisfaction (customer experience/CX?) is even a factor sometimes. Patients not treated like customers
• dianedanamac: .@EricStephens SO TRUE! Patients not treated like customers
• Technodad: . @EricStephens Amen to that. Stovepipe data in provider systems is barrier to understanding my health & therefore satisfaction.
• dianedanamac: “@mclark497: @EricStephens issue is the customer is treat as only 1 dimension. There is also the family experience to consider too
• tetradian: .@EricStephens A5: “Patients not treated like customers” – who _is_ ‘the customer’? – that’s a really tricky question…
• efeatherston: @tetradian @EricStephens Trickiest question. to the provider is the patient or the payer the customer?
• tetradian: .@efeatherston “patient or payer” – yeah, though it gets _way_ more complex than that once we explore real stakeholder-relations
• efeatherston: @tetradian So true.
• jasonsleephd: .@tetradian @efeatherston Very true. There are so many diff stakeholders. But to align payers and pts would be huge
• efeatherston: @jasonsleephd @tetradian re: aligning payers and patients, agree, it would be huge and a good thing
• jasonsleephd: .@efeatherston @tetradian @EricStephens Ideally, there should be no dividing line between the payer and the patient!
• efeatherston: @jasonsleephd @tetradian @EricStephens Ideally I agree, and long for that ideal world.
• EricStephens: .@jasonsleephd @efeatherston @tetradian the payer s/b a financial proxy for the patient. and nothing more
• TerryBlevins: @EricStephens @jasonsleephd @efeatherston @tetradian … got a LOL out of me.
• Technodad: . @tetradian @EricStephens That’s a case of distorted marketplace. #Healthcare architecture must cut through to patient.
• tetradian: .@Technodad “That’s a case of distorted marketplace.” – yep. now add in the politics of consultants and their hierarchies, etc?
• TerryBlevins: A5: @efeatherston @tetradian @EricStephens in patient cetric world it is the patient and or their proxy.
• jasonsleephd: A5: Not enough emphasis on how proven technologies and architectural structures in other industries can benefit healthcare
• jim_hietala: A5: distinct tension in healthcare between patient-focus and meeting mandates (a US issue)
• tetradian: .@jim_hietala A5: “meeting mandates (a US issue)” – UK NHS (national-health-service) may be even worse than US – a mess of ‘targets’
• EricStephens: A5 @jim_hietala …and avoiding lawsuits
• tetradian: A5: most IT-type tech still not well-suited to the level of mass-uniqueness inherent in the healthcare context
• Dana_Gardner: A5 They are using tech, but patient “satisfaction” not yet a top driver. We have a long ways to go on that. But it can help a ton.
• theopengroup: @Dana_Gardner Agree, there’s a long way to go. What would you say is the starting point for providers to tie the two together?
• Dana_Gardner: @theopengroup An incentive other than to avoid lawsuits. A transparent care ratings capability. Outcomes focus based on total health
• Technodad: A5: I’d be satisfied just to not have to enter my patient info & history on a clipboard in every different provider I go to!
• dianedanamac: A5 @tetradian Better data sharing & Collab. less redundancy, lower cost, more focus on patient needs -all possible w/ technology
• Technodad: A5: The patient/payer discussion is a red herring. If the patient weren’t there, rest of the system would be unnecessary.
• jim_hietala: RT @Technodad: The patient/payer discussion is a red herring. If the patient weren’t there, rest of system unnecessary. AMEN

Very interesting conversation. Positive signs of progress were noted but so too were indications that healthcare will remain far behind the technology curve in the foreseeable future. Providers were given higher “grades” than payers. Yet, claims processing would seemingly be one of the easiest areas for technology-assisted improvement. One discussant noted that there will not be enough focus on technology in healthcare “until the tech folks properly face the non-tech issues”. This would seem to open a wide door for EA experts to enter the healthcare domain! The barriers (and opportunities) to this may be the topic of another tweet jam, or Open Group White Paper.
Interestingly, part way into the discussion the topic turned to the lack of a real customer/patient focus in healthcare. Not enough emphasis on patient satisfaction. Not enough attention to patient outcomes. There needs to be a better/closer alignment between what motivates payers and the needs of patients.

Question 6: As some have pointed out, many of the EHR systems are highly proprietary, how can standards deliver benefits in healthcare?
• jim_hietala: A6: Standards will help by lowering the barriers to capturing data, esp. for mhealth, and getting it to point of care
• tetradian: .@jim_hietala “esp. for mhealth” – focus on mhealth may be a way to break the proprietary logjam, ‘cos it ain’t proprietary yet
• TerryBlevins: A6: @theopengroup So now I deal with at least 3 different EHR systems. All requiring me to be the info steward! Hmmm
• TerryBlevins: A6 @theopengroup following up if they shared data through standards maybe they can synchronize.
• EricStephens: A6 – Standards lead to better interoperability, increased viscosity of information which will lead to lowers costs, better outcomes.
• efeatherston: @EricStephens and greater trust in the info (as was mentioned earlier, trust in the information key to success)
• jasonsleephd: A6: Standards development will not kill innovation but rather make proprietary systems interoperable
• Technodad: A6: Metcalfe’s law rules! HC’s many providers-many patients structure means interop systems will be > cost effective in long run.
• tetradian: A6: the politics of this are _huge_, likewise the complexities – if we don’t face those issues right up-front, this is going nowhere

On his April 24, 2014 post at, Tom Graves provided a clearly stated position on the role of The Open Group in delivering standards to help healthcare improve. He wrote:

“To me, this is where The Open Group has an obvious place and a much-needed role, because it’s more than just an IT-standards body. The Open Group membership are mostly IT-type organisations, yes, which tends to guide towards IT-standards, and that’s unquestionably of importance here. Yet perhaps the real role for The Open Group as an organisation is in its capabilities and experience in building consortia across whole industries: EMMM™ and FACE are two that come immediately to mind. Given the maze of stakeholders and the minefields of vested-interests across the health-context, those consortia-building skills and experience are perhaps what’s most needed here.”

The Open Group is the ideal organization to engage in this work. There are many ways to collaborate. You can join The Open Group Healthcare Forum, follow the Forum on Twitter @ogHealthcare and connect on The Open Group Healthcare Forum LinkedIn Group.

Jason Lee headshotJason Lee, Director of Healthcare and Security Forums at The Open Group, has conducted healthcare research, policy analysis and consulting for over 20 years. He is a nationally recognized expert in healthcare organization, finance and delivery and applies his expertise to a wide range of issues, including healthcare quality, value-based healthcare, and patient-centered outcomes research. Jason worked for the legislative branch of the U.S. Congress from 1990-2000 — first at GAO, then at CRS, then as Health Policy Counsel for the Chairman of the House Energy and Commerce Committee (in which role the National Journal named him a “Top Congressional Aide” and he was profiled in the Almanac of the Unelected). Subsequently, Jason held roles of increasing responsibility with non-profit organizations — including AcademyHealth, NORC, NIHCM, and NEHI. Jason has published quantitative and qualitative findings in Health Affairs and other journals and his work has been quoted in Newsweek, the Wall Street Journal and a host of trade publications. He is a Fellow of the Employee Benefit Research Institute, was an adjunct faculty member at the George Washington University, and has served on several boards. Jason earned a Ph.D. in social psychology from the University of Michigan and completed two postdoctoral programs (supported by the National Science Foundation and the National Institutes of Health). He is the proud father of twins and lives outside of Boston.

Comments Off

Filed under Boundaryless Information Flow™, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Professional Development, Standards

The Open Group Summit Amsterdam – ArchiMate® Day – May 14, 2014

By Andrew Josey, Director of Standards, The Open Group

The Open Group Summit 2014 Amsterdam features an all day track on the ArchiMate® modeling language, followed by an ArchiMate Users Group meeting in the evening. The meeting attendees include the core developers of the ArchiMate language, users and tool developers.

The sessions include tutorials, a panel session on the past, present and future of the language and case studies. The Users Group meeting follows in the evening. The evening session is free and open to all — whether attending the rest of the conference or not — and starts at 6pm with free beer and pizza!

The timetable for ArchiMate Day is as follows:

• Tutorials (09:00 – 10:30), Henry Franken, CEO, BiZZdesign, and Alan Burnett, COO & Consulting Head, Corso

Henry Franken will show how the TOGAF® and ArchiMate® standards can be used to provide an actionable EA capability. Alan Burnett will present on how the ArchiMate language can be extended to support roadmapping, which is a fundamental part of strategic planning and enterprise architecture.

• Panel Discussion (11:00 – 12:30), Moderator: Henry Franken, Chair of The Open Group ArchiMate Forum

The  topic for the Panel Discussion is the ArchiMate Language — Past, Present and Future. The panel is comprised of key developers and users of the ArchiMate® language, including Marc Lankhorst and Henk Jonkers from the ArchiMate Core team, Jan van Gijsen from SNS REAAL, a Dutch financial institution, and Gerben Wierda author of Mastering ArchiMate. The session will include brief updates on current status from the panel members (30 minutes) and a 60-minute panel discussion with questions from the moderator and audience.

• Case Studies (14:00 – 16:00), Geert Van Grootel, Senior Researcher, Department of Economy, Science & Innovation, Flemish Government; Patrick Derde, Consultant, Envizion; and Pieter De Leenheer, Co-Founder and Research Director, Collibra. Walter Zondervan, Member – Architectural Board, ASL-BiSL Foundation. Adina Aldea, BiZZdesign.

There are three case studies:

Geert Van Grootel, Patrick Derde, and Pieter De Leenheer will present on how you can manage your business meta data by means of the use of data model patterns and an Integrated Information Architecture approach supported by a standard formal architecture language ArchiMate.

Walter Zondervan will present an ArchiMate reference architecture for governance, based on BiSL.

Adina Aldea will present on how high level strategic models can be used and modelled based on the Strategizer method.

• ArchiMate Users Group Meeting (18:00 – 21:00)

The evening session is free and open to all — whether attending the rest of the conference or not. It will start at 6pm with free beer and pizza. Invited speakers for the Users Group Meeting include: Andrew Josey, Henk Jonkers,  Marc Lankhorst and Gerben Wierda:

- Andrew Josey will present on the ArchiMate certification program and adoption of the language
– Henk Jonkers will present on modeling risk and security
– Marc Lankhorst will present about capability modeling in ArchiMate
– Gerben Wierda will present about relating ArchiMate and BPMN

Why should you attend?
• Spend time interacting directly with other ArchiMate users and tool providers in a relaxed, engaging environment
• Opportunity to listen and understand how ArchiMate can be used to develop solutions to common industry problems
• Learn about the future directions and meet with key users and developers of the language and tools
• Interact with peers to broaden your expertise and knowledge in the ArchiMate language

For detailed information, see the ArchiMate Day agenda at / or our YouTube event video at

How to register

Registration for the ArchiMate® Users Group meeting is independent of The Open Group Conference registration. There is no fee but registration is required. Please register here, select one-day pass for pass type, insert the promotion code (AMST14-AUG), tick the box against Wednesday May 14th and select ArchiMate Users Group from the conference session list. You will then be registered for the event and should not be charged.  Please note that this promotion code should only be used for those attending only the evening meeting from 6:00 p.m. Anyone attending the conference or just the ArchiMate Day will have to pay the applicable registration fee.  User Group members who want to attend The Open Group conference and who are not members of The Open Group can register using the affiliate code AMST14-AFFIL.

 Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate 2.1, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Comments Off

Filed under ArchiMate®, Enterprise Architecture, Professional Development, Standards, TOGAF®, Uncategorized

Heartbleed: Tips and Lessons Learned

By Jim Hietala, VP, Security, The Open Group

During our upcoming event May 12-14, The Open Group Summit 2014 AmsterdamEnabling Boundaryless Information Flow™ – one of the discussions will be around risk management and the development of open methodologies for managing risk.

Managing risk is an essential component of an information security program. Risk management is fundamental to effectively securing information, IT assets, and critical business processes. Risk management is also a challenge to get right. With numerous risk management frameworks and standards available, it can be difficult for practitioners to know where to start, and what methodologies to employ.

Recently, the Heartbleed bug has been wreaking havoc not only for major websites and organizations, but the security confidence of the public in general. Even as patches are being made to guarantee safety, systems will remain vulnerable for an extended period of time. Taking proactive steps and learning how to manage risk is imperative to securing your privacy.

With impacts on an estimated 60-70% of websites, Heartbleed is easily the security vulnerability with the highest degree of potential impact ever. There is helpful guidance as to what end-users can try to do to insulate themselves from any negative consequences.

Large organizations obviously need to determine where they have websites and network equipment that is vulnerable, in order to rapidly remediate this. Scanning your IP address range (both for internal addresses, and for IP addresses exposed to the Internet) should be done ASAP, to allow you to identify all sites, servers, and other equipment using OpenSSL, and needing immediate patching.

In the last few days, it has become clear that we are not just talking about websites/web servers. Numerous network equipment vendors have used OpenSSL in their networking products. Look closely at your routers, switches, firewalls, and make sure that you understand in which of these OpenSSL is also an issue. The impact of OpenSSL and Heartbleed on these infrastructure components is likely to be a bigger problem for organizations, as the top router manufacturers all have products affected by this vulnerability.

Taking a step back from the immediate frenzy of finding OpenSSL, and patching websites and network infrastructure to mitigate this security risk, it is pretty clear that we have a lot of work to do as a security community on numerous fronts:

• Open source security components that gain widespread use need much more serious attention, in terms of finding/fixing software vulnerabilities
• For IT hardware and software vendors, and for the organizations that consume their products, OpenSSL and Heartbleed will become the poster child for why we need more rigorous supply chain security mechanisms generally, and specifically for commonly used open source software.
• The widespread impacts from Heartbleed should also focus attention on the need for radically improved security for the emerging Internet of Things (IoT). As bad as Heartbleed is, try to imagine a similar situation when there are billions of IP devices connected to the internet. This is precisely where we are headed absent big changes in software assurance/supply chain security for IoT devices.

Finally, there is a deeper issue here: CIOs and IT people should realize that the fundamental security barriers, such as SSL are under constant attack – and these security walls won’t hold forever. So, it is important not to simply patch your SSL and reissue your certificates, but to rethink your strategies for security defense in depth, such as increased protection of critical data and multiple independent levels of security.

You also need to ensure that your suppliers are implementing security practices that are at least as good as yours – how many web sites got caught out by Heartbleed because of something their upstream supplier did?

Discussions during the Amsterdam Summit will outline important areas to be aware of when managing security risk, including how to be more effective against any copycat bugs. Be sure to sign up now for our summit .

For more information on The Open Group Security Forum, please visit

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security, risk management and healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.


Filed under Boundaryless Information Flow™, Cybersecurity, Information security, RISK Management

Improving Patient Care and Reducing Costs in Healthcare – Join The Open Group Tweet Jam on Wednesday, April 23

By Jason Lee, Director of Healthcare and Security Forums, The Open Group

On Wednesday, April 23 at 9:00 am PT/12:00 pm ET/5:00 pm GMT, The Open Group Healthcare Forum will host a tweet jam to discuss the issues around healthcare and improving patient care while reducing costs. Many healthcare payer and provider organizations today are facing numerous “must do” priorities, including EHR implementation, transitioning to ICD-10, and meeting enhanced HIPAA security requirements.

This tweet jam will focus on opportunities that healthcare organizations have available to improve patient care and reduce costs associated with capturing, maintaining, and sharing patient information. It will also explore how using Enterprise Architectural approaches that have proven effective in other industries will apply to the healthcare sector and dramatically improve both costs and patient care.

In addition to the need for implementing integrated digital health records that can be shared across health organizations to maximize care for both patients who don’t want to repeat themselves and the doctors providing their care, we’ll explore what other solutions exist to enhance information flow. For example, did you know that a new social network for M.D.s has even emerged to connect and communicate across teams, hospitals and entire health systems? The new network, called Doximity, boasts that 40 percent of U.S. doctors have signed on. Not only are doctors using social media, they’re using software specifically designed for the iPad that roughly 68 percent of doctors are carrying around. One hospital even calculated its return on investment of utilizing a an iPad in just nine days!

We’ll be talking about how many healthcare thought leaders are looking at technology and its influence on online collaboration, patient telemonitoring and information flow.

We welcome The Open Group members and interested participants from all backgrounds to join the discussion and interact with our panel of thought-leaders including Jim Hietala, Vice President of Security; David Lounsbury, CTO; and Dr. Chris Harding, Forum Director of Open Platform 3.0™ Forum. To access the discussion, please follow the hashtag #ogchat during the allotted discussion time.

Interested in joining The Open Group Healthcare Forum? Register your interest, here.

What Is a Tweet Jam?

The Open Group tweet jam, approximately 45 minutes in length, is a “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on relevant and thought-provoking issues. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

Have your first #ogchat tweet be a self-introduction: name, affiliation, occupation.

Start all other tweets with the question number you’re responding to and add the #ogchat hashtag.

Sample: Q1 What barriers exist for collaboration among providers in healthcare, and what can be done to improve things? #ogchat

Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.

While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue.

A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please contact Rob Checkal (@robcheckal or We anticipate a lively chat and hope you will be able to join!

Jason Lee headshotJason Lee, Director of Healthcare and Security Forums at The Open Group, has conducted healthcare research, policy analysis and consulting for over 20 years. He is a nationally recognized expert in healthcare organization, finance and delivery and applies his expertise to a wide range of issues, including healthcare quality, value-based healthcare, and patient-centered outcomes research. Jason worked for the legislative branch of the U.S. Congress from 1990-2000 — first at GAO, then at CRS, then as Health Policy Counsel for the Chairman of the House Energy and Commerce Committee (in which role the National Journal named him a “Top Congressional Aide” and he was profiled in the Almanac of the Unelected). Subsequently, Jason held roles of increasing responsibility with non-profit organizations — including AcademyHealth, NORC, NIHCM, and NEHI. Jason has published quantitative and qualitative findings in Health Affairs and other journals and his work has been quoted in Newsweek, the Wall Street Journal and a host of trade publications. He is a Fellow of the Employee Benefit Research Institute, was an adjunct faculty member at the George Washington University, and has served on several boards. Jason earned a Ph.D. in social psychology from the University of Michigan and completed two postdoctoral programs (supported by the National Science Foundation and the National Institutes of Health). He is the proud father of twins and lives outside of Boston.



Filed under Boundaryless Information Flow™, Enterprise Architecture, Healthcare, Tweet Jam

ArchiMate® Q&A with Phil Beauvoir

By The Open Group

The Open Group’s upcoming Amsterdam Summit in May will feature a full day on May 14 dedicated to ArchiMate®, an open and independent modeling language for Enterprise Architecture, supported by tools that allow Enterprise Architects to describe, analyze and visualize relationships among business domains in an unambiguous way.

One of the tools developed to support ArchiMate is Archi, a free, open-source tool created by Phil Beauvoir at the University of Bolton in the UK as part of a Jisc-funded Enterprise Architecture project that ran from 2009-2012. Since its development, Archi has grown from a relatively small, home-grown tool to become a widely used open-source resource that averages 3000 downloads per month and whose community ranges from independent practitioners to Fortune 500 companies. Here we talk with Beauvoir about how Archi was developed, the problems inherent in sustaining an open source product, its latest features and whether it was named after the Archie comic strip.

Beauvoir will be a featured speaker during the ArchiMate Day in Amsterdam.

Tell us about the impetus for creating the Archi tool and how it was created…
My involvement with the ArchiMate language has mainly been through the development of the software tool, Archi. Archi has, I believe, acted as a driver and as a hub for activity around the ArchiMate language and Enterprise Architecture since it was first created.

I’ll tell you the story of how Archi came about. Let’s go back to the end of 2009. At that point, I think ArchiMate and Enterprise Architecture were probably being used quite extensively in the commercial sector, especially in The Netherlands. The ArchiMate language had been around for a while at that point but was a relatively new thing to many people, at least here in the UK. If you weren’t part of the EA scene, it would have been a new thing to you. In the UK, it was certainly new for many in higher education and universities, which is where I come in.

Jisc, the UK funding body, funded a number of programs in higher education exploring digital technologies and other initiatives. One of the programs being funded was to look at how to improve systems using Enterprise Architecture within the university sector. Some of the universities had already been led to ArchiMate and Enterprise Architecture and were trying it out for themselves – they were new to it and, of course, one of the first things they needed were tools. At that time, and I think it’s still true today, a lot of the tools were quite expensive. If you’re a big commercial organization, you might be able to afford the licensing costs for tools and support, but for a small university project it can be prohibitive, especially if you’re just dipping your toe into something like this. So some colleagues within Jisc and the university I worked at said, ‘well, what about creating a small, open source project tool which isn’t over-complicated but does enough to get people started in ArchiMate? And we can fund six months of money to do this as a proof of concept tool’.

That takes us into 2010, when I was working for the university that was approached to do this work. After six months, by June 2010, I had created the first 1.0 version of Archi and it was (and still is) free, open source and cross-platform. Some of the UK universities said ‘well, that’s great, because now the barrier to entry has been lowered, we can use this tool to start exploring the ArchiMate language and getting on board with Enterprise Architecture’. That’s really where it all started.

So some of the UK universities that were exploring ArchiMate and Enterprise Architecture had a look at this first version of Archi, version 1.0, and said ‘it’s good because it means that we can engage with it without committing at this stage to the bigger tooling solutions.’ You have to remember, of course, that universities were (and still are) a bit strapped for cash, so that’s a big issue for them. At the time, and even now, there really aren’t any other open-source or free tools doing this. That takes us to June 2010. At this point we got some more funding from the Jisc, and kept on developing the tool and adding more features to it. That takes us through 2011 and then up to the end of 2012, when my contract came to an end.

Since the official funding ended and my contract finished, I’ve continued to develop Archi and support the community that’s built up around it. I had to think about the sustainability of the software beyond the project, and sometimes this can be difficult, but I took it upon myself to continue to support and develop it and to engage with the Archi/ArchiMate community.

How did you get involved with The Open Group and bringing the tool to them?
I think it was inevitable really due to where Archi originated, and because the funding came from the Jisc, and they are involved with The Open Group. So, I guess The Open Group became aware of Archi through the Jisc program and then I became involved with the whole ArchiMate initiative and The Open Group. I think The Open Group is in favor of Archi, because it’s an open source tool that provides a neutral reference implementation of the ArchiMate language. When you have an open standard like ArchiMate, it’s good to have a neutral reference model implementation.

How is this tool different from other tools out there and what does it enable people to do?
Well, firstly Archi is a tool for modeling Enterprise Architecture using the ArchiMate language and notation, but what really makes it stand out from the other tools is its accessibility and the fact that it is free, open source and cross-platform. It can do a lot of, if not all of, the things that the bigger tools provide without any financial or other commitment. However, free is not much use if there’s no quality. One thing I’ve always strived for in developing Archi is to ensure that even if it only does a few things compared with the bigger tools, it does those things well. I think with a tool that is free and open-source, you have a lot of support and good-will from users who provide positive encouragement and feedback, and you end up with an interesting open development process.

I suppose you might regard Archi’s relationship to the bigger ArchiMate tools in the same way as you’d compare Notepad to Microsoft Word. Notepad provides the essential writing features, but if you want to go for the full McCoy then you go and buy Microsoft Word. The funny thing is, this is where Archi was originally targeted – at beginners, getting people to start to use the ArchiMate language. But then I started to get emails — even just a few months after its first release — from big companies, insurance companies and the like saying things like ‘hey, we’re using this tool and it’s great, and ‘thanks for this, when are we going to add this or that feature?’ or ‘how many more features are you going to add?’ This surprised me somewhat since I wondered why they hadn’t invested in one of the available commercial tools. Perhaps ArchiMate, and even Enterprise Architecture itself, was new to these organizations and they were using Archi as their first software tool before moving on to something else. Having said that, there are some large organizations out there that do use Archi exclusively.

Which leads to an interesting dilemma — if something is free, how do you continue developing and sustaining it? This is an issue that I’m contending with right now. There is a PayPal donation button on the front page of the website, but the software is open source and, in its present form, will remain open source; but how do you sustain something like this? I don’t have the complete answer right now.

Given that it’s a community product, it helps that the community contributes ideas and develops code, but at the same time you still need someone to give their time to coordinate all of the activity and support. I suppose the classic model is one of sponsorship, but we don’t have that right now, so at the moment I’m dealing with issues around sustainability.

How much has the community contributed to the tool thus far?
The community has contributed a lot in many different ways. Sometimes a user might find a bug and report it or they might offer a suggestion on how a feature can be improved. In fact, some of the better features have been suggested by users. Overall, community contributions seem to have really taken off more in the last few months than in the whole lifespan of Archi. I think this may be due to the new Archi website and a lot more renewed activity. Lately there have been more code contributions, corrections to the documentation and user engagement in the future of Archi. And then there are users who are happy to ask ‘when is Archi going to implement this big feature, and when is it going to have full support for repositories?’ and of course they want this for free. Sometimes that’s quite hard to accommodate, because you think ‘sure, but who’s going to do all this work and contribute the effort.’ That’s certainly an interesting issue for me.

How many downloads of the tool are you getting per month? Where is it being used?
At the moment we’re seeing around 3,000 downloads a month of the tool — I think that’s a lot actually. Also, I understand that some EA training organizations use Archi for their ArchiMate training, so there are quite a few users there, as well.

The number one country for downloading the app and visiting the website is the Netherlands, followed by the UK and the United States. In the past three months, the UK and The Netherlands have been about equal in numbers in their visits to the website and downloads, followed by the United States, France, Germany, Canada, then Australia, Belgium, and Norway. We have some interest from Russia too. Sometimes it depends on whether ArchiMate or Archi is in the news at any given time. I’ve noticed that when there’s a blog post about ArchiMate, for example, you’ll see a spike in the download figures and the number of people visiting the website.

How does the tool fit into the overall schema of the modeling language?
It supports all of the ArchiMate language concepts, and I think it offers the core functionality of you’d want from an ArchiMate modeling tool — the ability to create diagrams, viewpoints, analysis of model objects, reporting, color schemes and so on. Of course, the bigger ArchiMate tools will let you manipulate the model in more sophisticated ways and create more detailed reports and outputs. This is an area that we are trying to improve, and the people who are now actively contributing to Archi are full-time Enterprise Architects who are able to contribute to these areas. For example, we have a user and contributor from France, and he and his team use Archi, and so they are able to see first-hand where Archi falls short and they are able to say ‘well, OK, we would like it to do this, or that could be improved,’ so now they’re working towards strengthening any weak areas.

How did you come up with the name?
What happens is you have pet names for projects and I think it just came about that we started calling it “Archie,” like the guy’s name. When it was ready to be released I said, ‘OK, what should we really call the app?’ and by that point everyone had started to refer to it as “Archie.” Then somebody said ‘well, everybody’s calling it by that name so why don’t we just drop the “e” from the name and go with that?’ – so it became “Archi.” I suppose we could have spent more time coming up with a different name, but by then the name had stuck and everybody was calling it that. Funnily enough, there’s a comic strip called ‘Archie’ and an insurance company that was using the software at the time told me that they’d written a counterpart tool called ‘Veronica,’ named after a character in the comic strip.

What are you currently working on with the tool?
For the last few months, I’ve been adding new features – tweaks, improvements, tightening things up, engaging with the user community, listening to what’s needed and trying to implement these requests. I’ve also been adding new resources to the Archi website and participating on social media like Twitter, spreading the word. I think the use of social media is really important. Twitter, the User Forums and the Wikis are all points where people can provide feedback and engage with me and other Archi developers and users. On the development side of things, we host the code at GitHub, and again that’s an open resource that users and potential developers can go to. I think the key words are ‘open’ and ‘community driven.’ These social media tools, GitHub and the forums all contribute to that. In this way everyone, from developer to user, becomes a stakeholder – everyone can play their part in the development of Archi and its future. It’s a community product and my role is to try and manage it all.

What will you be speaking about in Amsterdam?
I think the angle I’m interested in is what can be achieved by a small number of people taking the open source approach to developing software and building and engaging with the community around it. For me, the interesting part of the Archi story is not so much about the software itself and what it does, but rather the strong community that’s grown around it, the extent of the uptake of the tool and the way in which it has enabled people to get on board with Enterprise Architecture and ArchiMate. It’s the accessibility and agility of this whole approach that I like and also the activity and buzz around the software and from the community – that for me is the interesting thing about this process.

For more information on ArchiMate, please visit:

For information on the Archi tool, please visit:

For information on joining the ArchiMate Forum, please visit:

philbeauvoirPhil Beauvoir has been developing, writing, and speaking about software tools and development for over 25 years. He was Senior Researcher and Developer at Bangor University, and, later, the Institute for Educational Cybernetics at Bolton University, both in the UK. During this time he co-developed a peer-to-peer learning management and groupware system, a suite of software tools for authoring and delivery of standards-compliant learning objects and meta-data, and tooling to create IMS Learning Design compliant units of learning.  In 2010, working with the Institute for Educational Cybernetics, Phil created the open source ArchiMate Modelling Tool, Archi. Since 2013 he has been curating the development of Archi independently. Phil holds a degree in Medieval English and Anglo-Saxon Literature.

1 Comment

Filed under ArchiMate®, Certifications, Conference, Enterprise Architecture, Uncategorized

How the Open Trusted Technology Provider Standard (O-TTPS) and Accreditation Will Help Lower Cyber Risk

By Andras Szakal, Vice President and Chief Technology Officer, IBM U.S. Federal

Changing business dynamics and enabling technologies

In 2008, IBM introduced the concept of a “Smarter Planet.” The Smarter Planet initiative focused, in part, on the evolution of globalization against the backdrop of changing business dynamics and enabling technologies. A key concept was the need for infrastructure to be tightly integrated, interconnected, and intelligent, thereby facilitating collaboration between people, government and businesses in order to meet the world’s growing appetite for data and automation. Since then, many industries and businesses have adopted this approach, including the ICT (information and communications technology) industries that support the global technology manufacturing supply chain.

Intelligent and interconnected critical systems

This transformation has infused technology into virtually all aspects of our lives, and involves, for example, government systems, the electric grid and healthcare. Most of these technological solutions are made up of hundreds or even thousands of components that are sourced from the growing global technology supply chain.
Intelligent and interconnected critical systems

In the global technology economy, no one technology vendor or integrator is able to always provide a single source solution. It is no longer cost competitive to design all of the electronic components, printed circuit boards, card assemblies, or other sub-assemblies in-house. Adapting to the changing market place and landscape by balancing response time and cost efficiency, in an expedient manner, drives a more wide-spread use of OEM (original equipment manufacturer) products.

As a result, most technology providers procure from a myriad of global component suppliers, who very often require similarly complex supply chains to source their components. Every enterprise has a supplier network, and each of their suppliers has a supply chain network, and these sub-tier suppliers have their own supply chain networks. The resultant technology supply chain is manifested into a network of integrated suppliers.

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Cybersecurity by design: Addressing risk in a sustainable way across the ecosystem

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways:

- Does the product meet essential functional requirements?
– Has the product or component been produced by trustworthy provider?

Of course, the leaders or owners of these systems have to do their part to achieve security and safety: e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats. Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

Assuring the quality and integrity of mission-critical technology

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

The very process of manufacturing technology is not immune to cyber-attack. The primary purpose of attacking the supply chain typically is motivated by monetary gain. The primary goals of a technology supply chain attack are intended to inflict massive economic damage in an effort to gain global economic advantage or as a way to seeding targets with malware that provides unfettered access for attackers.

It is for this reason that the global technology manufacturing industry must establish practices that mitigate this risk by increasing the cost barriers of launching such attacks and increasing the likelihood of being caught before the effects of such an attack are irreversible. As these threats evolve, the global ICT industry must deploy enhanced security through advanced automated cyber intelligence analysis. As critical infrastructure becomes more automated, integrated and essential to critical to functions, the technology supply chain that surrounds it must be considered a principle theme of the overall global security and risk mitigation strategy.

A global, agile, and scalable approach to supply chain security

Certainly, the manner in which technologies are invented, produced, and sold requires a global, agile, and scalable approach to supply chain assurance and is essential to achieve the desired results. Any technology supply chain security standard that hopes to be widely adopted must be flexible and country-agnostic. The very nature of the global supply chain (massively segmented and diverse) requires an approach that provides practicable guidance but avoids being overtly prescriptive. Such an approach would require the aggregation of industry practices that have been proven beneficial and effective at mitigating risk.

The OTTF (The Open Group Trusted Technology Forum) is an increasingly recognized and promising industry initiative to establish best practices to mitigate the risk of technology supply chain attack. Facilitated by The Open Group, a recognized international standards and certification body, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Current membership includes a list of the most well-known technology vendors, integrators, and technology assessment laboratories.

The benefits of O-TTPS for governments and enterprises

IBM is currently a member of the OTTF and has been honored to hold the Chair for the last three years.  Governments and enterprises alike will benefit from the work of the OTTF. Technology purchasers can use the Open Trusted Technology Provider™ Standard (O-TTPS) and Framework best-practice recommendations to guide their strategies.

A wide range of technology vendors can use O-TTPS approaches to build security and integrity into their end-to-end supply chains. The first version of the O-TTPS is focused on mitigating the risk of maliciously tainted and counterfeit technology components or products. Note that a maliciously tainted product is one that has been produced by the provider and acquired through reputable channels but which has been tampered maliciously. A counterfeit product is produced other than by or for the provider, or is supplied by a non-reputable channel, and is represented as legitimate. The OTTF is currently working on a program that will accredit technology providers who conform to the O-TTPS. IBM expects to complete pilot testing of the program by 2014.

IBM has actively supported the formation of the OTTF and the development of the O-TTPS for several reasons. These include but are not limited to the following:

- The Forum was established within a trusted and respected international standards body – The Open Group.
– The Forum was founded, in part, through active participation by governments in a true public-private partnership in which government members actively participate.
– The OTTF membership includes some of the most mature and trusted commercial technology manufactures and vendors because a primary objective of the OTTF was harmonization with other standards groups such as ISO (International Organization for Standardization) and Common Criteria.

The O-TTPS defines a framework of organizational guidelines and best practices that enhance the security and integrity of COTS ICT. The first version of the O-TTPS is focused on mitigating certain risks of maliciously tainted and counterfeit products within the technology development / engineering lifecycle. These best practices are equally applicable for systems integrators; however, the standard is intended to primarily address the point of view of the technology manufacturer.

O-TTPS requirements

The O-TTPS requirements are divided into three categories:

1. Development / Engineering Process and Method
2. Secure Engineering Practices
3. Supply Chain Security Practices

The O-TTPS is intended to establish a normalized set of criteria against which a technology provider, component supplier, or integrator can be assessed. The standard is divided into categories that define best practices for engineering development practices, secure engineering, and supply chain security and integrity intended to mitigate the risk of maliciously tainted and counterfeit components.

The accreditation program

As part of the process for developing the accreditation criteria and policy, the OTTF established a pilot accreditation program. The purpose of the pilot was to take a handful of companies through the accreditation process and remediate any potential process or interpretation issues. IBM participated in the OTTP-S accreditation pilot to accredit a very significant segment of the software product portfolio; the Application Infrastructure Middleware Division (AIM) which includes the flagship WebSphere product line. The AIM pilot started in mid-2013 and completed in the first week of 2014 and was formally recognized as accredited in the fist week of February 2014.

IBM is currently leveraging the value of the O-TTPS and working to accredit additional development organizations. Some of the lessons learned during the IBM AIM initial O-TTPS accreditation include:

- Conducting a pre-assessment against the O-TTPS should be conducted by an organization before formally entering accreditation. This allows for remediation of any gaps and reduces potential assessment costs and project schedule.
– Starting with a segment of your development portfolio that has a mature secure engineering practices and processes. This helps an organization address accreditation requirements and facilitates interactions with the 3rd party lab.
– Using your first successful O-TTPS accreditation to create templates that will help drive data gathering and validate practices to establish a repeatable process as your organization undertakes additional accreditations.

andras-szakalAndras Szakal, VP and CTO, IBM U.S. Federal, is responsible for IBM’s industry solution technology strategy in support of the U.S. Federal customer. Andras was appointed IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team in 2005. He is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP).  Andras holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. He has been a driving force behind IBM’s adoption of government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of the IT Architect Profession Certification Standard (ITAC). More recently, he was appointed chair of The Open Group Trusted Technology Forum and leads the development of The Open Trusted Technology Provider Framework.

1 Comment

Filed under Accreditations, Cybersecurity, government, O-TTF, O-TTPS, OTTF, RISK Management, Standards, supply chain, Supply chain risk