Category Archives: Supply chain risk

Securing Business Operations and Critical Infrastructure: Trusted Technology, Procurement Paradigms, Cyber Insurance

Following is the transcript of an Open Group discussion on ways to address supply chain risk in the information technology sector marketplace.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Sponsor: The Open Group

Dana Gardner: Hello, and welcome to a special Thought Leadership Panel Discussion, coming to you in conjunction with The Open Group’s upcoming conference on July 20, 2015 in Baltimore.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator as we explore ways to address supply chain risk in the information technology sector market.

We’ll specifically examine how The Open Group Trusted Technology Forum (OTTF) standards and accreditation activities are enhancing the security of global supply chains and improving the integrity of openly available IT products and components.

We’ll also learn how the age-old practice of insurance is coming to bear on the problem of IT supply-chain risk, and by leveraging insurance models, the specter of supply chain disruption and security yields may be significantly reduced.

To update us on the work of the OTTF and explain the workings and benefits of supply-chain insurance, we’re joined by our panel of experts. Please join me in welcoming Sally Long, Director of The Open Group Trusted Technology Forum. Welcome, Sally.

Sally Long: Thank you.

Gardner: We’re also here with Andras Szakal, Vice President and Chief Technology Officer for IBM U.S. Federal and Chairman of The Open Group Trusted Technology Forum. Welcome back, Andras.

Andras Szakal: Thank you for having me.

Gardner: And Bob Dix joins us. He is Vice President of Global Government Affairs and Public Policy for Juniper Networks and is a member of The Open Group Trusted Technology Forum. Welcome, Bob.

Bob Dix: Thank you for the invitation. Glad to be here.

Gardner: Lastly, we are joined by Dan Reddy, Supply Chain Assurance Specialist, college instructor and Lead of The Open Group Trusted Technology Forum Global Outreach and Standards Harmonization Work Group. Thanks for being with us, Dan.

Dan Reddy: Glad to be here, Dana.

Gardner: Sally, let’s start with you. Why don’t we just get a quick update on The Open Group Trusted Technology Forum (OTTF) and the supply-chain accreditation process generally? What has been going on?

OTTP standard

Long: For some of you who might not have heard of the O-TTPS, which is the standard, it’s called The Open Trusted Technology Provider™ Standard. The effort started with an initiative in 2009, a roundtable discussion with U.S. government and several ICT vendors, on how to identify trustworthy commercial off-the-shelf (COTS) information and communication technology (ICT), basically driven by the fact that governments were moving away from high assurance customized solution and more and more using COTS ICT.

That ad-hoc group formed under The OTTF and proceeded to deliver a standard and an accreditation program.

The standard really provides a set of best practices to be used throughout the COTS ICT product life cycle. That’s both during in-house development, as well as with outsourced development and manufacturing, including the best practices to use for security in the supply chain, encompassing all phases from design to disposal.

Just to bring you up to speed on just some of the milestones that we’ve had, we released our 1.0 version of the standard in 2013, launched our accreditation program to help assure conformance to the standard in February 2014, and then in July, we released our 1.1 version of the standard. We have now submitted that version to ISO for approval as a publicly available specification (PAS) and it’s a fast track for ISO.

The PAS is a process for adopting standards developed in other standards development organizations (SDOs), and the O-TTPS has passed the draft ISO ballot. Now, it’s coming up for final ballot.

That should bring folks up to speed, Dana, and let them know where we are today.

Gardner: Is there anything in particular at The Open Group Conference in Baltimore, coming up in July, that pertains to these activities? Is this something that’s going to be more than just discussed? Is there something of a milestone nature here too?

Long: Monday, July 20, is the Cyber Security Day of the Baltimore Conference. We’re going to be meeting in the plenary with many of the U.S. government officials from NIST, GSA, and the Department of Homeland Security. So there is going to be a big plenary discussion on cyber security and supply chain.

We’ll also be meeting separately as a member forum, but the whole open track on Monday will be devoted to cyber security and supply chain security.

The one milestone that might coincide is that we’re publishing our Chinese translation version of the standard 1.1 and we might be announcing that then. I think that’s about it, Dana.

OTTF background

Gardner: Andras, for the benefit of our listeners and readers who might be new to this concept, perhaps you could fill us in on the background on the types of problems that OTTF and the initiatives and standards are designed to solve. What’s the problem that we need to address here?

Szakal: That’s a great question. We realized, over the last 5 to 10 years, that the traditional supply-chain management practices, supply-chain integrity practices, where we were ensuring the integrity of the delivery of a product to the end customer, ensuring that it wasn’t tampered with, effectively managing our suppliers to ensure they provided us with quality components really had expanded as a result of the adoption of technology and the pervasive growth of technology in all aspects of manufacturing, but especially as IT has expanded into the Internet of Things, critical infrastructure and mobile technologies, and now obviously cloud and big data.

And as we manufacture those IT products we have to recognize that now we’re in a global environment, and manufacturing and sourcing of components occurs worldwide. In some cases, some of these components are even open source or freely available. We’re concerned, obviously, about the lineage, but also the practices of how these products are manufactured from a secure engineering perspective, as well as the supply-chain integrity and supply-chain security practices.

What we’ve recognized here is that the traditional life cycle of supplychain security and integrity has expanded to include all the way down to the design aspects of the product through sustainment and managing that product over a period of time, from cradle to grave, and disposal of the product to ensure that those components, if they were hardware-based, don’t actually end up recycled in a way that they pose a threat to our customers.

Gardner: So it’s as much a lifecycle as it is a procurement issue.

Szakal: Absolutely. When you talk about procurement, you’re talking about lifecycle and about mitigating risks to those two different aspects from sourcing and from manufacturing.

So from the customer’s perspective, they need to be considering how they actually apply techniques to ensure that they are sourcing from authorized channels, that they are also applying the same techniques that we use for secure engineering when they are doing the integration of their IT infrastructure.

But from a development perspective, it’s ensuring that we’re applying secure engineering techniques, that we have a well-defined baseline for our life cycle, and that we’re controlling our assets effectively. We understand who our partners are and we’re able to score them and ensure that we’re tracking their integrity and that we’re applying new techniques around secure engineering, like threat analysis and risk analysis to the supply chain.

We’re understanding the current risk landscape and applying techniques like vulnerability analysis and runtime protection techniques that would allow us to mitigate many of these risks as we build out our products and manufacture them.

It goes all the way through sustainment. You probably recognize now, most people would, that your products are no longer a shrink-wrap product that you get, install, and it lives for a year or two before you update it. It’s constantly being updated. So to ensure that the integrity and delivery of that update is consistent with the principles that we are trying to espouse is also really important.

Collaborative effort

Gardner: And to that point, no product stands alone. It’s really a result of a collaborative effort, very complex number of systems coming together. Not only are standards necessary, but cooperation among all those players in that ecosystem becomes necessary.

Dan Reddy, how have we done in terms of getting mutual assurance across a supply chain that all the participants are willing to take part? It seems to me that, if there is a weak link, everyone would benefit by shoring that up. So how do we go beyond the standards? How are we getting cooperation, get all the parties interested in contributing and being part of this?

Reddy: First of all, it’s an evolutionary process, and we’re still in the early days of fully communicating what the best practices are, what the standards are, and getting people to understand how that relates to their place in the supply chain.

Certainly, the supplier community would benefit by following some common practices so they don’t wind up answering customized survey questions from all of their customers.

That’s what’s happening today. It’s pretty much a one-off situation, where each customer says, “I need to protect my supply chain. Let me go find out what all of my suppliers are doing.” The real benefit here is to have the common language of the requirements in our standard and a way to measure it.

So there should be an incentive for the suppliers to take a look at that and say, “I’m tired of answering these individual survey questions. Maybe if I just document my best practices, I can avoid some of the effort that goes along with that individual approach.”

Everyone needs to understand that value proposition across the supply chain. Part of what we’re trying to do with the Baltimore conference is to talk to some thought leaders and continue to get the word out about the value proposition here.

Gardner: Bob Dix, the government in the U.S., and of course across the globe, all the governments, are major purchasers of technology and also have a great stake in security and low risk. What’s been driving some of the government activities? Of course, they’re also interested in using off-the-shelf technology and cutting costs. So what’s the role that governments can play in driving some of these activities around the OTTF?

Risk management

Dix: This issue of supply chain assurance and cyber security is all about risk management, and it’s a shared responsibility. For too long I think that the government has had a tendency to want to point a finger at the private sector as not sufficiently attending to this matter.

The fact is, Dana, that many in the private sector make substantial investments in their product integrity program, as Andras was talking about, from product conception, to delivery, to disposal. What’s really important is that when that investment is made and when companies apply the standard the OTTF has put forward, it’s incumbent upon the government to do their part in purchasing from authorized and trusted sources.

In today’s world, we still have a culture that’s pervasive across the government acquisition community, where decision-making on procurements is often driven by cost and schedule, and product authenticity, assurance, and security are not necessarily a part of that equation. It’s driven in many cases by budgets and other considerations, but nonetheless, we must change that culture to focus to include authenticity and assurance as a part of the decision making process.

The result of focusing on cost and schedule is often those acquisitions are made from untrusted and unauthorized sources, which raises the risk of acquiring counterfeit, tainted, or even malicious equipment.

Part of the work of the OTTF is to present to all stakeholders, in industry and government alike, that there is a process that can be uniform, as has been stated by Sally and Dan as well, that can be applied in an environment to raise the bar of authenticity, security, and assurance to improve upon that risk management approach.

Gardner: Sally, we’ve talked about where you’re standing in terms of some progress in your development around these standards and activities. We’ve heard about the challenges and the need for improvement.

Before we talk about this really interesting concept of insurance that would come to bear on perhaps encouraging standardization and giving people more ways to reduce their risk and adhere to best practices, what do you expect to see in a few years? If things go well and if this is adopted widely and embraced in true good practices, what’s the result? What do we expect to see as an improvement?

What I am trying to get at here is that if there’s a really interesting golden nugget to shoot for, a golden ring to grab for, what is that we can accomplish by doing this well?

Powerful impact

Long: The most important and significant aspect of the accreditation program is when you look at the holistic nature of the program and how it could have a very powerful impact if it’s widely adopted.

The idea of an accreditation program is that a provider gets accredited for conforming to the best practices. A provider that can get accredited could be an integrator, an OEM, the component suppliers of hardware and software that provide the components to the OEM, and the value-add resellers and distributors.

Every important constituent in that supply chain could be accredited. So not only from a business perspective is it important for governments and commercial customers to look on the Accreditation Registry and see who has been accredited for the integrators they want to work with or for the OEMs they want to work with, but it’s also important and beneficial for OEMs to be able to look at that register and say, “These component suppliers are accredited. So I’ll work with them as business partners.” It’s the same for value-add resellers and distributors.

It builds in these real business-market incentives to make the concept work, and in the end, of course, the ultimate goal of having a more secure supply chain and more products with integrity will be achieved.

To me, that is one of the most important aspects that we can reach for, especially if we reach out internationally. What we’re starting to see internationally is that localized requirements are cropping up in different countries. What that’s going to mean is that vendors need to meet those different requirements, increasing their cost, and sometimes even there will end up being trade barriers.

Back to what Dan and Bob were saying, we need to look at this global standard and accreditation program that already exists. It’s not in development; we’ve been working on it for five years with consensus from many, many of the major players in the industry and government. So urging global adoption of what already exists and what could work holistically is really an important objective for our next couple of years.

Gardner: It certainty sounds like a win, win, win if everyone can participate, have visibility, and get designated as having followed through on those principles. But as you know and as you mentioned, it’s the marketplace. Economics often drives business behavior. So in addition to a standards process and the definitions being available, what is it about this notion of insurance that might be a parallel market force that would help encourage better practices and ultimately move more companies in this direction?

Let’s start with Dan. Explain to me how cyber insurance, as it pertains to the supply chain, would work?

Early stages

Reddy: It’s an interesting question. The cyber insurance industry is still in the early stages, even though it goes back to the ’70s, where crime insurance started applying to outsiders gaining physical access to computer systems. You didn’t really see the advent of hacker insurance policies until the late ’90s. Then, starting in 2000, some of the first forms of cyber insurance covering first and third party started to appear.

What we’re seeing today is primarily related to the breaches that we hear about in the paper everyday, where some organization has been comprised, and sensitive information, like credit card information, is exposed for thousands of customers. The remediation is geared toward the companies that have to pay the claim and sign people up for identity protection. It’s pretty cut and dried. That’s the wave that the insurance industry is riding right now.

What I see is that as attacks get to be more sophisticated and potentially include attacks on the supply chain, it’s going to represent a whole new area for cyber insurance. Having consistent ways to address supplier-related risk, as well as the other infrastructure related risks that go beyond simple data breach, is going to be where the marketplace has to make an adjustment. Standardization is critical there.

Gardner: Andras, how does this work in conjunction with OTTF? Would insurance companies begin their risk assessment by making sure that participants in the supply chain are already adhering to your standards and seeking accreditation? Then, maybe they would have premiums that would reflect the diligence that companies extend into their supply chains. Maybe you could just explain to me, not just the insurance, but how it would work in conjunction with OTTF, maybe to each’s mutual benefit.

Szakal: You made a really great point earlier about the economic element that would drive compliance. For us in IBM, the economic element is the ability to prove that we’re providing the right assurance that is being specified in the requests for proposals (RFPs), not only in the federal sector, but outside the federal sector in critical infrastructure and finance. We continue to win those opportunities, and that’s driven our compliance, as well as the government policy aspect worldwide.

But from an insurance point of view, insurance comes in two forms. I buy policy insurance in a case where there are risks that are out of my control, and I apply protective measures that are under my control. So in the case of the supply chain, the OTTF is a set of practices that help you gain control and lower the risk of threat in the manufacturing process.

The question is, do you buy a policy, and what’s the balance here between a cyber threat that is in your control, and those aspects of supply chain security which are out of your control? This is with the understanding that there is an infinite number of a resources or revenue that you can apply to allocate to both of these aspects.

There’s going to have to be a balance, and it really is going to be case by case, with respect to customers and manufacturers, as to where the loss of potential intellectual property (IP) with insurance, versus applying controls. Those resources are better applied where they actually have control, versus that of policies that are protecting you against things that are out of your control.

For example, you might buy a policy for providing code to a third party, which has high value IP to manufacture a component. You have to share that information with that third-party supplier to actually manufacture that component as part of the overarching product, but with the realization that if that third party is somehow hacked or intruded on and that IP is stolen, you have lost some significant amount of value. That will be an area where insurance would be applicable.

What’s working

Gardner: Bob Dix, if insurance comes to bear in conjunction with standards like what the OTTF is developing in supply chain assurance, it seems to me that the insurance providers themselves would be in a position of gathering information for their actuarial decisions and could be a clearing house for what’s working and what isn’t working.

It would be in their best interest to then share that back into the marketplace in order to reduce the risk. That’s a market-driven, data-driven approach that could benefit everyone. Do you see the advent of insurance as a benefit or accelerant to improvement here?

Dix: It’s a tool. This is a conversation that’s been going on in the community for quite some time, the lack of actuarial data for catastrophic losses produced by cyber events, that is impacting some of the rate setting and premium setting by insurance companies, and that has continued to be a challenge.

But from an incentive standpoint, it’s just like in your home. If you have an alarm system, if you have a fence, if you do other kinds of protective measures, your insurance on your homeowners or liability insurance may get a reduction in premium for those actions that you have taken.

As an incentive, the opportunity to have an insurance policy to either transfer or buy down risk can be driven by the type of controls that you have in your environment. The standard that the OTTF has put forward provides guidance about how best to accomplish that. So, there is an opportunity to leverage, as an incentive, the reduction in premiums for insurance to transfer or buy down risk.

Gardner: It’s interesting, Sally, that the insurance industry could benefit from OTTF, and by having more insurance available in the marketplace, it could encourage more participation and make the standard even more applicable and valuable. So it’s interesting to see over time how that plays out.

Any thoughts or comments on the relationship between what you are doing at OTTF and The Open Group and what the private insurance industry is moving toward?

Long: I agree with what everyone has said. It’s an up-and-coming field, and there is a lot more focus on it. I hear at every conference I go to, there is a lot more research on cyber security insurance. There is a place for the O-TTPS in terms of buying down risk, as Bob was mentioning.

The other thing that’s interesting is the NIST Cybersecurity Framework. That whole paradigm started out with the fact that there would be incentives for those that followed the NIST Cybersecurity Framework – that incentive piece became very hard to pull together, and still is. To my knowledge, there are no incentives yet associated with it. But insurance was one of the ideas they talked about for incentivizing adopters of the CSF.

The other thing that I think came out of one of the presentations that Dan and Larry Clinton will be giving at our Baltimore Conference, is that insurers are looking for simplicity. They don’t want to go into a client’s environment and have them prove that they are doing all of these things required of them or filling out a long checklist.

That’s why, in terms of simplicity, asking for O-TTPS-accredited providers or lowering their rates based on that – would be a very simplistic approach, but again not here yet. As Bob said, it’s been talked about a lot for a long time, but I think it is coming to the fore.

Market of interest

Gardner: Dan Reddy, back to you. When there is generally a large addressable market of interest in a product or service, there often rises a commercial means to satisfy that. How can enterprises, the people who are consuming these products, encourage acceptance of these standards, perhaps push for a stronger insurance capability in the marketplace, or also get involved with some of these standards and practices that we have been talking about?

If you’re a publicly traded company, you would want to reduce your exposure and be able to claim accreditation and insurance as well. Let’s look at this from the perspective of the enterprise. What should and could they be doing to improve on this?

Reddy: I want to link back to what Sally said about the NIST Cyber Security Framework. What’s been very useful in publishing the Framework is that it gives enterprises a way to talk about their overall operational risk in a consistent fashion.

I was at one of the workshops sponsored by NIST where enterprises that had adopted it talked about what they were doing internally in their own enterprises in changing their practices, improving their security, and using the language of the framework to address that.

Yet, when they talked about one aspect of their risk, their supplier risk, they were trying to send the NIST Cybersecurity Framework risk questions to their suppliers, and those questions aren’t really sufficient. They’re interesting. You care about the enterprise of your supplier, but you really care about the products of your supplier.

So one of the things that the OTTF did is look at the requirements in our standard related to suppliers and link them specifically to the same operational areas that were included in the NIST Cybersecurity Framework.

This gives the standard enterprise looking at risk, trying to do standard things, a way to use the language of our requirements in the standard and the accreditation program as a form of measurement to see how that aspect of supplier risk would be addressed.

But remember, cyber insurance is more than just the risk of suppliers. It’s the risk at the enterprise level. But the attacks are going to change over time, and we’ll go beyond the simple breaches. That’s where the added complexity will be needed.

Gardner: Andras, any suggestions for how enterprises, suppliers, vendors, systems integrators, and now, of course, the cloud services providers, should get involved? Where can they go for more information? What can they do to become part of the solution on this?

International forum

Szakal: Well, they can always become a member of the Trusted Technology Forum, where we have an international forum.

Gardner: I thought you might say that.

Szakal: That’s an obvious one, right? But there are a couple of places where you can go to learn more about this challenge.

One is certainly our website. Download the framework, which was a compendium of best practices, which we gathered as a result of a lot of hard work of sharing in an open, penalty-free environment all of the best practices that the major vendors are employing to mitigate risks to counterfeit and maliciously tainted products, as well as other supply chain risks. I think that’s a good start, understanding the standard.

Then, it’s looking at how you might measure the standard against what your practices are currently using the accreditation criteria that we have established.

Other places would be NIST. I believe that it’s 161 that is the current pending standard for protecting supply chain security. There are several really good reports that the Defense Science Board and other organizations have conducted in the past within the federal government space. There are plenty of materials out there, a lot of discussion about challenges.

But I think the only place where you really find solutions, or at least one of the only places that I have seen is in the TTF, embedded in the standard as a set of practices that are very practical to implement.

Gardner: Sally, the same question to you. Where can people go to get involved? What should they perhaps do to get started?

Long: I’d reiterate what Andras said. I’d also point them toward the accreditation website, which is And on that accreditation site you can see the policy, standard and supporting docs. We publicize our assessment procedures so you have a good idea of what the assessment process will entail.

The program is based on evidence of conformance as well as a warranty from the applicant. So the assessment procedures being public will allow any organizations thinking about getting accredited to know exactly what they need to do.

As always, we would appreciate any new members, because we’ll be evolving the standard and the accreditation program, and it is done by consensus. So if you want a say in that, whether our standard needs to be stronger, weaker, broader, etc., join the forum and help us evolve it.

Impact on business

Gardner: Dan Reddy, when we think about managing these issues, often it falls on the shoulders of IT and their security apparatus, the Chief Information Security Officer perhaps. But it seems that the impact on business is growing. So should other people in the enterprise be thinking about this? I am thinking about procurement or the governance risk and compliance folks. Who else should be involved other than IT in their security apparatus in mitigating the risks as far as IT supply chain activity?

Reddy: You’re right that the old model of everything falls on IT is expanding, and now you see issues of enterprise risk and supply chain risk making it up to the boards of directors, who are asking tough questions. That’s one reason why boards look at cyber insurance as a way to mitigate some of the risk that they can’t control.

They’re asking tough questions all the way around, and I think acquisition people do need to understand what are the right questions to ask of technology providers.

To me, this comes back to scalability. This one-off approach of everyone asking questions of each of their vendors just isn’t going to make it. The advantage that we have here is that we have a consistent standard, built by consensus, freely available, and it’s measurable.

There are a lot of other good documents that talk about supply chain risk and secure engineering, but you can’t get a third-party assessment in a straightforward method, and I think that’s going to be appealing over time.

Gardner: Bob Dix, last word to you. What do you see happening in the area of government affairs and public policy around these issues? What should we hope for or expect from different governments in creating an atmosphere that improves risk across supply chain?

Dix: A couple things have to happen, Dana. First, we have got to quit blaming victims when we have breaches and compromises and start looking at solutions. The government has a tendency in the United States and in other countries around the world, to look at legislating and trying to pass regulatory measures that impose requirements on industry without a full understanding of what industry is already doing.

In this particular example, the government has had a tendency to take an approach that excludes vendors from being able to participate in federal procurement activities based on a risk level that they determine.

The really great thing about the work of the OTTF and the standard that’s being produced is it allows a different way to look at it and instead look at those that are accredited as having met the standard and being able to provide a higher assurance level of authenticity and security around the products and services that they deliver. I think that’s a much more productive approach.

Working together

And from a standpoint of public policy, this example on the great work that’s being done by industry and government working together globally to be able to deliver the standard provides the government a basis by which they can think about it a little differently.

Instead of just focusing on who they want to exclude, let’s look at who actually is delivering the value and meeting the requirements to be a trusted provider. That’s a different approach and it’s one that we are very proud of in terms of the work of The Open Group and we will continue to work that going forward.

Gardner: Excellent. I’m afraid we will have to leave it there. We’ve been exploring ways to address supply chain risk in the information technology sector marketplace, and we’ve seen how The Open Group Trusted Technology Forum standards and accreditation activities are enhancing the security of global supply chain and improving the integrity of openly available IT products and components. And we have also learned how the age-old practice of insurance is coming to bear on the problem of IT supply chain risk.

This special BriefingsDirect Thought Leadership Panel Discussion comes to you in conjunction with The Open Group’s upcoming conference on July 20, 2015 in Baltimore. It’s not too late to register on The Open Group’s website or to follow the proceedings online and via Twitter and other social media during the week of the presentation.

So a big thank you to our guests. We’ve been joined today by Sally Long, Director of The Open Group Trusted Technology Forum. Thanks so much, Sally.

Long: Thank you, Dana.

Gardner: And a big thank you to Andras Szakal, Vice President and Chief Technology Officer for IBM U.S. Federal and Chairman of The Open Group Trusted Technology Forum. Thank you, Andras.

Szakal: Thank you very much for having us and come join the TTF. We can use all the help we can get.

Gardner: Great. A big thank you too to Bob Dix, Vice President of Global Government Affairs & Public Policy for Juniper Networks and a member of The Open Group Trusted Technology Forum. Thanks, Bob.

Dix: Appreciate the invitation. I look forward to joining you again.

Gardner: And lastly, thank you to Dan Reddy, Supply Chain Assurance Specialist, college instructor and Lead of The Open Group Trusted Technology Forum Global Outreach and Standards Harmonization Work Group. I appreciate your input, Dan.

Reddy: Glad to be here.

Gardner: And lastly, a big thank you to our audience for joining us at the special Open Group sponsored Thought Leadership Panel Discussion.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for these Open Group discussions associated with the Baltimore Conference ( (Register Here). Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Sponsor: The Open Group

Join the conversation @theopengroup #ogchat #ogBWI

Transcript of a Briefings Direct discussion on ways to address supply chain risk in the information technology sector marketplace. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in:


Filed under Cybersecurity, OTTF, Supply chain risk, The Open Group Baltimore 2015

Global Open Trusted Technology Provider™ Standard

By Sally Long, Forum Director, Open Trusted Technology Forum, The Open Group

A First Line of Defense in Protecting Critical Infrastructure – A Technical Solution that can Help Address a Geo-political Issue

The challenges associated with Cybersecurity and critical infrastructure, which include the security of global supply chains, are enormous. After working almost exclusively on supply chain security issues for the past 5 years, I am still amazed at the number of perspectives that need to be brought to bear on this issue.

Recently I had the opportunity to participate in a virtual panel sponsored by InfoSecurity Magazine entitled: “Protecting Critical Infrastructure: Developing a Framework to Mitigate Cybersecurity Risks and Build Resilience”. The session was recorded and the link can be found at the end of this blog.

The panelists were:

  • Jonathan Pollet, Founder, Executive Director Red Tiger Security
  • Ernie Hayden, Executive Consultant, Securicon LLC
  • Sean Paul McGurk, Global Managing Principal, Critical Infrastructure Protection Cybersecurity, Verizon
  • Sally Long, Director, The Open Group Trusted Technology Forum

One perspective I brought to the discussion was the importance of product integrity and the security of ICT global supply chains as a first line of defense to mitigate vulnerabilities that can lead to maliciously tainted and counterfeit products. This first line of defense must not be ignored when discussing how to prevent damage to critical infrastructure and the horrific consequences that can ensue.

The other perspective I highlighted was that securing global supply chains is both a technical and a global geo-political issue. And that addressing the technical perspective in a vendor-neutral and country-neutral manner can have a positive effect on diminishing the geo-political issues as well.

The technical perspective is driven by the simple fact that most everything has a global supply chain – virtually nothing is built from just one company or in just one country. In order for products to have integrity and their supply chains to be secure all constituents in the chain must follow best practices for security – both in-house and in their supply chains.

The related but separate geo-political perspective, driven by a desire to protect against malicious attackers and a lack of trust of/from nation-states, is pushing many countries to consider approaches that are disconcerting, to put it mildly. This is not just a US issue; every country is concerned about securing their critical infrastructures and their underlying supply chains. Unfortunately we are beginning to see attempts to address these global concerns through local solutions (i.e. country specific and disparate requirements that raise the toll on suppliers and could set up barriers to trade).

The point is that an international technical solution (e.g. a standard and accreditation program for all constituents in global supply chains), which all countries can adopt, helps address the geo-political issues by having a common standard and common conformance requirements, raising all boats on the river toward becoming trusted suppliers.

To illustrate the point, I provided some insight into a technical solution from The Open Group Trusted Technology Provider Forum. The Open Group announced the release of the Open Trusted Technology Provider™ Standard (O-TTPS) – Mitigating Maliciously Tainted and Counterfeit Products. A standard of best practices that addresses product integrity and supply chain security throughout a product’s life cycle (from design through disposal). In February 2014, The Open Group announced the O-TTPS Accreditation Program that enables a technology provider (e.g. integrator, OEM, hardware or software component supplier, or reseller) that conforms to the standard to be accredited – positioning them on the public accreditation registry so they can be identified as an Open Trusted Technology Provider™.

Establishing a global standard and accreditation program like the O-TTPS – a program which helps mitigate the risk of maliciously tainted and counterfeit products from being integrated into critical infrastructure – a program that is already available and is available to any technology provider in any country regardless if they are based in the US, China, Germany, India, Brazil, or in any other country in the world – is most certainly a step in the right direction.

For a varied set of perspectives and opinions from critical infrastructure and supply chain subject matter experts, you can view the recording at the following link. Please note that you may need to log in to the InfoSecurity website for access:

To learn more about the Open Trusted Technology Provider Standard and Accreditation Program, please visit the OTTF site:

Sally LongSally Long is the Director of The Open Group Trusted Technology Forum (OTTF). She has managed customer supplier forums and collaborative development projects for over twenty years. She was the release engineering section manager for all multi-vendor collaborative technology development projects at The Open Software Foundation (OSF) in Cambridge Massachusetts. Following the merger of the OSF and X/Open under The Open Group, she served as director for multiple forums in The Open Group. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts.

Contact:; @sallyannlong

Comments Off on Global Open Trusted Technology Provider™ Standard

Filed under COTS, Cybersecurity, O-TTF, O-TTPS, OTTF, Security, Standards, supply chain, Supply chain risk, Uncategorized

The Open Group Panel: Internet of Things – Opportunities and Obstacles

Below is the transcript of The Open Group podcast exploring the challenges and ramifications of the Internet of Things, as machines and sensors collect vast amounts of data.

Listen to the podcast.

Dana Gardner: Hello, and welcome to a special BriefingsDirect thought leadership interview series coming to you in conjunction with recent The Open Group Boston 2014 on July 21 in Boston.

Dana Gardner I’m Dana Gardner, principal analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these discussions on Open Platform 3.0 and Boundaryless Information Flow.

We’re going to now specifically delve into the Internet of Things with a panel of experts. The conference has examined how Open Platform 3.0™ leverages the combined impacts of cloud, big data, mobile, and social. But to each of these now we can add a new cresting wave of complexity and scale as we consider the rapid explosion of new devices, sensors, and myriad endpoints that will be connected using internet protocols, standards and architectural frameworks.

This means more data, more cloud connectivity and management, and an additional tier of “things” that are going to be part of the mobile edge — and extending that mobile edge ever deeper into even our own bodies.

When we think about inputs to these social networks — that’s going to increase as well. Not only will people be tweeting, your device could be very well tweet, too — using social networks to communicate. Perhaps your toaster will soon be sending you a tweet about your English muffins being ready each morning.

The Internet of Things is more than the “things” – it means a higher order of software platforms. For example, if we are going to operate data centers with new dexterity thanks to software-definited networking (SDN) and storage (SDS) — indeed the entire data center being software-defined (SDDC) — then why not a software-defined automobile, or factory floor, or hospital operating room — or even a software-defined city block or neighborhood?

And so how does this all actually work? Does it easily spin out of control? Or does it remain under proper management and governance? Do we have unknown unknowns about what to expect with this new level of complexity, scale, and volume of input devices?

Will architectures arise that support the numbers involved, interoperability, and provide governance for the Internet of Things — rather than just letting each type of device do its own thing?

To help answer some of these questions, The Open Group assembled a distinguished panel to explore the practical implications and limits of the Internet of Things. So please join me in welcoming Said Tabet, Chief Technology Officer for Governance, Risk and Compliance Strategy at EMC, and a primary representative to the Industrial Internet Consortium; Penelope Gordon, Emerging Technology Strategist at 1Plug Corporation; Jean-Francois Barsoum, Senior Managing Consultant for Smarter Cities, Water and Transportation at IBM, and Dave Lounsbury, Chief Technical Officer at The Open Group.

Jean-Francois, we have heard about this notion of “cities as platforms,” and I think the public sector might offer us some opportunity to look at what is going to happen with the Internet of Things, and then extrapolate from that to understand what might happen in the private sector.

Hypothetically, the public sector has a lot to gain. It doesn’t have to go through the same confines of a commercial market development, profit motive, and that sort of thing. Tell us a little bit about what the opportunity is in the public sector for smart cities.

Barsoum_Jean-FrancoisJean-Francois Barsoum: It’s immense. The first thing I want to do is link to something that Marshall Van Alstyne (Professor at Boston University and Researcher at MIT) had talked about, because I was thinking about his way of approaching platforms and thinking about how cities represent an example of that.

You don’t have customers; you have citizens. Cities are starting to see themselves as platforms, as ways to communicate with their customers, their citizens, to get information from them and to communicate back to them. But the complexity with cities is that as a good a platform as they could be, they’re relatively rigid. They’re legislated into existence and what they’re responsible for is written into law. It’s not really a market.

Chris Harding (Forum Director of The Open Group Open Platform 3.0) earlier mentioned, for example, water and traffic management. Cities could benefit greatly by managing traffic a lot better.

Part of the issue is that you might have a state or provincial government that looks after highways. You might have the central part of the city that looks after arterial networks. You might have a borough that would look after residential streets, and these different platforms end up not talking to each other.

They gather their own data. They put in their own widgets to collect information that concerns them, but do not necessarily share with their neighbor. One of the conditions that Marshall said would favor the emergence of a platform had to do with how much overlap there would be in your constituents and your customers. In this case, there’s perfect overlap. It’s the same citizen, but they have to carry an Android and an iPhone, despite the fact it is not the best way of dealing with the situation.

The complexities are proportional to the amount of benefit you could get if you could solve them.

Gardner: So more interoperability issues?

Barsoum: Yes.

More hurdles

Gardner: More hurdles, and when you say commensurate, you’re saying that the opportunity is huge, but the hurdles are huge and we’re not quite sure how this is going to unfold.

Barsoum: That’s right.

Gardner: Let’s go to an area where the opportunity outstrips the challenge, manufacturing. Said, what is the opportunity for the software-defined factory floor for recognizing huge efficiencies and applying algorithmic benefits to how management occurs across domains of supply-chain, distribution, and logistics. It seems to me that this is a no-brainer. It’s such an opportunity that the solution must be found.

Tabet_SaidSaid Tabet: When it comes to manufacturing, the opportunities are probably much bigger. It’s where we can see a lot of progress that has already been done and still work is going on. There are two ways to look at it.

One is the internal side of it, where you have improvements of business processes. For example, similar to what Jean-Francois said, in a lot of the larger companies that have factories all around the world, you’ll see such improvements on a factory base level. You still have those silos at that level.

Now with this new technology, with this connectedness, those improvements are going to be made across factories, and there’s a learning aspect to it in terms of trying to manage that data. In fact, they do a better job. We still have to deal with interoperability, of course, and additional issues that could be jurisdictional, etc.

However, there is that learning that allows them to improve their processes across factories. Maintenance is one of them, as well as creating new products, and connecting better with their customers. We can see a lot of examples in the marketplace. I won’t mention names, but there are lots of them out there with the large manufacturers.

Gardner: We’ve had just-in-time manufacturing and lean processes for quite some time, trying to compress the supply chain and distribution networks, but these haven’t necessarily been done through public networks, the internet, or standardized approaches.

But if we’re to benefit, we’re going to need to be able to be platform companies, not just product companies. How do you go from being a proprietary set of manufacturing protocols and approaches to this wider, standardized interoperability architecture?

Tabet: That’s a very good question, because now we’re talking about that connection to the customer. With the airline and the jet engine manufacturer, for example, when the plane lands and there has been some monitoring of the activity during the whole flight, at that moment, they’ll get that data made available. There could be improvements and maybe solutions available as soon as the plane lands.


That requires interoperability. It requires Platform 3.0 for example. If you don’t have open platforms, then you’ll deal with the same hurdles in terms of proprietary technologies and integration in a silo-based manner.

Gardner: Penelope, you’ve been writing about the obstacles to decision-making that might become apparent as big data becomes more prolific and people try to capture all the data about all the processes and analyze it. That’s a little bit of a departure from the way we’ve made decisions in organizations, public and private, in the past.

Of course, one of the bigger tenets of Internet of Things is all this great data that will be available to us from so many different points. Is there a conundrum of some sort? Is there an unknown obstacle for how we, as organizations and individuals, can deal with that data? Is this going to be chaos, or is this going to be all the promises many organizations have led us to believe around big data in the Internet of Things?

Gordon_PenelopePenelope Gordon: It’s something that has just been accelerated. This is not a new problem in terms of the decision-making styles not matching the inputs that are being provided into the decision-making process.

Former US President Bill Clinton was known for delaying making decisions. He’s a head-type decision-maker and so he would always want more data and more data. That just gets into a never-ending loop, because as people collect data for him, there is always more data that you can collect, particularly on the quantitative side. Whereas, if it is distilled down and presented very succinctly and then balanced with the qualitative, that allows intuition to come to fore, and you can make optimal decisions in that fashion.

Conversely, if you have someone who is a heart-type or gut-type decision-maker and you present them with a lot of data, their first response is to ignore the data. It’s just too much for them to take in. Then you end up completely going with whatever you feel is correct or whatever you have that instinct that it’s the correct decision. If you’re talking about strategic decisions, where you’re making a decision that’s going to influence your direction five years down the road, that could be a very wrong decision to make, a very expensive decision, and as you said, it could be chaos.

It just brings to mind to me Dr. Suess’s The Cat in the Hat with Thing One and Thing Two. So, as we talk about the Internet of Things, we need to keep in mind that we need to have some sort of structure that we are tying this back to and understanding what are we trying to do with these things.

Gardner: Openness is important, and governance is essential. Then, we can start moving toward higher-order business platform benefits. But, so far, our panel has been a little bit cynical. We’ve heard that the opportunity and the challenges are commensurate in the public sector and that in manufacturing we’re moving into a whole new area of interoperability, when we think about reaching out to customers and having a boundary that is managed between internal processes and external communications.

And we’ve heard that an overload of data could become a very serious problem and that we might not get benefits from big data through the Internet of Things, but perhaps even stumble and have less quality of decisions.

So Dave Lounsbury of The Open Group, will the same level of standardization work? Do we need a new type of standards approach, a different type of framework, or is this a natural path and course what we have done in the past?

Different level

Lounsbury_DaveDave Lounsbury: We need to look at the problem at a different level than we institutionally think about an interoperability problem. Internet of Things is riding two very powerful waves, one of which is Moore’s Law, that these sensors, actuators, and network get smaller and smaller. Now we can put Ethernet in a light switch right, a tag, or something like that.

Also, Metcalfe’s Law that says that the value of all this connectivity goes up with the square of the number of connected points, and that applies to both the connection of the things but more importantly the connection of the data.

The trouble is, as we have said, that there’s so much data here. The question is how do you manage it and how do you keep control over it so that you actually get business value from it. That’s going to require us to have this new concept of a platform to not only to aggregate, but to just connect the data, aggregate it, correlate it as you said, and present it in ways that people can make decisions however they want.

Also, because of the raw volume, we have to start thinking about machine agency. We have to think about the system actually making the routine decisions or giving advice to the humans who are actually doing it. Those are important parts of the solution beyond just a simple “How do we connect all the stuff together?”

Gardner: We might need a higher order of intelligence, now that we have reached this border of what we can do with our conventional approaches to data, information, and process.

Thinking about where this works best first in order to then understand where it might end up later, I was intrigued again this morning by Professor Van Alstyne. He mentioned that in healthcare, we should expect major battles, that there is a turf element to this, that the organization, entity or even commercial corporation that controls and manages certain types of information and access to that information might have some very serious platform benefits.

The openness element now is something to look at, and I’ll come back to the public sector. Is there a degree of openness that we could legislate or regulate to require enough control to prevent the next generation of lock-in, which might not be to a platform to access to data information and endpoints? Where is it in the public sector that we might look to a leadership position to establish needed openness and not just interoperability.

Barsoum: I’m not even sure where to start answering that question. To take healthcare as an example, I certainly didn’t write the bible on healthcare IT systems and if someone did write that, I think they really need to publish it quickly.

We have a single-payer system in Canada, and you would think that would be relatively easy to manage. There is one entity that manages paying the doctors, and everybody gets covered the same way. Therefore, the data should be easily shared among all the players and it should be easy for you to go from your doctor, to your oncologist, to whomever, and maybe to your pharmacy, so that everybody has access to this same information.

We don’t have that and we’re nowhere near having that. If I look to other areas in the public sector, areas where we’re beginning to solve the problem are ones where we face a crisis, and so we need to address that crisis rapidly.

Possibility of improvement

In the transportation infrastructure, we’re getting to that point where the infrastructure we have just doesn’t meet the needs. There’s a constraint in terms of money, and we can’t put much more money into the structure. Then, there are new technologies that are coming in. Chris had talked about driverless cars earlier. They’re essentially throwing a wrench into the works or may be offering the possibility of improvement.

On any given piece of infrastructure, you could fit twice as many driverless cars as cars with human drivers in them. Given that set of circumstances, the governments are going to find they have no choice but to share data in order to be able to manage those. Are there cases where we could go ahead of a crisis in order to manage it? I certainly hope so.

Gardner: How about allowing some of the natural forces of marketplaces, behavior, groups, maybe even chaos theory, where if sufficient openness is maintained there will be some kind of a pattern that will emerge? We need to let this go through its paces, but if we have artificial barriers, that might be thwarted or power could go to places that we would regret later.

Barsoum: I agree. People often focus on structure. So the governance doesn’t work. We should find some way to change the governance of transportation. London has done a very good job of that. They’ve created something called Transport for London that manages everything related to transportation. It doesn’t matter if it’s taxis, bicycles, pedestrians, boats, cargo trains, or whatever, they manage it.

You could do that, but it requires a lot of political effort. The other way to go about doing it is saying, “I’m not going to mess with the structures. I’m just going to require you to open and share all your data.” So, you’re creating a new environment where the governance, the structures, don’t really matter so much anymore. Everybody shares the same data.

Gardner: Said, to the private sector example of manufacturing, you still want to have a global fabric of manufacturing capabilities. This is requiring many partners to work in concert, but with a vast new amount of data and new potential for efficiency.

How do you expect that openness will emerge in the manufacturing sector? How will interoperability play when you don’t have to wait for legislation, but you do need to have cooperation and openness nonetheless?

Tabet: It comes back to the question you asked Dave about standards. I’ll just give you some examples. For example, in the automotive industry, there have been some activities in Europe around specific standards for communication.

The Europeans came to the US and started to have discussions, and the Japanese have interest, as well as the Chinese. That shows, because there is a common interest in creating these new models from a business standpoint, that these challenges they have to be dealt with together.

Managing complexity

When we talk about the amounts of data, what we call now big data, and what we are going to see in about five years or so, you can’t even imagine. How do we manage that complexity, which is multidimensional? We talked about this sort of platform and then further, that capability and the data that will be there. From that point of view, openness is the only way to go.

There’s no way that we can stay away from it and still be able to work in silos in that new environment. There are lots of things that we take for granted today. I invite some of you to go back and read articles from 10 years ago that try to predict the future in technology in the 21st century. Look at your smart phones. Adoption is there, because the business models are there, and we can see that progress moving forward.

Collaboration is a must, because it is a multidimensional level. It’s not just manufacturing like jet engines, car manufacturers, or agriculture, where you have very specific areas. They really they have to work with their customers and the customers of their customers.

Adoption is there, because the business models are there, and we can see that progress moving forward.

Gardner: Dave, I have a question for both you and Penelope. I’ve seen some instances where there has been a cooperative endeavor for accessing data, but then making it available as a service, whether it’s an API, a data set, access to a data library, or even analytics applications set. The Ocean Observatories Initiative is one example, where it has created a sensor network across the oceans and have created data that then they make available.

Do you think we expect to see an intermediary organization level that gets between the sensors and the consumers or even controllers of the processes? Is there’s a model inherent in that that we might look to — something like that cooperative data structure that in some ways creates structure and governance, but also allows for freedom? It’s sort of an entity that we don’t have yet in many organizations or many ecosystems and that needs to evolve.

Lounsbury: We’re already seeing that in the marketplace. If you look at the commercial and social Internet of Things area, we’re starting to see intermediaries or brokers cropping up that will connect the silo of my android ecosystem to the ecosystem of package tracking or something like that. There are dozens and dozens of these cropping up.

In fact, you now see APIs even into a silo of what you might consider a proprietary system and what people are doing is to to build a layer on top of those APIs that intermediate the data.

This is happening on a point-to-point basis now, but you can easily see the path forward. That’s going to expand to large amounts of data that people will share through a third party. I can see this being a whole new emerging market much as what Google did for search. You could see that happening for the Internet of Things.

Gardner: Penelope, do you have any thoughts about how that would work? Is there a mutually assured benefit that would allow people to want to participate and cooperate with that third entity? Should they have governance and rules about good practices, best practices for that intermediary organization? Any thoughts about how data can be managed in this sort of hierarchical model?

Nothing new

Gordon: First, I’ll contradict it a little bit. To me, a lot of this is nothing new, particularly coming from a marketing strategy perspective, with business intelligence (BI). Having various types of intermediaries, who are not only collecting the data, but then doing what we call data hygiene, synthesis, and even correlation of the data has been around for a long time.

It was an interesting, when I looked at recent listing of the big-data companies, that some notable companies were excluded from that list — companies like Nielsen. Nielsen’s been collecting data for a long time. Harte-Hanks is another one that collects a tremendous amount of information and sells that to companies.

That leads into the another part of it that I think there’s going to be. We’re seeing an increasing amount of opportunity that involves taking public sources of data and then providing synthesis on it. What remains to be seen is how much of the output of that is going to be provided for “free”, as opposed to “fee”. We’re going to see a lot more companies figuring out creative ways of extracting more value out of data and then charging directly for that, rather than using that as an indirect way of generating traffic.

Gardner: We’ve seen examples of how this has been in place. Does it scale and does the governance or lack of governance that might be in the market now sustain us through the transition into Platform 3.0 and the Internet of Things.

Gordon: That aspect is the lead-on part of “you get what you pay for”. If you’re using a free source of data, you don’t have any guarantee that it is from authoritative sources of data. Often, what we’re getting now is something somebody put it in a blog post, and then that will get referenced elsewhere, but there was nothing to go back to. It’s the shaky supply chain for data.

You need to think about the data supply and that is where the governance comes in. Having standards is going to increasingly become important, unless we really address a lot of the data illiteracy that we have. A lot of people do not understand how to analyze data.

One aspect of that is a lot of people expect that we have to do full population surveys, as opposed representative sampling to get much more accurate and much more cost-effective collection of data. That’s just one example, and we do need a lot more in governance and standards.

Gardner: What would you like to see changed most in order for the benefits and rewards of the Internet of Things to develop and overcome the drawbacks, the risks, the downside? What, in your opinion, would you like to see happen to make this a positive, rapid outcome? Let’s start with you Jean-Francois.

Barsoum: There are things that I have seen cities start to do now. There are couple of examples: Philadelphia is one and Barcelona does this too. Rather than do the typical request for proposal (RFP), where they say, “This is the kind of solution we’re looking for, and here are our parameters. Can l you tell us how much it is going to cost to build,” they come to you with the problem and they say, “Here is the problem I want to fix. Here are my priorities, and you’re at liberty to decide how best to fix the problem, but tell us how much that would cost.”

If you do that and you combine it with access to the public data that is available — if public sector opens up its data — you end up with a very powerful combination that liberates a lot of creativity. You can create a lot of new business models. We need to see much more of that. That’s where I would start.

More education

Tabet: I agree with Jean-Francois on that. What I’d like to add is that I think we need to push the relation a little further. We need more education, to your point earlier, around the data and the capabilities.

We need these platforms that we can leverage a little bit further with the analytics, with machine learning, and with all of these capabilities that are out there. We have to also remember, when we talk about the Internet of Things, it is things talking to each other.

So it is not human-machine communication. Machine-to-machine automation will be further than that, and we need more innovation and more work in this area, particularly more activity from the governments. We’ve seen that, but it is a little bit frail from that point of view right now.

Gardner: Dave Lounsbury, thoughts about what need to happen in order to keep this on the tracks?

Lounsbury: We’ve touched on lot of them already. Thank you for mentioning the machine-to-machine part, because there are plenty of projections that show that it’s going to be the dominant form of Internet communication, probably within the next four years.

So we need to start thinking of that and moving beyond our traditional models of humans talking through interfaces to set of services. We need to identify the building blocks of capability that you need to manage, not only the information flow and the skilled person that is going to produce it, but also how you manage the machine-to-machine interactions.

Gordon: I’d like to see not so much focus on data management, but focus on what is the data managing and helping us to do. Focusing on the machine-to-machine and the devices is great, but it should be not on the devices or on the machines… it should be on what can they accomplish by communicating; what can you accomplish with the devices and then have a reverse engineer from that.

Gardner: Let’s go to some questions from the audience. The first one asks about a high order of intelligence which we mentioned earlier. It could be artificial intelligence, perhaps, but they ask whether that’s really the issue. Is the nature of the data substantially different, or we are just creating more of the same, so that it is a storage, plumbing, and processing problem? What, if anything, are we lacking in our current analytics capabilities that are holding us back from exploiting the Internet of Things?

Gordon: I’ve definitely seen that. That has a lot to do with not setting your decision objectives and your decision criteria ahead of time so that you end up collecting a whole bunch of data, and the important data gets lost in the mix. There is a term “data smog.”

Most important

The solution is to figure out, before you go collecting data, what data is most important to you. If you can’t collect certain kinds of data that are important to you directly, then think about how to indirectly collect that data and how to get proxies. But don’t try to go and collect all the data for that. Narrow in on what is going to be most important and most representative of what you’re trying to accomplish.

Gardner: Does anyone want to add to this idea of understanding what current analytics capabilities are lacking, if we have to adopt and absorb the Internet of Things?

Barsoum: There is one element around projection into the future. We’ve been very good at analyzing historical information to understand what’s been happening in the past. We need to become better at projecting into the future, and obviously we’ve been doing that for some time already.

But so many variables are changing. Just to take the driverless car as an example. We’ve been collecting data from loop detectors, radar detectors, and even Bluetooth antennas to understand how traffic moves in the city. But we need to think harder about what that means and how we understand the city of tomorrow is going to work. That requires more thinking about the data, a little bit like what Penelope mentioned, how we interpret that, and how we push that out into the future.

Lounsbury: I have to agree with both. It’s not about statistics. We can use historical data. It helps with lot of things, but one of the major issues we still deal with today is the question of semantics, the meaning of the data. This goes back to your point, Penelope, around the relevance and the context of that information – how you get what you need when you need it, so you can make the right decisions.

Gardner: Our last question from the audience goes back to Jean-Francois’s comments about the Canadian healthcare system. I imagine it applies to almost any healthcare system around the world. But it asks why interoperability is so difficult to achieve, when we have the power of the purse, that is the market. We also supposedly have the power of the legislation and regulation. You would think between one or the other or both that interoperability, because the stakes are so high, would happen. What’s holding it up?

Barsoum: There are a couple of reasons. One, in the particular case of healthcare, is privacy, but that is one that you could see going elsewhere. As soon as you talk about interoperability in the health sector, people start wondering where is their data going to go and how accessible is it going to be and to whom.

You need to put a certain number of controls over top of that. What is happening in parallel is that you have people who own some data, who believe they have some power from owning that data, and that they will lose that power if they share it. That can come from doctors, hospitals, anywhere.

So there’s a certain amount of change management you have to get beyond. Everybody has to focus on the welfare of the patient. They have to understand that there has to be a priority, but you also have to understand the welfare of the different stakeholders in the system and make sure that you do not forget about them, because if you forget about them they will find some way to slow you down.

Use of an ecosystem

Lounsbury: To me, that’s a perfect example of what Marshall Van Alstyne talked about this morning. It’s the change from focus on product to a focus on an ecosystem. Healthcare traditionally has been very focused on a doctor providing product to patient, or a caregiver providing a product to a patient. Now, we’re actually starting to see that the only way we’re able to do this is through use of an ecosystem.

That’s a hard transition. It’s a business-model transition. I will put in a plug here for The Open Group Healthcare vertical, which is looking at that from architecture perspective. I see that our Forum Director Jason Lee is over here. So if you want to explore that more, please see him.

Gardner: I’m afraid we will have to leave it there. We’ve been discussing the practical implications of the Internet of Things and how it is now set to add a new dimension to Open Platform 3.0 and Boundaryless Information Flow.

We’ve heard how new thinking about interoperability will be needed to extract the value and orchestrate out the chaos with such vast new scales of inputs and a whole new categories of information.

So with that, a big thank you to our guests: Said Tabet, Chief Technology Officer for Governance, Risk and Compliance Strategy at EMC; Penelope Gordon, Emerging Technology Strategist at 1Plug Corp.; Jean-Francois Barsoum, Senior Managing Consultant for Smarter Cities, Water and Transportation at IBM, and Dave Lounsbury, Chief Technology Officer at The Open Group.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on Open Platform 3.0 and Boundaryless Information Flow at The Open Group Conference, recently held in Boston. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript.

Transcript of The Open Group podcast exploring the challenges and ramifications of the Internet of Things, as machines and sensors collect vast amounts of data. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2014. All rights reserved.

You may also be interested in:

Comments Off on The Open Group Panel: Internet of Things – Opportunities and Obstacles

Filed under Boundaryless Information Flow™, Business Architecture, Cloud, Cloud/SOA, Data management, digital technologies, Enterprise Architecture, Future Technologies, Information security, Internet of Things, Interoperability, Open Platform 3.0, Service Oriented Architecture, Standards, Strategy, Supply chain risk, Uncategorized

How the Open Trusted Technology Provider Standard (O-TTPS) and Accreditation Will Help Lower Cyber Risk

By Andras Szakal, Vice President and Chief Technology Officer, IBM U.S. Federal

Changing business dynamics and enabling technologies

In 2008, IBM introduced the concept of a “Smarter Planet.” The Smarter Planet initiative focused, in part, on the evolution of globalization against the backdrop of changing business dynamics and enabling technologies. A key concept was the need for infrastructure to be tightly integrated, interconnected, and intelligent, thereby facilitating collaboration between people, government and businesses in order to meet the world’s growing appetite for data and automation. Since then, many industries and businesses have adopted this approach, including the ICT (information and communications technology) industries that support the global technology manufacturing supply chain.

Intelligent and interconnected critical systems

This transformation has infused technology into virtually all aspects of our lives, and involves, for example, government systems, the electric grid and healthcare. Most of these technological solutions are made up of hundreds or even thousands of components that are sourced from the growing global technology supply chain.
Intelligent and interconnected critical systems

In the global technology economy, no one technology vendor or integrator is able to always provide a single source solution. It is no longer cost competitive to design all of the electronic components, printed circuit boards, card assemblies, or other sub-assemblies in-house. Adapting to the changing market place and landscape by balancing response time and cost efficiency, in an expedient manner, drives a more wide-spread use of OEM (original equipment manufacturer) products.

As a result, most technology providers procure from a myriad of global component suppliers, who very often require similarly complex supply chains to source their components. Every enterprise has a supplier network, and each of their suppliers has a supply chain network, and these sub-tier suppliers have their own supply chain networks. The resultant technology supply chain is manifested into a network of integrated suppliers.

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Cybersecurity by design: Addressing risk in a sustainable way across the ecosystem

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways:

– Does the product meet essential functional requirements?
– Has the product or component been produced by trustworthy provider?

Of course, the leaders or owners of these systems have to do their part to achieve security and safety: e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats. Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

Assuring the quality and integrity of mission-critical technology

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

The very process of manufacturing technology is not immune to cyber-attack. The primary purpose of attacking the supply chain typically is motivated by monetary gain. The primary goals of a technology supply chain attack are intended to inflict massive economic damage in an effort to gain global economic advantage or as a way to seeding targets with malware that provides unfettered access for attackers.

It is for this reason that the global technology manufacturing industry must establish practices that mitigate this risk by increasing the cost barriers of launching such attacks and increasing the likelihood of being caught before the effects of such an attack are irreversible. As these threats evolve, the global ICT industry must deploy enhanced security through advanced automated cyber intelligence analysis. As critical infrastructure becomes more automated, integrated and essential to critical to functions, the technology supply chain that surrounds it must be considered a principle theme of the overall global security and risk mitigation strategy.

A global, agile, and scalable approach to supply chain security

Certainly, the manner in which technologies are invented, produced, and sold requires a global, agile, and scalable approach to supply chain assurance and is essential to achieve the desired results. Any technology supply chain security standard that hopes to be widely adopted must be flexible and country-agnostic. The very nature of the global supply chain (massively segmented and diverse) requires an approach that provides practicable guidance but avoids being overtly prescriptive. Such an approach would require the aggregation of industry practices that have been proven beneficial and effective at mitigating risk.

The OTTF (The Open Group Trusted Technology Forum) is an increasingly recognized and promising industry initiative to establish best practices to mitigate the risk of technology supply chain attack. Facilitated by The Open Group, a recognized international standards and certification body, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Current membership includes a list of the most well-known technology vendors, integrators, and technology assessment laboratories.

The benefits of O-TTPS for governments and enterprises

IBM is currently a member of the OTTF and has been honored to hold the Chair for the last three years.  Governments and enterprises alike will benefit from the work of the OTTF. Technology purchasers can use the Open Trusted Technology Provider™ Standard (O-TTPS) and Framework best-practice recommendations to guide their strategies.

A wide range of technology vendors can use O-TTPS approaches to build security and integrity into their end-to-end supply chains. The first version of the O-TTPS is focused on mitigating the risk of maliciously tainted and counterfeit technology components or products. Note that a maliciously tainted product is one that has been produced by the provider and acquired through reputable channels but which has been tampered maliciously. A counterfeit product is produced other than by or for the provider, or is supplied by a non-reputable channel, and is represented as legitimate. The OTTF is currently working on a program that will accredit technology providers who conform to the O-TTPS. IBM expects to complete pilot testing of the program by 2014.

IBM has actively supported the formation of the OTTF and the development of the O-TTPS for several reasons. These include but are not limited to the following:

– The Forum was established within a trusted and respected international standards body – The Open Group.
– The Forum was founded, in part, through active participation by governments in a true public-private partnership in which government members actively participate.
– The OTTF membership includes some of the most mature and trusted commercial technology manufactures and vendors because a primary objective of the OTTF was harmonization with other standards groups such as ISO (International Organization for Standardization) and Common Criteria.

The O-TTPS defines a framework of organizational guidelines and best practices that enhance the security and integrity of COTS ICT. The first version of the O-TTPS is focused on mitigating certain risks of maliciously tainted and counterfeit products within the technology development / engineering lifecycle. These best practices are equally applicable for systems integrators; however, the standard is intended to primarily address the point of view of the technology manufacturer.

O-TTPS requirements

The O-TTPS requirements are divided into three categories:

1. Development / Engineering Process and Method
2. Secure Engineering Practices
3. Supply Chain Security Practices

The O-TTPS is intended to establish a normalized set of criteria against which a technology provider, component supplier, or integrator can be assessed. The standard is divided into categories that define best practices for engineering development practices, secure engineering, and supply chain security and integrity intended to mitigate the risk of maliciously tainted and counterfeit components.

The accreditation program

As part of the process for developing the accreditation criteria and policy, the OTTF established a pilot accreditation program. The purpose of the pilot was to take a handful of companies through the accreditation process and remediate any potential process or interpretation issues. IBM participated in the OTTP-S accreditation pilot to accredit a very significant segment of the software product portfolio; the Application Infrastructure Middleware Division (AIM) which includes the flagship WebSphere product line. The AIM pilot started in mid-2013 and completed in the first week of 2014 and was formally recognized as accredited in the fist week of February 2014.

IBM is currently leveraging the value of the O-TTPS and working to accredit additional development organizations. Some of the lessons learned during the IBM AIM initial O-TTPS accreditation include:

– Conducting a pre-assessment against the O-TTPS should be conducted by an organization before formally entering accreditation. This allows for remediation of any gaps and reduces potential assessment costs and project schedule.
– Starting with a segment of your development portfolio that has a mature secure engineering practices and processes. This helps an organization address accreditation requirements and facilitates interactions with the 3rd party lab.
– Using your first successful O-TTPS accreditation to create templates that will help drive data gathering and validate practices to establish a repeatable process as your organization undertakes additional accreditations.

andras-szakalAndras Szakal, VP and CTO, IBM U.S. Federal, is responsible for IBM’s industry solution technology strategy in support of the U.S. Federal customer. Andras was appointed IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team in 2005. He is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP).  Andras holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. He has been a driving force behind IBM’s adoption of government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of the IT Architect Profession Certification Standard (ITAC). More recently, he was appointed chair of The Open Group Trusted Technology Forum and leads the development of The Open Trusted Technology Provider Framework.

1 Comment

Filed under Accreditations, Cybersecurity, government, O-TTF, O-TTPS, OTTF, RISK Management, Standards, supply chain, Supply chain risk

Accrediting the Global Supply Chain: A Conversation with O-TTPS Recognized Assessors Fiona Pattinson and Erin Connor

By The Open Group 

At the recent San Francisco 2014 conference, The Open Group Trusted Technology Forum (OTTF) announced the launch of the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program.

The program is one the first accreditation programs worldwide aimed at assuring the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

In three short years since OTTF launched, the forum has grown to include more than 25 member companies dedicated to safeguarding the global supply chain against the increasing sophistication of cybersecurity attacks through standards. Accreditation is yet another step in the process of protecting global technology supply chains from maliciously tainted and counterfeit products.

As part of the program, third-party assessor companies will be employed to assess organizations applying for accreditation, with The Open Group serving as the vendor-neutral Accreditation Authority that operates the program.  Prior to the launch, the forum conducted a pilot program with a number of member companies. It was announced at the conference that IBM is the first company to becoming accredited, earning accreditation for its Application, Infrastructure and Middleware (AIM), software business division for its product integrity and supply chain practices.

We recently spoke with OTTF members Fiona Pattinson, director of strategy and business development at Atsec Information Security, and Erin Connor, director at EWA-Canada, at the San Francisco conference to learn more about the assessment process and the new program.

The O-TTPS focus is on securing the technology supply chain. What would you say are the biggest threats facing the supply chain today?

Fiona Pattinson (FP): I think in the three years since the forum began certainly all the members have discussed the various threats quite a lot. It was one of things we discussed as an important topic early on, and I don’t know if it’s the ‘biggest threat,’ but certainly the most important threats that we needed to address initially were those of counterfeit and maliciously tainted products. We came to that through both discussion with all the industry experts in the forum and also through research into some of the requirements from government, so that’s exactly how we knew which threats [to start with].

Erin Connor (EC):  And the forum benefits from having both sides of the acquisition process, both acquirers, and the suppliers and vendors. So they get both perspectives.

How would you define maliciously tainted and counterfeit products?

FP:  They are very carefully defined in the standard—we needed to do that because people’s understanding of that can vary so much.

EC: And actually the concept of ‘maliciously’ tainted was incorporated close to the end of the development process for the standard at the request of members on the acquisition side of the process.

[Note: The standard precisely defines maliciously tainted and counterfeit products as follows:

“The two major threats that acquirers face today in their COTS ICT procurements, as addressed in this Standard, are defined as:

1. Maliciously tainted product – the product is produced by the provider and is acquired

through a provider’s authorized channel, but has been tampered with maliciously.

2. Counterfeit product – the product is produced other than by, or for, the provider, or is

supplied to the provider by other than a provider’s authorized channel and is presented as being legitimate even though it is not.”]

The OTTF announced the Accreditation Program for the OTTP Standard at the recent San Francisco conference. Tell us about the standard and how the accreditation program will help ensure conformance to it?

EC: The program is intended to provide organizations with a way to accredit their lifecycle processes for their product development so they can prevent counterfeit or maliciously tainted components from getting into the products they are selling to an end user or into somebody else’s supply chain. It was determined that a third-party type of assessment program would be used. For the organizations, they will know that we Assessors have gone through a qualification process with The Open Group and that we have in place all that’s required on the management side to properly do an assessment. From the consumer side, they have confidence the assessment has been completed by an independent third-party, so they know we aren’t beholden to the organizations to give them a passing grade when perhaps they don’t deserve it. And then of course The Open Group is in position to oversee the whole process and award the final accreditation based on the recommendation we provide.  The Open Group will also be the arbiter of the process between the assessors and organizations if necessary. 

FP:  So The Open Group’s accreditation authority is validating the results of the assessors.

EC: It’s a model that is employed in many, many other product or process assessment and evaluation programs where the actual accreditation authority steps back and have third parties do the assessment.

FP: It is important that the assessor companies are working to the same standard so that there’s no advantage in taking one assessor over the other in terms of the quality of the assessments that are produced.

How does the accreditation program work?

FP: Well, it’s brand new so we don’t know if it is perfect yet, but having said that, we have worked over several months on defining the process, and we have drawn from The Open Group’s existing accreditation programs, as well as from the forum experts who have worked in the accreditation field for many years. We have been performing pilot accreditations in order to check out how the process works. So it is already tested.

How does it actually work? Well, first of all an organization will feel the need to become accredited and at that point will apply to The Open Group to get the accreditation underway. Once their scope of accreditation – which may be as small as one product or theoretically as large as a whole global company – and once the application is reviewed and approved by The Open Group, then they engage an assessor.

There is a way of sampling a large scope to identify the process variations in a larger scope using something we term ‘selective representative products.’ It’s basically a way of logically sampling a big scope so that we capture the process variations within the scope and make sure that the assessment is kept to a reasonable size for the organization undergoing the assessment, but it also gives good assurance to the consumers that it is a representative sample. The assessment is performed by the Recognized Assessor company, and a final report is written and provided to The Open Group for their validation. If everything is in order, then the company will be accredited and their scope of conformance will be added to the accreditation register and trademarked.

EC: So the customers of that organization can go and check the registration for exactly what products are covered by the scope.

FP: Yes, the register is public and anybody can check. So if IBM says WebSphere is accredited, you can go and check that claim on The Open Group web site.

How long does the process take or does it vary?

EC: It will vary depending on how large the scope to be accredited is in terms of the size of the representative set and the documentation evidence. It really does depend on what the variations in the processes are among the product lines as to how long it takes the assessor to go through the evidence and then to produce the report. The other side of the coin is how long it takes the organization to produce the evidence. It may well be that they might not have it totally there at the outset and will have to create some of it.

FP: As Erin said, it varies by the complexity and the variation of the processes and hence the number of selected representative products. There are other factors that can influence the duration. There are three parties influencing that: The applicant Organization, The Open Group’s Accreditation Authority and the Recognized Assessor.

For example, we found that the initial work by the Organization and the Accreditation Authority in checking the scope and the initial documentation can take a few weeks for a complex scope, of course for the pilots we were all new at doing that. In this early part of the project it is vital to get the scope both clearly defined and approved since it is key to a successful accreditation.

It is important that an Organization assigns adequate resources to help keep this to the shortest time possible, both during the initial scope discussions, and during the assessment. If the Organization can provide all the documentation before they get started, then the assessors are not waiting for that and the duration of the assessment can be kept as short as possible.

Of course the resources assigned by the Recognized Assessor also influences how long an assessment takes. A variable for the assessors is how much documentation do they have to read and review? It might be small or it might be a mountain.

The Open Group’s final review and oversight of the assessment takes some time and is influenced by resource availability within that organization. If they have any questions it may take a little while to resolve.

What kind of safeguards does the accreditation program put in place for enforcing the standard?

FP: It is a voluntary standard—there’s no requirement to comply. Currently some of the U.S. government organizations are recommending it. For example, NASA in their SEWP contract and some of the draft NIST documents on Supply Chain refer to it, too.

EC: In terms of actual oversight, we review what their processes are as assessors, and the report and our recommendations are based on that review. The accreditation expires after three years so before the three years is up, the organization should actually get the process underway to obtain a re-accreditation.  They would have to go through the process again but there will be a few more efficiencies because they’ve done it before. They may also wish to expand the scope to include the other product lines and portions of the company. There aren’t any periodic ‘spot checks’ after accreditation to make sure they’re still following the accredited processes, but part of what we look at during the assessment is that they have controls in place to ensure they continue doing the things they are supposed to be doing in terms of securing their supply chain.

FP:  And then the key part is the agreement the organizations signs with The Open Group includes the fact the organization warrant and represent that they remain in conformance with the standard throughout the accreditation period. So there is that assurance too, which builds on the more formal assessment checks.

What are the next steps for The Open Group Trusted Technology Forum?  What will you be working on this year now that the accreditation program has started?

FP: Reviewing the lessons we learned through the pilot!

EC: And reviewing comments from members on the standard now that it’s publicly available and working on version 1.1 to make any corrections or minor modifications. While that’s going on, we’re also looking ahead to version 2 to make more substantial changes, if necessary. The standard is definitely going to be evolving for a couple of years and then it will reach a steady state, which is the normal evolution for a standard.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey –

Fiona Pattinson Fiona Pattinson is responsible for developing new and existing atsec service offerings.  Under the auspices of The Open Group’s OTTF, alongside many expert industry colleagues, Fiona has helped develop The Open Group’s O-TTPS, including developing the accreditation program for supply chain security.  In the past, Fiona has led service developments which have included establishing atsec’s US Common Criteria laboratory, the CMVP cryptographic module testing laboratory, the GSA FIPS 201 TP laboratory, TWIC reader compliance testing, NPIVP, SCAP, PCI, biometrics testing and penetration testing. Fiona has responsibility for understanding a broad range of information security topics and the application of security in a wide variety of technology areas from low-level design to the enterprise level.

ErinConnorErin Connor is the Director at EWA-Canada responsible for EWA-Canada’s Information Technology Security Evaluation & Testing Facility, which includes a Common Criteria Test Lab, a Cryptographic & Security Test Lab (FIPS 140 and SCAP), a Payment Assurance Test Lab (device testing for PCI PTS POI & HSM, Australian Payment Clearing Association and Visa mPOS) and an O-TTPS Assessor lab Recognized by the Open Group.  Erin participated with other expert members of the Open Group Trusted Technology Forum (OTTF) in the development of The Open Group Trusted Technology Provider Standard for supply chain security and its accompanying Accreditation Program.  Erin joined EWA-Canada in 1994 and his initial activities in the IT Security and Infrastructure Assurance field included working on the team fielding a large scale Public Key Infrastructure system, Year 2000 remediation and studies of wireless device vulnerabilities.  Since 2000, Erin has been working on evaluations of a wide variety of products including hardware security modules, enterprise security management products, firewalls, mobile device and management products, as well as system and network vulnerability management products.  He was also the only representative of an evaluation lab in the Biometric Evaluation Methodology Working Group, which developed a proposed methodology for the evaluation of biometric technologies under the Common Criteria.

Comments Off on Accrediting the Global Supply Chain: A Conversation with O-TTPS Recognized Assessors Fiona Pattinson and Erin Connor

Filed under Accreditations, Cybersecurity, OTTF, Professional Development, Standards, Supply chain risk

New Accreditation Program – Raises the Bar for Securing Global Supply Chains

By Sally Long, Director of The Open Group Trusted Technology Forum (OTTF)™

In April 2013, The Open Group announced the release of the Open Trusted Technology Provider™ Standard (O-TTPS) 1.0 – Mitigating Maliciously Tainted and Counterfeit Products. Now we are announcing the O-TTPS Accreditation Program, launched on February 3, 2014, which enables organizations that conform to the standard to be accredited as Open Trusted Technology Providers™.

The O-TTPS, a standard of The Open Group, provides a set of guidelines, recommendations and requirements that help assure against maliciously tainted and counterfeit products throughout commercial off-the-shelf (COTS) information and communication technology (ICT) product lifecycles. The standard includes best practices throughout all phases of a product’s life cycle: design, sourcing, build, fulfillment, distribution, sustainment, and disposal, thus enhancing the integrity of COTS ICT products and the security of their global supply chains.

This accreditation program is one of the first of its kind in providing accreditation for conforming to standards for product integrity coupled with supply chain security.

The standard and the accreditation program are the result of a collaboration between government, third party evaluators and some of industry’s most mature and respected providers who came together and, over a period of four years, shared their practices for integrity and security, including those used in-house and those used with their own supply chains.

Applying for O-TTPS Accreditation

When the OTTF started this initiative, one of its many mantras was “raise all boats.” The  objective was to raise the security bar across the full spectrum of the supply chain, from small component suppliers to the providers who include those components in their products and to the integrators who incorporate those providers’ products into customers’ systems.

The O-TTPS Accreditation Program is open to all component suppliers, providers and integrators. The holistic aspect of this program’s potential, as illustrated in the diagram below should not be underestimated—but it will take a concerted effort to reach and encourage all constituents in the supply chain to become involved.

OTTPSThe importance of mitigating the risk of maliciously tainted and counterfeit products

The focus on mitigating the risks of tainted and counterfeit products by increasing the security of the supply chain is critical in today’s global economy. Virtually nothing is made from one source.

COTS ICT supply chains are complex. A single product can be comprised of hundreds of components from multiple component suppliers from numerous different areas around the world—and providers can change their component suppliers frequently depending on the going rate for a particular component.  If, along the supply chain, bad things happen, such as inserting counterfeit components in place of authentic ones or inserting maliciously tainted code or the double-hammer—maliciously tainted counterfeit parts—then terrible things can happen when that product is installed at a customer site.

With the threat of tainted and counterfeit technology products posing a major risk to global organizations, it is increasingly important for those organizations to take what steps they can to mitigate these risks. The O-TTPS Accreditation Program is one of those steps. Can an accreditation program completely eliminate the risk of tainted and counterfeit components? No!  Does it reduce the risk? Absolutely!

How the Accreditation Program works

The Open Group, with over 25 years’ experience managing vendor- and technology-neutral certification programs, will assume the role of the Accreditation Authority over the entire program. Additionally the program will utilize third-party assessors to assess conformance to the O-TTPS requirements.

Companies seeking accreditation will declare their Scope of Accreditation, which means they can choose to be accredited for conforming to the O-TTPS standard and adhering to the best practice requirements across their entire enterprise, within a specific product line or business unit or within an individual product.  Organizations applying for accreditation are then required to provide evidence of conformance for each of the O-TTPS requirements, demonstrating they have the processes in place to secure in-house development and their supply chains across the entire COTS ICT product lifecycle. O-TTPS accredited organizations will then be able to identify themselves as Open Trusted Technology Providers™ and will become part of a public registry of trusted providers.

The Open Group has also instituted the O-TTPS Recognized Assessor Program, which assures that Recognized Assessor (companies) meet certain criteria as assessor organizations and that their assessors (individuals) meet an additional set of criteria and have passed the O-TTPS Assessor exam, before they can be assigned to an O-TTPS Assessment. The Open Group will operate this program, grant O-TTPS Recognized Assessor certificates and list those qualifying organizations on a public registry of recognized assessor companies.

Efforts to increase awareness of the program

The Open Group understands that to achieve global uptake we need to reach out to other countries across the globe for market adoption, as well as to other standards groups for harmonization. The forum has a very active outreach and harmonization work group and the OTTF is increasingly being recognized for its efforts. A number of prominent U.S. government agencies, including the General Accounting Office and NASA have recognized the standard as an important supply chain security effort. Dave Lounsbury, the CTO of The Open Group, has testified before Congress on the value of this initiative from the industry-government partnership perspective. The Open Group has also met with President Obama’s Cybersecurity Coordinators (past and present) to apprise them of our work. We continue to work closely with NIST from the perspective of the Cybersecurity Framework, which recognizes the supply chain as a critical area for the next version, and the OTTF work is acknowledged in NIST’s Special Publication 161. We have liaisons with ISO and are working internally at mapping our standards and accreditation to Common Criteria. The O-TTPS has also been discussed with government agencies in China, India, Japan and the UK.

The initial version of the standard and the accreditation program are just the beginning. OTTF members will continue to evolve both the standard and the accreditation program to provide additional versions that refine existing requirements, introduce additional requirements, and cover additional threats. And the outreach and harmonization efforts will continue to strengthen so that we can reach that holistic potential of Open Trusted Technology Providers™ throughout all global supply chains.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey –

Sally LongSally Long is the Director of The Open Group Trusted Technology Forum (OTTF). She has managed customer supplier forums and collaborative development projects for over twenty years. She was the release engineering section manager for all multi-vendor collaborative technology development projects at The Open Software Foundation (OSF) in Cambridge Massachusetts. Following the merger of the OSF and X/Open under The Open Group, she served as director for multiple forums in The Open Group. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts.

Comments Off on New Accreditation Program – Raises the Bar for Securing Global Supply Chains

Filed under Cybersecurity, OTTF, Supply chain risk

Summer in the Capitol – Looking Back at The Open Group Conference in Washington, D.C.

By Jim Hietala, The Open Group

This past week in Washington D.C., The Open Group held our Q3 conference. The theme for the event was “Cybersecurity – Defend Critical Assets and Secure the Global Supply Chain,” and the conference featured a number of thought-provoking speakers and presentations.

Cybersecurity is at a critical juncture, and conference speakers highlighted the threat and attack reality and described industry efforts to move forward in important areas. The conference also featured a new capability, as several of the events were Livestreamed to the Internet.

For those who did not make the event, here’s a summary of a few of the key presentations, as well as what The Open Group is doing in these areas.

Joel Brenner, attorney with Cooley, was our first keynote. Joel’s presentation was titled, “Turning Us Inside-Out: Crime and Economic Espionage on our Networks,” The talk mirrored his recent book, “America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime, and Warfare,” and Joel talked about current threats to critical infrastructure, attack trends and challenges in securing information. Joel’s presentation was a wakeup call to the very real issues of IP theft and identity theft. Beyond describing the threat and attack landscape, Joel discussed some of the management challenges related to ownership of the problem, namely that the different stakeholders in addressing cybersecurity in companies, including legal, technical, management and HR, all tend to think that this is someone else’s problem. Joel stated the need for policy spanning the entire organization to fully address the problem.

Kristin Baldwin, principal deputy, systems engineering, Office of the Assistant Secretary of Defense, Research and Engineering, described the U.S. Department of Defense (DoD) trusted defense systems strategy and challenges, including requirements to secure their multi-tiered supply chain. She also talked about how the acquisition landscape has changed over the past few years. In addition, for all programs the DoD now requires the creation of a program protection plan, which is the single focal point for security activities on the program. Kristin’s takeaways included needing a holistic approach to security, focusing attention on the threat, and avoiding risk exposure from gaps and seams. DoD’s Trusted Defense Systems Strategy provides an overarching framework for trusted systems. Stakeholder integration with acquisition, intelligence, engineering, industry and research communities is key to success. Systems engineering brings these stakeholders, risk trades, policy and design decisions together. Kristin also stressed the importance of informing leadership early and providing programs with risk-based options.

Dr. Ron Ross of NIST presented a perfect storm of proliferation of information systems and networks, increasing sophistication of threat, resulting in an increasing number of penetrations of information systems in the public and private sectors potentially affecting security and privacy. He proposed a need an integrated project team approach to information security. Dr. Ross also provided an overview of the changes coming in NIST SP 800-53, version 4, which is presently available in draft form. He also advocated a dual protection strategy approach involving traditional controls at network perimeters that assumes attackers outside of organizational networks, as well as agile defenses, are already inside the perimeter. The objective of agile defenses is to enable operation while under attack and to minimize response times to ongoing attacks. This new approach mirrors thinking from the Jericho Forum and others on de-perimeterization and security and is very welcome.

The Open Group Trusted Technology Forum provided a panel discussion on supply chain security issues and the approach that the forum is taking towards addressing issues relating to taint and counterfeit in products. The panel included Andras Szakal of IBM, Edna Conway of Cisco and Dan Reddy of EMC, as well as Dave Lounsbury, CTO of The Open Group. OTTF continues to make great progress in the area of supply chain security, having published a snapshot of the Open Trusted Technology Provider Framework, working to create a conformance program, and in working to harmonize with other standards activities.

Dave Hornford, partner at Conexiam and chair of The Open Group Architecture Forum, provided a thought provoking presentation titled, “Secure Business Architecture, or just Security Architecture?” Dave’s talk described the problems in approaches that are purely focused on securing against threats and brought forth the idea that focusing on secure business architecture was a better methodology for ensuring that stakeholders had visibility into risks and benefits.

Geoff Besko, CEO of Seccuris and co-leader of the security integration project for the next version of TOGAF®, delivered a presentation that looked at risk from a positive and negative view. He recognized that senior management frequently have a view of risk embracing as taking risk with am eye on business gains if revenue/market share/profitability, while security practitioners tend to focus on risk as something that is to be mitigated. Finding common ground is key here.

Katie Lewin, who is responsible for the GSA FedRAMP program, provided an overview of the program, and how it is helping raise the bar for federal agency use of secure Cloud Computing.

The conference also featured a workshop on security automation, which featured presentations on a number of standards efforts in this area, including on SCAP, O-ACEML from The Open Group, MILE, NEA, AVOS and SACM. One conclusion from the workshop was that there’s presently a gap and a need for a higher level security automation architecture encompassing the many lower level protocols and standards that exist in the security automation area.

In addition to the public conference, a number of forums of The Open Group met in working sessions to advance their work in the Capitol. These included:

All in all, the conference clarified the magnitude of the cybersecurity threat, and the importance of initiatives from The Open Group and elsewhere to make progress on real solutions.

Join us at our next conference in Barcelona on October 22-25!

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off on Summer in the Capitol – Looking Back at The Open Group Conference in Washington, D.C.

Filed under Conference, Cybersecurity, Enterprise Architecture, Information security, OTTF, Security Architecture, Supply chain risk, TOGAF®