Tag Archives: standards process

Viewpoint: Technology Supply Chain Security – Becoming a Trust-Worthy Provider

By Andras Szakal, IBM

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways: first, does the product meet essential functional requirements; and second, has the product or component been produced by trustworthy provider. Of course, the leaders or owners of these systems have to do their part to achieve security and safety (e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats). Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

For example:  Most of us are able to enjoy a meal of stir-fried shrimp and not give a second thought as to whether the shellfish is safe to eat.

Why is that? Because we are the beneficiaries of a system whose workings greatly increase the likelihood — in many parts of the world — that the shellfish served to end consumers is safe and uncontaminated. While tainted technology is not quite the same as tainted foods it’s a useful analogy.

Of course, a very high percentage of the seafood industry is extremely motivated to provide safe and delicious shellfish to the end consumer. So we start with the practical perspective that, much more likely than not in today’s hyper-informed and communicative world, the food supply system will provide reasonably safe and tasty products. Invisible though it may be to most of us, however, this generalized confidence rests on a worldwide system that is built on globally recognized standards and strong public-private collaboration.

This system is necessary because mistakes happen, expectations evolve and — worse — the occasional participant in the food supply chain may take a shortcut in their processing practices. Therefore, some kind of independent oversight and certification has proven useful to assure consumers that what they pay for — their desired size and quality grade and, always, safety — is what they will get. In many countries, close cooperation between industry and government results in industry-led development and implementation of food safety standards.[1]

Government’s role is limited but important. Clearly, government cannot look at and certify every piece of shellfish people buy. So its actions are focused on areas in which it can best contribute: to take action in the event of a reported issue; to help convene industry participants to create and update safety practices; to educate consumers on how to choose and prepare shellfish safely; and to recognize top performers.[2]

Is the system perfect? Of course not. But it works, and supports the most practical and affordable methods of conducting safe and global commerce.

Let’s apply this learning to another sphere: information technology. To wit:

  • We need to start with the realization that the overwhelming majority of technology suppliers are motivated to provide securely engineered products and services, and that competitive dynamics reward those who consistently perform well.
  • However, we also need to recognize that there is a gap in time between the corrective effect of the market’s Invisible Hand and the damage that can be done in any given incident. Mistakes will inevitably happen, and there are some bad actors. So some kind of oversight and governmental participation are important, to set the right incentives and expectations.
  • We need to acknowledge that third-party inspection and certification of every significant technology product at the “end of pipe” is not only impractical but also insufficient. It will not achieve trust across a wide variety of infrastructures and industries.  A much more effective approach is to gather the world’s experts and coalesce industry practices around the processes that the experts agree are best suited to produce desired end results.
  • Any proposed oversight or government involvement must not stymie innovation or endanger a provider’s intellectual capital by requiring exposure to 3rd party assessments or require overly burdensome escrow of source code.
  • Given the global and rapid manner in which technologies are invented, produced and sold, a global and agile approach to technology assurance is required to achieve scalable results.  The approach should be based on understood and transparently formulated standards that are, to the maximum extent possible, industry-led and global in their applicability.  Conformance to such standards once would then be recognized by multiple industry’s and geo-political regions.  Propagation of country or industry specific standards will result in economic fragmentation and slow the adoption of industry best practices.

The Open Group Trusted Technology Forum (OTTF)[3] is a promising and complementary effort in this regard. Facilitated by The Open Group, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Membership continues to grow and includes representation from manufacturers world-wide.

Governments and enterprises alike will benefit from OTTF’s work. Technology purchasers can use the Open Trusted Technology Provider (OTTP) Standard and OTTP Framework best practice recommendations to guide their strategies.  And a wide range of technology vendors can use OTTF approaches to build security and integrity into their end-to-end supply chains. The first version of the OTTPS is focused on mitigating the risk of tainted and counterfeit technology components or products. The OTTF is currently working a program that will accredit technology providers to the OTTP Standard. We expect to begin pilot testing of the program by the end of 2012.

Don’t misunderstand us: Market leaders like IBM have every incentive to engineer security and quality into our products and services. We continually encourage and support others to do the same.

But we realize that trusted technology — like food safety — can only be achieved if we collaborate with others in industry and in government.  That’s why IBM is pleased to be an active member of the Trusted Technology Forum, and looks forward to contributing to its continued success.

A version of this blog post was originally posted by the IBM Institute for Advanced Security.

Andras Szakal is the Chief Architect and a Senior Certified Software IT Architect for IBM’s Federal Software Sales business unit. His responsibilities include developing e-Government software architectures using IBM middleware and managing the IBM federal government software IT architect team. Szakal is a proponent of service oriented and web services based enterprise architectures and participates in open standards and open source product development initiatives within IBM.

 

Comments Off

Filed under OTTF

“Making Standards Work®”

By Andrew Josey, The Open Group

Next month as part of the ongoing process of “Making Standards Work®,” we will be setting standards and policy with those attending the member meetings at The Open Group Conference, London, (May 9-12, Central Hall Westminster). The standards development activities include a wide range of subject areas from Cloud Computing, Tools and People certification, best practices for Trusted Technology, SOA and Quantum Lifecycle Management, as well as maintenance of existing standards such as TOGAF® and ArchiMate®. The common link with all these activities is that all of these are open standards developed by members of The Open Group.

Why do our members invest their time and efforts in development of open standards? The key reasons as I see them are as follows:

  1. Open standards are a core part of today’s infrastructure
  2. Open standards allow vendors to differentiate their offerings by offering a level of openness (portable interfaces and interoperability)
  3. Open standards establish a baseline from which competitors can innovate
  4. Open standards backed with certification enable customers to buy with increased confidence

This is all very well, you say — but what differentiates The Open Group from other standards organizations? Well, when The Open Group develops a new standard, we take an end-to-end view of the ecosystem all the way through from customer requirements, developing consensus standards to certification and procurement. We aim to deliver standards that meet a need in the marketplace and then back those up with certification that delivers an assurance about the products or in the case of people certification, their knowledge or skills and experience. We then take regular feedback on our standards, maintain them and evolve them according to marketplace needs. We also have a deterministic, timely process for developing our standards that helps to avoid the stalemate that can occur in some standards development.

Let’s look briefly at two of the most well known Open Group standards:  UNIX® and TOGAF®,. The UNIX® and TOGAF® standards are both examples of where a full ecosystem has been developed around the standard.

The UNIX® standard for operating systems has been around since 1995 and is now in its fourth major iteration. High reliability, availability and scalability are all attributes associated with certified UNIX® systems. As well as the multi-billion-dollar annual market in server systems from HP, Oracle, IBM and Fujitsu, there is an installed base of 50 million users* using The Open Group certified UNIX® systems on the desktop.

TOGAF® is the standard enterprise architecture method and framework. It encourages use with other frameworks and adoption of best practices for enterprise architecture. Now in its ninth iteration, it is freely available for internal use by any organization globally and is widely adopted with over 60% of the Fortune 50 and more than 80% of the Global Forbes 50. The TOGAF® certification program now has more than 15,000 certified individuals, including over 6,000 for TOGAF® 9.

If you are able to join us in London in May, I hope you will be able to also join us at the member meetings to continue making standards work. If you are not yet a member then I hope you will attend the conference itself and network with the members to find out more and consider joining us in Making Standards Work®!

For more information on The Open Group Standards Process visit http://www.opengroup.org/standardsprocess/

(*) Apple estimated number from Briefing October 2010. Mac OS X is certified to the UNIX 03 standard.

Standards development will be part of member meetings taking place at The Open Group Conference, London, May 9-13. Join us for best practices and case studies on Enterprise Architecture, Cloud, Security and more, presented by preeminent thought leaders in the industry.

Andrew Josey is Director of Standards within The Open Group, responsible for the Standards Process across the organization. Andrew leads the standards development activities within The Open Group Architecture Forum, including the development and maintenance of TOGAF® 9, and the TOGAF® 9 People certification program. He also chairs the Austin Group, the working group responsible for development and maintenance the POSIX 1003.1 standard that forms the core volumes of the Single UNIX® Specification. He is the ISO project editor for ISO/IEC 9945 (POSIX). He is a member of the IEEE Computer Society’s Golden Core and is the IEEE P1003.1 chair and the IEEE PASC Functional chair of Interpretations. Andrew is based in the UK.

Comments Off

Filed under Standards, UNIX, TOGAF