Tag Archives: supply chain

How the Open Trusted Technology Provider Standard (O-TTPS) and Accreditation Will Help Lower Cyber Risk

By Andras Szakal, Vice President and Chief Technology Officer, IBM U.S. Federal

Changing business dynamics and enabling technologies

In 2008, IBM introduced the concept of a “Smarter Planet.” The Smarter Planet initiative focused, in part, on the evolution of globalization against the backdrop of changing business dynamics and enabling technologies. A key concept was the need for infrastructure to be tightly integrated, interconnected, and intelligent, thereby facilitating collaboration between people, government and businesses in order to meet the world’s growing appetite for data and automation. Since then, many industries and businesses have adopted this approach, including the ICT (information and communications technology) industries that support the global technology manufacturing supply chain.

Intelligent and interconnected critical systems

This transformation has infused technology into virtually all aspects of our lives, and involves, for example, government systems, the electric grid and healthcare. Most of these technological solutions are made up of hundreds or even thousands of components that are sourced from the growing global technology supply chain.
Intelligent and interconnected critical systems

In the global technology economy, no one technology vendor or integrator is able to always provide a single source solution. It is no longer cost competitive to design all of the electronic components, printed circuit boards, card assemblies, or other sub-assemblies in-house. Adapting to the changing market place and landscape by balancing response time and cost efficiency, in an expedient manner, drives a more wide-spread use of OEM (original equipment manufacturer) products.

As a result, most technology providers procure from a myriad of global component suppliers, who very often require similarly complex supply chains to source their components. Every enterprise has a supplier network, and each of their suppliers has a supply chain network, and these sub-tier suppliers have their own supply chain networks. The resultant technology supply chain is manifested into a network of integrated suppliers.

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Cybersecurity by design: Addressing risk in a sustainable way across the ecosystem

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways:

- Does the product meet essential functional requirements?
- Has the product or component been produced by trustworthy provider?

Of course, the leaders or owners of these systems have to do their part to achieve security and safety: e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats. Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

Assuring the quality and integrity of mission-critical technology

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

The very process of manufacturing technology is not immune to cyber-attack. The primary purpose of attacking the supply chain typically is motivated by monetary gain. The primary goals of a technology supply chain attack are intended to inflict massive economic damage in an effort to gain global economic advantage or as a way to seeding targets with malware that provides unfettered access for attackers.

It is for this reason that the global technology manufacturing industry must establish practices that mitigate this risk by increasing the cost barriers of launching such attacks and increasing the likelihood of being caught before the effects of such an attack are irreversible. As these threats evolve, the global ICT industry must deploy enhanced security through advanced automated cyber intelligence analysis. As critical infrastructure becomes more automated, integrated and essential to critical to functions, the technology supply chain that surrounds it must be considered a principle theme of the overall global security and risk mitigation strategy.

A global, agile, and scalable approach to supply chain security

Certainly, the manner in which technologies are invented, produced, and sold requires a global, agile, and scalable approach to supply chain assurance and is essential to achieve the desired results. Any technology supply chain security standard that hopes to be widely adopted must be flexible and country-agnostic. The very nature of the global supply chain (massively segmented and diverse) requires an approach that provides practicable guidance but avoids being overtly prescriptive. Such an approach would require the aggregation of industry practices that have been proven beneficial and effective at mitigating risk.

The OTTF (The Open Group Trusted Technology Forum) is an increasingly recognized and promising industry initiative to establish best practices to mitigate the risk of technology supply chain attack. Facilitated by The Open Group, a recognized international standards and certification body, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Current membership includes a list of the most well-known technology vendors, integrators, and technology assessment laboratories.

The benefits of O-TTPS for governments and enterprises

IBM is currently a member of the OTTF and has been honored to hold the Chair for the last three years.  Governments and enterprises alike will benefit from the work of the OTTF. Technology purchasers can use the Open Trusted Technology Provider™ Standard (O-TTPS) and Framework best-practice recommendations to guide their strategies.

A wide range of technology vendors can use O-TTPS approaches to build security and integrity into their end-to-end supply chains. The first version of the O-TTPS is focused on mitigating the risk of maliciously tainted and counterfeit technology components or products. Note that a maliciously tainted product is one that has been produced by the provider and acquired through reputable channels but which has been tampered maliciously. A counterfeit product is produced other than by or for the provider, or is supplied by a non-reputable channel, and is represented as legitimate. The OTTF is currently working on a program that will accredit technology providers who conform to the O-TTPS. IBM expects to complete pilot testing of the program by 2014.

IBM has actively supported the formation of the OTTF and the development of the O-TTPS for several reasons. These include but are not limited to the following:

- The Forum was established within a trusted and respected international standards body – The Open Group.
- The Forum was founded, in part, through active participation by governments in a true public-private partnership in which government members actively participate.
- The OTTF membership includes some of the most mature and trusted commercial technology manufactures and vendors because a primary objective of the OTTF was harmonization with other standards groups such as ISO (International Organization for Standardization) and Common Criteria.

The O-TTPS defines a framework of organizational guidelines and best practices that enhance the security and integrity of COTS ICT. The first version of the O-TTPS is focused on mitigating certain risks of maliciously tainted and counterfeit products within the technology development / engineering lifecycle. These best practices are equally applicable for systems integrators; however, the standard is intended to primarily address the point of view of the technology manufacturer.

O-TTPS requirements

The O-TTPS requirements are divided into three categories:

1. Development / Engineering Process and Method
2. Secure Engineering Practices
3. Supply Chain Security Practices

The O-TTPS is intended to establish a normalized set of criteria against which a technology provider, component supplier, or integrator can be assessed. The standard is divided into categories that define best practices for engineering development practices, secure engineering, and supply chain security and integrity intended to mitigate the risk of maliciously tainted and counterfeit components.

The accreditation program

As part of the process for developing the accreditation criteria and policy, the OTTF established a pilot accreditation program. The purpose of the pilot was to take a handful of companies through the accreditation process and remediate any potential process or interpretation issues. IBM participated in the OTTP-S accreditation pilot to accredit a very significant segment of the software product portfolio; the Application Infrastructure Middleware Division (AIM) which includes the flagship WebSphere product line. The AIM pilot started in mid-2013 and completed in the first week of 2014 and was formally recognized as accredited in the fist week of February 2014.

IBM is currently leveraging the value of the O-TTPS and working to accredit additional development organizations. Some of the lessons learned during the IBM AIM initial O-TTPS accreditation include:

- Conducting a pre-assessment against the O-TTPS should be conducted by an organization before formally entering accreditation. This allows for remediation of any gaps and reduces potential assessment costs and project schedule.
- Starting with a segment of your development portfolio that has a mature secure engineering practices and processes. This helps an organization address accreditation requirements and facilitates interactions with the 3rd party lab.
- Using your first successful O-TTPS accreditation to create templates that will help drive data gathering and validate practices to establish a repeatable process as your organization undertakes additional accreditations.

andras-szakalAndras Szakal, VP and CTO, IBM U.S. Federal, is responsible for IBM’s industry solution technology strategy in support of the U.S. Federal customer. Andras was appointed IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team in 2005. He is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP).  Andras holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. He has been a driving force behind IBM’s adoption of government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of the IT Architect Profession Certification Standard (ITAC). More recently, he was appointed chair of The Open Group Trusted Technology Forum and leads the development of The Open Trusted Technology Provider Framework.

1 Comment

Filed under Accreditations, Cybersecurity, government, O-TTF, O-TTPS, OTTF, RISK Management, Standards, supply chain, Supply chain risk

Accrediting the Global Supply Chain: A Conversation with O-TTPS Recognized Assessors Fiona Pattinson and Erin Connor

By The Open Group 

At the recent San Francisco 2014 conference, The Open Group Trusted Technology Forum (OTTF) announced the launch of the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program.

The program is one the first accreditation programs worldwide aimed at assuring the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

In three short years since OTTF launched, the forum has grown to include more than 25 member companies dedicated to safeguarding the global supply chain against the increasing sophistication of cybersecurity attacks through standards. Accreditation is yet another step in the process of protecting global technology supply chains from maliciously tainted and counterfeit products.

As part of the program, third-party assessor companies will be employed to assess organizations applying for accreditation, with The Open Group serving as the vendor-neutral Accreditation Authority that operates the program.  Prior to the launch, the forum conducted a pilot program with a number of member companies. It was announced at the conference that IBM is the first company to becoming accredited, earning accreditation for its Application, Infrastructure and Middleware (AIM), software business division for its product integrity and supply chain practices.

We recently spoke with OTTF members Fiona Pattinson, director of strategy and business development at Atsec Information Security, and Erin Connor, director at EWA-Canada, at the San Francisco conference to learn more about the assessment process and the new program.

The O-TTPS focus is on securing the technology supply chain. What would you say are the biggest threats facing the supply chain today?

Fiona Pattinson (FP): I think in the three years since the forum began certainly all the members have discussed the various threats quite a lot. It was one of things we discussed as an important topic early on, and I don’t know if it’s the ‘biggest threat,’ but certainly the most important threats that we needed to address initially were those of counterfeit and maliciously tainted products. We came to that through both discussion with all the industry experts in the forum and also through research into some of the requirements from government, so that’s exactly how we knew which threats [to start with].

Erin Connor (EC):  And the forum benefits from having both sides of the acquisition process, both acquirers, and the suppliers and vendors. So they get both perspectives.

How would you define maliciously tainted and counterfeit products?

FP:  They are very carefully defined in the standard—we needed to do that because people’s understanding of that can vary so much.

EC: And actually the concept of ‘maliciously’ tainted was incorporated close to the end of the development process for the standard at the request of members on the acquisition side of the process.

[Note: The standard precisely defines maliciously tainted and counterfeit products as follows:

"The two major threats that acquirers face today in their COTS ICT procurements, as addressed in this Standard, are defined as:

1. Maliciously tainted product – the product is produced by the provider and is acquired

through a provider’s authorized channel, but has been tampered with maliciously.

2. Counterfeit product – the product is produced other than by, or for, the provider, or is

supplied to the provider by other than a provider’s authorized channel and is presented as being legitimate even though it is not."]

The OTTF announced the Accreditation Program for the OTTP Standard at the recent San Francisco conference. Tell us about the standard and how the accreditation program will help ensure conformance to it?

EC: The program is intended to provide organizations with a way to accredit their lifecycle processes for their product development so they can prevent counterfeit or maliciously tainted components from getting into the products they are selling to an end user or into somebody else’s supply chain. It was determined that a third-party type of assessment program would be used. For the organizations, they will know that we Assessors have gone through a qualification process with The Open Group and that we have in place all that’s required on the management side to properly do an assessment. From the consumer side, they have confidence the assessment has been completed by an independent third-party, so they know we aren’t beholden to the organizations to give them a passing grade when perhaps they don’t deserve it. And then of course The Open Group is in position to oversee the whole process and award the final accreditation based on the recommendation we provide.  The Open Group will also be the arbiter of the process between the assessors and organizations if necessary. 

FP:  So The Open Group’s accreditation authority is validating the results of the assessors.

EC: It’s a model that is employed in many, many other product or process assessment and evaluation programs where the actual accreditation authority steps back and have third parties do the assessment.

FP: It is important that the assessor companies are working to the same standard so that there’s no advantage in taking one assessor over the other in terms of the quality of the assessments that are produced.

How does the accreditation program work?

FP: Well, it’s brand new so we don’t know if it is perfect yet, but having said that, we have worked over several months on defining the process, and we have drawn from The Open Group’s existing accreditation programs, as well as from the forum experts who have worked in the accreditation field for many years. We have been performing pilot accreditations in order to check out how the process works. So it is already tested.

How does it actually work? Well, first of all an organization will feel the need to become accredited and at that point will apply to The Open Group to get the accreditation underway. Once their scope of accreditation – which may be as small as one product or theoretically as large as a whole global company – and once the application is reviewed and approved by The Open Group, then they engage an assessor.

There is a way of sampling a large scope to identify the process variations in a larger scope using something we term ‘selective representative products.’ It’s basically a way of logically sampling a big scope so that we capture the process variations within the scope and make sure that the assessment is kept to a reasonable size for the organization undergoing the assessment, but it also gives good assurance to the consumers that it is a representative sample. The assessment is performed by the Recognized Assessor company, and a final report is written and provided to The Open Group for their validation. If everything is in order, then the company will be accredited and their scope of conformance will be added to the accreditation register and trademarked.

EC: So the customers of that organization can go and check the registration for exactly what products are covered by the scope.

FP: Yes, the register is public and anybody can check. So if IBM says WebSphere is accredited, you can go and check that claim on The Open Group web site.

How long does the process take or does it vary?

EC: It will vary depending on how large the scope to be accredited is in terms of the size of the representative set and the documentation evidence. It really does depend on what the variations in the processes are among the product lines as to how long it takes the assessor to go through the evidence and then to produce the report. The other side of the coin is how long it takes the organization to produce the evidence. It may well be that they might not have it totally there at the outset and will have to create some of it.

FP: As Erin said, it varies by the complexity and the variation of the processes and hence the number of selected representative products. There are other factors that can influence the duration. There are three parties influencing that: The applicant Organization, The Open Group’s Accreditation Authority and the Recognized Assessor.

For example, we found that the initial work by the Organization and the Accreditation Authority in checking the scope and the initial documentation can take a few weeks for a complex scope, of course for the pilots we were all new at doing that. In this early part of the project it is vital to get the scope both clearly defined and approved since it is key to a successful accreditation.

It is important that an Organization assigns adequate resources to help keep this to the shortest time possible, both during the initial scope discussions, and during the assessment. If the Organization can provide all the documentation before they get started, then the assessors are not waiting for that and the duration of the assessment can be kept as short as possible.

Of course the resources assigned by the Recognized Assessor also influences how long an assessment takes. A variable for the assessors is how much documentation do they have to read and review? It might be small or it might be a mountain.

The Open Group’s final review and oversight of the assessment takes some time and is influenced by resource availability within that organization. If they have any questions it may take a little while to resolve.

What kind of safeguards does the accreditation program put in place for enforcing the standard?

FP: It is a voluntary standard—there’s no requirement to comply. Currently some of the U.S. government organizations are recommending it. For example, NASA in their SEWP contract and some of the draft NIST documents on Supply Chain refer to it, too.

EC: In terms of actual oversight, we review what their processes are as assessors, and the report and our recommendations are based on that review. The accreditation expires after three years so before the three years is up, the organization should actually get the process underway to obtain a re-accreditation.  They would have to go through the process again but there will be a few more efficiencies because they’ve done it before. They may also wish to expand the scope to include the other product lines and portions of the company. There aren’t any periodic ‘spot checks’ after accreditation to make sure they’re still following the accredited processes, but part of what we look at during the assessment is that they have controls in place to ensure they continue doing the things they are supposed to be doing in terms of securing their supply chain.

FP:  And then the key part is the agreement the organizations signs with The Open Group includes the fact the organization warrant and represent that they remain in conformance with the standard throughout the accreditation period. So there is that assurance too, which builds on the more formal assessment checks.

What are the next steps for The Open Group Trusted Technology Forum?  What will you be working on this year now that the accreditation program has started?

FP: Reviewing the lessons we learned through the pilot!

EC: And reviewing comments from members on the standard now that it’s publicly available and working on version 1.1 to make any corrections or minor modifications. While that’s going on, we’re also looking ahead to version 2 to make more substantial changes, if necessary. The standard is definitely going to be evolving for a couple of years and then it will reach a steady state, which is the normal evolution for a standard.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Fiona Pattinson Fiona Pattinson is responsible for developing new and existing atsec service offerings.  Under the auspices of The Open Group’s OTTF, alongside many expert industry colleagues, Fiona has helped develop The Open Group’s O-TTPS, including developing the accreditation program for supply chain security.  In the past, Fiona has led service developments which have included establishing atsec’s US Common Criteria laboratory, the CMVP cryptographic module testing laboratory, the GSA FIPS 201 TP laboratory, TWIC reader compliance testing, NPIVP, SCAP, PCI, biometrics testing and penetration testing. Fiona has responsibility for understanding a broad range of information security topics and the application of security in a wide variety of technology areas from low-level design to the enterprise level.

ErinConnorErin Connor is the Director at EWA-Canada responsible for EWA-Canada’s Information Technology Security Evaluation & Testing Facility, which includes a Common Criteria Test Lab, a Cryptographic & Security Test Lab (FIPS 140 and SCAP), a Payment Assurance Test Lab (device testing for PCI PTS POI & HSM, Australian Payment Clearing Association and Visa mPOS) and an O-TTPS Assessor lab Recognized by the Open Group.  Erin participated with other expert members of the Open Group Trusted Technology Forum (OTTF) in the development of The Open Group Trusted Technology Provider Standard for supply chain security and its accompanying Accreditation Program.  Erin joined EWA-Canada in 1994 and his initial activities in the IT Security and Infrastructure Assurance field included working on the team fielding a large scale Public Key Infrastructure system, Year 2000 remediation and studies of wireless device vulnerabilities.  Since 2000, Erin has been working on evaluations of a wide variety of products including hardware security modules, enterprise security management products, firewalls, mobile device and management products, as well as system and network vulnerability management products.  He was also the only representative of an evaluation lab in the Biometric Evaluation Methodology Working Group, which developed a proposed methodology for the evaluation of biometric technologies under the Common Criteria.

Leave a comment

Filed under Accreditations, Cybersecurity, OTTF, Professional Development, Standards, Supply chain risk

New Accreditation Program – Raises the Bar for Securing Global Supply Chains

By Sally Long, Director of The Open Group Trusted Technology Forum (OTTF)™

In April 2013, The Open Group announced the release of the Open Trusted Technology Provider™ Standard (O-TTPS) 1.0 – Mitigating Maliciously Tainted and Counterfeit Products. Now we are announcing the O-TTPS Accreditation Program, launched on February 3, 2014, which enables organizations that conform to the standard to be accredited as Open Trusted Technology Providers™.

The O-TTPS, a standard of The Open Group, provides a set of guidelines, recommendations and requirements that help assure against maliciously tainted and counterfeit products throughout commercial off-the-shelf (COTS) information and communication technology (ICT) product lifecycles. The standard includes best practices throughout all phases of a product’s life cycle: design, sourcing, build, fulfillment, distribution, sustainment, and disposal, thus enhancing the integrity of COTS ICT products and the security of their global supply chains.

This accreditation program is one of the first of its kind in providing accreditation for conforming to standards for product integrity coupled with supply chain security.

The standard and the accreditation program are the result of a collaboration between government, third party evaluators and some of industry’s most mature and respected providers who came together and, over a period of four years, shared their practices for integrity and security, including those used in-house and those used with their own supply chains.

Applying for O-TTPS Accreditation

When the OTTF started this initiative, one of its many mantras was “raise all boats.” The  objective was to raise the security bar across the full spectrum of the supply chain, from small component suppliers to the providers who include those components in their products and to the integrators who incorporate those providers’ products into customers’ systems.

The O-TTPS Accreditation Program is open to all component suppliers, providers and integrators. The holistic aspect of this program’s potential, as illustrated in the diagram below should not be underestimated—but it will take a concerted effort to reach and encourage all constituents in the supply chain to become involved.

OTTPSThe importance of mitigating the risk of maliciously tainted and counterfeit products

The focus on mitigating the risks of tainted and counterfeit products by increasing the security of the supply chain is critical in today’s global economy. Virtually nothing is made from one source.

COTS ICT supply chains are complex. A single product can be comprised of hundreds of components from multiple component suppliers from numerous different areas around the world—and providers can change their component suppliers frequently depending on the going rate for a particular component.  If, along the supply chain, bad things happen, such as inserting counterfeit components in place of authentic ones or inserting maliciously tainted code or the double-hammer—maliciously tainted counterfeit parts—then terrible things can happen when that product is installed at a customer site.

With the threat of tainted and counterfeit technology products posing a major risk to global organizations, it is increasingly important for those organizations to take what steps they can to mitigate these risks. The O-TTPS Accreditation Program is one of those steps. Can an accreditation program completely eliminate the risk of tainted and counterfeit components? No!  Does it reduce the risk? Absolutely!

How the Accreditation Program works

The Open Group, with over 25 years’ experience managing vendor- and technology-neutral certification programs, will assume the role of the Accreditation Authority over the entire program. Additionally the program will utilize third-party assessors to assess conformance to the O-TTPS requirements.

Companies seeking accreditation will declare their Scope of Accreditation, which means they can choose to be accredited for conforming to the O-TTPS standard and adhering to the best practice requirements across their entire enterprise, within a specific product line or business unit or within an individual product.  Organizations applying for accreditation are then required to provide evidence of conformance for each of the O-TTPS requirements, demonstrating they have the processes in place to secure in-house development and their supply chains across the entire COTS ICT product lifecycle. O-TTPS accredited organizations will then be able to identify themselves as Open Trusted Technology Providers™ and will become part of a public registry of trusted providers.

The Open Group has also instituted the O-TTPS Recognized Assessor Program, which assures that Recognized Assessor (companies) meet certain criteria as assessor organizations and that their assessors (individuals) meet an additional set of criteria and have passed the O-TTPS Assessor exam, before they can be assigned to an O-TTPS Assessment. The Open Group will operate this program, grant O-TTPS Recognized Assessor certificates and list those qualifying organizations on a public registry of recognized assessor companies.

Efforts to increase awareness of the program

The Open Group understands that to achieve global uptake we need to reach out to other countries across the globe for market adoption, as well as to other standards groups for harmonization. The forum has a very active outreach and harmonization work group and the OTTF is increasingly being recognized for its efforts. A number of prominent U.S. government agencies, including the General Accounting Office and NASA have recognized the standard as an important supply chain security effort. Dave Lounsbury, the CTO of The Open Group, has testified before Congress on the value of this initiative from the industry-government partnership perspective. The Open Group has also met with President Obama’s Cybersecurity Coordinators (past and present) to apprise them of our work. We continue to work closely with NIST from the perspective of the Cybersecurity Framework, which recognizes the supply chain as a critical area for the next version, and the OTTF work is acknowledged in NIST’s Special Publication 161. We have liaisons with ISO and are working internally at mapping our standards and accreditation to Common Criteria. The O-TTPS has also been discussed with government agencies in China, India, Japan and the UK.

The initial version of the standard and the accreditation program are just the beginning. OTTF members will continue to evolve both the standard and the accreditation program to provide additional versions that refine existing requirements, introduce additional requirements, and cover additional threats. And the outreach and harmonization efforts will continue to strengthen so that we can reach that holistic potential of Open Trusted Technology Providers™ throughout all global supply chains.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Sally LongSally Long is the Director of The Open Group Trusted Technology Forum (OTTF). She has managed customer supplier forums and collaborative development projects for over twenty years. She was the release engineering section manager for all multi-vendor collaborative technology development projects at The Open Software Foundation (OSF) in Cambridge Massachusetts. Following the merger of the OSF and X/Open under The Open Group, she served as director for multiple forums in The Open Group. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts.

Leave a comment

Filed under Cybersecurity, OTTF, Supply chain risk

Developing standards to secure our global supply chain

By Sally Long, Director of The Open Group Trusted Technology Forum (OTTF)™

In a world where tainted and counterfeit products pose significant risks to organizations, we see an increasing need for a standard that protects both organizations and consumers. Altered or non-genuine products introduce the possibility of untracked malicious behavior or poor performance. These risks can damage both customers and suppliers resulting in the potential for failed or inferior products, revenue and brand equity loss and disclosure of intellectual property.

On top of this, cyber-attacks are growing more sophisticated, forcing technology suppliers and governments to take a more comprehensive approach to risk management as it applies to product integrity and supply chain security. Customers are now seeking assurances that their providers are following standards to mitigate the risks of tainted and counterfeit components, while providers of Commercial Off-the-Shelf (COTS) Information and Communication Technology (ICT) products are focusing on protecting the integrity of their products and services as they move through the global supply chain.

In this climate we need a standard more than ever, which is why today we’re proud to announce the publication of the Open Trusted Technology Provider Standard (O-TTPS)™(Standard). The O-TTPS is the first complete standard published by The Open Group Trusted Technology Forum (OTTF)™ which will benefit global providers and acquirers of COTS and ICT products.

The first of its kind, the open standard has been developed to help organizations achieve Trusted Technology Provider status, assuring the integrity of COTS and ICT products worldwide and safeguarding the global supply chain against the increased sophistication of cyber security attacks.

Specifically intended to prevent maliciously tainted and counterfeit products from entering the supply chain, the standard codifies best practices across the entire COTS ICT product lifecycle, including the design, sourcing, build, fulfilment, distribution, sustainment, and disposal phases. Our intention is that it will help raise the bar globally by helping the technology industry and its customers to “Build with Integrity, Buy with Confidence.”™.

What’s next?

The OTTF is now working to develop an accreditation program to help provide assurance that Trusted Technology Providers conform to the O-TTPS Standard. The planned accreditation program is intended to mitigate maliciously tainted and counterfeit products by raising the assurance bar for: component suppliers, technology providers, and integrators, who are part of and depend on the global supply chain.Using the guidelines and best practices documented in the Standard as a basis, the OTTF will also release updated versions of the O-TTPS Standard based on changes to the threat landscape.

Interested in seeing the Standard for yourself? You can download it directly from The Open Group Bookstore, here. For more information on The Open Group Trusted Technology Forum, please click here, or keep checking back on the blog for updates.

 

2 Comments

Filed under Uncategorized

Quick Hit Thoughts from RSA Conference 2013

By Joshua Brickman, CA Technologies

I have a great job at CA Technologies, I can’t deny it. Working in CA Technologies Federal Certification Program Office, I have the responsibility of knowing what certifications, accreditations, mandates, etc. are relevant and then helping them get implemented.

One of the responsibilities (and benefits) of my job is getting to go to great conferences like the RSA Security Conference which just wrapped last week. This year I was honored to be selected by the Program Committee to speak twice at the event. Both talks fit well to the Policy and Government track at the show.

First I was on a panel with a distinguished group of senior leaders from both industry and government. The title of the session was, Certification of Products or Accreditation of Organizations: Which to Do? The idea was to discuss the advantages and disadvantages of individual product certifications vs. looking at an entire company or business unit. Since I’ve led CA through many product certifications (certs) and have been involved in accreditation programs as well, my position was to be able to bring real-world industry perspective to the panel. The point I tried to make was that product certs (like Common Criteria – CC) add value, but only for the specific purpose that they are designed for (security functions). We’ve seen CC expanding beyond just security enforcing products and that’s concerning. Product certs are expensive, time consuming and take away from time that could be spent on innovation. We want to do CC when it will be long lasting and add value.

On the idea of accreditation of organizations, I first talked about CMMI and my views on its challenges. I then shifted to the Open Trusted Technology Forum (OTTF), a forum of The Open Group, as I’ve written about before and said that the accreditation program that group is building is more focused than CMMI. OTTF is building something that  – when adopted by industry and THEIR suppliers – will provide assurance that technology is being built the right way (best practices) and will give acquirers confidence that products bought from vendors that have the OTTF mark can be trusted. The overall conclusion of the panel was that accreditation of organizations and certifications of products both had a place, and that it is important that the value was understood by buyers and vendors.

A couple of days later, I presented with Mary Ann Davidson, CSO of Oracle. The main point of the talk was to try and give the industry perspective on mandates, legislation and regulations – which all seemed to be focused on technology providers – to solve the cyber security issues which we see every day. We agreed that sometimes regulations make sense but having a clear problem definition, language and limited scope was the path to success and acceptance. We also encouraged government to get involved with industry via public/private partnerships, like The Open Group Trusted Technology Forum.

Collaboration is the key to fighting the cyber security battle. If you are interested in hearing more about ways to get involved in building a safer and more productive computing environment, feel free to contact me or leave a comment on this blog. Cybersecurity is a complicated issue and there were well over 20,000 security professionals discussing it at RSA Conference. We’d love to hear your views as well.

 This blog post was originally published on the CA Technologies blog.


joshJoshua Brickman, PMP (Project Management Professional), runs CA Technologies Federal Certifications Program. He has led CA through the successful evaluation of sixteen products through the Common Criteria over the last six years (in both the U.S. and Canada). He is also a Steering Committee member on The Open Group consortium focused on Supply Chain Integrity and Security, The Open Group Trusted Technology Forum (OTTF). He also runs CA Technologies Accessibility Program. 

1 Comment

Filed under OTTF

#ogChat Summary – 2013 Security Priorities

By Patty Donovan, The Open Group

Totaling 446 tweets, yesterday’s 2013 Security Priorities Tweet Jam (#ogChat) saw a lively discussion on the future of security in 2013 and became our most successful tweet jam to date. In case you missed the conversation, here’s a recap of yesterday’s #ogChat!

The event was moderated by former CNET security reporter Elinor Mills, and there was a total of 28 participants including:

Here is a high-level snapshot of yesterday’s #ogChat:

Q1 What’s the biggest lesson learned by the security industry in 2012? #ogChat

The consensus among participants was that 2012 was a year of going back to the basics. There are many basic vulnerabilities within organizations that still need to be addressed, and it affects every aspect of an organization.

  • @Dana_Gardner Q1 … Security is not a product. It’s a way of conducting your organization, a mentality, affects all. Repeat. #ogChat #security #privacy
  • @Technodad Q1: Biggest #security lesson of 2102: everyone is in two security camps: those who know they’ve been penetrated & those who don’t. #ogChat
  • @jim_hietala Q1. Assume you’ve been penetrated, and put some focus on detective security controls, reaction/incident response #ogChat
  • @c7five Lesson of 2012 is how many basics we’re still not covering (eg. all the password dumps that showed weak controls and pw choice). #ogChat

Q2 How will organizations tackle #BYOD security in 2013? Are standards needed to secure employee-owned devices? #ogChat

Participants debated over the necessity of standards. Most agreed that standards and policies are key in securing BYOD.

  • @arj Q2: No “standards” needed for BYOD. My advice: collect as little information as possible; use MDM; create an explicit policy #ogChat
  • @Technodad @arj Standards are needed for #byod – but operational security practices more important than technical standards. #ogChat
  • @AWildCSO Organizations need to develop a strong asset management program as part of any BYOD effort. Identification and Classification #ogChat
  • @Dana_Gardner Q2 #BYOD forces more apps & data back on servers, more secure; leaves devices as zero client. Then take that to PCs too. #ogChat #security
  • @taosecurity Orgs need a BYOD policy for encryption & remote wipe of company data; expect remote compromise assessment apps too @elinormills #ogChat

Q3 In #BYOD era, will organizations be more focused on securing the network, the device, or the data? #ogChat

There was disagreement here. Some emphasized focusing on protecting data, while others argued that it is the devices and networks that need protecting.

  • @taosecurity Everyone claims to protect data, but the main ways to do so remain protecting devices & networks. Ignores code sec too. @elinormills #ogChat
  • @arj Q3: in the BYOD era, the focus must be on the data. Access is gated by employee’s entitlements + device capabilities. #ogChat
  • @Technodad @arj Well said. Data sec is the big challenge now – important for #byod, #cloud, many apps. #ogChat
  • @c7five Organization will focus more on device management while forgetting about the network and data controls in 2013. #ogChat #BYOD

Q4 What impact will using 3rd party #BigData have on corporate security practices? #ogChat

Participants agreed that using third parties will force organizations to rely on security provided by those parties. They also acknowledged that data must be secure in transit.

  • @daviottenheimer Q4 Big Data will redefine perimeter. have to isolate sensitive data in transit, store AND process #ogChat
  • @jim_hietala Q4. 3rd party Big Data puts into focus 3rd party risk management, and transparency of security controls and control state #ogChat
  • @c7five Organizations will jump into 3rd party Big Data without understanding of their responsibilities to secure the data they transfer. #ogChat
  • @Dana_Gardner Q4 You have to trust your 3rd party #BigData provider is better at #security than you are, eh? #ogChat  #security #SLA
  • @jadedsecurity @Technodad @Dana_Gardner has nothing to do with trust. Data that isn’t public must be secured in transit #ogChat
  • @AWildCSO Q4: with or without bigdata, third party risk management programs will continue to grow in 2013. #ogChat

Q5 What will global supply chain security look like in 2013? How involved should governments be? #ogChat

Supply chains are an emerging security issue, and governments need to get involved. But consumers will also start to understand what they are responsible for securing themselves.

  • @jim_hietala Q5. supply chain emerging as big security issue, .gov’s need to be involved, and Open Group’s OTTF doing good work here #ogChat
  • @Technodad Q5: Governments are going to act- issue is getting too important. Challenge is for industry to lead & minimize regulatory patchwork. #ogChat
  • @kjhiggins Q5: Customers truly understanding what they’re responsible for securing vs. what cloud provider is. #ogChat

Q6 What are the biggest unsolved issues in Cloud Computing security? #ogChat

Cloud security is a big issue. Most agreed that Cloud security is mysterious, and it needs to become more transparent. When Cloud providers claim they are secure, consumers and organizations put blind trust in them, making the problem worse.

  • @jadedsecurity @elinormills Q6 all of them. Corps assume cloud will provide CIA and in most cases even fails at availability. #ogChat
  • @jim_hietala Q6. Transparency of security controls/control state, cloud risk management, protection of unstructured data in cloud services #ogChat
  • @c7five Some PaaS cloud providers advertise security as something users don’t need to worry about. That makes the problem worse. #ogChat

Q7 What should be the top security priorities for organizations in 2013? #ogChat

Top security priorities varied. Priorities highlighted in the discussion included:  focusing on creating a culture that promotes secure activity; prioritizing security spending based on risk; focusing on where the data resides; and third-party risk management coming to the forefront.

  • @jim_hietala Q7. prioritizing security spend based on risks, protecting data, detective controls #ogChat
  • @Dana_Gardner Q7 Culture trumps technology and business. So make #security policy adherence a culture that is defined and rewarded. #ogChat #security
  • @kjhiggins Q7 Getting a handle on where all of your data resides, including in the mobile realm. #ogChat
  • @taosecurity Also for 2013: 1) count and classify your incidents & 2) measure time from detection to containment. Apply Lean principles to both. #ogChat
  • @AWildCSO Q7: Asset management, third party risk management, and risk based controls for 2013. #ogChat

A big thank you to all the participants who made this such a great discussion!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Tweet Jam

Operational Resilience through Managing External Dependencies

By Ian Dobson & Jim Hietala, The Open Group

These days, organizations are rarely self-contained. Businesses collaborate through partnerships and close links with suppliers and customers. Outsourcing services and business processes, including into Cloud Computing, means that key operations that an organization depends on are often fulfilled outside their control.

The challenge here is how to manage the dependencies your operations have on factors that are outside your control. The goal is to perform your risk management so it optimizes your operational success through being resilient against external dependencies.

The Open Group’s Dependency Modeling (O-DM) standard specifies how to construct a dependency model to manage risk and build trust over organizational dependencies between enterprises – and between operational divisions within a large organization. The standard involves constructing a model of the operations necessary for an organization’s success, including the dependencies that can affect each operation. Then, applying quantitative risk sensitivities to each dependency reveals those operations that have highest exposure to risk of not being successful, informing business decision-makers where investment in reducing their organization’s exposure to external risks will result in best return.

O-DM helps you to plan for success through operational resilience, assured business continuity, and effective new controls and contingencies, enabling you to:

  • Cut costs without losing capability
  • Make the most of tight budgets
  • Build a resilient supply chain
  •  Lead programs and projects to success
  • Measure, understand and manage risk from outsourcing relationships and supply chains
  • Deliver complex event analysis

The O-DM analytical process facilitates organizational agility by allowing you to easily adjust and evolve your organization’s operations model, and produces rapid results to illustrate how reducing the sensitivity of your dependencies improves your operational resilience. O-DM also allows you to drill as deep as you need to go to reveal your organization’s operational dependencies.

O-DM support training on the development of operational dependency models conforming to this standard is available, as are software computation tools to automate speedy delivery of actionable results in graphic formats to facilitate informed business decision-making.

The O-DM standard represents a significant addition to our existing Open Group Risk Management publications:

The O-DM standard may be accessed here.

Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world.  In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Security Architecture

The Open Group Trusted Technology Forum is Leading the Way to Securing Global IT Supply Chains

By Dana Gardner, Interarbor Solutions

This BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference in Washington, D.C., beginning July 16. The conference will focus on Enterprise Architecture (EA), enterprise transformation, and securing global supply chains.

We’re joined in advance by some of the main speakers at the conference to examine the latest efforts to make global supply chains for technology providers more secure, verified, and therefore trusted. We’ll examine the advancement of The Open Group Trusted Technology Forum (OTTF) to gain an update on the effort’s achievements, and to learn more about how technology suppliers and buyers can expect to benefit.

The expert panel consists of Dave Lounsbury, Chief Technical Officer at The Open Group; Dan Reddy, Senior Consultant Product Manager in the Product Security Office at EMC Corp.; Andras Szakal, Vice President and Chief Technology Officer at IBM’s U.S. Federal Group, and also the Chair of the OTTF, and Edna Conway, Chief Security Strategist for Global Supply Chain at Cisco. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Why this is an important issue, and why is there a sense of urgency in the markets?

Lounsbury: The Open Group has a vision of boundaryless information flow, and that necessarily involves interoperability. But interoperability doesn’t have the effect that you want, unless you can also trust the information that you’re getting, as it flows through the system.

Therefore, it’s necessary that you be able to trust all of the links in the chain that you use to deliver your information. One thing that everybody who watches the news would acknowledge is that the threat landscape has changed. As systems become more and more interoperable, we get more and more attacks on the system.

As the value that flows through the system increases, there’s a lot more interest in cyber crime. Unfortunately, in our world, there’s now the issue of state-sponsored incursions in cyberspace, whether officially state-sponsored or not, but politically motivated ones certainly.

So there is an increasing awareness on the part of government and industry that we must protect the supply chain, both through increasing technical security measures, which are handled in lots of places, and in making sure that the vendors and consumers of components in the supply chain are using proper methodologies to make sure that there are no vulnerabilities in their components.

I’ll note that the demand we’re hearing is increasingly for work on standards in security. That’s top of everybody’s mind these days.

Reddy: One of the things that we’re addressing is the supply chain item that was part of the Comprehensive National Cybersecurity Initiative (CNCI), which spans the work of two presidents. Initiative 11 was to develop a multi-pronged approach to global supply chain risk management. That really started the conversation, especially in the federal government as to how private industry and government should work together to address the risks there.

In the OTTF, we’ve tried create a clear measurable way to address supply-chain risk. It’s been really hard to even talk about supply chain risk, because you have to start with getting a common agreement about what the supply chain is, and then talk about how to deal with risk by following best practices.

Szakal: One of the observations that I’ve made over the last couple of years is that this group of individuals, who are now part of this standards forum, have grown in their ability to collaborate, define, and rise to the challenges, and work together to solve the problem.

Standards process

Technology supply chain security and integrity are not necessarily a set of requirements or an initiative that has been taken on by the standards committee or standards groups up to this point The people who are participating in this aren’t your traditional IT standards gurus. They had to learn the standards process. They had to understand how to approach the standardization of best practices, which is how we approach solving this problem.

It’s sharing information. It’s opening up across the industry to share best practices on how to secure the supply chain and how to ensure its overall integrity. Our goal has been to develop a framework of best practices and then ultimately take those codified best practices and instantiate them into a standard, which we can then assess providers against. It’s a big effort, but I think we’re making tremendous progress.

Gardner: Because The Open Group Conference is taking place in Washington, D.C., what’s the current perception in the U.S. Government about this in terms of its role?

Szakal:The government has always taken a prominent role, at least to help focus the attention of the industry.

Now that they’ve corralled the industry and they’ve got us moving in the right direction, in many ways, we’ve fought through many of the intricate complex technology supply chain issues and we’re ahead of some of the thinking of folks outside of this group because the industry lives these challenges and understands the state of the art. Some of the best minds in the industry are focused on this, and we’ve applied some significant internal resources across our membership to work on this challenge.

So the government is very interested in it. We’ve had collaborations all the way from the White House across the Department of Defense (DoD) and within the Department of Homeland Security (DHS), and we have members from the government space in NASA and DoD.

It’s very much a collaborative effort, and I’m hoping that it can continue to be so and be utilized as a standard that the government can point to, instead of coming up with their own policies and practices that may actually not work as well as those defined by the industry.

Conway: Our colleagues on the public side of the public-private partnership that is addressing supply-chain integrity have recognized that we need to do it together.

More importantly, you need only to listen to a statement, which I know has often been quoted, but it’s worth noting again from EU Commissioner Algirdas Semeta. He recently said that in a globalized world, no country can secure the supply chain in isolation. He recognized that, again quoting, national supply chains are ineffective and too costly unless they’re supported by enhanced international cooperation.

Mindful focus

The one thing that we bring to bear here is a mindful focus on the fact that we need a public-private partnership to address comprehensively in our information and communications technology industry supply chain integrity internationally. That has been very important in our focus. We want to be a one-stop shop of best practices that the world can look at, so that we continue to benefit from commercial technology which sells globally and frequently builds once or on a limited basis.

Combining that international focus and the public-private partnership is something that’s really coming home to roost in everyone’s minds right now, as we see security value migrating away from an end point and looking comprehensively at the product lifecycle or the global supply chain.

Lounsbury:I had the honor of testifying before the U.S. House Energy and Commerce Committee on Oversight Investigations, on the view from within the U.S. Government on IT security.

It was very gratifying to see that the government does recognize this problem. We had witnesses in from the DoD and Department of Energy (DoE). I was there, because I was one of the two voices on industry that the government wants to tap into to get the industry’s best practices into the government.

It was even more gratifying to see that the concerns that were raised in the hearings were exactly the ones that the OTTF is pursuing. How do you validate a long and complex global supply chain in the face of a very wide threat environment, recognizing that it can’t be any single country? Also, it really does need to be not a process that you apply to a point, but something where you have a standard that raises the bar for our security for all the participants in your supply chain.

So it was really good to know that we were on track and that the government, and certainly the U.S. Government, as we’ve heard from Edna, the European governments, and I suspect all world governments are looking at exactly how to tap into this industry activity.

Gardner: Where we are in the progression of OTTF?

Lounsbury: In the last 18 months, there has been a tremendous amount of progress. The thing that I’ll highlight is that early in 2012, the OTTF published a snapshot of the standard. A snapshot is what The Open Group uses to give a preview of what we expect the standards will apply. It has fleshed out two areas, one on tainted products and one on counterfeit products, the standards and best practices needed to secure a supply chain against those two vulnerabilities.

So that’s out there. People can take a look at that document. Of course, we would welcome their feedback on it. We think other people have good answers too. Also, if they want to start using that as guidance for how they should shape their own practices, then that would be available to them.

Normative guidance

That’s the top development topic inside the OTTF itself. Of course, in parallel with that, we’re continuing to engage in an outreach process and talking to government agencies that have a stake in securing the supply chain, whether it’s part of government policy or other forms of steering the government to making sure they are making the right decisions. In terms of exactly where we are, I’ll defer to Edna and Andras on the top priority in the group.

Gardner: Edna, what’s been going on at OTTF and where do things stand?

Conway: We decided that this was, in fact, a comprehensive effort that was going to grow over time and change as the challenges change. We began by looking at two primary areas, which were counterfeit and taint in that communications technology arena. In doing so, we first identified a set of best practices, which you referenced briefly inside of that snapshot.

Where we are today is adding the diligence, and extracting the knowledge and experience from the broad spectrum of participants in the OTTF to establish a set of rigorous conformance criteria that allow a balance between flexibility and how one goes about showing compliance to those best practices, while also assuring the end customer that there is rigor sufficient to ensure that certain requirements are met meticulously, but most importantly comprehensively.

We have a practice right now where we’re going through each and every requirement or best practice and thinking through the broad spectrum of the development stage of the lifecycle, as well as the end-to-end nodes of the supply chain itself.

This is to ensure that there are requirements that would establish conformance that could be pointed to, by both those who would seek accreditation to this international standard, as well as those who would rely on that accreditation as the imprimatur of some higher degree of trustworthiness in the products and solutions that are being afforded to them, when they select an OTTF accredited provider.

Gardner: Andras, I’m curious where in an organization like IBM that these issues are most enforceable. Where within the private sector is the knowledge and the expertise to reside?

Szakal: Speaking for IBM, we recently celebrated our 100th anniversary in 2011. We’ve had a little more time than some folks to come up with a robust engineering and development process, which harkens back to the IBM 701 and the beginning of the modern computing era.

Integrated process

We have what we call the integrated product development process (IPD), which all products follow and that includes hardware and software. And we have a very robust quality assurance team, the QSE team, which ensures that the folks are following those practices that are called out. Within each of line of business there exist specific requirements that apply more directly to the architecture of a particular product offering.

For example, the hardware group obviously has additional standards that they have to follow during the course of development that is specific to hardware development and the associated supply chain, and that is true with the software team as well.

The product development teams are integrated with the supply chain folks, and we have what we call the Secure Engineering Framework, of which I was an author and the Secure Engineering Initiative which we have continued to evolve for quite some time now, to ensure that we are effectively engineering and sourcing components and that we’re following these Open Trusted Technology Provider Standard (O-TTPS) best practices.

In fact, the work that we’ve done here in the OTTF has helped to ensure that we’re focused in all of the same areas that Edna’s team is with Cisco, because we’ve shared our best practices across all of the members here in the OTTF, and it gives us a great view into what others are doing, and helps us ensure that we’re following the most effective industry best practices.

Gardner: Dan, at EMC, is the Product Security Office something similar to what Andras explained for how IBM operates? Perhaps you could just give us a sense of how it’s done there?

Reddy: At EMC in our Product Security Office, we house the enabling expertise to define how to build their products securely. We’re interested in building that in as soon as possible throughout the entire lifecycle. We work with all of our product teams to measure where they are, to help them define their path forward, as they look at each of the releases of their other products. And we’ve done a lot of work in sharing our practices within the industry.

One of the things this standard does for us, especially in the area of dealing with the supply chain, is it gives us a way to communicate what our practices are with our customers. Customers are looking for that kind of assurance and rather than having a one-by-one conversation with customers about what our practices are for a particular organization. This would allow us to have a way of demonstrating the measurement and the conformance against a standard to our own customers.

Also, as we flip it around and take a look at our own suppliers, we want to be able to encourage suppliers, which may be small suppliers, to conform to a standard, as we go and select who will be our authorized suppliers.

Gardner: Dave, what would you suggest for those various suppliers around the globe to begin the process?

Publications catalog

Lounsbury: Obviously, the thing I would recommend right off is to go to The Open Group website, go to the publications catalog, and download the snapshot of the OTTF standard. That gives a good overview of the two areas of best practices for protection from tainted and counterfeit products we’ve mentioned on the call here.

That’s the starting point, but of course, the reason it’s very important for the commercial world to lead this is that commercial vendors face the commercial market pressures and have to respond to threats quickly. So the other part of this is how to stay involved and how to stay up to date?

And of course the two ways that The Open Group offers to let people do that is that you can come to our quarterly conferences, where we do regular presentations on this topic. In fact, the Washington meeting is themed on the supply chain security.

Of course, the best way to do it is to actually be in the room as these standards are evolved to meet the current and the changing threat environment. So, joining The Open Group and joining the OTTF is absolutely the best way to be on the cutting edge of what’s happening, and to take advantage of the great information you get from the companies represented on this call, who have invested years-and-years, as Andras said, in making their own best practices and learning from them.

Gardner:Edna, what’s on the short list of next OTTF priorities?

Conway: You’ve heard us talk about CNCI, and the fact that cybersecurity is on everyone’s minds today. So while taint embodies that to some degree, we probably need to think about partnering in a more comprehensive way under the resiliency and risk umbrella that you heard Dan talk about and really think about embedding security into a resilient supply chain or a resilient enterprise approach.

In fact, to give that some forethought, we actually have invited at the upcoming conference, a colleague who I’ve worked with for a number of years who is a leading expert in enterprise resiliency and supply chain resiliency to join us and share his thoughts.

He is a professor at MIT, and his name is Yossi Sheffi. Dr. Sheffi will be with us. It’s from that kind of information sharing, as we think in a more comprehensive way, that we begin to gather the expertise that not only resides today globally in different pockets, whether it be academia, government, or private enterprise, but also to think about what the next generation is going to look like.

Resiliency, as it was known five years ago, is nothing like supply chain resiliency today, and where we want to take it into the future. You need only look at the US national strategy for global supply chain security to understand that. When it was announced in January of this year at Davos by Secretary Napolitano of the DHS, she made it quite clear that we’re now putting security at the forefront, and resiliency is a part of that security endeavor.

So that mindset is a change, given the reliance ubiquitously on communications, for everything, everywhere, at all times — not only critical infrastructure, but private enterprise, as well as all of us on a daily basis today. Our communications infrastructure is essential to us.

Thinking about resiliency

Given that security has taken top ranking, we’re probably at the beginning of this stage of thinking about resiliency. It’s not just about continuity of supply, not just about prevention from the kinds of cyber incidents that we’re worried about, but also to be cognizant of those nation-state concerns or personal concerns that would arise from those parties who are engaging in malicious activity, either for political, religious or reasons.

Or, as you know, some of them are just interested in seeing whether or not they can challenge the system, and that causes loss of productivity and a loss of time. In some cases, there are devastating negative impacts to infrastructure.

Szakal: There’s another area too that I am highly focused on, but have kind of set aside, and that’s the continued development and formalization of the framework itself that is to continue the collective best practices from the industry and provide some sort of methods by which vendors can submit and externalize those best practices. So those are a couple of areas that I think that would keep me busy for the next 12 months easily.

Gardner: What do IT vendors companies gain if they do this properly?

Secure by Design

Szakal: Especially now in this day and age, any time that you actually approach security as part of the lifecycle — what we call an IBM Secure by Design – you’re going to be ahead of the market in some ways. You’re going to be in a better place. All of these best practices that we’ve defined are additive in effect. However, the very nature of technology as it exists today is that it will be probably another 50 or so years, before we see a perfect security paradigm in the way that we all think about it.

So the researchers are going to be ahead of all of the providers in many ways in identifying security flaws and helping us to remediate those practices. That’s part of what we’re doing here, trying to make sure that we continue to keep these practices up to date and relevant to the entire lifecycle of commercial off-the-shelf technology (COTS) development.

So that’s important, but you also have to be realistic about the best practices as they exist today. The bar is going to move as we address future challenges.

************

For more information on The Open Group’s upcoming conference in Washington, D.C., please visit: http://www.opengroup.org/dc2012

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

Comments Off

Filed under Cybersecurity, Information security, OTTF, Supply chain risk

The Increasing Importance of Cybersecurity: The Open Group Conference in Washington, D.C.

By Jim Hietala, The Open Group

As we move through summer here in the U.S., cybersecurity continues to be top of mind, not only for security professionals, but for IT management as well as for senior managers in large organizations.

The IT security world tends to fixate on the latest breach reported or the latest vulnerability disclosed. Clearly the recent news around Stuxnet and Flame has caused a stir in the community, as professionals debate what it means to have cyberwar attacks being carried out by nations. However, there have also been other significant developments in cybersecurity that have heightened the need for better understanding of risk and security posture in large organizations.

In the U.S., the SEC recently issued guidance to public companies on disclosing the risks of cybersecurity incidents in financial reports, as well as disclosing actual breaches if there is material affect. This is a significant new development, as there’s little that directs the attention of CEO’s and Boards like new financial disclosure requirements. In publicly traded organizations that struggled to find funding to perform adequate risk management and for IT security initiatives, IT folks will have a new impetus and mandate, likely with support from the highest levels.

The upcoming Open Group conference in Washington, D.C. on July 16-20 will explore cybersecurity, with a focus on defending critical assets and securing the global supply chain. To highlight a few of the notable presentations:

  • Joel Brenner, author of America the Vulnerable, attorney, and former senior counsel at the NSA, will keynote on Monday, July 16 and will speak on “America the Vulnerable: Inside the New Threat Matrix.”
  • Kristen Baldwin, principal deputy, DASD, Systems Engineering, and acting cirector, Systems Analysis, will speak on “Meeting the Challenge of Cybersecurity Threats through Industry-Government Partnerships.”
  • Dr. Ron Ross, project leader, NIST, will talk to “Integrating Cyber Security Requirements into Main Stream Organizational Mission and Business Processes.”
  • Andras Szakal, VP & CTO, IBM Federal will moderate a panel that will include Daniel Reddy, EMC; Edna Conway, Cisco; and Hart Rossman, SAIC on “Mitigating Tainted & Counterfeit Products.”
  • Dazza (Daniel) J. Greenwood, JD, MIT and CIVICS.com Consultancy Services, and Thomas Hardjono, executive director of MIT Kerberos Consortium, will discuss “Meeting the Challenge of Identity and Security.”

Apart from our quarterly conferences and member meetings, The Open Group undertakes a broad set of programs aimed at addressing challenges in information security.

Our Security Forum focuses on developing standards and best practices in the areas of information security management and secure architecture. The Real Time and Embedded Systems Forum addresses high assurance systems and dependability through work focused on MILS, software assurance, and dependability engineering for open systems. Our Trusted Technology Forum addresses supply chain issues of taint and counterfeit products through the development of the Trusted Technology Provider Framework, which is a draft standard aimed at enabling commercial off the shelf ICT products to be built with integrity, and bought with confidence. Finally, The Open Group Jericho Forum continues to provide thought leadership in the area of information security, most notably in the areas of de-perimeterization, secure cloud computing and identity management.

I hope to see you at the conference. More information about the conference, including the full program can be found here: http://www.opengroup.org/dc2012

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.


Comments Off

Filed under Conference, Cybersecurity, Information security, OTTF, Security Architecture

Cybersecurity Threats Key Theme at Washington, D.C. Conference – July 16-20, 2012

By The Open Group Conference Team

Identify risks and eliminating vulnerabilities that could undermine integrity and supply chain security is a significant global challenge and a top priority for governments, vendors, component suppliers, integrators and commercial enterprises around the world.

The Open Group Conference in Washington, D.C. will bring together leading minds in technology and government policy to discuss issues around cybersecurity and how enterprises can establish and maintain the necessary levels of integrity in a global supply chain. In addition to tutorial sessions on TOGAF and ArchiMate, the conference offers approximately 60 sessions on a varied of topics, including:

  • Cybersecurity threats and key approaches to defending critical assets and securing the global supply chain
  • Information security and Cloud security for global, open network environments within and across enterprises
  • Enterprise transformation, including Enterprise Architecture, TOGAF and SOA
  • Cloud Computing for business, collaborative Cloud frameworks and Cloud architectures
  • Transforming DoD avionics software through the use of open standards

Keynote sessions and speakers include:

  • America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime and Warfare - Keynote Speaker: Joel Brenner, author and attorney at Cooley LLP
  • Meeting the Challenge of Cybersecurity Threats through Industry-Government Partnerships - Keynote Speaker: Kristin Baldwin, principal deputy, deputy assistant secretary of defense for Systems Engineering
  • Implementation of the Federal Information Security Management Act (FISMA) - Keynote Speaker: Dr. Ron Ross, project leader at NIST (TBC)
  • Supply Chain: Mitigating Tainted and Counterfeit Products - Keynote Panel: Andras Szakal, VP and CTO at IBM Federal; Daniel Reddy, consulting product manager in the Product Security Office at EMC Corporation; John Boyens, senior advisor in the Computer Security Division at NIST; Edna Conway, chief security strategist of supply chain at Cisco; and Hart Rossman, VP and CTO of Cyber Security Services at SAIC
  • The New Role of Open Standards – Keynote Speaker: Allen Brown, CEO of The Open Group
  • Case Study: Ontario Healthcare - Keynote Speaker: Jason Uppal, chief enterprise architect at QRS
  • Future Airborne Capability Environment (FACE): Transforming the DoD Avionics Software Industry Through the Use of Open Standards - Keynote Speaker: Judy Cerenzia, program director at The Open Group; Kirk Avery of Lockheed Martin; and Robert Sweeney of Naval Air Systems Command (NAVAIR)

The full program can be found here: http://www3.opengroup.org/events/timetable/967

For more information on the conference tracks or to register, please visit our conference registration page. Please stay tuned throughout the next month as we continue to release blog posts and information leading up to The Open Group Conference in Washington, D.C. and be sure to follow the conference hashtag on Twitter – #ogDCA!

1 Comment

Filed under ArchiMate®, Cloud, Cloud/SOA, Conference, Cybersecurity, Enterprise Architecture, Information security, OTTF, Standards, Supply chain risk

Corporate Data, Supply Chains Remain Vulnerable to Cyber Crime Attacks, Says Open Group Conference Speaker

By Dana Gardner, Interarbor Solutions 

This BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference in Washington, D.C., beginning July 16. The conference will focus on how security impacts the Enterprise Architecture, enterprise transformation, and global supply chain activities in organizations, both large and small.

We’re now joined on the security front with one of the main speakers at the conference, Joel Brenner, the author of America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime, and Warfare.”

Joel is a former Senior Counsel at the National Security Agency (NSA), where he advised on legal and policy issues relating to network security. Mr. Brenner currently practices law in Washington at Cooley LLP, specializing in cyber security. Registration remains open for The Open Group Conference in Washington, DC beginning July 16.

Previously, he served as the National Counterintelligence Executive in the Office of the Director of National Intelligence, and as the NSA’s Inspector General. He is a graduate of University of Wisconsin–Madison, the London School of Economics, and Harvard Law School. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. The full podcast can be found here.

Here are some excerpts:

Gardner: Your book came out last September and it affirmed this notion that the United States, or at least open Western cultures and societies, are particularly vulnerable to being infiltrated, if you will, from cybercrime, espionage, and dirty corporate tricks.

Why are we particularly vulnerable, when we should be most adept at using cyber activities to our advantage?

Brenner: Let’s make a distinction here between the political-military espionage that’s gone on since pre-biblical times and the economic espionage that’s going on now and, in many cases, has nothing to do at all to do with military, defense, or political issues.

The other stuff has been going on forever, but what we’ve seen in the last 15 or so years is a relentless espionage attack on private companies for reasons having nothing to do with political-military affairs or defense.

So the countries that are adept at cyber, but whose economies are relatively undeveloped compared to ours, are at a big advantage, because they’re not very lucrative targets for this kind of thing, and we are. Russia, for example, is paradoxical. While it has one of the most educated populations in the world and is deeply cultured, it has never been able to produce a commercially viable computer chip.

Not entrepreneurial

We’re not going to Russia to steal advanced technology. We’re not going to China to steal advanced technology. They’re good at engineering and they’re good at production, but so far, they have not been good at making themselves into an entrepreneurial culture.

That’s one just very cynical reason why we don’t do economic espionage against the people who are mainly attacking us, which are China, Russia, and Iran. I say attack in the espionage sense.

The other reason is that you’re stealing intellectual property when you’re doing economic espionage. It’s a bedrock proposition of American economics and political strategy around the world to defend the legal regime that protects intellectual property. So we don’t do that kind of espionage. Political-military stuff we’re real good at.

Gardner: Wouldn’t our defense rise to the occasion? Why hasn’t it?

Brenner: The answer has a lot to do with the nature of the Internet and its history. The Internet, as some of your listeners will know, was developed starting in the late ’60s by the predecessor of the Defense Advanced Research Projects Agency (DARPA), a brilliant operation which produced a lot of cool science over the years.

It was developed for a very limited purpose, to allow the collaboration of geographically dispersed scientists who worked under contract in various universities with the Defense Department’s own scientists. It was bringing dispersed brainpower to bear.

It was a brilliant idea, and the people who invented this, if you talk to them today, lament the fact that they didn’t build a security layer into it. They thought about it. But it wasn’t going to be used for anything else but this limited purpose in a trusted environment, so why go to the expense and aggravation of building a lot of security into it?

Until 1992, it was against the law to use the Internet for commercial purposes. Dana, this is just amazing to realize. That’s 20 years ago, a twinkling of an eye in the history of a country’s commerce. That means that 20 years ago, nobody was doing anything commercial on the Internet. Ten years ago, what were you doing on the Internet, Dana? Buying a book for the first time or something like that? That’s what I was doing, and a newspaper.

In the intervening decade, we’ve turned this sort of Swiss cheese, cool network, which has brought us dramatic productivity and all and pleasure into the backbone of virtually everything we do.

International finance, personal finance, command and control of military, manufacturing controls, the controls in our critical infrastructure, all of our communications, virtually all of our activities are either on the Internet or exposed to the Internet. And it’s the same Internet that was Swiss cheese 20 years ago and it’s Swiss cheese now. It’s easy to spoof identities on it.

So this gives a natural and profound advantage to attack on this network over defense. That’s why we’re in the predicament we’re in.

Both directions

Gardner: Let’s also look at this notion of supply chain, because corporations aren’t just islands unto themselves. A business is really a compendium of other businesses, products, services, best practices, methodologies, and intellectual property that come together to create a value add of some kind. It’s not just attacking the end point, where that value is extended into the market. It’s perhaps attacking anywhere along that value chain.

What are the implications for this notion of the ecosystem vulnerability versus the enterprise vulnerability?

Brenner: Well, the supply chain problem really is rather daunting for many businesses, because supply chains are global now, and it means that the elements of finished products have a tremendous numbers of elements. For example, this software, where was it written? Maybe it was written in Russia — or maybe somewhere in Ohio or in Nevada, but by whom? We don’t know.

There are two fundamental different issues for supply chain, depending on the company. One is counterfeiting. That’s a bad problem. Somebody is trying to substitute shoddy goods under your name or the name of somebody that you thought you could trust. That degrades performance and presents real serious liability problems as a result.

The other problem is the intentional hooking, or compromising, of software or chips to do things that they’re not meant to do, such as allow backdoors and so on in systems, so that they can be attacked later. That’s a big problem for military and for the intelligence services all around the world.

The reason we have the problem is that nobody knows how to vet a computer chip or software to see that it won’t do thesesquirrelly things. We can test that stuff to make sure it will do what it’s supposed to do, but nobody knows how to test the computer chip or two million lines of software reliably to be sure that it won’t also do certain things we don’t want it to do.

You can put it in a sandbox or a virtual environment and you can test it for a lot of things, but you can’t test it for everything. It’s just impossible. In hardware and software, it is thestrategic supply chain problem now. That’s why we have it.

If you have a worldwide supply chain, you have to have a worldwide supply chain management system. This is hard and it means getting very specific. It includes not only managing a production process, but also the shipment process. A lot of squirrelly things happen on loading docks, and you have to have a way not to bring perfect security to that — that’s impossible — but to make it really harder to attack your supply chain.

Notion of cost

Gardner: So many organizations today, given the economy and the lagging growth, have looked to lowest cost procedures, processes, suppliers, materials, and aren’t factoring in the risk and the associated cost around these security issues. Do people need to reevaluate cost in the supply chain by factoring in what the true risks are that we’re discussing?

Brenner: Yes, but of course, when the CEO and the CFO get together and start to figure this stuff out, they look at the return on investment (ROI) of additional security. It’s very hard to be quantitatively persuasive about that. That’s one reason why you may see some kinds of production coming back into the United States. How one evaluates that risk depends on the business you’re in and how much risk you can tolerate.

This is a problem not just for really sensitive hardware and software, special kinds of operations, or sensitive activities, but also for garden-variety things.

Gardner: We’ve seen other aspects of commerce in which we can’t lock down the process. We can’t know all the information, but what we can do is offer deterrence, perhaps in the form of legal recourse, if something goes wrong, if in fact, decisions were made that countered the contracts or were against certain laws or trade practices.

Brenner: For a couple of years now, I’ve struggled with the question why it is that liability hasn’t played a bigger role in bringing more cyber security to our environment, and there are a number of reasons.

We’ve created liability for the loss of personal information, so you can quantify that risk. You have a statute that says there’s a minimum damage of $500 or $1,000 per person whose identifiable information you lose. You add up the number of files in the breach and how much the lawyers and the forensic guys cost and you come up with a calculation of what these things cost.

But when it comes to just business risk, not legal risk, and the law says intellectual property to a company that depends on that intellectual property, you have a business risk. You don’t have much of a legal risk at this point.

You may have a shareholder suit issue, but there hasn’t been an awful lot of that kind of litigation so far. So I don’t know. I’m not sure that’s quite the question you were asking me, Dana.

Gardner: My follow on to that was going to be where would you go to sue across borders anyway? Is there an über-regulatory or legal structure across borders to target things like supply chain, counterfeit, cyber espionage, or mistreatment of business practice?

Depends on the borders

Brenner: It depends on the borders you’re talking about. The Europeans have a highly developed legal and liability system. You can bring actions in European courts. So it depends what borders you mean.

If you’re talking about the border of Russia, you have very different legal issues. China has different legal issues, different from Russia, as well from Iran. There are an increasing number of cases where actions are being brought in China successfully for breaches of intellectual property rights. But you wouldn’t say that was the case in Nigeria. You wouldn’t say that was the case in a number of other countries where we’ve had a lot of cybercrime originating from.

So there’s no one solution here. You have to think in terms of all kinds of layered defenses. There are legal actions you can take sometimes, but the fundamental problem we’re dealing with is this inherently porous Swiss-cheesy system. In the long run, we’re going to have to begin thinking about the gradual reengineering of the way the Internet works, or else this basic dynamic, in which lawbreakers have advantage over law-abiding people, is not going to go away.

Think about what’s happened in cyber defenses over the last 10 years and how little they’ve evolved — even 20 years for that matter. They almost all require us to know the attack mode or the sequence of code in order to catch it. And we get better at that, but that’s a leapfrog business. That’s fundamentally the way we do it.

Whether we do it at the perimeter, inside, or even outside before the attack gets to the perimeter, that’s what we’re looking for — stuff we’ve already seen. That’s a very poor strategy for doing security, but that’s where we are. It hasn’t changed much in quite a long time and it’s probably not going to.

Gardner: Why is that the case? Is this not a perfect opportunity for a business-government partnership to come together and re-architect the Internet at least for certain types of business activities, permit a two-tier approach, and add different levels of security into that? Why hasn’t it gone anywhere?

Brenner: What I think you’re saying is different tiers or segments. We’re talking about the Balkanization of the Internet. I think that’s going to happen as more companies demand a higher level of protection, but this again is a cost-benefit analysis. You’re going to see even more Balkanization of the Internet as you see countries like Russia and China, with some success, imposing more controls over what can be said and done on the Internet. That’s not going to be acceptable to us.

Gardner: We’ve seen a lot with Cloud Computing and more businesses starting to go to third-party Cloud providers for their applications, services, data storage, even integration to other business services and so forth.

More secure

If there’s a limited lumber, or at least a finite number, of Cloud providers and they can institute the proper security and take advantage of certain networks within networks, then wouldn’t that hypothetically make a Cloud approach more secure and more managed than every-man-for-himself, which is what we have now in enterprises and small to medium-sized businesses (SMBs)?

Brenner: I think the short answer is, yes. The SMBs will achieve greater security by basically contracting it out to what are called Cloud providers. That’s because managing the patching of vulnerabilities and other aspects and encryption is beyond what’s most small businesses and many medium-sized businesses can do, are willing to do, or can do cost-effectively.

For big businesses in the Cloud, it just depends on how good the big businesses’ own management of IT is as to whether it’s an improvement or not. But there are some problems with the Cloud.

People talk about security, but there are different aspects of it. You and I have been talking just now about security meaning the ability to prevent somebody from stealing or corrupting your information. But availability is another aspect of security. By definition, putting everything in one remote place reduces robustness, because if you lose that connection, you lose everything.

Consequently, it seems to me that backup issues are really critical for people who are going to the Cloud. Are you going to rely on your Cloud provider to provide the backup? Are you going to rely on the Cloud provider to provide all of your backup? Are you going to go to a second Cloud provider? Are you going to keep some information copied in-house?

What would happen if your information is good, but you can’t get to it? That means you can’t get to anything anymore. So that’s another aspect of security people need to think through.

Gardner: How do you know you’re doing the right thing? How do you know that you’re protecting? How do you know that you’ve gone far enough to ameliorate the risk?

Brenner: This is really hard. If somebody steals your car tonight, Dana, you go out to the curb or the garage in the morning, and you know it’s not there. You know it’s been stolen.

When somebody steals your algorithms, your formulas, or your secret processes, you’ve still got them. You don’t know they’re gone, until three or four years later, when somebody in Central China or Siberia is opening a factory and selling stuff into your market that you thought you were going to be selling — and that’s your stuff. Then maybe you go back and realize, “Oh, that incident three or four years ago, maybe that’s when that happened, maybe that’s when I lost it.”

What’s going out

So you don’t even know necessarily when things have been stolen. Most companies don’t do a good job. They’re so busy trying to find out what’s coming into their network, they’re not looking at what’s going out.

That’s one reason the stuff is hard to measure. Another is that ROI is very tough. On the other hand, there are lots of things where business people have to make important judgments in the face of risks and opportunities they can’t quantify, but we do it.

We’re right to want data whenever we can get it, because data generally means we can make better decisions. But we make decisions about investment in R&D all the time without knowing what the ROI is going to be and we certainly don’t know what the return on a particular R&D expenditure is going to be. But we make that, because people are convinced that if they don’t make it, they’ll fall behind and they’ll be selling yesterday’s products tomorrow.

Why is it that we have a bias toward that kind of risk, when it comes to opportunity, but not when it comes to defense? I think we need to be candid about our own biases in that regard, but I don’t have a satisfactory answer to your question, and nobody else does either. This is one where we can’t quantify that answer.

Gardner: It sounds as if people need to have a healthy dose of paranoia to tide them over across these areas. Is that a fair assessment?

Brenner: Well, let’s say skepticism. People need to understand, without actually being paranoid, that life is not always what it seems. There are people who are trying to steal things from us all the time, and we need to protect ourselves.

In many companies, you don’t see a willingness to do that, but that varies a great deal from company to company. Things are not always what they seem. That is not how we Americans approach life. We are trusting folks, which is why this is a great country to do business in and live in. But we’re having our pockets picked and it’s time we understood that.

Gardner: And, as we pointed out earlier, this picking of pockets is not just on our block, but could be any of our suppliers, partners, or other players in our ecosystem. If their pockets get picked, it ends up being our problem too.

Brenner: Yeah, I described this risk in my book, America the Vulnerable,” at great length and in my practice, here at Cooley, I deal with this every day. I find myself, Dana, giving briefings to businesspeople that 5, 10, or 20 years ago, you wouldn’t have given to anybody who wasn’t a diplomat or a military person going outside the country. Now this kind of cyber pilferage is an aspect of daily commercial life, I’m sorry to say.

************

For more information on The Open Group’s upcoming conference in Washington, D.C., please visit: http://www.opengroup.org/dc2012

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

2 Comments

Filed under Cloud, Cybersecurity, Supply chain risk

Enterprise Transformation Takes the French Riviera

By The Open Group Conference Team

The Open Group Conference in Cannes, France is just around the corner. Taking place April 23-27, the conference will bring together leading minds in technology to discuss the process of Enterprise Transformation, and the role of Enterprise Architecture (EA) and IT in Enterprise Transformation.

The French Riviera is a true playground for the rich and famous. As the location of the next Open Group Conference, (not to mention the next Open Cannes Awards) it seems only fitting that we not only have an incredible venue for the event, the JW Marriott Cannes, but have our own star-studded lineup of speakers, sessions and activities that are sure to make the conference an unforgettable experience.

In addition to tutorial sessions on TOGAF and ArchiMate, the conference offers roughly 60 sessions on a varied of topics, including:

  • Enterprise Transformation, including Enterprise Architecture and SOA
  • Cybersecurity, Cloud Security and Trusted Technology for the Supply Chain
  • Cloud Computing for Business, Collaborative Cloud Frameworks and Cloud Architectures

The conference theme “Enterprise Transformation” will highlight how Enterprise Architecture can be used to truly change how companies do business and create models and architectures that help them make those changes. Keynote speakers include:

  • Dr. Alexander Osterwalder, Best-selling Author and Entrepreneur

Dr. Osterwalder is a renowned thought leader on business model design and innovation. Many executives and entrepreneurs and world-leading organizations have applied Dr. Osterwalderʼs approach to strengthen their business model and achieve a competitive advantage through business model innovation. His keynote session at the conference, titled: “Business Models, IT, and Enterprise Transformation,” will discuss how to use the Business Model Canvas approach to better align IT and business strategy, empower multi-disciplinary teams and contribute to Enterprise Transformation.

  • Herve Gouezel, Advisor to the CEO at BNP Paribas & Eric Boulay, Founder and CEO of Arismore

Keynote: “EA and Transformation: An Enterprise Issue, a New Role for the CIO?” will examine governance within the Enterprise and what steps need to take place to create a collaborative Enterprise.

  • Peter Haviland, Chief Architect and Head of Business Architecture Advisory Services at Ernst & Young, US

Keynote: “World Class EA 2012: Putting Your Architecture Team in the Middle of Enterprise Transformation,” will identify and discuss key activities leading practice architecture teams are performing to create and sustain value, to remain at the forefront of enterprise transformation.

  • Kirk Avery, Software Architect at Lockheed Martin & Robert Sweeney, MSMA Lead Systems Engineer at Naval Air Systems Command

Keynote: “FACE: Transforming the DoD Avionics Software Industry Through the Use of Open Standards,” will address the DoD Avionics Industry’s need for providing complex mission capability in less time and in an environment of shrinking government budgets

The Common Criteria Workshop and the European Commission

We are also pleased to be hosting the first Common Criteria Workshop during the Cannes Conference. This two-day event – taking place April 25 to 26 – offers a rich opportunity to hear from distinguished speakers from the Common Criteria Security community, explore viewpoints through panel discussions and work with minded people towards common goals.

One of the keynote speakers during the workshop is Andrea Servida, the Deputy Head of the Internet, Network and Information Security unit with the European Commission in Brussels, Belgium. With extensive experience defining and implementing strategies and policies on network and information security and critical information infrastructure protection, Mr. Servida is an ideal speaker as we kick-off the first workshop.

The Open Cannes Awards

What trip would be complete to Cannes without an awards ceremony? Presented by The Open Group, The Open Cannes Awards is an opportunity for our members to recognize each other’s accomplishments within The Open Group with a little fun during the gala ceremony on the night of Tuesday, April 24. The goal is to acknowledge the success stories, the hard work and dedication that members, either as individuals or as organizations, have devoted to The Open Group’s ideals and vision over the past decade.

We hope to see you in Cannes! For more information on the conference tracks or to register, please visit our conference registration page, and please stay tuned throughout the next month as we continue to release blog posts and information leading up to The Open Group Conference in Cannes, France!

Comments Off

Filed under Cloud, Cloud/SOA, Conference, Cybersecurity, Enterprise Architecture, Enterprise Transformation, FACE™, Semantic Interoperability, Service Oriented Architecture

The future – ecosystems and standards

By Mark Skilton, Capgemini

This article is a continuation of a series on standards by Mark Stilton. Read his previous posts on “Why standards in information technology are critical and “Innovation in the Cloud needs open standards.”

The evolution of standards has become a big domain issue. The world has moved from the individual languages of resources and transactions into architectural standards that seek to describe how different sets of resources, interfaces and interactions can be designed to work together. But this concept has now gone further in networked societies.

In this new “universe” of online and physical services, new channels, portals, devices and services are emerging that create new integration and compositions of services. New business models are emerging as a result, which are impacting existing markets and incumbents as well as creating new rules and standards.  Old standards and policies such as digital privacy and cross-border intellectual property are being challenged by these new realities. Ignoring these is not an option, as companies and whole countries are realizing the need to keep up-to-date and aware of these developments that impact their own locations and economies.

This means the barriers and accelerators to individual markets and new markets are evolving and in constant dynamic change. Standards and interoperability are at the center of these issues and affect the very levers of change in markets.

Cloud Computing is one such phenomenon rewriting the rules on information exchange and business models for provisioning and delivery of products and services. The impact of Cloud Computing on competitive advantage is significant in the way it has lowered barriers to access of markets and collaboration. It has increased speed of provisioning and potential for market growth and expansion through the distributed power of the Internet. The connectivity and extensions of business models brought about by these trends is changing previously held beliefs and competitive advantages of ownership and relationships.

The following diagram was presented at The Open Group Conference, Amsterdam in the fall  of 2010.

The Internet of Things (IOT) is an example of this trend that is seen in the area of Radio Frequency Identification (RFID) tags of materials and products for automatic tracking. But this is just one example of interoperability emerging across industries. Large-scale telecommunications networks now have the ability to reach and integrate large areas of the marketplace through fixed and now wireless mobile communications networks.

This vision can create new possibilities beyond just tagging and integration of supply chains; it hints towards a possibility of social networks, business networks and value chains being able to create new experiences and services through interconnectedness.

Mark Skilton, Director, Capgemini, is the Co-Chair of The Open Group Cloud Computing Work Group. He has been involved in advising clients and developing of strategic portfolio services in Cloud Computing and business transformation. His recent contributions include the publication of Return on Investment models on Cloud Computing widely syndicated that achieved 50,000 hits on CIO.com and in the British Computer Society 2010 Annual Review. His current activities include development of a new Cloud Computing Model standards and best practices on the subject of Cloud Computing impact on Outsourcing and Off-shoring models and contributed to the second edition of the Handbook of Global Outsourcing and Off-shoring published through his involvement with Warwick Business School UK Specialist Masters Degree Program in Information Systems Management.

1 Comment

Filed under Cloud, Standards

PODCAST: Industry moves to fill gap for building trusted supply chain technology accreditation

By Dana Gardner, Interabor Solutions

Listen to this recorded podcast here: BriefingsDirect-IT Industry Looks to Open Trusted Technology Forum to Help Secure Supply Chains That Support Technology Products

The following is the transcript of a sponsored podcast panel discussion on how the OTTF is developing an accreditation process for trusted technology, in conjunction with the The Open Group Conference, Austin 2011.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect. Today, we present a sponsored podcast discussion in conjunction with The Open Group Conference in Austin, Texas, the week of July 18, 2011.

We’ve assembled a distinguished panel to update us on The Open Group Trusted Technology Forum, also known as the OTTF, and an accreditation process to help technology acquirers and buyers safely conduct global procurement and supply chain commerce. We’ll examine how the security risk for many companies and organizations has only grown, even as these companies form essential partnerships and integral supplier relationships. So, how can all the players in a technology ecosystem gain assurances that the other participants are adhering to best practices and taking the proper precautions?

Here to help us better understand how established standard best practices and an associated accreditation approach can help make supply chains stronger and safer is our panel. We’re here with Dave Lounsbury, the Chief Technical Officer at The Open Group. Welcome back, Dave.

Dave Lounsbury: Hello Dana. How are you?

Gardner: Great. We are also here with Steve Lipner, Senior Director of Security Engineering Strategy in the Trustworthy Computing Security at Microsoft. Welcome back, Steve.

Steve Lipner: Hi, Dana. Glad to be here.

Gardner: We’re here also with Joshua Brickman, Director of the Federal Certification Program Office at CA Technologies. Welcome, Joshua.

Joshua Brickman: Thanks for having me.

Gardner: And, we’re here too with Andras Szakal. He’s the Vice President and CTO of IBM’s Federal Software Group. Welcome back, Andras.

Andras Szakal: Thank you very much, Dana. I appreciate it.

Gardner: Dave, let’s start with you. We’ve heard so much lately about “hacktivism,” break-ins, and people being compromised. These are some very prominent big companies, both public and private. How important is it that we start to engage more with things like the OTTF?

No backup plan

Dave LounsburyLounsbury: Dana, a great quote coming out of this week’s conference was that we have moved the entire world’s economy to being dependent on the Internet, without a backup plan. Anyone who looks at the world economy will see, not only are we dependent on it for exchange of value in many cases, but even information about how our daily lives are run, traffic, health information, and things like that. It’s becoming increasingly vitally important that we understand all the aspects of what it means to have trust in the chain of components that deliver that connectivity to us, not just as technologists, but as people who live in the world.

Gardner: Steve Lipner, your thoughts on how this problem seems to be only getting worse?

Lipner: Well, the attackers are becoming more determined and more visible across the Internet ecosystem. Vendors have stepped up to improve the security of their product offerings, but customers are concerned. A lot of what we’re doing in The Open Group and in the OTTF is about trying to give them additional confidence of what vendors are doing, as well as inform vendors what they should be doing.

Gardner: Joshua Brickman, this is obviously a big topic and a very large and complex area. From your perspective, what is it that the OTTF is good at? What is it focused on? What should we be looking to it for in terms of benefit in this overall security issue?

Brickman: One of the things that I really like about this group is that you have all of the leaders, everybody who is important in this space, working together with one common goal. Today, we had a discussion where one of the things we were thinking about is, whether there’s a 100 percent fail-safe solution to cyber? And there really isn’t. There is just a bar that you can set, and the question is how much do you want to make the attackers spend, before they can get over that bar? What we’re going to try to do is establish that level, and working together, I feel very encouraged that we are getting there, so far.

Gardner: Andras, we are not just trying to set the bar, but we’re also trying to enforce, or at least have clarity into, what other players in an ecosystem are doing. So that accreditation process seems to be essential.

Szakal: We’re going to develop a standard, or are in the process of developing a specification and ultimately an accreditation program, that will validate suppliers and providers against that standard. It’s focused on building trust into a technology provider organization through this accreditation program, facilitated through either one of several different delivery mechanisms that we are working on. We’re looking for this to become a global program, with global partners, as we move forward.

Gardner: It seems as if almost anyone is a potential target, and when someone decides to target you, you do seem to suffer. We’ve seen things with Booz Allen, RSA, and consumer organizations like Sony. Is this something that almost everyone needs to be more focused on? Are we at the point now where there is no such thing as turning back, Dave Lounsbury?

Global effort

Lounsbury: I think there is, and we have talked about this before. Any electronic or information system now is really built on components and software that are delivered from all around the globe. We have software that’s developed in one continent, hardware that’s developed in another, integrated in a third, and used globally. So, we really do need to have the kinds of global standards and engagement that Andras has referred to, so that there is that one bar for all to clear in order to be considered as a provider of trusted components.

Gardner: As we’ve seen, there is a weak link in any chain, and the hackers or the cyber criminals or the state sponsored organizations will look for those weak links. That’s really where we need to focus.

Lounsbury: I would agree with that. In fact, some of the other outcomes of this week’s conference have been the change in these attacks, from just nuisance attacks, to ones that are focused on monetization of cyber crimes and exfiltration of data. So the spectrum of threats is increasing a lot. More sophisticated attackers are looking for narrower and narrower attack vectors each time. So we really do need to look across the spectrum of how this IT technology gets produced in order to address it.

Gardner: Steve Lipner, it certainly seems that the technology supply chain is essential. If there is weakness there, then it’s difficult for the people who deploy those technologies to cover their bases. It seems that focusing on the technology providers, the ecosystems that support them, is a really necessary first step to taking this to a larger, either public or private, buyer side value.

Lipner: The tagline we have used for The Open Group TTF is “Build with Integrity, Buy with Confidence.” We certainly understand that customers want to have confidence in the hardware and software of the IT products that they buy. We believe that it’s up to the suppliers, working together with other members of the IT community, to identify best practices and then articulate them, so that organizations up and down the supply chain will know what they ought to be doing to ensure that customer confidence.

Gardner: Let’s take a step back and get a little bit of a sense of where this process that you are all involved with is. I know you’re all on working groups and in other ways involved in moving this forward, but it’s been about six months now since The OTTF was developed initially, and there was a white paper to explain that. Perhaps, one of you will volunteer to give us sort of a state of affairs where things are,. Then, we’d also like to hear an update about what’s been going on here in Austin. Anyone?

Szakal: Well, as the chair, I have the responsibility of keeping track of our milestones, so I’ll take that one. A, we completed the white paper earlier this year, in the first quarter. The white paper was visionary in nature, and it was obviously designed to help our constituents understand the goals of the OTTF. However, in order to actually make this a normative specification and design a program, around which you would have conformance and be able to measure suppliers’ conformity to that specification, we have to develop a specification with normative language.

First draft

We’re finishing that up as we speak and we are going to have a first draft here within the next month. We’re looking to have that entire specification go through company review in the fourth quarter of this year.

Simultaneously, we’ll be working on the accreditation policy and conformance criteria and evidence requirements necessary to actually have an accreditation program, while continuing to liaise with other evaluation schemes that are interested in partnering with us. In a global international environment, that’s very important, because there exist more than one of these regimes that we will have to exist, coexist, and partner with. Over the next year, we’ll have completed the accreditation program and have begun testing of the process, probably having to make some adjustments along the way. We’re looking at sometime within the first half of 2012 for having a completed program to begin ramping up.

Gardner: Is there an update on the public sector’s, or in the U.S., the federal government’s, role in this? Are they active? Are they leading? How would you characterize the public role or where you would like to see that go?

Szakal: The Forum itself continues to liaise with the government and all of our constituents. As you know, we have several government members that are part of the TTF and they are just as important as any of the other members. We continue to provide update to many of the governments that we are working with globally to ensure they understand the goals of the OTTF and how they can provide value synergistically with what we are doing, as we would to them.

Gardner: I’ll throw this back out to the panel? How about the activities this week at the conference? What have been the progress or insights that you can point to from that?

Brickman: We’ve been meeting for the first couple of days and we have made tremendous progress on wrapping up our framework and getting it ready for the first review. We’ve also been meeting with several government officials. I can’t say who they are, but what’s been good about it is that they’re very positive on the work that we’re doing, they support what we are doing and want to continue this discussion. It’s very much a partnership, and we do feel like it’s not just an industry-led project, where we have participation from folks who could very much be the consumers of this initiative.

Gardner: Clearly, there are a lot of stakeholders around the world, across both the public and private domains. Dave Lounsbury, what’s possible? What would we gain if this is done correctly? How would we tangibly look to improvements? I know that’s hard with security. It’s hard to point out what doesn’t happen, which is usually the result of proper planning, but how would you characterize the value of doing this all correctly say a year or two from now?

Awareness of security

Lounsbury: One of the trends we’ll see is that people are increasingly going to be making decisions about what technology to produce and who to partner with, based on more awareness of security.

A very clear possible outcome is that there will be a set of simple guidelines and ones that can be implemented by a broad spectrum of vendors, where a consumer can look and say, “These folks have followed good practices. They have baked secure engineering, secure design, and secure supply chain processes into their thing, and therefore I am more comfortable in dealing with them as a partner.”

Of course, what the means is that, not only do you end up with more confidence in your supply chain and the components for getting to that supply chain, but also it takes a little bit of work off your plate. You don’t have to invest as much in evaluating your vendors, because you can use commonly available and widely understood sort of best practices.

From the vendor perspective, it’s helpful because we’re already seeing places where a company, like a financial services company, will go to a vendor and say, “We need to evaluate you. Here’s our checklist.” Of course, the vendor would have to deal with many different checklists in order to close the business, and this will give them some common starting point.

Of course, everybody is going to customize and build on top of what that minimum bar is, depending on what kind of business they’re in. But at least it gives everybody a common starting point, a common reference point, some common vocabulary for how they are going to talk about how they do those assessments and make those purchasing decisions.

Gardner: Steve Lipner, do you think that this is going to find its way into a lot of RFPs, beginning a sales process, looking to have a major checkbox around these issues? Is that sort of how you see this unfolding?

Lipner: If we achieve the sort of success that we are aiming for and anticipating, you’ll see requirements for the OTTF, not only in RFPs, but also potentially in government policy documents around the world, basically aiming to increase the trust of broad collections of products that countries and companies use.

Gardner: Joshua Brickman, I have to imagine that this is a living type of an activity that you never really finish. There’s always something new to be done, a type of threat that’s evolving that needs to be reacted to. Would the TTF over time take on a larger role? Do you see it expanding into larger set of requirements, even as it adjusts to the contemporary landscape?

Brickman: That’s possible. I think that we are going to try to get something achievable out there in a timeframe that’s useful and see what sticks. One of the things that will happen is that as companies start to go out and test this, as with any other standard, the 1.0 standard will evolve to something that will become more germane, and as Steve said, will hopefully be adopted worldwide.

Agile and useful

It’s absolutely possible. It could grow. I don’t think anybody wants it to become a behemoth. We want it to be agile, useful, and certainly something readable and achievable for companies that are not multinational billion dollar companies, but also companies that are just out there trying to sell their piece of the pie into the space. That’s ultimately the goal of all of us, to make sure that this is a reasonable achievement.

Lounsbury: Dana, I’d like to expand on what Joshua just said. This is another thing that has come out of our meetings this week. We’ve heard a number of times that governments, of course, feel the need to protect their infrastructure and their economies, but also have a realization that because of the rapid evolution of technology and the rapid evolution of security threats that it’s hard for them to keep up. It’s not really the right vehicle.

There really is a strong preference. The U.S. strategy on this is to let industry take the lead. One of the reasons for that is the fact that industry can evolve, in fact must evolve, at the pace of the commercial marketplace. Otherwise, they wouldn’t be in business.

So, we really do want to get that first stake in the ground and get this working, as Joshua said. But there is some expectation that, over time, the industry will drive the evolution of security practices and security policies, like the ones OTTF is developing at the pace of commercial market, so that governments won’t have to do that kind of regulation which may not keep up.

Gardner: Andras, any thoughts from your perspective on this ability to keep up in terms of market forces? How do you see the dynamic nature of this being able to be proactive instead of reactive?

Szakal: One of our goals is to ensure that the viability of the specification itself, the best practices, are updated periodically. We’re talking about potentially yearly. And to include new techniques and the application of potentially new technologies to ensure that providers are implementing the best practices for development engineering, secure engineering, and supply chain integrity. It’s going to be very important for us to continue to evolve these best practices over a period of time and not allow them to fall into a state of static disrepair.

I’m very enthusiastic, because many of the members are very much in agreement that this is something that needs to be happening in order to actually raise the bar on the industry, as we move forward, and help the entire industry adopt the practices and then move forward in our journey to secure our critical infrastructure.

Gardner: Given that this has the potential of being a fairly rapidly evolving standard that may start really appearing in RFPs and be impactful for real world business success, how should enterprises get involved from the buy side? How should suppliers get involved from the sell side, given that this is seemingly a market driven, private enterprise driven activity?

I’ll throw this out to the crowd. What’s the responsibility from the buyers and the sellers to keep this active and to keep themselves up-to-date?

Lounsbury: Let me take the first stab at this. The reason we’ve been able to make the progress we have is that we’ve got the expertise in security from all of these major corporations and government agencies participating in the TTF. The best way to maintain that currency and maintain that drive is for people who have a problem, if you’re on the buy side or expertise from either side, to come in and participate.

Hands-on awareness

You have got the hands-on awareness of the market, and bringing that in and adding that knowledge of what is needed to the specification and helping move its evolution along is absolutely the best thing to do.

That’s our steady state, and of course the way to get started on that is to go and look at the materials. The white paper is out there. I expect we will be doing snapshots of early versions of this that would be available, so people can take a look at those. Or, come to an Open Group Conference and learn about what we are doing.

Gardner: Anyone else have a reaction to that? I’m curious. Given that we are looking to the private sector and market forces to be the drivers of this, will they also be the drivers in terms of enforcement? Is this voluntary? One would hope that market forces reward those who seek accreditation and demonstrate adhesion to the standard, and that those who don’t would suffer. Or is there a potential for more teeth and more enforcement? Again, I’ll throw this out to the panel at large.

Szakal: As vendors, we’d would like to see minimal regulation and that’s simply the nature of the beast. In order for us to conduct our business and lower the cost of market entry, I think that’s important.

I think it’s important that we provide leadership within the industry to ensure that we’re following the best practices to ensure the integrity of the products that we provide. It’s through that industry leadership that we will avoid potential damaging regulations across different regional environments.

We certainly wouldn’t want to see different regulations pop-up in different places globally. It makes for very messy technology insertion opportunity for us. We’re hoping that by actually getting engaged and providing some self-regulation, we won’t see additional government or international regulation.

Lipner: One of the things that my experience has taught me is that customers are very aware these days of security, product integrity, and the importance of suppliers paying attention to those issues. Having a robust program like the TTF and the certifications that it envisions will give customers confidence, and they will pay attention to that. That will change their behavior in the market even without formal regulations.

Gardner: Joshua Brickman, any thoughts on the self-regulation benefits? If that doesn’t work, is it self-correcting? Is there a natural approach that if this doesn’t work at first, that a couple of highly publicized incidents and corporations that suffer for not regulating themselves properly, would ride that ship, so to speak?

Brickman: First of all, industry setting the standard is an idea that has been thrown around a while, and I think that it’s great to see us finally doing it in this area, because we know our stuff the best.

But as far as an incident indicating that it’s not working, I don’t think so. We’re going to try to set up a standard, whereby we’re providing public information about what our products do and what we do as far as best practices. At the end of the day the acquiring agency, or whatever, is going to have to make decisions, and they’re going to make intelligent decisions, based upon looking at folks that choose to go through this and folks that choose not to go through it.

It will continue

The bad news that continues to come out is going to continue to happen. The only thing that they’ll be able to do is to look to the companies that are the experts in this to try to help them with that, and they are going to get some of that with the companies that go through these evaluations. There’s no question about it.

At the end of the day, this accreditation program is going to shake out the products and companies that really do follow best practices for secure engineering and supply chain best practices.

Gardner: What should we expect next? As we heard, there has been a lot of activity here in Austin at the conference. We’ve got that white paper. We’re working towards more mature definitions and approaching certification and accreditation types of activities. What’s next? What milestone should we look to? Andras, this is for you.

Szakal: Around November, we’re going to be going through company review of the specification and we’ll be publishing that in the fourth quarter.

We’ll also be liaising with our government and international partners during that time and we’ll also be looking forward to several upcoming conferences within The Open Group where we conduct those activities. We’re going to solicit some of our partners to be speaking during those events on our behalf.

As we move into 2012, we’ll be working on the accreditation program, specifically the conformance criteria and the accreditation policy, and liaising again with some of our international partners on this particular issue. Hopefully we will, if all things go well and according to plan, come out of 2012 with a viable program.

Gardner: Dave Lounsbury, any further thoughts about next steps, what people should be looking for, or even where they should go for more information?

Lounsbury: Andras has covered it well. Of course, you can always learn more by going to www.opengroup.org and looking on our website for information about the OTTF. You can find drafts of all the documents that have been made public so far, and there will be our white paper and, of course, more information about how to become involved.

Gardner: Very good. We’ve been getting an update about The Open Group Trusted Technology Forum, OTTF, and seeing how this can have a major impact from a private sector perspective and perhaps head off issues about lack of trust and lack of clarity in a complex evolving technology ecosystem environment.

I’d like to thank our guests. We’ve been joined by Dave Lounsbury, Chief Technical Officer at The Open Group. Thank you, sir.

Lounsbury: Thank you, Dana.

Gardner: Steve Lipner, the Senior Director of Security Engineering Strategy in the Trustworthy

Computing Security Group at Microsoft. Thank you, Steve.

Lipner: Thanks, Dana.

Gardner: Joshua Brickman, who is the Director of the Federal Certification Program Office in CA Technologies, has also joined us. Thank you.

Brickman: I enjoyed it very much.

Gardner: And Andras Szakal, Vice President and CTO of IBM’s Federal Software Group. Thank you, sir.

Szakal: It’s my pleasure. Thank you very much, Dana.

Gardner: This discussion has come to you as a sponsored podcast in conjunction with The Open Group Conference in Austin, Texas. We are here the week of July 18, 2011. I want to thank our listeners as well. This is Dana Gardner, Principal Analyst at Interarbor Solutions. Don’t forget to come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com.

Copyright The Open Group 2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirect™ blogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

3 Comments

Filed under Cybersecurity, Supply chain risk

Government Outreach for Global Supply Chain Integrity (OTTF)

By Sally Long, The Open Group

On May 10th in London, a select group of technology, government and Cybersecurity leaders and supply chain strategists met for a lunchtime briefing and discussion during The Open Group Conference. The message that came across loud and clear by all who participated was that fostering honest and open dialogue between government and industry is critical to securing the global supply chain; and that the only way we will do this effectively is by working together to assure coordination and adoption among current and emerging approaches.

This industry/government roundtable event was the fourth in a series of planned events for government outreach. In December and January, members of The Open Group Trusted Technology Forum (OTTF) met with Howard Schmidt, US Cybersecurity Coordinator for the Obama Administration, and with US House and Senate Committees and the Department of Commerce. In March, there were some inroads made into the Japanese government, and in April we held a session with government officials in India. Coming up are more briefings and discussions planned for Europe, Canada, China and Brazil.

The event in London brought together representatives from Atsec, Boeing, CA Technologies, Capgemini, CESG, Chatham House, Cisco, Fraunhofer SIT, Fujitsu, Hewlett-Packard, IBM, IDA, Kingdee Software, Microsoft, MITRE, NASA, Oracle, Real IRM, SAIC, SAP, and the UK Government. These, along with thought leaders from Chatham House, discussed global supply-chain challenges and a potential solution through The Open Group Trusted Technology Provider Framework (O-TTPF). Other existing approaches were highlighted by CESG as effective in some areas, though those areas were not directly focused on supply-chain best practices.

The beauty of the O-TTPF, a set of best practices for engineering and secure development methods and supply chain integrity, is that the Framework and guidelines are being developed by industry — architects, developers, manufacturers and supply chain experts, with input from government(s) — for industry. The fact that these best practices will be open, international, publically available and translated where appropriate, will allow all providers to understand what they need to do to “Build with Integrity” – so that customers can “Buy with Confidence”.

This is critically important because as we all know, a chain is only as strong as its weakest link. Even though a large system vendor may follow the O-TTPF best practices, those vendors often rely on sub-component suppliers of software and hardware from around the world, and in order to maintain the integrity of their supply-chain their sub-suppliers need to understand what it means to be trustworthy as well.

One of the OTTF’s objectives is to develop an accreditation program, which will help customers, in government and industry, identify secure technology providers and products in the global supply chain. Governments and large enterprises that base their purchasing decisions on trusted technology providers who have developed their products using the best practices identified by the O-TTPF will be able to rely on a more comprehensive approach to risk management and product assurance when selecting COTS technology products.

One of the major messages at the Roundtable event was that the OTTF is not just about major industry providers. It’s about opening the doors to all providers and all customers, and it’s about reaching out to all governments to assure the O-TTPF best practice requirements are aligned with their acquisition requirements — so that there is true global recognition and demand for Trusted Technology Providers who conform to the O-TTPF Best Practices.

The OTTF members believe it is critical to reach out to governments around the world, to foster industry-government dialogue about government acquisition requirements for trusted technology and trusted technology providers, so they can enable the global recognition required for a truly secure global supply chain. Any government or government agency representative interested in working together to provide a trusted global supply chain can contact the OTTF global outreach and acquisition team through ottf-interest@opengroup.org.

The Forum operates under The Open Group, an international vendor- and technology-neutral consortium well known for providing an open and collaborative environment for such work. We are seeking additional participants from global government and commercial entities. If you are interested in learning more about the Forum please feel free to contact me, Sally Long, OTTF Forum Director, at s.long@opengroup.org.

Sally Long, Director of Consortia Services at The Open Group, has been managing customer-vendor forums and collaborative development projects for the past nineteen years. She was the Release Engineering Section Manager for all collaborative, multi-vendor, development projects (OSF/1, DME, DCE, and Motif) at The Open Software Foundation (OSF), in Cambridge Massachusetts.  Following the merger of OSF and X/Open under The Open Group, Sally served as the Program Director for multiple Forums within The Open Group including: The Distributed Computing Environment (DCE) Forum, The Enterprise Management Forum, The Quality of Service (QoS) Task Force, The Real-time and Embedded Systems Forum and most recently the Open Group Trusted Technology Forum. Sally has also been instrumental in business development and program definition for certification programs developed and operated by The Open Group for the North American State and Provincial Lotteries Association (NASPL) and for the Near Field Communication (NFC) Forum. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts, and a Bachelor of Science degree in Occupational Therapy from The Ohio State University.

1 Comment

Filed under Cybersecurity, Supply chain risk

Thoughts on reorienting Enterprise Architecture

By Raghuraman Krishnamurthy, Cognizant Technology Solutions

My last post generated an interesting comment from Tom, who found resonance in one of the thoughts (“EA should be a wider discipline”) that I had mentioned: the need to look at the world and learn from the leaner supply chain efficiencies realized by the manufacturing segment. In an interesting post on securing global technology supply chains, Andras Szakal of IBM talks about taking cues from industry associations. An effort like this that attempts to learn from the established efficiencies of other supply chains will be a great step forward.

In the flat world, innovations can occur anywhere and enterprises must look at how to take best advantage of that. The emergence of the Internet and richer channels of communication have created new fundamental forces that any enterprise needs to contend with:

  • Getting closer to your customer
  • Innovations in service/product offerings
  • Achieve high operational efficiency
  • Building of brand perception: Regulatory/social/environmental consciousness

Taking the case of the pharmaceutical industry, one can see how all the above forces have resulted in dramatic transformations. For pharmaceuticals, where the earlier customer segments were the prescribing physician and the payers, a new brand segment has emerged: the patients. Technology advances have enabled formation of virtual communities where the efficacies of drugs are debated by patients, and this has created powerful brand perception that pharmaceuticals cannot afford to ignore. Pharmaceuticals and providers are coming together to create innovative service offerings and are beginning to experiment with outcome-based payment plans. Regulators and the public demand more transparency in the clinical trial processes and sharing of safety data. The need for higher operational efficiency has resulted in more and more outsourcing of business processes. This has created a fluxed enterprise boundary as many competencies are outside the traditional realm of the enterprise.

Successful EA needs to demonstrate a deep understanding of these fundamental forces. Whatever modeling or tools or methodologies one may adapt for EA, there should be a connect to these fundamental forces that shape the thinking of the enterprise. I think it is very important that we be aware of the forces of influence of our enterprises and position EA to help manage these forces. Of course, it is a long call to have practical demonstrations of this, but an effort in this direction demonstrates staying relevant and contributing in a powerful way. EA is truly at the cusp of transformation.

Enterprise Architecture will be a topic of discussion at The Open Group India Conference next week in Chennai (March 7), Hyderabad (March 9) and Pune (March 11). Join us for best practices and case studies in the areas of Enterprise Architecture, Security, Cloud and Certification, presented by preeminent thought leaders in the industry.

Raghuraman Krishnamurthy works as a Principal Architect at Cognizant Technology Solutions and is based in India. He can be reached at Raghuraman.krishnamurthy2@cognizant.com.

3 Comments

Filed under Enterprise Architecture

Cloud security and risk management

by Varad G. Varadarajan, Cognizant Technology Solutions

Are you ready to move to the Cloud?

Risk management and cost control are two key issues facing CIOs and CTOs today. Both these issues come into play in Cloud Computing, and present an interesting dilemma for IT leaders at large corporations.

The elastic nature of the Cloud, the conversion of Capex to Opex and the managed security infrastructure provided by the Cloud service provider make it very attractive for hosting applications. However, there are a number of security and privacy issues that companies need to grapple with before moving to the Cloud.

For example, multi-tenancy and virtualization are great technologies for lowering the cost of hosting applications, and the service providers that would like to use them. However, these technologies also pose grave security risks because companies operate in a shared infrastructure that offers very little isolation. They greatly increase the target attack surface, which is a hacker’s dream come true.

Using multiple service providers on the Cloud is great for providing redundancy, connecting providers in a supply chain or handling spikes in services via Cloud bursts. However, managing identities across multiple providers is a challenge.  Making sure data does not accidentally cross trust boundaries is another difficult problem.

Likewise, there are many challenges in the areas of:

  • Choosing the right service / delivery model (and its security implications)
  • Key management and distribution
  • Governance and Compliance of the service provider
  • Vendor lock-in
  • Data privacy (e.g. regulations governing the offshore-ability of data)
  • Residual risks

In my presentation at The Open Group India Conference next week, I will discuss these and many other interesting challenges facing CIOs regarding Cloud adoption. I will present a five step approach that enterprises can use to select assets, assess risks, map them to service providers and manage the risks through contract negotiation, SLAs and regular monitoring.

Cloud Computing will be a topic of discussion at The Open Group India Conference in Chennai (March 7), Hyderabad (March 9) and Pune (March 11). Join us for best practices and case studies in the areas of Enterprise Architecture, Security, Cloud and Certification, presented by preeminent thought leaders in the industry.

Varad is a senior IT professional with 22 years of experience in Technology Management, Practice Development, Business Consulting, Architecture, Software Development and Entrepreneurship. He has led consulting assignments in IT Transformation, Architecture, and IT Strategy/Blueprinting at global companies across a broad range of industries and domains. He holds an MBA (Stern School of Business, New York), M.S Computer Science (G.W.U/Stanford California) and B.Tech (IIT India).

Comments Off

Filed under Cloud/SOA

PODCAST: Cloud Computing panel forecasts transition phase for Enterprise Architecture

By Dana Gardner, Interabor Solutions

Listen to this recorded podcast here: BriefingsDirect-Open Group Cloud Panel Forecasts Transition Phase for Enterprise IT

The following is the transcript of a sponsored podcast panel discussion on newly emerging Cloud models and their impact on business and government, from The Open Group Conference, San Diego 2011.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

We now present a sponsored podcast discussion coming to you live from The Open Group 2011 Conference in San Diego. We’re here the week of February 7, and we have assembled a distinguished panel to examine the expectation of new types of cloud models and perhaps cloud specialization requirements emerging quite soon.

By now, we’re all familiar with the taxonomy around public cloud, private cloud, software as a service (SaaS), platform as a service (PaaS), and my favorite, infrastructure as a service (IaaS), but we thought we would do you all an additional service and examine, firstly, where these general types of cloud models are actually gaining use and allegiance, and we’ll look at vertical industries and types of companies that are leaping ahead with cloud, as we now define it. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Then, second, we’re going to look at why one-size-fits-all cloud services may not fit so well in a highly fragmented, customized, heterogeneous, and specialized IT world.

How much of cloud services that come with a true price benefit, and that’s usually at scale and cheap, will be able to replace what is actually on the ground in many complex and unique enterprise IT organizations?

What’s more, we’ll look at the need for cloud specialization, based on geographic and regional requirements, as well as based on the size of these user organizations, which of course can vary from 5 to 50,000 seats. Can a few types of cloud work for all of them?

Please join me now in welcoming our panel. Here to help us better understand the quest for “fit for purpose” cloud balance and to predict, at least for some time, the considerable mismatch between enterprise cloud wants and cloud provider offerings we’re here with Penelope Gordon, the cofounder of 1Plug Corporation, based in San Francisco. Welcome, Penelope.

Penelope Gordon: Thank you.

Gardner: We’re also here with Mark Skilton. He is the Director of Portfolio and Solutions in the Global Infrastructure Services with Capgemini in London. Thank you for coming, Mark.

Mark Skilton: Thank you.

Gardner: Ed Harrington joins us. He is the Principal Consultant in Virginia for the UK-based Architecting the Enterprise organization. Thank you, Ed.

Ed Harrington: Thank you.

Gardner: Tom Plunkett is joining us. He is a Senior Solution Consultant with Oracle in Huntsville, Alabama.

Tom Plunkett: Thank you, Dana.

Gardner: And lastly, we’re here with TJ Virdi. He is Computing Architect in the CAS IT System Architecture Group at Boeing based in Seattle. Welcome.

TJ Virdi: Thank you.

Gardner: Let me go first to you, Mark Skilton. One size fits all has rarely worked in IT. If it has, it has been limited in its scope and, most often, leads to an additional level of engagement to make it work with what’s already there. Why should cloud be any different?

Three areas

Skilton: Well, Dana, from personal experience, there are probably three areas of adaptation of cloud into businesses. For sure, there are horizontal common services to which, what you call, the homogeneous cloud solution could be applied common to a number of business units or operations across a market.

But, we’re starting to increasingly see the need for customization to meet vertical competitive needs of a company or the decisions within that large company. So, differentiation and business models are still there, they are still in platform cloud as they were in the pre-cloud era.

But, the key thing is that we’re seeing a different kind of potential that a business can do now with cloud — a more elastic, explosive expansion and contraction of a business model. We’re seeing fundamentally the operating model of the business growing, and the industry can change using cloud technology.

So, there are two things going on in the business and the technologies are changing because of the cloud.

Gardner: Well, for us to understand where it fits best, and perhaps not so good, is to look at where it’s already working. Ed, you talked about the federal government. They seem to be going like gangbusters in the cloud. Why so?

Harrington: Perceived cost savings, primarily. The (US) federal government has done some analysis. In particular, the General Services Administration (GSA), has done some considerable analysis on what they think they can save by going to, in their case, a public cloud model for email and collaboration services. They’ve issued a $6.7 million contract to Unisys as the systems integrator, with Google being the cloud services supplier.

So, the debate over the benefits of cloud, versus the risks associated with cloud, is still going on quite heatedly.

Gardner: How about some other verticals? Where is this working? We’ve seen in some pharma, health-care, and research environments, which have a lot of elasticity, it makes sense, given that they have very variable loads. Any other suggestions on where this works, Tom?

Plunkett: You mentioned variable workloads. Another place where we are seeing a lot of customers approach cloud is when they are starting a new project. Because then, they don’t have to migrate from the existing infrastructure. Instead everything is brand new. That’s the other place where we see a lot of customers looking at cloud, your greenfields.

Gardner: TJ, any verticals that you are aware of? What are you seeing that’s working now?

Virdi: It’s not probably related with any vertical market, but I think what we are really looking for speed to put new products into the market or evolve the products that we already have and how to optimize business operations, as well as reduce the cost. These may be parallel to any vertical industries, where all these things are probably going to be working as a cloud solution.

Gardner: We’ve heard the application of “core and context” to applications, but maybe there is an application of core and context to cloud computing, whereby there’s not so much core and lot more context. Is that what you’re saying so far?

Unstructured data

Virdi: In a sense, you would have to measure not only the structured documents or structured data, but unstructured data as well. How to measure and create a new product or solutions is the really cool things you would be looking for in the cloud. And, it has proved pretty easy to put a new solution into the market. So, speed is also the big thing in there.

Gardner: Penelope, use cases or verticals where this is working so far?

Gordon: One example in talking about core and context is when you look in retail. You can have two retailers like a Walmart or a Costco, where they’re competing in the same general space, but are differentiating in different areas.

Walmart is really differentiating on the supply chain, and so it’s not a good candidate for public cloud computing solutions. We did discuss it that might possibly be a candidate for private cloud computing.

But that’s really where they’re going to invest in the differentiating, as opposed to a Costco, where it makes more sense for them to invest in their relationship with their customers and their relationship with their employees. They’re going to put more emphasis on those business processes, and they might be more inclined to outsource some of the aspects of their supply chain.

A specific example within retail is pricing optimization. A lot of grocery stores need to do pricing optimization checks once a quarter, or perhaps once a year in some of their areas. It doesn’t makes sense for smaller grocery store chains to have that kind of IT capability in house. So, that’s a really great candidate, when you are looking at a particular vertical business process to outsource to a cloud provider who has specific industry domain expertise.

Gardner: So for small and medium businesses (SMBs) that would be more core for them than others.

Gordon: Right. That’s an example, though, where you’re talking about what I would say is a particular vertical business process. Then, you’re talking about a monetization strategy and then part of the provider, where they are looking more at a niche strategy, rather than a commodity, where they are doing a horizontal infrastructure platform.

Gardner: Ed, you had a thought?

Harrington: Yeah, and it’s along the SMB dimension. We’re seeing a lot of cloud uptake in the small businesses. I work for a 50-person company. We have one “sort of” IT person and we do virtually everything in the cloud. We’ve got people in Australia and Canada, here in the States, headquartered in the UK, and we use cloud services for virtually everything across that. I’m associated with a number of other small companies and we are seeing big uptake of cloud services.

Gardner: Allow me to be a little bit of a skeptic, because I’m seeing these reports from analyst firms on the tens of billions of dollars in potential cloud market share and double-digit growth rates for the next several years. Is this going to come from just peripheral application context activities, mostly SMBs? What about the core in the enterprises? Does anybody have an example of where cloud is being used in either of those?

Skilton: In the telecom sector, which is very IT intensive, I’m seeing the emergence of their core business of delivering service to a large end user or multiple end user channels, using what I call cloud brokering.

Front-end cloud

So, if where you’re going with your question is that, certainly in the telecom sector we’re seeing the emergence of front end cloud, customer relationship management (CRM) type systems and also sort of back-end content delivery engines using cloud.

The fundamental shift away from the service orientated architecture (SOA) era is that we’re seeing more business driven self-service, more deployment of services as a business model, which is a big difference of the shift of the cloud. Particularly in telco, we’re seeing almost an explosion in that particular sector.

Gordon: A lot of companies don’t even necessarily realize that they’re using cloud services, particularly when you talk about SaaS. There are a number of SaaS solutions that are becoming more and more ubiquitous. If you look at large enterprise company recruiting sites, often you will see Taleo down at the bottom. Taleo is a SaaS. So, that’s a cloud solution, but it’s just not thought necessarily of in that context.

Gardner: Right. Tom?

Plunkett: Another place we’re seeing a lot of growth with regards to private clouds is actually on the defense side. The Defense Department is looking at private clouds, but they also have to deal with this core and context issue. We’re in San Diego today. The requirements for a shipboard system are very different from the land-based systems.

Ships have to deal with narrow bandwidth and going disconnected. They also have to deal with coalition partners or perhaps they are providing humanitarian assistance and they are dealing even with organizations we wouldn’t normally consider military. So, they have to deal with lots of information, assurance issues, and have completely different governance concerns that we normally think about for public clouds.

Gardner: However, in the last year or two, the assumption has been that this is something that’s going to impact every enterprise, and everybody should get ready. Yet, I’m hearing mostly this creeping in through packaged applications on a on-demand basis, SMBs, greenfield organizations, perhaps where high elasticity is a requirement.

What would be necessary for these cloud providers to be able to bring more of the core applications the large enterprises are looking for? What’s the new set of requirements? As I pointed out, we have had a general category of SaaS and development, elasticity, a handful of infrastructure services. What’s the next set of requirements that’s going to make it palatable for these core activities and these large enterprises to start doing this? Let me start with you, Penelope.

Gordon: It’s an interesting question and it was something that we were discussing in a session yesterday afternoon. Here is a gentleman from a large telecommunications company, and from his perspective, trust was a big issue. To him, part of it was just an immaturity of the market, specifically talking about what the new style of cloud is and that branding. Some of the aspects of cloud have been around for quite some time.

Look at Linux adoption as an analogy. A lot of companies started adopting Linux, but it was for peripheral applications and peripheral services, some web services that weren’t business critical. It didn’t really get into the core enterprise until much later.

We’re seeing some of that with cloud. It’s just a much bigger issue with cloud, especially as you start looking at providers wanting to moving up the food chain and providing greater value. This means that they have to have more industry knowledge and that they have to have more specialization. It becomes more difficult for large enterprises to trust a vendor to have that kind of knowledge.

No governance

Another aspect of what came up in the afternoon is that, at this point, while we talk about public cloud specifically, it’s not the same as saying it’s a public utility. We talk about “public utility,” but there is no governance, at this point, to say, “Here is certification that these companies have been tested to meet certain delivery standards.” Until that exists, it’s going to be difficult for some enterprises to get over that trust issue.

Gardner: Assuming that the trust and security issues are worked out over time, that experience leads to action, it leads to trust, it leads to adoption, and we have already seen that with SaaS applications. We’ve certainly seen it with the federal government, as Ed pointed out earlier.

Let’s just put that aside as one of the requirements that’s already on the drawing board and that we probably can put a checkmark next to at some point. What’s next? What about customization? What about heterogeneity? What about some of these other issues that are typical in IT, Mark Skilton?

Skilton: One of the under-played areas is PaaS. We hear about lock-in of technology caused by the use of the cloud, either putting too much data in or doing customization of parameters and you lose the elastic features of that cloud.

As to your question about what do vendors or providers need to do more to help the customer use the cloud, the two things we’re seeing are: one, more of an appliance strategy, where they can buy modular capabilities, so the licensing issue, solutioning issue, is more contained. The client can look at it more in a modular appliance sort of way. Think of it as cloud in a box.

The second thing is that we need to be seeing is much more offering transition services, transformation services, to accelerate the use of the cloud in a safe way, and I think that’s something that we need to really push hard to do. There’s a great quote from a client, “It’s not the destination, it’s the journey to the cloud that I need to see.”

Gardner: You mentioned PaaS. We haven’t seen too much yet with a full mature offering of the full continuum of PaaS to IaaS. That’s one where new application development activities and new integration activities would be built of, for, and by the cloud and coordinated between the dev and the ops, with the ops being any number of cloud models — on-premises, off-premises, co-lo, multi-tenancy, and so forth.

So what about that? Is that another requirement that there is continuity between the past and the infrastructure and deployment, Tom?

Plunkett: We’re getting there. PaaS is going to be a real requirement going forward, simply because that’s going to provide us the flexibility to reach some of those core applications that we were talking about before. The further you get away from the context, the more you’re focusing on what the business is really focused in on, and that’s going to be the core, which is going to require effective PaaS.

Gardner: TJ.

More regulatory

Virdi: I want to second that, but at the same time, we’re looking for more regulatory and other kind of licensing and configuration issues as well. Those also make it a little better to use the cloud. You don’t really have to buy, or you can go for the demand. You need to make your licenses a little bit better in such a way that you can just put the product or business solutions into the market, test the water, and then you can go further on that.

Gardner: Penelope, where do you see any benefit of having a coordinated or integrated platform and development test and deploy functions? Is that going to bring this to a more core usage in large enterprises?

Gordon: It depends. I see a lot more of the buying of cloud moving out to the non-IT line of business executives. If that accelerates, there is going to be less and less focus. Companies are really separating now what is differentiating and what is core to my business from the rest of it.

There’s going to be less emphasis on, “Let’s do our scale development on a platform level” and more, “Let’s really seek out those vendors that are going to enable us to effectively integrate, so we don’t have to do double entry of data between different solutions. Let’s look out for the solutions that allow us to apply the governance and that effectively let us tailor our experience with these solutions in a way that doesn’t impinge upon the provider’s ability to deliver in a cost effective fashion.”

That’s going to become much more important. So, a lot of the development onus is going to be on the providers, rather than on the actual buyers.

Gardner: Now, this is interesting. On one hand, we have non-IT people, business people, specifying, acquiring, and using cloud services. On the other hand we’re perhaps going to see more PaaS, the new application development, be it custom or more of a SaaS type of offering that’s brought in with a certain level of adjustment and integration. But, these are going off without necessarily any coordination. At some point, they are going to even come together. It’s inevitable, another “integrationness” perhaps.

Mark Skilton, is that what you see, that we have not just one cloud approach but multiple approaches and then some need to rationalize?

Skilton: There are two key points. There’s a missing architecture practice that needs to be there, which is a workers analysis, so that you design applications to fit specific infrastructure containers, and you’ve got a bridge between the the application service and the infrastructure service. There needs to be a piece of work by enterprise architects that starts to bring that together as a deliberate design for applications to be able to operate in the cloud, and the PaaS platform is a perfect environment.

The second thing is that there’s a lack of policy management in terms of technical governance, and because of the lack of understanding, there needs to be more of a matching exercise going on. The key thing is that that needs to evolve.

Part of the work we’re doing in The Open Group with the Cloud Computing Work Group is to develop new standards and methodologies that bridge those gaps between infrastructure, PaaS, platform development, and SaaS.

Gardner: We already have the Trusted Technology Forum. Maybe soon we’ll see an open trusted cloud technology forum.

Skilton: I hope so.

Gardner: Ed Harrington, you mentioned earlier that the role of the enterprise architect is going to benefit from cloud. Do you see what we just described in terms of dual tracks, multiple inception points, heterogeneity, perhaps overlap and redundancy? Is that where the enterprise architect flourishes?

Shadow IT

Harrington: I think we talked about line management IT getting involved in acquiring cloud services. If you think we’ve got this thing called “shadow IT” today, wait a few years. We’re going to have a huge problem with shadow IT.

From the architect’s perspective, there’s lot to be involved with and a lot to play with, as I said in my talk. There’s an awful lot of analysis to be done — what is the value that the cloud solution being proposed is going to be supplying to the organization in business terms, versus the risk associated with it? Enterprise architects deal with change, and that’s what we’re talking about. We’re talking about change, and change will inherently involve risk.

Gardner: TJ.

Virdi: All these business decisions are going to be coming upstream, and business executives need to be more aware about how cloud could be utilized as a delivery model. The enterprise architects and someone with a technical background needs to educate or drive them to make the right decisions and choose the proper solutions.

It has an impact how you want to use the cloud, as well as how you get out of it too, in case you want to move to different cloud vendors or providers. All those things come into play upstream rather than downstream.

Gardner: We all seem to be resigned to this world of, “Well, here we go again. We’re going to sit back and wait for all these different cloud things to happen. Then, we’ll come in, like the sheriff on the white horse, and try to rationalize.” Why not try to rationalize now before we get to that point? What could be done from an architecture standpoint to head off mass confusion around cloud? Let me start at one end and go down the other. Tom?

Plunkett: One word: governance. We talked about the importance of governance increasing as the IT industry went into SOA. Well, cloud is going to make it even more important. Governance throughout the lifecycle, not just at the end, not just at deployment, but from the very beginning.

Gardner: TJ.

Virdi: In addition to governance, you probably have to figure out how you want to plan to adapt to the cloud also. You don’t want to start as a Big Bang theory. You want to start in incremental steps, small steps, test out what you really want to do. If that works, then go do the other things after that.

Gardner: Penelope, how about following the money? Doesn’t where the money flows in and out of organizations tend to have a powerful impact on motivating people or getting them moving towards governance or not?

Gordon: I agree, and towards that end, it’s enterprise architects. Enterprise architects need to break out of the idea of focusing on how to address the boundary between IT and the business and talk to the business in business terms.

One way of doing that that I have seen as effective is to look at it from the standpoint of portfolio management. Where you were familiar with financial portfolio management, now you are looking at a service portfolio, as well as looking at your overall business and all of your business processes as a portfolio. How can you optimize at a macro level for your portfolio of all the investment decisions you’re making, and how the various processes and services are enabled? Then, it comes down to, as you said, a money issue.

Gardner: Perhaps one way to head off what we seem to think is an inevitable cloud chaos situation is to invoke more shared services, get people to consume services and think about how to pay for them along the way, regardless of where they come from and regardless of who specified them. So back to SOA, back to ITIL, back to the blocking and tackling that’s just good enterprise architecture. Anything to add to that, Mark?

Not more of the same

Skilton: I think it’s a mistake to just describe this as more of the same. ITIL, in my view, needs to change to take into account self-service dynamics. ITIL is kind of a provider service management process. It’s thing that you do to people. Cloud changes that direction to the other way, and I think that’s something that needs to be done.

Also, fundamentally the data center and network strategies need to be in place to adopt cloud. From my experience, the data center transformation or refurbishment strategies or next generation networks tend to be done as a separate exercise from the applications area. So a strong, strong recommendation from me would be to drive a clear cloud route map to your data center.

Gardner: So, perhaps a regulating effect on the self-selection of cloud services would be that the network isn’t designed for it and it’s not going to help.

Skilton: Exactly.

Gardner: That’s one way to govern your cloud. Ed Harrington, any other further thoughts on working towards a cloud future without the pitfalls?

Harrington: Again, the governance, certification of some sort. I’m not in favor of regulation, but I am in favor of some sort of third party certification of services that consumers can rely upon safely. But, I will go back to what I said earlier. It’s a combination of governance, treating the cloud services as services per se, and enterprise architecture.

Gardner: What about the notion that was brought up earlier about private clouds being an important on-ramp to this? If I were a public cloud provider, I would do my market research on what’s going on in the private clouds, because I think they are going to be incubators to what might then become hybrid and ultimately a full-fledged third-party public cloud providing assets and services.

What can we learn from looking at what’s going on with private cloud now, seemingly a lot of trying to reduce cost and energy consumption, but what does that tell us about what we should expect in the next few years? Again, let’s start with you, Tom.

Plunkett: What we’re seeing with private cloud is that it’s actually impacting governance, because one of the things that you look at with private cloud is chargeback between different internal customers. This is forcing these organizations to deal with complex money, business issues that they don’t really like to do.

Nowadays, it’s mostly vertical applications, where you’ve got one owner who is paying for everything. Now, we’re actually going back to, as we were talking about earlier, dealing with some of the tricky issues of SOA.

Gardner: TJ, private cloud as an incubator. What we should expect?

Securing your data

Virdi: Configuration and change management — how in the private cloud we are adapting to it and supporting different customer segments is really the key. This could be utilized in the public cloud too, as well as how you are really securing your information and data or your business knowledge. How you want to secure that is key, and that’s why the private cloud is there. If we can adapt to or mimic the same kind of controls in the public cloud, maybe we’ll have more adoptions in the public cloud too.

Gardner: Penelope, any thoughts on that, the private to public transition?

Gordon: I also look at it in a little different way. For example, in the U.S., you have the National Security Agency (NSA). For a lot of what you would think of as their non-differentiating processes, for example payroll, they can’t use ADP. They can’t use that SaaS for payroll, because they can’t allow the identities of their employees to become publicly known.

Anything that involves their employee data and all the rest of the information within the agency has to be kept within a private cloud. But, they’re actively looking at private cloud solutions for some of the other benefits of cloud.

In one sense, I look at it and say that private cloud adoption to me tells a provider that this is an area that’s not a candidate for a public-cloud solution. But, private clouds could also be another channel for public cloud providers to be able to better monetize what they’re doing, rather than just focusing on public cloud solutions.

Gardner: So, then, you’re saying this is a two-way street. Just as we could foresee someone architecting a good private cloud and then looking to take that out to someone else’s infrastructure, you’re saying there is a lot of public services that for regulatory or other reasons might then need to come back in and be privatized or kept within the walls. Interesting.

Mark Skilton, any thoughts on this public-private tension and/or benefit?

Skilton: I asked an IT service director the question about what was it like running a cloud service for the account. This is a guy who had previously been running hosting and management and with many years experience.

The surprising thing was that he was quite shocked that the disciplines that he previously had for escalating errors and doing planned maintenance, monitoring, billing and charging back to the customer fundamentally were changing, because it had to be done more in real time. You have to fix before it fails. You can’t just wait for it to fail. You have to have a much more disciplined approach to running a private cloud.

The lessons that we’re learning in running private clouds for our clients is the need to have a much more of a running-IT-as-a-business ethos and approach. We find that if customers try to do it themselves, either they may find that difficult, because they are used to buying that as a service, or they have to change their enterprise architecture and support service disciplines to operate the cloud.

Gardner: Perhaps yet another way to offset potential for cloud chaos in the future is to develop the core competencies within the private-cloud environment and do it sooner rather than later? This is where you can cut your teeth or get your chops, some number of metaphors come to mind, but this is something that sounds like a priority. Would you agree with that Ed, coming up with a private-cloud capability is important?

Harrington: It’s important, and it’s probably going to dominate for the foreseeable future, especially in areas that organizations view as core. They view them as core, because they believe they provide some sort of competitive advantage or, as Penelope was saying, security reasons. ADP’s a good idea. ADP could go into NSA and set up a private cloud using ADP and NSA. I think is a really good thing.

Trust a big issue

But, I also think that trust is still a big issue and it’s going to come down to trust. It’s going to take a lot of work to have anything that is perceived by a major organization as core and providing differentiation to move to other than a private cloud.

Gardner: TJ.

Virdi: Private clouds actually allow you to make more business modular. Your capability is going to be a little bit more modular and interoperability testing could happen in the private cloud. Then you can actually use those same kind of modular functions, utilize the public cloud, and work with other commercial off-the-shelf (COTS) vendors that really package this as new holistic solutions.

Gardner: Does anyone consider the impact of mergers and acquisitions on this? We’re seeing the economy pick up, at least in some markets, and we’re certainly seeing globalization, a very powerful trend with us still. We can probably assume, if you’re a big company, that you’re going to get bigger through some sort of merger and acquisition activity. Does a cloud strategy ameliorate the pain and suffering of integration in these business mergers, Tom?

Plunkett: Well, not to speak on behalf of Oracle, but we’ve gone through a few mergers and acquisitions recently, and I do believe that having a cloud environment internally helps quite a bit. Specifically, TJ made the earlier point about modularity. Well, when we’re looking at modules, they’re easier to integrate. It’s easier to recompose services, and all the benefits of SOA really.

Gardner: TJ, mergers and acquisitions in cloud.

Virdi: It really helps. At the same time, we were talking about legal and regulatory compliance stuff. EU and Japan require you to put the personally identifiable information (PII) in their geographical areas. Cloud could provide a way to manage those things without having the hosting where you have your own business.

Gardner: Penelope, any thoughts, or maybe even on a slightly different subject, of being able to grow rapidly vis-à-vis cloud experience and expertise and having architects that understand it?

Gordon: Some of this comes back to some of the discussions we were having about the extra discipline that comes into play, if you are going to effectively consume and provide cloud services, if you do become much more rigorous about your change management, your configuration management, and if you then apply that out to a larger process level.

So, if you define certain capabilities within the business in a much more modular fashion, then, when you go through that growth and add on people, you have documented procedures and processes. It’s much easier to bring someone in and say, “You’re going to be a product manager, and that job role is fungible across the business.”

That kind of thinking, the cloud constructs applied up at a business architecture level, enables a kind of business expansion that we are looking at.

Gardner: Mark Skilton, thoughts about being able to manage growth, mergers and acquisitions, even general business agility vis-à-vis more cloud capabilities.

Skilton: Right now, I’m involved in merging in a cloud company that we bought last year in May, and I would say yes and no. The no point is that I’m trying to bundle this service that we acquired in each product and with which we could add competitive advantage to the services that we are offering. I’ve had a problem with trying to bundle that into our existing portfolio. I’ve got to work out how they will fit and deploy in our own cloud. So, that’s still a complexity problem.

Faster launch

But, the upside is that I can bundle that service that we acquired, because we wanted to get that additional capability, and rewrite design techniques for cloud computing. We can then launch that bundle of new service faster into the market.

It’s kind of a mixed blessing with cloud. With our own cloud services, we acquire these new companies, but we still have the same IT integration problem to then exploit that capability we’ve acquired.

Gardner: That might be a perfect example of where cloud is or isn’t. When you run into the issue of complexity and integration, it doesn’t compute, so to speak.

Skilton: It’s not plug and play yet, unfortunately.

Gardner: Ed, what do you think about this growth opportunity, mergers and acquisitions, a good thing or bad thing?

Harrington: It’s a challenge. I think, as Mark presented it, it’s got two sides. It depends a lot on how close the organizations are, how close their service portfolios are, to what degree has each of the organizations adapted the cloud, and is that going to cause conflict as well. So I think there is potential.

Skilton: Each organization in the commercial sector can have different standards, and then you still have that interoperability problem that we have to translate to make it benefit, the post merger integration issue.

Gardner: We’ve been discussing the practical requirements of various cloud computing models, looking at core and context issues where cloud models would work, where they wouldn’t. And, we have been thinking about how we might want to head off the potential mixed bag of cloud models in our organizations and what we can do now to make the path better, but perhaps also make our organizations more agile, service oriented, and able to absorb things like rapid growth and mergers.

I’d like to thank you all for joining and certainly want to thank our guests. This is a sponsored podcast discussion coming to you from The Open Group’s 2011 Conference in San Diego. We’re here the week of February 7, 2011. A big thank you now to Penelope Gordon, cofounder of 1Plug Corporation. Thanks.

Gordon: Thank you.

Gardner: Mark Skilton, Director of Portfolio and Solutions in the Global Infrastructure Services with Capgemini. Thank you, Mark.

Skilton: Thank you very much.

Gardner: Ed Harrington, Principal Consultant in Virginia for the UK-based Architecting the Enterprise.

Harrington: Thank you, Dana.

Gardner: Tom Plunkett, Senior Solution Consultant with Oracle. Thank you.

Plunkett: Thank you, Dana.

Gardner: TJ Virdi, the Computing Architect in the CAS IT System Architecture group at Boeing.

Virdi: Thank you.

Gardner: I’m Dana Gardner, Principal Analyst at Interarbor Solutions. You’ve been listening to a sponsored BriefingsDirect podcast. Thanks for joining, and come back next time.

Copyright The Open Group and Interarbor Solutions, LLC, 2005-2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirectblogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

Comments Off

Filed under Cloud/SOA, Enterprise Architecture

PODCAST: Exploring the role and impact of the Open Trusted Technology Framework (OTTF)

by Dana Gardner, Interarbor Solutions

Listen to this recorded podcast here: BriefingsDirect-Discover the Open Trusted Technology Provider Framework

The following is the transcript of a sponsored podcast panel discussion from The Open Group Conference, San Diego 2011 on The Open Trusted Technology Forum and its impact on business and government.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion in conjunction with The Open Group Conference held in San Diego, the week of February 7, 2011. We’ve assembled a panel to examine The Open Group’s new Open Trusted Technology Forum (OTTF), which was established in December.

The forum is tasked with finding ways to better conduct global procurement and supply-chain commerce among and between technology acquirers and buyers and across the ecosystem of technology providers. By providing transparency, collaboration, innovation, and more trust on the partners and market participants in the IT environment, the OTTF will lead to improved business risk for global supply activities in the IT field.

We’ll examine how the OTTF will function, what its new framework will be charged with providing, and we will examine ways that participants in the global IT commerce ecosystem can become involved with and perhaps use the OTTF’s work to its advantage.

Here with us to delve into the mandate and impact of the Trusted Technology Forum, we’re here with Dave Lounsbury. He is the Chief Technology Officer for The Open Group. Welcome, Dave.

Dave Lounsbury: Hi, Dana. How are you?

Gardner: I’m great. We’re also here with Steve Lipner, the Senior Director of Security Engineering Strategy in Microsoft’s Trustworthy Computing Group. Welcome, Steve.

Steve Lipner: Hi, Dana. Glad to be here.

Gardner: And, we’re also here with Andras Szakal, the Chief Architect in IBM’s Federal Software Group and an IBM distinguished engineer. Welcome.

Andras Szakal: Welcome. Thanks for having me.

Gardner: We’re also here with Carrie Gates, Vice President and Research Staff Member at CA Labs. Welcome.Carrie Gates: Thank you.

Gardner: Let’s start with you, Dave. Tell us in a nutshell what the OTTF is and why it came about?

Lounsbury: The OTTF is a group that came together under the umbrella of The Open Group to identify and develop standards and best practices for trusting supply chain. It’s about how one consumer in a supply chain could trust their partners and how they will be able to indicate their use of best practices in the market, so that people who are buying from the supply chain or buying from a specific vendor will be able to know that they can procure this with a high level of confidence.

Gardner: Clearly, people have been buying these sorts of products for some time. What’s new? What’s changed that makes this necessary?

Concerns by DoD

Lounsbury: There are a couple of dimensions on it, and I will start this off because the other folks in the room are far more expert in this than I am.

This actually started a while ago at The Open Group by a question from the U.S. Department of Defense (DoD), which faced the challenge of buying commercial off-the-shelf product. Obviously, they wanted to take advantage of the economies of scale and the pace of technology in the commercial supply chain, but realized that means they’re not going to get purpose-built equipment, that they are going to buy things from a global supply chain.

They asked, “What would we look for in these things that we are buying to know that people have used good engineering practices and good supply chain management practices? Do they have a good software development methodology? What would be those indicators?”

Now, that was a question from the DoD, but everybody is on somebody’s supply chain. People buy components. The big vendors buy components from smaller vendors. Integrators bring multiple systems together.

So, this is a really broad question in the industry. Because of that, we felt the best way to address this was bring together a broad spectrum of industry to come in, identify the practices that they have been using — your real, practical experience — and bring that together within a framework to create a standard for how we would do that.

Gardner: And this is designed with that word “open” being important to being inclusive. This is about a level playing field, but not necessarily any sort of exclusionary affair.

Lounsbury: Absolutely. Not only is the objective of all The Open Group activities to produce open standards and conformance programs that are available to everyone,

but in this case, because we are dealing with a global supply chain, we know that we are going to have not only vendors at all scales, but also vendors from all around the world.

If you pick up any piece of technology, it will be designed in the USA, assembled in Mexico, and built in China. So we need that international and global dimension in production of this set of standards as well.

Gardner: Andras, you’ve been involved with this quite a bit. For the edification of our listeners, is this mostly software we’re talking about? Is it certain components? Can we really put a bead on what will be the majority of technologies that would probably be affected?

Szakal: That’s a great question, Dana. I’d like to provide a little background. In today’s environment, we’re seeing a bit of a paradigm shift. We’re seeing technology move out of the traditional enterprise infrastructure. We’re seeing these very complex value chains be created. We’re seeing cloud computing.

Smarter infrastructures

We’re actually working to create smarter infrastructures that are becoming more intelligent, automated, and instrumented, and they are very much becoming open-loop systems. Traditionally, they were closed loop systems, in other words, closed environments, for example, the energy and utility (E&U) industry, the transportation industry, and the health-care industry.

As technology becomes more pervasive and gets integrated into these environments, into the critical infrastructure, we have to consider whether they are vulnerable and how the components that have gone into these solutions are trustworthy.

Governments worldwide are asking that question. They’re worried about critical infrastructure and the risk of using commercial, off-the-shelf technology — software and hardware — in a myriad of ways, as it gets integrated into these more complex solutions.

That’s part of the worry internationally from a government and policy perspective, and part of our focus here is to help our constituents, government customers and critical infrastructure customers, understand how the commercial technology manufacturers, the software development manufactures, go about engineering and managing their supply chain integrity.

Gardner: I got the impression somehow, listening to some of the presentations here at the Conference, that this was mostly about software. Maybe at the start, would that be the case?

Szakal: No, it’s about all types of technology. Software obviously is a particularly important focus, because it’s at the center of most technology anyway. Even if you’re developing a chip, a chip has some sort of firmware, which is ultimately software. So that perception is valid to a certain extent, but no, not just software, hardware as well.

Gardner: Steve, I heard also the concept of “build with integrity,” as applied to the OTTF. What does that mean, build with integrity?

Lipner: Build with integrity really means that the developer who is building a technology product, whether it be hardware or software, applies best practices and understood techniques to prevent the inclusion of security problems, holes, bugs, in the product — whether those problems arise from some malicious act in the supply chain or whether they arise from inadvertent errors. With the complexity of modern software, it’s likely that security vulnerabilities can creep in.

So, what build with integrity really means is that the developer applies best practices to reduce the likelihood of security problems arising, as much as commercially feasible.

And not only that, but any given supplier has processes for convincing himself that upstream suppliers, component suppliers, and people or organizations that he relies on, do the same, so that ultimately he delivers as secure a product as possible.

Gardner: Carrie, one of the precepts of good commerce is a lack of friction between borders, where more markets can become involved, where the highest quality at the lowest cost types of effects can take place. This notion of trust, when applied to IT resources and assets, seems to be important to try to keep this a global market and to allow for the efficiencies that are inherent in an open market to take place. How do you see this as a borderless technology ecosystem? How does this help?

International trust

Gates: This helps tremendously in improving trust internationally. We’re looking at developing a framework that can be applied regardless of which country you’re coming from. So, it is not a US-centric framework that we’ll be using and adhering to.

We’re looking for a framework so that each country, regardless of its government, regardless of the consumers within that country, all of them have confidence in what it is that we’re building, that we’re building with integrity, that we are concerned about both, as Steve mentioned, malicious acts or inadvertent errors.

And each country has its own bad guy, and so by adhering to international standard we can say we’re looking for bad guys for every country and ensuring that what we provide is the best possible software.

Gardner: Let’s look a little bit at how this is going to shape up as a process. Dave, let’s explain the idea of The Open Group being involved as a steward. What is The Open Group’s role in this?

Lounsbury: The Open Group provides the framework under which both buyers and suppliers at any scale could come together to solve a common problem — in this case, the question of providing trusted technology best practices and standards. We operate a set of proven processes that ensure that everyone has a voice and that all these standards go forward in an orderly manner.

We provide infrastructure for doing that in the meetings and things like that. The third leg is that The Open Group operates industry-based conformance programs, the certification programs, that allow someone who is not a member to come in and indicate their conformance standard and give evidence that they’re using the best practices there.

Gardner: That’s important. I think there is a milestone set that you were involved with. You’ve created the forum. You’ve done some gathering of information. Now, you’ve come out right here at this conference with the framework, with the first step towards a framework, that could be accepted across the community. There is also a white paper that explains how that’s all going to work. But, eventually, you’re going to get to an accreditation capability. What does that mean? Is that a stamp of approval?

Lounsbury: Let me back up just a little bit. The white paper actually lays out the framework. The work of forum is to turn that framework into an Open Group standard and populate it. That will provide the standards and best practice foundation for this conformance program.

We’re just getting started on the vision for a conformance program. One of the challenges here is that first, not only do we have to come up with the standard and then come up with the criteria by which people would submit evidence, but you also have to deal with the problem of scale.

If we really want to address this problem of global supply chains, we’re talking about a very large number of companies around the world. It’s a part of the challenge that the forum faces.

Accrediting vendors

Part of the work that they’ve embarked on is, in fact, to figure out how we wouldn’t necessarily do that kind of conformance one on one, but how we would accredit either vendors themselves who have their own duty of quality processes as a big vendor would or third parties who can do assessments and then help provide the evidence for that conformance.

We’re getting ahead of ourselves here, but there would be a certification authority that would verify that all the evidence is correct and grant some certificate that says that they have met some or all of the standards.

Szakal: Our vision is that we want to leverage some of the capability that’s already out there. Most of us go through common criteria evaluations and that is actually listed as a best practice for a validating security function and products.

Where we are focused, from an accreditation point of view, affects more than just security products. That’s important to know. However, we definitely believe that the community of assessment labs that exists out there that already conducts security evaluations, whether they be country-specific or that they be common criteria, needs to be leveraged. We’ll endeavor to do that and integrate them into both the membership and the thinking of the accreditation process.

Gardner: Thank you, Andras. Now, for a company that is facing some hurdles — and we heard some questions in our sessions earlier about: “What do I have to do? Is this going to be hard for an SMB? — the upside could be pretty significant. If you’re a company and you do get that accreditation, you’re going to have some business value. Steve Lipner, what from your perspective is the business rationale for these players to go about this accreditation to get this sort of certification?

Lipner: To the extent that the process is successful, why then customers will really value the certification? And will that open markets or create preferences in markets for organizations that have sought and achieved the certification?

Obviously, there will be effort involved in achieving the certification, but that will be related to real value, more trust, more security, and the ability of customers to buy with confidence.

The challenge that we’ll face as a forum going forward is to make the processes deterministic and cost-effective. I can understand what I have to do. I can understand what it will cost me. I won’t get surprised in the certification process and I can understand that value equation. Here’s what I’m going to have to do and then here are the markets and the customer sets, and the supply chains it’s going to open up to me.

Gardner: So, we understand that there is this effort afoot that the idea is to create more trust and a set of practices in place, so that everyone understands that certain criteria have been met and vulnerabilities have been reduced. And, we understand that this is going to be community effort and you’re going to try to be inclusive.

What I’m now curious about is what is it this actually consists of — a list of best practices, technology suggestions? Are there certain tests and requirements that are already in place that one would have to tick off? Let me take that to you, Carrie, and we’ll go around the panel. How do you actually assure that this is safe stuff?

Different metrics

Gates: If you refer to our white paper, we start to address that there. We were looking at a number of different metrics across the board. For example, what do you have for documentation practices? Do you do code reviews? There are a number of different best practices that are already in the field that people are using. Anyone who wants to be a certified, can go and look at this document and say, “Yes, we are following these best practices” or “No, we are missing this. Is it something that we really need to add? What kind of benefit it will provide to us beyond the certification?”

Gardner: Dave, anything to add as to how a company would go about this? What are some of the main building blocks to a low-vulnerability technology creation and distribution process?

Lounsbury: Again, I refer everybody to the white paper, which is available on The Open Group website. You’ll see there in the categories that we’ve divided these kinds of best practice into four broad categories: product engineering and development methods, secure engineering development methods, supply chain integrity methods and the product evaluation methods.

Under there those are the categories, we’ll be looking at the attributes that are necessary to each of those categories and then identifying the underlying standards or bits of evidence, so people can submit to indicate their conformance.

I want to underscore this point about the question of the cost to a vendor. Steve said it very well. The objective here is to raise best practices across the industry and make the best practice commonplace. One of the great things about an industry-based conformance program is that it gives you the opportunity to take the standards and those categories that we’ve talked about as they are developed by OTTF and incorporate those in your engineering and development processes.

So you’re baking in the quality as you go along, and not trying to have an expensive thing going on at the end.

Gardner: Andras, IBM is perhaps one of the largest providers to governments and defense agencies when it comes to IT and certainly, at the center of a large ecosystem around the world, you probably have some insights into best practices that satisfy governments and military and defense organizations.

Can you offer a few major building blocks that perhaps folks that have been in a completely commercial environment would need to start thinking more about as they try to think about reaching accreditation?

Szakal: We have three broad categories here and we’ve broken each of the categories into a set of principles, what we call best practice attributes. One of those is secure engineering. Within secure engineering, for example, one of the attributes is threat assessment and threat modeling.

Another would be to focus on lineage of open-source. So, these are some of the attributes that go into these large-grained categories.

Unpublished best practices

You’re absolutely right, we have thought about this before. Steve and I have talked a lot about this. We’ve worked on his secure engineering initiative, his SDLC initiative within Microsoft. I worked on and was co-author of the IBM Secure Engineering Framework. So, these are living examples that have been published, but are proprietary, for some of the best practices out there. There are others, and in many cases, most companies have addressed this internally, as part of their practices without having to publish them.

Part of the challenge that we are seeing, and part of the reason that Microsoft and IBM went to the length of publishing there is that government customers and critical infrastructure were asking what is the industry practice and what were the best practices.

What we’ve done here is taken the best practices in the industry and bringing them together in a way that’s a non-vendor specific. So you’re not looking to IBM, you’re not having to look at the other vendors’ methods of implementing these practices, and it gives you a non-specific way of addressing them based on outcome.

These have all been realized in the field. We’ve observed these practices in the wild, and we believe that this is going to actually help vendors mature in these specific areas. Governments recognize that, to a certain degree, the industry is not a little drunk and disorderly and we do actually have a view on what it means to develop product in a secure engineering manner and that we have supply chain integrity initiatives out there. So, those are very important.

Gardner: Somebody mentioned earlier that technology is ubiquitous across so many products and services. Software in particular growing more important in how it affects all sorts of different aspects of different businesses around the world. It seems to me this is an inevitable step that you’re taking here and that it might even be overdue.

If we can take the step of certification and agreement about technology best practices, does this move beyond just technology companies in the ecosystem to a wider set of products and services? Any thoughts about whether this is a framework for technology that could become more of a framework for general commerce, Dave?

Lounsbury: Well, Dana, you asked me a question I’m not sure I have an answer for. We’ve got a quite a task in front of us doing some of these technology standards. I guess there might be cases where vertical industries that are heavy technology employers or have similar kinds of security problems might look to this or there might be some overlap. The one that comes to my mind immediately is health care, but we will be quite happy if we get the technology industry, standards and best practices in place in the near future.

Gardner: I didn’t mean to give you more work to do necessarily. I just wanted to emphasize how this is an important and inevitable step and that the standardization around best practices trust and credibility for lack of malware and other risks that comes in technology is probably going to become more prevalent across the economy and the globe. Would you agree with that, Andras?

Szakal: This approach is, by the way, our best practices approach to solving this problem. It’s an approach that’s been taken before by the industry or industries from a supply chain perspective. There are several frameworks out there that abstract the community practice into best practices and use it as a way to help global manufacturing and development practices, in general, ensure integrity.

Our approach is not all that unique, but it’s certainly the first time the technology industry has come together to make sure that we have an answer to some of these most important questions.

Gardner: Any thoughts, Steve?

Lipner: I think Andras was right in terms of the industry coming together to articulate best practices. You asked a few minutes ago about existing certifications and beyond in the trust and assurance space. Beyond common criteria for security features, security products, there’s really not much in terms of formal evaluation processes today.

Creating a discipline

One of the things we think that the forum can contribute is a discipline that governments and potentially other customers can use to say, “What is my supplier actually doing? What assurance do I have? What confidence do I have?”

Gardner: Dave?

Lounsbury: I want to expand on that point a little bit. The white paper’s name, “The Open Trusted Technology Provider Framework” was quite deliberately chosen. There are a lot of practices out there that talk about how you would establish specific security criteria or specific security practices for products. The Open Trusted Technology Provider Forum wants to take a step up and not look at the products, but actually look at the practices that the providers employ to do that. So it’s bringing together those best practices.

Now, good technology providers will use good practices, when they’re looking at their products, but we want to make sure that they’re doing all of the necessary standards and best practices across the spectrum, not just, “Oh, I did this in this product.”

Szakal: I have to agree 100 percent. We’re not simply focused on a bunch of security controls here. This is industry continuity and practices for supply chain integrity, as well as our internal manufacturing practices around the actual practice and process of engineering or software development, as well as supply chain integrity practices.

That’s a very important point to be made. This is not a traditional security standard, insomuch as that we’ve got a hundred security controls that you should always go out and implement. You’re going to have certain practices that make sense in certain situations, depending on the context of the product you’re manufacturing.

Gardner: Carrie, any suggestions for how people could get started at least from an educational perspective? What resources they might look to or what maybe in terms of a mindset they should start to develop as they move towards wanting to be a trusted part of a larger supply chain?

Gates: I would say an open mindset. In terms of getting started, the white paper is an excellent resource to get started and understand how the OTTF is thinking about the problem. How we are sort of structuring things? What are the high-level attributes that we are looking at? Then, digging down further and saying, “How are we actually addressing the problem?”

We had mentioned threat modeling, which for some — if you’re not security-focused — might be a new thing to think about, as an example, in terms of your supply chain. What are the threats to your supply chain? Who might be interested, if you’re looking at malicious attack, in inserting something into your code? Who are your customers and who might be interested in potentially compromising them? How might you go about protecting them?

I am going to contradict Andras a little bit, because there is a security aspect to this, and there is a security mindset that is required. The security mindset is a little bit different, in that you tend to be thinking about who is it that would be interested in doing harm and how do you prevent that?

It’s not a normal way of thinking about problems. Usually, people have a problem, they want to solve it, and security is an add-on afterwards. We’re asking that they start that thinking as part of their process now and then start including that as part of their process.

Szakal: But, you have to agree with me that this isn’t your hopelessly lost techie 150-paragraph list of security controls you have to do in all cases, right?

Gates: Absolutely, there is no checklist of, “Yes, I have a Firewall. Yes, I have an IDS.”

Gardner: Okay. It strikes me that this is really a unique form of insurance — insurance for the buyer, insurance for the seller — that they can demonstrate that they’ve taken proper steps — and insurance for the participants in a vast and complex supply chain of contractors and suppliers around the world. Do you think the word “insurance” makes sense or “assurance?” How would you describe it, Steve?

Lipner: We talk about security assurance, and assurance is really what the OTTF is about, providing developers and suppliers with ways to achieve that assurance in providing their customers ways to know that they have done that. Andras referred to install the Firewall, and so on. This is really not about adding some security band-aid onto a technology or a product. It’s really about the fundamental attributes or assurance of the product or technology that’s being produced.

Gardner: Very good. I think we’ll need to leave it there. We have been discussing The Open Group’s new Open Trusted Technology Forum, The Associated Open Trusted Technology Provider Framework, and the movement towards more of an accreditation process for the global supply chains around technology products.

I want to thank our panel. We’ve been joined by Dave Lounsbury, the Chief Technology Officer of The Open Group. Thank you.

Lounsbury: Thank you, Dana.

Gardner: Also, Steve Lipner, the Senior Director of Security Engineering Strategy in Microsoft’s Trustworthy Computing Group. Thank you, Steve.

Lipner: Thank you, Dana.

Gardner: And also, Andras Szakal, he is the Chief Architect in the IBM Federal Software Group and an IBM’s Distinguished Engineer. Thank you.

Szakal: Thank you so much.

Gardner: And, also Carrie Gates, Vice President and Research Staff Member at CA Labs. Thank you.

Gates: Thank you.

Gardner: You’ve been listening to a sponsored podcast discussion in conjunction with The Open Group Conference here in San Diego, the week of February 7, 2011. I’m Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks for joining and come back next time.

Copyright The Open Group and Interarbor Solutions, LLC, 2005-2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirectblogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

Comments Off

Filed under Cybersecurity, Supply chain risk

A First Step in Securing the Global Technology Supply Chain: Introducing The Open Group Trusted Technology Provider Framework Whitepaper

By Andras Szakal, IBM

Nearly two months ago, we announced the formation of The Open Group Trusted Technology Forum (OTTF), a global standards initiative among technology companies, customers, government and supplier organizations to create and promote guidelines for manufacturing, sourcing, and integrating trusted, secure technologies. The OTTF’s purpose is to shape global procurement strategies and best practices to help reduce threats and vulnerabilities in the global supply chain. I’m proud to say that we have just completed our first deliverable towards achieving our goal: The Open Trusted Technology Provider Framework (O-TTPF) whitepaper.

The framework outlines industry best practices that contribute to the secure and trusted development, manufacture, delivery and ongoing operation of commercial software and hardware products. Even though the OTTF has only recently been announced to the public, the framework and the work that led to this whitepaper have been in development for more than a year: first as a project of the Acquisition Cybersecurity Initiative, a collaborative effort facilitated by The Open Group between government and industry verticals under the sponsorship of the U.S. Department of Defense (OUSD (AT&L)/DDR&E). The framework is intended to benefit technology buyers and providers across all industries and across the globe concerned with secure development practices and supply chain management.

More than 15 member organizations joined efforts to form the OTTF as a proactive response to the changing cybersecurity threat landscape, which has forced governments and larger enterprises to take a more comprehensive view of risk management and product assurance. Current members of the OTTF include Atsec, Boeing, Carnegie Mellon SEI, CA Technologies, Cisco Systems, EMC, Hewlett-Packard, IBM, IDA, Kingdee, Microsoft, MITRE, NASA, Oracle, and the U.S. Department of Defense (OUSD(AT&L)/DDR&E), with the Forum operating under the stewardship and guidance of The Open Group.

Over the past year, OTTF member organizations have been hard at work collaborating, sharing and identifying secure engineering and supply chain integrity best practices that currently exist.  These best practices have been compiled from a number of sources throughout the industry including cues taken from industry associations, coalitions, traditional standards bodies and through existing vendor practices. OTTF member representatives have also shared best practices from within their own organizations.

From there, the OTTF created a common set of best practices distilled into categories and eventually categorized into the O-TTPF whitepaper. All this was done with a goal of ensuring that the practices are practical, outcome-based, aren’t unnecessarily prescriptive and don’t favor any particular vendor.

The Framework

The diagram below outlines the structure of the framework divided into categories that outline a hierarchy of how the OTTF arrived at the best practices it created.

Trusted Technology Provider Categories

Best practices were grouped by category because the types of technology development, manufacturing or integration activities conducted by a supplier are usually tailored to suit the type of product being produced, whether it is hardware, firmware, or software-based. Categories may also be aligned by manufacturing or development phase so that, for example, a supplier can implement a Secure Engineering/Development Method if necessary.

Provider categories outlined in the framework include:

  • Product Engineering/Development Method
  • Secure Engineering/Development Method
  • Supply Chain Integrity Method
  • Product Evaluation Method

Establishing Conformance and Determining Accreditation

In order for the best practices set forth in the O-TTPF to have a long-lasting effect on securing product development and the supply chain, the OTTF will define an accreditation process. Without an accreditation process, there can be no assurance that a practitioner has implemented practices according to the approved framework.

After the framework is formally adopted as a specification, The Open Group will establish conformance criteria and design an accreditation program for the O-TTPF. The Open Group currently manages multiple industry certification and accreditation programs, operating some independently and some in conjunction with third party validation labs. The Open Group is uniquely positioned to provide the foundation for creating standards and accreditation programs. Since trusted technology providers could be either software or hardware vendors, conformance will be applicable to each technology supplier based on the appropriate product architecture.

At this point, the OTTF envisions a multi-tiered accreditation scheme, which would allow for many levels of accreditation including enterprise-wide accreditations or a specific division. An accreditation program of this nature could provide alternative routes to claim conformity to the O-TTPF.

Over the long-term, the OTTF is expected to evolve the framework to make sure its industry best practices continue to ensure the integrity of the global supply chain. Since the O-TTPF is a framework, the authors fully expect that it will evolve to help augment existing manufacturing processes rather than replace existing organizational practices or policies.

There is much left to do, but we’re already well on the way to ensuring the technology supply chain stays safe and secure. If you’re interested in shaping the Trusted Technology Provider Framework best practices and accreditation program, please join us in the OTTF.

Download the O-TTPF, or visit read the OTTPF in full here.

Andras Szakal is an IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team. Andras is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP). His responsibilities include developing e-Government software architectures using IBM middleware and leading the IBM U.S. Federal Software IT Architect Team. His team is responsible for designing solutions to enable smarter government by applying innovative approaches to secure service based computing and mission critical systems. He holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. Andras has been a driving force behind IBM’s adoption of federal government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of the IT Architect Profession Certification Standard (ITAC). More recently he was appointed chair of The Open Trusted Technology Forum.

4 Comments

Filed under Cybersecurity, Supply chain risk