Author Archives: The Open Group Blog

Strategic Planning – Ideas to Delivery

By Martin Owen, CEO, Corso

Most organizations operate at a fast pace of change. Businesses are constantly evaluating market demands and enacting change to drive growth and develop a competitive edge.

These market demands come from a broad number of sources, and include economic changes, market trends, regulations, technology improvements and resource management. Knowing where the demands originated, whether they are important and if they are worth acting on can be difficult.

We look at how innovation, Enterprise Architecture and successful project delivery needs to be intertwined and traceable.

In the past, managing ideation to the delivery of innovation has not been done, or has been attempted in organizational silos, leading to disconnections. This in turn results in change not being implemented properly or a focus on the wrong type of change.

How Does an Organization Successfully Embrace Change?

Many companies start with campaigns and ideation. They run challenges and solicit ideas from within and outside of their walls. Ideas are then prioritized and evaluated. Sometimes prototypes are built and tested, but what happens next?

Many organizations turn to the blueprints or roadmaps generated by their enterprise architectures, IT architectures and or business process architectures for answers. They evaluate how a new idea and its supporting technology, such as SOA or enterprise-resource planning (ERP), fits into the broader architecture. They manage their technology portfolio by looking at their IT infrastructure needs.

Organizations often form program management boards to evaluate ideas, initiatives and their costs. In reality, these evaluations are based on lightweight business cases without the broader context. organizations don’t have a comprehensive understanding of what systems, processes and resources they have, what they are being used for, and how much they cost and the effects of regulations. Projects are delivered and viewed on a project-by-project basis without regard to the bigger picture. Enterprise, technology and process-related decisions are made within the flux of change and without access to the real knowledge contained within the organisation or in the market place. IT is often in the hot seat of this type of decision-making.

Challenges of IT Planning

IT planning takes place in reaction to and anticipation of these market demands and initiatives. There may be a need for a new CRM or accounting system, or new application for manufacturing or product development. While IT planning should be part of a broader enterprise architecture or market analysis, IT involvement in technology investments are often done close to the end of the strategic planning process and without proper access to enterprise or market data.

The following questions illustrate the competing demands found within the typical IT environment:

How can we manage the prioritization of business, architectural-and project-driven initiatives?

Stakeholders place a large number of both tactical and strategic requirements on IT. IT is required to offer different technology investment options, but is often constrained by a competition for resources.

How do we balance enterprise architecture’s role with IT portfolio management?

An enterprise architect provides a high-level view of the risks and benefits of a project and the alignment to future goals. It can illustrate the project complexities and the impact of change. Future state architectures and transition plans can be used to define investment portfolio content. At the same time, portfolio management provides a detailed perspective of development and implementation. Balancing these often-competing viewpoints can be tricky.

How well are application lifecycles being managed?

Application management requires a product/service/asset view over time. Well-managed application lifecycles demand a process of continuous releases, especially when time to market is key. The higher level view required by portfolio management provides a broader perspective of how all assets work together. Balancing application lifecycle demands against a broader portfolio framework can present an inherent conflict about priorities and a struggle for resources.

How do we manage the numerous and often conflicting governance requirements across the delivery process?

As many organizations move to small-team agile development, coordinating the various application development projects becomes more difficult. Managing the development process using waterfall methods can shorten schedules but can also increase the chance of errors and a disconnect with broader portfolio and enterprise goals.

How do we address different lifecycles and tribes in the organization?

Lifecycles such as innovation management, enterprise architecture, business process management and solution delivery are all necessary but are not harmonised across the enterprise. The connection among these lifecycles is important to the effective delivery of initiatives and understanding the impact of change.

The enterprise view, down through innovation management, portfolio management, application lifecycle management and agile development represent competing IT viewpoints that can come together using an ideas to delivery framework.

Agile Development and DevOps

A key component of the drive from ideas to delivery is how strategic planning and the delivery of software are related or more directly the relevance of Agile Enterprise Architecture to DevOps.

DevOps is a term that has been around since the end of the last decade, originating from the Agile development movement and is a fusion of “development” and “operations”. In more practical terms it integrates developers and operations teams in order to improve collaboration and productivity by automating infrastructure, workflows and continuously measuring application performance.

The drivers behind the approach are the competing needs to incorporate new products into production whilst maintaining 99.9% uptime to customers in an agile manner.

To understand further the increase in complexity we need to look at how new features and functions need to be applied to our delivery of software. The world of mobile apps, middleware and cloud deployment has reduced release cycles to weeks not months with an emphasis on delivering incremental change. Previously a business release would be every few months with a series of modules and hopefully still relevant to the business goals.

The shorter continuous delivery lifecycle will help organizations:

  • Achieve shorter releases by incremental delivery and delivering faster innovation.
  • Be more responsive to business needs by improved collaboration, better quality and more frequent releases.
  • Manage the number of applications impacted by business release by allowing local variants for a global business and continuous delivery within releases.

The Devops approach achieves this by providing an environment that:

  • Will minimize software delivery batch sizes to increase flexibility and enable continuous feedback as every team delivers features to production as they are completed.
  • Has the notion of projects replaced by release trains which minimizes batch waiting time to reduce lead times and waste.
  • Has a shift from central planning to decentralized execution with a pull philosophy thus minimizing batch transaction cost to improve efficiency.
  • Makes DevOps economically feasible through test virtualization, build automation, and automated release management as we prioritize and sequence batches to maximize business value and select the right batches, sequence them in the right order, guide the implementation, track execution and make planning adjustments to maximize business value.

By Martin Owen, CEO, CorsoFigure 1: DevOps lifecycle

Thus far we have only looked at the delivery aspects, so how does this approach integrate with an enterprise architecture view?

To understand this we need to look more closely at the strategic Planning Lifecycle. Figure 2 shows how the strategic planning lifecycle supports an ‘ideas to delivery’ framework.

By Martin Owen, CEO, Corso

Figure 2: The strategic planning lifecycle

You can see here, the high level relationship between the strategy and goals of an organization and the projects that deliver the change to meet these goals. The enterprise architecture provides the model to govern the delivery of projects in line with these goals.

However we must ensure that any model that is built must be just enough EA to provide the right level of analysis and this has been discussed in previous sections of this book regarding the use of Kanban to drive change. The Agile EA model is then one that can both provide enough analysis to plan which projects should be undertaken and then to ensure full architectural governance over the delivery. The last part of this is achieved by connecting to the tools used in the Agile space.

By Martin Owen, CEO, Corso

Figure 3: Detailed view of the strategic planning lifecycle

There are a number of tools that can be used within DevOps. One example is the IBM toolset, which uses open standards to link to other products within the overall lifecycle. This approach integrates the Agile enterprise architecture process with the Agile Development process and connects project delivery with effective governance of the project lifecycle and ensures that even if the software delivery process is agile the link to goals and associated business needs are met.

To achieve this goal a number of internal processes must interoperate and this is a significant challenge, but one that can be met by building an internal center of excellence and finding a solution by starting small and building a working environment.

The Strategic Planning Lifecycle Summary

The organization begins by revisiting its corporate vision and strategy. What things will differentiate the organization from its competitors in five years? What value propositions will it offer customers to create that differentiation? The organization can create a series of campaigns or challenges to solicit new ideas and requirements for its vision and strategy.

The ideas and requirements are rationalized into a value proposition that can be examined in more detail.

The company can look at what resources it needs to have on both the business side and the IT side to deliver the capabilities needed to realize the value propositions. For example, a superior customer experience might demand better internet interactions and new applications, processes, and infrastructure on which to run. Once the needs are understood, they are compared to what the organization already has. The transition planning determines how the gaps will be addressed.

An enterprise architecture is a living thing with a lifecycle of its own. Figure 3 shows the ongoing EA processes. With the strategy and transition plan in place, EA execution begins. The transition plan provides input to project prioritization and planning since those projects aligned with the transition plan are typically prioritized over those that do not align. This determines which projects are funded and entered into, or continue to the Devops stage. As the solutions are developed, enterprise architecture assets such as models, building blocks, rules, patterns, constraints and guidelines are used and followed. Where the standard assets aren’t suitable for a project, exceptions are requested from the governance board. These exceptions are tracked carefully. Where assets are frequently the subject of exception requests, they must be examined to see if they really are suitable for the organization.

If we’re not doing things the way we said we wanted them done, then we must ask if our target architectures are still correct. This helps keep the EA current and useful.

Periodic updates to the organization’s vision and strategy require a reassessment of the to-be state of the enterprise architecture. This typically results in another look at how the organization will differentiate itself in five years, what value propositions it will offer, the capabilities and resources needed, and so on. Then the transition plan is examined to see if it is still moving us in the right direction. If not, it is updated.

Figure 3, separates the organization’s strategy and vision, the enterprise architecture lifecycle components and the solution development & delivery. Some argue that the strategy and vision are part of the EA while others argue against this. Both views are valid since they simply depend on how you look at the process. If the CEO’s office is responsible for the vision and strategy and the reporting chain as responsible for its execution, then the separation of it from the EA makes sense. In practice, the top part of the reporting chain participates in the vision and strategy exercise and is encouraged to “own” it, at least from an execution perspective. In that case, it might be fair to consider it part of the EA. Or you can say it drives the EA. The categorization isn’t as important as understanding how the vision and strategy interacts with the EA, or the rest of the EA, however you see it.

Note that the overall goal here is to have traceability from our ideas and initiatives, all the way through to strategic delivery. This comes with clear feedback from delivery assets to the ideas and requirements that they were initiated from.

By Martin Owen, CEO, CorsoMartin Owen, CEO, Corso, has held executive and senior management and technical positions in IBM, Telelogic and Popkin. He has been instrumental in driving forward the product management of enterprise architecture, portfolio management and asset management tooling.

Martin is also active with industry standards bodies and was the driver behind the first business process-modelling notation (BPMN) standard.

Martin has led the ArchiMate® and UML mapping initiatives at The Open Group and is part of the capability based planning standards team.

Martin is responsible for strategy, products and direction at Corso.

1 Comment

Filed under Uncategorized

Mac OS X El Capitan Achieves UNIX® Certification

By The Open Group

The Open Group, an international vendor- and technology-neutral consortium, has announced that Apple, Inc. has achieved UNIX® certification for its latest operating system – Mac OS X version 10.11 known as “El Capitan.”

El Capitan was announced on September 29, 2015 following it being registered as conforming to The Open Group UNIX® 03 standard on the September 7, 2015.

The UNIX® trademark is owned and managed by The Open Group, with the trademark licensed exclusively to identify operating systems that have passed the tests identifying that they conform to The Single UNIX Specification, a standard of The Open Group. UNIX certified operating systems are trusted for mission critical applications because they are powerful and robust, they have a small footprint and are inherently more secure and more stable than the alternatives.

Mac OS X is the most widely used UNIX desktop operating system. Apple’s installed base is now over 80 million users. It’s commitment to the UNIX standard as a platform enables wide portability of applications between compliant and compatible operating systems.

Leave a comment

Filed under Uncategorized

The Open Group ArchiMate® Model Exchange File Format and Archi 3.3

By Phil Beauvoir

Some of you might have noticed that Archi 3.3 has been released. This latest version of Archi includes a new plug-in which supports The Open Group ArchiMate Model Exchange File Format standard. This represents the fruits of some years and months’ labour! I’ve been collaborating with The Open Group, and representatives from associated parties and tool vendors, for some time now to produce a file format that can be used to exchange single ArchiMate models between conforming toolsets. Finally, version 1.0 of the standard has been released!

The file format uses XML, which is backed by a validating XSD Schema. Why is this? Wouldn’t XMI be better? Well, yes it would if we had a MOF representation of the ArchiMate standard. Currently, one doesn’t exist. Also, it’s very hard to agree exactly what should be formally represented in a persistence format, as against what can be usefully represented and exchanged using a persistence format. For example, ArchiMate symbols use colour to denote the different layers, and custom colour schemes can be employed to convey meaning. Clearly, this is not something that can be enforced in a specification. Probably the only things that can be enforced are the ArchiMate concepts and relations themselves. Views, viewpoints, and visual arrangements of those concepts and relations are, arguably, optional. A valid ArchiMate model could simply consist of a set of concepts and relations. However, this is probably not very useful in the real world, and so the exchange format seeks to provide a file format for describing and exchanging the most used aspects of ArchiMate models, optional aspects as well as mandatory aspects.

So, simply put, the aim of The Open Group ArchiMate Model Exchange File Format is to provide a pragmatic and useful mechanism for exchanging ArchiMate models and visual representations between compliant toolsets. It does not seek to create a definitive representation of an ArchiMate model. For that to happen, I believe many things would have to be formally declared in the ArchiMate specification. For this reason, many of the components in the exchange format are optional. For example, the ArchiMate 2.1 specification describes the use of attributes as a means to extend the language and provide additional properties to the concepts and relations. The specification does not rigidly mandate their use. However, many toolsets do support and encourage the use of attributes to create model profiles, for example. To support this, the exchange format provides a properties mechanism, consisting of typed key/value pairs. This allows implementers to (optionally) represent additional information for all of the concepts, relations and views.

Even though I have emphasised that the main use for the exchange format is exchange (the name is a bit of a giveaway here ;-)), another advantage of using XML/XSD for the file format is that it is possible to use XSLT to transform the XML ArchiMate model instances into HTML documents, reports, as input for a database, and so on. I would say that the potential for exploiting ArchiMate data in this way is huge.

The exchange format could also help with learning the ArchiMate language and Enterprise Architecture – imagine a repository of ArchiMate models (tagged with Dublin Core metadata to facilitate search and description) that could be used as a resource pool of model patterns and examples for those new to the language. One thing that I personally would like to see is an extensive pool of example models and model snippets as examples of good modelling practice. And using the exchange format, these models and snippets can be loaded into any supporting toolset.

Here are my five “winning features” for the ArchiMate exchange file format:

  • Transparent
  • Simple
  • Well understood format
  • Pragmatic
  • Open

I’m sure that The Open Group ArchiMate Model Exchange File Format will contribute to, and encourage the use of the ArchiMate modelling language, and perhaps reassure users that their valuable data is not locked into any one vendor’s proprietary tool format. I personally think that this is a great initiative and that we have achieved a great result. Of course, nothing is perfect and the exchange format is still at version 1.0, so user feedback is welcome. With greater uptake the format can be improved, and we may see it being exploited in ways that we have not yet thought of!

(For more information about the exchange format, see here.)

About The Open Group ArchiMate® Model Exchange File Format:

The Open Group ArchiMate® Model Exchange File Format Standard defines a file format that can be used to exchange data between systems that wish to import, and export ArchiMate models. ArchiMate Exchange Files enable exporting content from one ArchiMate modelling tool or repository and importing it into another while retaining information describing the model in the file and how it is structured, such as a list of model elements and relationships. The standard focuses on the packaging and transport of ArchiMate models.

The standard is available for free download from:

An online resource site is available at

By Phil BeauvoirPhil Beauvoir has been developing, writing, and speaking about software tools and development for over 25 years. He was Senior Researcher and Developer at Bangor University, and, later, the Institute for Educational Cybernetics at Bolton University, both in the UK. During this time he co-developed a peer-to-peer learning management and groupware system, a suite of software tools for authoring and delivery of standards-compliant learning objects and meta-data, and tooling to create IMS Learning Design compliant units of learning.  In 2010, working with the Institute for Educational Cybernetics, Phil created the open source ArchiMate Modelling Tool, Archi. Since 2013 he has been curating the development of Archi independently. Phil holds a degree in Medieval English and Anglo-Saxon Literature.

Leave a comment

Filed under ArchiMate®, Standards, The Open Group

Congratulations to The Open Group Open Certified Architect (Open CA) on its 10th Anniversary!

By Cristina Woodbridge, Architect Profession Leader, IBM, retired

In New York City on July 18, 2005, The Open Group announced the IT Architect Certification (ITAC) Program in recognition of the need to formalize the definition of the role of IT Architect, a critical new role in the IT industry. The certification program defines a common industry-wide set of skills, knowledge and experience as requirements for IT Architects and a consistent repeatable standard for a peer-based evaluation.

Why was this important? The practice of architecture in the IT industry has the objective of defining how various contributing business and IT elements should come together to produce an effective solution to a business problem. The IT Architect is responsible for defining the structures on which the solution will be developed. When we think of how IT solutions underlay core business throughout the world in every industry and business sector, we can understand the impact of architecture and the role of the IT Architect on the effectiveness and integrity of these systems. In 2015, this understanding may seem obvious, but it was not so in 2005.

How did the standard come about? Based on the request of industry, The Open Group Architecture Forum and the membership at large, The Open Group Governing Board approved the creation of a working group in 2004 to develop the IT Architect certification program. As part of this new working group, I remember when we first came together to start our discussions. Representing different organizations, we were all a little reluctant initially to share our secret definition of the IT Architect role. However as we discussed the skills and experience requirements, we quickly discovered that our definitions were not so secret but commonly shared by all of us. We all agreed IT Architects must have architectural breadth of experience in a wide range of technologies, techniques and tools. They must have a disciplined method-based approach to solution development, strong leadership and communication skills. This conformity in our definition was a clear indication that an industry standard could be articulated and that it was needed. There were areas of differences in our discussion, but the core set of skills, knowledge and experience requirements, which are part of the certification program, were easy to agree upon. We also saw the need to define the professional responsibilities of IT Architects to foster their profession and mentor others. The outcome was the development of the ITAC certification conformance requirements and the certification process.

We unanimously agreed that the candidate’s certification needed to be reviewed by peers, as is the case in many other professions. Only certified IT Architects would be able to assess the documented experience. I have participated in hundreds of board reviews and consensus meetings as part of the Open CA direct certification boards, the IBM certification process and by invitation to audit other organization certification boards. In all of these I have consistently heard the same probing questions looking for the architectural thinking and decision-making process that characterizes IT Architects. In the cases in which I was auditing certifications, I could often anticipate the issues (e.g., lack of architectural experience, was an architectural method applied, etc.) that would be discussed in the consensus reviews and which would impact the decision of the board. This independent review by peer certified IT Architects provides a repeatable consistent method of validating that a candidate meets the certification criteria.

Since 2005, the ITAC program expanded to provide three levels of certification defining a clear professional development plan for professionals from entry to senior level. The program was renamed to The Open Group Certified Architect (Open CA) in 2011 to expand beyond IT Architecture.[1] Over 4,000 certified professionals from 180 companies in more than 60 countries worldwide have been certified in the program. The British Computer Society agrees that The Open Group Certified Architect (Open CA) certification meets criteria accepted towards Chartered IT Professional (CITP) status.[2] Foote Partners [3] list The Open Group Certified Architect certification as driving premium pay by employers in US and Canada. Having a consistent industry standard defining the role of an Architect is valuable to individuals in the profession. It helps them grow professionally within the industry and gain personal recognition. It is valuable to organizations as it provides an assurance of the capabilities of their Architects. It also establishes a common language and common approach to defining solutions across the industry.

Congratulations to The Open Group on the 10th anniversary of Open CA certification program and for maturing the Architect profession to what it is today! Congratulations to the many Open Certified Architects who support the profession through mentoring and participating in the certfication process! Congratulations to the Architects who have certified through this program!

The current Open Group Governing Board Work Group for Open CA consists of: Andras Szakal (IBM), Andrew Macaulay (Capgemini), Chris Greenslade (CLARS Ltd.), Cristina Woodbridge (independent), James de Raeve (The Open Group), Janet Mostow (Oracle), Paul Williams (Capgemini), Peter Beijer (Hewlett-Packard) and Roberto Rivera (Hewlett-Packard).

[1] The Open CA program presently includes certification of Enterprise Architects, Business Architects, and IT Architects.

[2] British Computer Society CITP Agreement on Open CA

[3] Foote Partners, LLC is an independent IT benchmark research and advisory firm targeting the ‘people’ side of managing technology

By Cristina Woodbridge, Architect Profession LeaderCristina Woodbridge was the IBM Worldwide Architect Profession Leader from 2004 to 2015. She was responsible for the effective oversight and quality of the Architect profession deployed globally in IBM. Cristina is an Open Group Distinguished Certified Architect. She is an active member of Open CA Working Group and also participates as a board member for The Open Group Direct Certification boards.

1 Comment

Filed under Certifications, Open CA, The Open Group

The Open Group to Host Event in Edinburgh in October

By The Open Group

The Open Group, the vendor-neutral IT consortium, is hosting its latest event in Edinburgh October 19-22 2015. The Open Group Edinburgh 2015 will focus on the challenge of architecting a Boundaryless Organization in an age of new technology and changing trends. The event will look at this issue in more depth during individual sessions on October 19 and 20 around Cybersecurity, Risk, Internet of Things and Enterprise Architecture.

Key speakers at the event include:

  • Allen Brown, President & CEO, The Open Group
  • Steve Cole, CIO, BAE Systems Submarine Solutions
  • John Wilcock, Head of Operations Transformation, BAE Systems Submarine Solutions
  • Mark Orsborn, Director, IoT EMEA, Salesforce
  • Heather Kreger, CTO, International Standards, IBM
  • Kevin Coyle, Solution Architect, the Student Loans Company
  • Rob Akershoek, Solution Architect, Shell International

Full details on the range of track speakers at the event can be found here.

The event, which is being held at Edinburgh’s International Conference Centre, provides attendees with the opportunity to participate in Forums and Work Groups to help develop the next generation of certifications and standards. Attendees will be able to learn from the experiences of their peers and gain valuable knowledge from other relevant industry experts, such as a detailed case study of how BAE Submarines introduced an Enterprise Architecture approach to transform their Business Operations.

Attendees will also be able to hear about the upcoming launch of the new IT4IT™ Reference Architecture v2.0 Standard and hear from Mary Jarrett, IT4IT Manager at Shell, on “Why Shell is Adopting an Open Standard for Managing IT”.

Additional key topics of discussion at the Edinburgh event include:

  • An architected approach to Business Transformation in the manufacturing sector
  • The Boundaryless Organization and Boundaryless Information Flow™, and their relevance to information technology executives today
  • The role of Enterprise Architecture in Business Transformation, especially transformations riven by merging and disruptive technologies
  • Risk, Dependability & Trusted Technology: the Cybersecurity connection and securing the global supply chain.
  • Open Platform 3.0™ – Social, Mobile, Big Data & Analytics, Cloud and SOA – how organizations can achieve business objectives by adopting new technologies and processes as part of Business Transformation management principles
  • IT4IT™ – A new operating model to manage the business of IT, providing prescriptive guidance on how to design, procure and implement the functionality required to run IT
  • Enabling healthcare interoperability
  • Developing better interoperability and communication across organizational boundaries and pursuing global standards for Enterprise Architecture that are highly relevant to all industries

The Open Group will be hosting a networking event at Edinburgh Castle on Tuesday October 20, with an evening of traditional Scottish food and entertainment.

Registration for The Open Group Edinburgh is open now and available to members and non-members and can be found here.

Leave a comment

Filed under Uncategorized

The Open Trusted Technology Provider™ Standard (O-TTPS) Approved as ISO/IEC International Standard

The Open Trusted Technology Provider™ Standard (O-TTPS), a Standard from The Open Group for Product Integrity and Supply Chain Security, Approved as ISO/IEC International Standard

Doing More to Secure IT Products and their Global Supply Chains

By Sally Long, The Open Group Trusted Technology Forum Director

As the Director of The Open Group Trusted Technology Forum, I am thrilled to share the news that The Open Trusted Technology Provider™ Standard – Mitigating Maliciously Tainted and Counterfeit Products (O-TTPS) v 1.1 is approved as an ISO/IEC International Standard (ISO/IEC 20243:2015).

It is one of the first standards aimed at assuring both the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

The standard defines a set of best practices for COTS ICT providers to use to mitigate the risk of maliciously tainted and counterfeit components from being incorporated into each phase of a product’s lifecycle. This encompasses design, sourcing, build, fulfilment, distribution, sustainment, and disposal. The best practices apply to in-house development, outsourced development and manufacturing, and to global supply chains.

The ISO/IEC standard will be published in the coming weeks. In advance of the ISO/IEC 20243 publication, The Open Group edition of the standard, technically identical to the ISO/IEC approved edition, is freely available here.

The standardization effort is the result of a collaboration in The Open Group Trusted Technology Provider Forum (OTTF), between government, third party evaluators and some of industry’s most mature and respected providers who came together as members and, over a period of five years, shared and built on their practices for integrity and security, including those used in-house and those used with their own supply chains. From these, they created a set of best practices that were standardized through The Open Group consensus review process as the O-TTPS. That was then submitted to the ISO/IEC JTC1 process for Publicly Available Specifications (PAS), where it was recently approved.

The Open Group has also developed an O-TTPS Accreditation Program to recognize Open Trusted Technology Providers who conform to the standard and adhere to best practices across their entire enterprise, within a specific product line or business unit, or within an individual product. Accreditation is applicable to all ICT providers in the chain: OEMS, integrators, hardware and software component suppliers, value-add distributors, and resellers.

While The Open Group assumes the role of the Accreditation Authority over the entire program, it also uses third-party assessors to assess conformance to the O-TTPS requirements. The Accreditation Program and the Assessment Procedures are publicly available here. The Open Group is also considering submitting the O-TTPS Assessment Procedures to the ISO/IEC JTC1 PAS process.

This international approval comes none-too-soon, given the global threat landscape continues to change dramatically, and cyber attacks – which have long targeted governments and big business – are growing in sophistication and prominence. We saw this most clearly with the Sony hack late last year. Despite successes using more longstanding hacking methods, maliciously intentioned cyber criminals are looking at new ways to cause damage and are increasingly looking at the technology supply chain as a potentially profitable avenue. In such a transitional environment, it is worth reviewing again why IT products and their supply chains are so vulnerable and what can be done to secure them in the face of numerous challenges.

Risk lies in complexity

Information Technology supply chains depend upon complex and interrelated networks of component suppliers across a wide range of global partners. Suppliers deliver parts to OEMS, or component integrators who build products from them, and in turn offer products to customers directly or to system integrators who integrate them with products from multiple providers at a customer site. This complexity leaves ample opportunity for malicious components to enter the supply chain and leave vulnerabilities that can potentially be exploited.

As a result, organizations now need assurances that they are buying from trusted technology providers who follow best practices every step of the way. This means that they not only follow secure development and engineering practices in-house while developing their own software and hardware pieces, but also that they are following best practices to secure their supply chains. Modern cyber criminals go through strenuous efforts to identify any sort of vulnerability that can be exploited for malicious gain and the supply chain is no different.

Untracked malicious behavior and counterfeit components

Tainted products introduced into the supply chain pose significant risk to organizations because altered products introduce the possibility of untracked malicious behavior. A compromised electrical component or piece of software that lies dormant and undetected within an organization could cause tremendous damage if activated externally. Customers, including governments are moving away from building their own high assurance and customized systems and moving toward the use of commercial off the shelf (COTS) information and communication technology (ICT), typically because they are better, cheaper and more reliable. But a maliciously tainted COTS ICT product, once connected or incorporated, poses a significant security threat. For example, it could allow unauthorized access to sensitive corporate data including intellectual property, or allow hackers to take control of the organization’s network. Perhaps the most concerning element of the whole scenario is the amount of damage that such destructive hardware or software could inflict on safety or mission critical systems.

Like maliciously tainted components, counterfeit products can also cause significant damage to customers and providers resulting in failed or inferior products, revenue and brand equity loss, and disclosure of intellectual property. Although fakes have plagued manufacturers and suppliers for many years, globalization has greatly increased the number of out-sourced components and the number of links in every supply chain, and with that comes increased risk of tainted or counterfeit parts making it into operational environments. Consider the consequences if a faulty component was to fail in a government, financial or safety critical system or if it was also maliciously tainted for the sole purpose of causing widespread catastrophic damage.

Global solution for a global problem – the relevance of international standards

One of the emerging challenges is the rise of local demands on IT providers related to cybersecurity and IT supply chains. Despite technology supply chains being global in nature, more and more local solutions are cropping up to address some of the issues mentioned earlier, resulting in multiple countries with different policies that included disparate and variable requirements related to cybersecurity and their supply chains. Some are competing local standards, but many are local solutions generated by governmental policies that dictate which country to buy from and which not to. The supply chain has become a nationally charged issue that requires the creation of a level playing field regardless of where your company is based. Competition should be based on the quality, integrity and security of your products and processes and not where the products were developed, manufactured, or assembled.

Having transparent criteria through global international standards like our recently approved O-TTPS standard (ISO/IEC 20243) and objective assessments like the O-TTPS Accreditation Program that help assure conformance to those standards is critical to both raise the bar on global suppliers and to provide equal opportunity (vendor-neutral and country-nuetral) for all constituents in the chain to reach that bar – regardless of locale.

The approval by ISO/IEC of this universal product integrity and supply chain security standard is an important next step in the continued battle to secure ICT products and protect the environments in which they operate. Suppliers should explore what they need to do to conform to the standard and buyers should consider encouraging conformance by requesting conformance to it in their RFPs. By adhering to relevant international standards and demonstrating conformance we will have a powerful tool for technology providers and component suppliers around the world to utilize in combating current and future cyber attacks on our critical infrastructure, our governments, our business enterprises and even on the COTS ICT that we have in our homes. This is truly a universal problem that we can begin to solve through adoption and adherence to international standards.

By Sally Long, OTTF DirectorSally Long is the Director of The Open Group Trusted Technology Forum (OTTF). She has managed customer supplier forums and collaborative development projects for over twenty years. She was the release engineering section manager for all multi-vendor collaborative technology development projects at The Open Software Foundation (OSF) in Cambridge Massachusetts. Following the merger of the OSF and X/Open under The Open Group, she served as director for multiple forums in The Open Group. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts.

Contact:; @sallyannlong

Leave a comment

Filed under Uncategorized

The Open Group Baltimore 2015 Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The Open Group Baltimore 2015, Enabling Boundaryless Information Flow™, July 20-23, was held at the beautiful Hyatt Regency Inner Harbor. Over 300 attendees from 16 countries, including China, Japan, Netherlands and Brazil, attended this agenda-packed event.

The event kicked off on July 20th with a warm Open Group welcome by Allen Brown, President and CEO of The Open Group. The first plenary speaker was Bruce McConnell, Senior VP, East West Institute, whose presentation “Global Cooperation in Cyberspace”, gave a behind-the-scenes look at global cybersecurity issues. Bruce focused on US – China cyber cooperation, major threats and what the US is doing about them.

Allen then welcomed Christopher Davis, Professor of Information Systems, University of South Florida, to The Open Group Governing Board as an Elected Customer Member Representative. Chris also serves as Chair of The Open Group IT4IT™ Forum.

The plenary continued with a joint presentation “Can Cyber Insurance Be Linked to Assurance” by Larry Clinton, President & CEO, Internet Security Alliance and Dan Reddy, Adjunct Faculty, Quinsigamond Community College MA. The speakers emphasized that cybersecurity is not a simply an IT issue. They stated there are currently 15 billion mobile devices and there will be 50 billion within 5 years. Organizations and governments need to prepare for new vulnerabilities and the explosion of the Internet of Things (IoT).

The plenary culminated with a panel “US Government Initiatives for Securing the Global Supply Chain”. Panelists were Donald Davidson, Chief, Lifecycle Risk Management, DoD CIO for Cybersecurity, Angela Smith, Senior Technical Advisor, General Services Administration (GSA) and Matthew Scholl, Deputy Division Chief, NIST. The panel was moderated by Dave Lounsbury, CTO and VP, Services, The Open Group. They discussed the importance and benefits of ensuring product integrity of hardware, software and services being incorporated into government enterprise capabilities and critical infrastructure. Government and industry must look at supply chain, processes, best practices, standards and people.

All sessions concluded with Q&A moderated by Allen Brown and Jim Hietala, VP, Business Development and Security, The Open Group.

Afternoon tracks (11 presentations) consisted of various topics including Information & Data Architecture and EA & Business Transformation. The Risk, Dependability and Trusted Technology theme also continued. Jack Daniel, Strategist, Tenable Network Security shared “The Evolution of Vulnerability Management”. Michele Goetz, Principal Analyst at Forrester Research, presented “Harness the Composable Data Layer to Survive the Digital Tsunami”. This session was aimed at helping data professionals understand how Composable Data Layers set digital and the Internet of Things up for success.

The evening featured a Partner Pavilion and Networking Reception. The Open Group Forums and Partners hosted short presentations and demonstrations while guests also enjoyed the reception. Areas focused on were Enterprise Architecture, Healthcare, Security, Future Airborne Capability Environment (FACE™), IT4IT™ and Open Platform™.

Exhibitors in attendance were Esteral Technologies, Wind River, RTI and SimVentions.

By Loren K. Baynes, Director, Global Marketing CommunicationsPartner Pavilion – The Open Group Open Platform 3.0™

On July 21, Allen Brown began the plenary with the great news that Huawei has become a Platinum Member of The Open Group. Huawei joins our other Platinum Members Capgemini, HP, IBM, Philips and Oracle.

By Loren K Baynes, Director, Global Marketing CommunicationsAllen Brown, Trevor Cheung, Chris Forde

Trevor Cheung, VP Strategy & Architecture Practice, Huawei Global Services, will be joining The Open Group Governing Board. Trevor posed the question, “what can we do to combine The Open Group and IT aspects to make a customer experience transformation?” His presentation entitled “The Value of Industry Standardization in Promoting ICT Innovation”, addressed the “ROADS Experience”. ROADS is an acronym for Real Time, On-Demand, All Online, DIY, Social, which need to be defined across all industries. Trevor also discussed bridging the gap; the importance of combining Customer Experience (customer needs, strategy, business needs) and Enterprise Architecture (business outcome, strategies, systems, processes innovation). EA plays a key role in the digital transformation.

Allen then presented The Open Group Forum updates. He shared roadmaps which include schedules of snapshots, reviews, standards, and publications/white papers.

Allen also provided a sneak peek of results from our recent survey on TOGAF®, an Open Group standard. TOGAF® 9 is currently available in 15 different languages.

Next speaker was Jason Uppal, Chief Architecture and CEO, iCareQuality, on “Enterprise Architecture Practice Beyond Models”. Jason emphasized the goal is “Zero Patient Harm” and stressed the importance of Open CA Certification. He also stated that there are many roles of Enterprise Architects and they are always changing.

Joanne MacGregor, IT Trainer and Psychologist, Real IRM Solutions, gave a very interesting presentation entitled “You can Lead a Horse to Water… Managing the Human Aspects of Change in EA Implementations”. Joanne discussed managing, implementing, maintaining change and shared an in-depth analysis of the psychology of change.

“Outcome Driven Government and the Movement Towards Agility in Architecture” was presented by David Chesebrough, President, Association for Enterprise Information (AFEI). “IT Transformation reshapes business models, lean startups, web business challenges and even traditional organizations”, stated David.

Questions from attendees were addressed after each session.

In parallel with the plenary was the Healthcare Interoperability Day. Speakers from a wide range of Healthcare industry organizations, such as ONC, AMIA and Healthway shared their views and vision on how IT can improve the quality and efficiency of the Healthcare enterprise.

Before the plenary ended, Allen made another announcement. Allen is stepping down in April 2016 as President and CEO after more than 20 years with The Open Group, including the last 17 as CEO. After conducting a process to choose his successor, The Open Group Governing Board has selected Steve Nunn as his replacement who will assume the role with effect from November of this year. Steve is the current COO of The Open Group and CEO of the Association of Enterprise Architects. Please see press release here.By Loren K. Baynes, Director, Global Marketing Communications

Steve Nunn, Allen Brown

Afternoon track topics were comprised of EA Practice & Professional Development and Open Platform 3.0™.

After a very informative and productive day of sessions, workshops and presentations, event guests were treated to a dinner aboard the USS Constellation just a few minutes walk from the hotel. The USS Constellation constructed in 1854, is a sloop-of-war, the second US Navy ship to carry the name and is designated a National Historic Landmark.

By Loren K. Baynes, Director, Global Marketing CommunicationsUSS Constellation

On Wednesday, July 22, tracks continued: TOGAF® 9 Case Studies and Standard, EA & Capability Training, Knowledge Architecture and IT4IT™ – Managing the Business of IT.

Thursday consisted of members-only meetings which are closed sessions.

A special “thank you” goes to our sponsors and exhibitors: Avolution, SNA Technologies, BiZZdesign, Van Haren Publishing, AFEI and AEA.

Check out all the Twitter conversation about the event – @theopengroup #ogBWI

Event proceedings for all members and event attendees can be found here.

Hope to see you at The Open Group Edinburgh 2015 October 19-22! Please register here.

By Loren K. Baynes, Director, Global Marketing CommunicationsLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog, media relations and social media. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Comments Off on The Open Group Baltimore 2015 Highlights

Filed under Accreditations, Boundaryless Information Flow™, Cybersecurity, Enterprise Architecture, Enterprise Transformation, Healthcare, Internet of Things, Interoperability, Open CA, Open Platform 3.0, Security, Security Architecture, The Open Group Baltimore 2015, TOGAF®