Tag Archives: The Open Group

Mac OS X El Capitan Achieves UNIX® Certification

By The Open Group

The Open Group, an international vendor- and technology-neutral consortium, has announced that Apple, Inc. has achieved UNIX® certification for its latest operating system – Mac OS X version 10.11 known as “El Capitan.”

El Capitan was announced on September 29, 2015 following it being registered as conforming to The Open Group UNIX® 03 standard on the September 7, 2015.

The UNIX® trademark is owned and managed by The Open Group, with the trademark licensed exclusively to identify operating systems that have passed the tests identifying that they conform to The Single UNIX Specification, a standard of The Open Group. UNIX certified operating systems are trusted for mission critical applications because they are powerful and robust, they have a small footprint and are inherently more secure and more stable than the alternatives.

Mac OS X is the most widely used UNIX desktop operating system. Apple’s installed base is now over 80 million users. It’s commitment to the UNIX standard as a platform enables wide portability of applications between compliant and compatible operating systems.

Comments Off on Mac OS X El Capitan Achieves UNIX® Certification

Filed under Uncategorized

The Open Group ArchiMate® Model Exchange File Format and Archi 3.3

By Phil Beauvoir

Some of you might have noticed that Archi 3.3 has been released. This latest version of Archi includes a new plug-in which supports The Open Group ArchiMate Model Exchange File Format standard. This represents the fruits of some years and months’ labour! I’ve been collaborating with The Open Group, and representatives from associated parties and tool vendors, for some time now to produce a file format that can be used to exchange single ArchiMate models between conforming toolsets. Finally, version 1.0 of the standard has been released!

The file format uses XML, which is backed by a validating XSD Schema. Why is this? Wouldn’t XMI be better? Well, yes it would if we had a MOF representation of the ArchiMate standard. Currently, one doesn’t exist. Also, it’s very hard to agree exactly what should be formally represented in a persistence format, as against what can be usefully represented and exchanged using a persistence format. For example, ArchiMate symbols use colour to denote the different layers, and custom colour schemes can be employed to convey meaning. Clearly, this is not something that can be enforced in a specification. Probably the only things that can be enforced are the ArchiMate concepts and relations themselves. Views, viewpoints, and visual arrangements of those concepts and relations are, arguably, optional. A valid ArchiMate model could simply consist of a set of concepts and relations. However, this is probably not very useful in the real world, and so the exchange format seeks to provide a file format for describing and exchanging the most used aspects of ArchiMate models, optional aspects as well as mandatory aspects.

So, simply put, the aim of The Open Group ArchiMate Model Exchange File Format is to provide a pragmatic and useful mechanism for exchanging ArchiMate models and visual representations between compliant toolsets. It does not seek to create a definitive representation of an ArchiMate model. For that to happen, I believe many things would have to be formally declared in the ArchiMate specification. For this reason, many of the components in the exchange format are optional. For example, the ArchiMate 2.1 specification describes the use of attributes as a means to extend the language and provide additional properties to the concepts and relations. The specification does not rigidly mandate their use. However, many toolsets do support and encourage the use of attributes to create model profiles, for example. To support this, the exchange format provides a properties mechanism, consisting of typed key/value pairs. This allows implementers to (optionally) represent additional information for all of the concepts, relations and views.

Even though I have emphasised that the main use for the exchange format is exchange (the name is a bit of a giveaway here ;-)), another advantage of using XML/XSD for the file format is that it is possible to use XSLT to transform the XML ArchiMate model instances into HTML documents, reports, as input for a database, and so on. I would say that the potential for exploiting ArchiMate data in this way is huge.

The exchange format could also help with learning the ArchiMate language and Enterprise Architecture – imagine a repository of ArchiMate models (tagged with Dublin Core metadata to facilitate search and description) that could be used as a resource pool of model patterns and examples for those new to the language. One thing that I personally would like to see is an extensive pool of example models and model snippets as examples of good modelling practice. And using the exchange format, these models and snippets can be loaded into any supporting toolset.

Here are my five “winning features” for the ArchiMate exchange file format:

  • Transparent
  • Simple
  • Well understood format
  • Pragmatic
  • Open

I’m sure that The Open Group ArchiMate Model Exchange File Format will contribute to, and encourage the use of the ArchiMate modelling language, and perhaps reassure users that their valuable data is not locked into any one vendor’s proprietary tool format. I personally think that this is a great initiative and that we have achieved a great result. Of course, nothing is perfect and the exchange format is still at version 1.0, so user feedback is welcome. With greater uptake the format can be improved, and we may see it being exploited in ways that we have not yet thought of!

(For more information about the exchange format, see here.)

About The Open Group ArchiMate® Model Exchange File Format:

The Open Group ArchiMate® Model Exchange File Format Standard defines a file format that can be used to exchange data between systems that wish to import, and export ArchiMate models. ArchiMate Exchange Files enable exporting content from one ArchiMate modelling tool or repository and importing it into another while retaining information describing the model in the file and how it is structured, such as a list of model elements and relationships. The standard focuses on the packaging and transport of ArchiMate models.

The standard is available for free download from:

http://www.opengroup.org/bookstore/catalog/C154.htm.

An online resource site is available at http://www.opengroup.org/xsd/archimate.

By Phil BeauvoirPhil Beauvoir has been developing, writing, and speaking about software tools and development for over 25 years. He was Senior Researcher and Developer at Bangor University, and, later, the Institute for Educational Cybernetics at Bolton University, both in the UK. During this time he co-developed a peer-to-peer learning management and groupware system, a suite of software tools for authoring and delivery of standards-compliant learning objects and meta-data, and tooling to create IMS Learning Design compliant units of learning.  In 2010, working with the Institute for Educational Cybernetics, Phil created the open source ArchiMate Modelling Tool, Archi. Since 2013 he has been curating the development of Archi independently. Phil holds a degree in Medieval English and Anglo-Saxon Literature.

Comments Off on The Open Group ArchiMate® Model Exchange File Format and Archi 3.3

Filed under ArchiMate®, Standards, The Open Group

Congratulations to The Open Group Open Certified Architect (Open CA) on its 10th Anniversary!

By Cristina Woodbridge, Architect Profession Leader, IBM, retired

In New York City on July 18, 2005, The Open Group announced the IT Architect Certification (ITAC) Program in recognition of the need to formalize the definition of the role of IT Architect, a critical new role in the IT industry. The certification program defines a common industry-wide set of skills, knowledge and experience as requirements for IT Architects and a consistent repeatable standard for a peer-based evaluation.

Why was this important? The practice of architecture in the IT industry has the objective of defining how various contributing business and IT elements should come together to produce an effective solution to a business problem. The IT Architect is responsible for defining the structures on which the solution will be developed. When we think of how IT solutions underlay core business throughout the world in every industry and business sector, we can understand the impact of architecture and the role of the IT Architect on the effectiveness and integrity of these systems. In 2015, this understanding may seem obvious, but it was not so in 2005.

How did the standard come about? Based on the request of industry, The Open Group Architecture Forum and the membership at large, The Open Group Governing Board approved the creation of a working group in 2004 to develop the IT Architect certification program. As part of this new working group, I remember when we first came together to start our discussions. Representing different organizations, we were all a little reluctant initially to share our secret definition of the IT Architect role. However as we discussed the skills and experience requirements, we quickly discovered that our definitions were not so secret but commonly shared by all of us. We all agreed IT Architects must have architectural breadth of experience in a wide range of technologies, techniques and tools. They must have a disciplined method-based approach to solution development, strong leadership and communication skills. This conformity in our definition was a clear indication that an industry standard could be articulated and that it was needed. There were areas of differences in our discussion, but the core set of skills, knowledge and experience requirements, which are part of the certification program, were easy to agree upon. We also saw the need to define the professional responsibilities of IT Architects to foster their profession and mentor others. The outcome was the development of the ITAC certification conformance requirements and the certification process.

We unanimously agreed that the candidate’s certification needed to be reviewed by peers, as is the case in many other professions. Only certified IT Architects would be able to assess the documented experience. I have participated in hundreds of board reviews and consensus meetings as part of the Open CA direct certification boards, the IBM certification process and by invitation to audit other organization certification boards. In all of these I have consistently heard the same probing questions looking for the architectural thinking and decision-making process that characterizes IT Architects. In the cases in which I was auditing certifications, I could often anticipate the issues (e.g., lack of architectural experience, was an architectural method applied, etc.) that would be discussed in the consensus reviews and which would impact the decision of the board. This independent review by peer certified IT Architects provides a repeatable consistent method of validating that a candidate meets the certification criteria.

Since 2005, the ITAC program expanded to provide three levels of certification defining a clear professional development plan for professionals from entry to senior level. The program was renamed to The Open Group Certified Architect (Open CA) in 2011 to expand beyond IT Architecture.[1] Over 4,000 certified professionals from 180 companies in more than 60 countries worldwide have been certified in the program. The British Computer Society agrees that The Open Group Certified Architect (Open CA) certification meets criteria accepted towards Chartered IT Professional (CITP) status.[2] Foote Partners [3] list The Open Group Certified Architect certification as driving premium pay by employers in US and Canada. Having a consistent industry standard defining the role of an Architect is valuable to individuals in the profession. It helps them grow professionally within the industry and gain personal recognition. It is valuable to organizations as it provides an assurance of the capabilities of their Architects. It also establishes a common language and common approach to defining solutions across the industry.

Congratulations to The Open Group on the 10th anniversary of Open CA certification program and for maturing the Architect profession to what it is today! Congratulations to the many Open Certified Architects who support the profession through mentoring and participating in the certfication process! Congratulations to the Architects who have certified through this program!

The current Open Group Governing Board Work Group for Open CA consists of: Andras Szakal (IBM), Andrew Macaulay (Capgemini), Chris Greenslade (CLARS Ltd.), Cristina Woodbridge (independent), James de Raeve (The Open Group), Janet Mostow (Oracle), Paul Williams (Capgemini), Peter Beijer (Hewlett-Packard) and Roberto Rivera (Hewlett-Packard).

[1] The Open CA program presently includes certification of Enterprise Architects, Business Architects, and IT Architects.

[2] British Computer Society CITP Agreement on Open CA

[3] Foote Partners, LLC is an independent IT benchmark research and advisory firm targeting the ‘people’ side of managing technology

By Cristina Woodbridge, Architect Profession LeaderCristina Woodbridge was the IBM Worldwide Architect Profession Leader from 2004 to 2015. She was responsible for the effective oversight and quality of the Architect profession deployed globally in IBM. Cristina is an Open Group Distinguished Certified Architect. She is an active member of Open CA Working Group and also participates as a board member for The Open Group Direct Certification boards.

1 Comment

Filed under Certifications, Open CA, The Open Group

The Open Trusted Technology Provider™ Standard (O-TTPS) Approved as ISO/IEC International Standard

The Open Trusted Technology Provider™ Standard (O-TTPS), a Standard from The Open Group for Product Integrity and Supply Chain Security, Approved as ISO/IEC International Standard

Doing More to Secure IT Products and their Global Supply Chains

By Sally Long, The Open Group Trusted Technology Forum Director

As the Director of The Open Group Trusted Technology Forum, I am thrilled to share the news that The Open Trusted Technology Provider™ Standard – Mitigating Maliciously Tainted and Counterfeit Products (O-TTPS) v 1.1 is approved as an ISO/IEC International Standard (ISO/IEC 20243:2015).

It is one of the first standards aimed at assuring both the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

The standard defines a set of best practices for COTS ICT providers to use to mitigate the risk of maliciously tainted and counterfeit components from being incorporated into each phase of a product’s lifecycle. This encompasses design, sourcing, build, fulfilment, distribution, sustainment, and disposal. The best practices apply to in-house development, outsourced development and manufacturing, and to global supply chains.

The ISO/IEC standard will be published in the coming weeks. In advance of the ISO/IEC 20243 publication, The Open Group edition of the standard, technically identical to the ISO/IEC approved edition, is freely available here.

The standardization effort is the result of a collaboration in The Open Group Trusted Technology Provider Forum (OTTF), between government, third party evaluators and some of industry’s most mature and respected providers who came together as members and, over a period of five years, shared and built on their practices for integrity and security, including those used in-house and those used with their own supply chains. From these, they created a set of best practices that were standardized through The Open Group consensus review process as the O-TTPS. That was then submitted to the ISO/IEC JTC1 process for Publicly Available Specifications (PAS), where it was recently approved.

The Open Group has also developed an O-TTPS Accreditation Program to recognize Open Trusted Technology Providers who conform to the standard and adhere to best practices across their entire enterprise, within a specific product line or business unit, or within an individual product. Accreditation is applicable to all ICT providers in the chain: OEMS, integrators, hardware and software component suppliers, value-add distributors, and resellers.

While The Open Group assumes the role of the Accreditation Authority over the entire program, it also uses third-party assessors to assess conformance to the O-TTPS requirements. The Accreditation Program and the Assessment Procedures are publicly available here. The Open Group is also considering submitting the O-TTPS Assessment Procedures to the ISO/IEC JTC1 PAS process.

This international approval comes none-too-soon, given the global threat landscape continues to change dramatically, and cyber attacks – which have long targeted governments and big business – are growing in sophistication and prominence. We saw this most clearly with the Sony hack late last year. Despite successes using more longstanding hacking methods, maliciously intentioned cyber criminals are looking at new ways to cause damage and are increasingly looking at the technology supply chain as a potentially profitable avenue. In such a transitional environment, it is worth reviewing again why IT products and their supply chains are so vulnerable and what can be done to secure them in the face of numerous challenges.

Risk lies in complexity

Information Technology supply chains depend upon complex and interrelated networks of component suppliers across a wide range of global partners. Suppliers deliver parts to OEMS, or component integrators who build products from them, and in turn offer products to customers directly or to system integrators who integrate them with products from multiple providers at a customer site. This complexity leaves ample opportunity for malicious components to enter the supply chain and leave vulnerabilities that can potentially be exploited.

As a result, organizations now need assurances that they are buying from trusted technology providers who follow best practices every step of the way. This means that they not only follow secure development and engineering practices in-house while developing their own software and hardware pieces, but also that they are following best practices to secure their supply chains. Modern cyber criminals go through strenuous efforts to identify any sort of vulnerability that can be exploited for malicious gain and the supply chain is no different.

Untracked malicious behavior and counterfeit components

Tainted products introduced into the supply chain pose significant risk to organizations because altered products introduce the possibility of untracked malicious behavior. A compromised electrical component or piece of software that lies dormant and undetected within an organization could cause tremendous damage if activated externally. Customers, including governments are moving away from building their own high assurance and customized systems and moving toward the use of commercial off the shelf (COTS) information and communication technology (ICT), typically because they are better, cheaper and more reliable. But a maliciously tainted COTS ICT product, once connected or incorporated, poses a significant security threat. For example, it could allow unauthorized access to sensitive corporate data including intellectual property, or allow hackers to take control of the organization’s network. Perhaps the most concerning element of the whole scenario is the amount of damage that such destructive hardware or software could inflict on safety or mission critical systems.

Like maliciously tainted components, counterfeit products can also cause significant damage to customers and providers resulting in failed or inferior products, revenue and brand equity loss, and disclosure of intellectual property. Although fakes have plagued manufacturers and suppliers for many years, globalization has greatly increased the number of out-sourced components and the number of links in every supply chain, and with that comes increased risk of tainted or counterfeit parts making it into operational environments. Consider the consequences if a faulty component was to fail in a government, financial or safety critical system or if it was also maliciously tainted for the sole purpose of causing widespread catastrophic damage.

Global solution for a global problem – the relevance of international standards

One of the emerging challenges is the rise of local demands on IT providers related to cybersecurity and IT supply chains. Despite technology supply chains being global in nature, more and more local solutions are cropping up to address some of the issues mentioned earlier, resulting in multiple countries with different policies that included disparate and variable requirements related to cybersecurity and their supply chains. Some are competing local standards, but many are local solutions generated by governmental policies that dictate which country to buy from and which not to. The supply chain has become a nationally charged issue that requires the creation of a level playing field regardless of where your company is based. Competition should be based on the quality, integrity and security of your products and processes and not where the products were developed, manufactured, or assembled.

Having transparent criteria through global international standards like our recently approved O-TTPS standard (ISO/IEC 20243) and objective assessments like the O-TTPS Accreditation Program that help assure conformance to those standards is critical to both raise the bar on global suppliers and to provide equal opportunity (vendor-neutral and country-nuetral) for all constituents in the chain to reach that bar – regardless of locale.

The approval by ISO/IEC of this universal product integrity and supply chain security standard is an important next step in the continued battle to secure ICT products and protect the environments in which they operate. Suppliers should explore what they need to do to conform to the standard and buyers should consider encouraging conformance by requesting conformance to it in their RFPs. By adhering to relevant international standards and demonstrating conformance we will have a powerful tool for technology providers and component suppliers around the world to utilize in combating current and future cyber attacks on our critical infrastructure, our governments, our business enterprises and even on the COTS ICT that we have in our homes. This is truly a universal problem that we can begin to solve through adoption and adherence to international standards.

By Sally Long, OTTF DirectorSally Long is the Director of The Open Group Trusted Technology Forum (OTTF). She has managed customer supplier forums and collaborative development projects for over twenty years. She was the release engineering section manager for all multi-vendor collaborative technology development projects at The Open Software Foundation (OSF) in Cambridge Massachusetts. Following the merger of the OSF and X/Open under The Open Group, she served as director for multiple forums in The Open Group. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts.

Contact:  s.long@opengroup.org; @sallyannlong

Comments Off on The Open Trusted Technology Provider™ Standard (O-TTPS) Approved as ISO/IEC International Standard

Filed under Uncategorized

The Open Group Baltimore 2015 Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The Open Group Baltimore 2015, Enabling Boundaryless Information Flow™, July 20-23, was held at the beautiful Hyatt Regency Inner Harbor. Over 300 attendees from 16 countries, including China, Japan, Netherlands and Brazil, attended this agenda-packed event.

The event kicked off on July 20th with a warm Open Group welcome by Allen Brown, President and CEO of The Open Group. The first plenary speaker was Bruce McConnell, Senior VP, East West Institute, whose presentation “Global Cooperation in Cyberspace”, gave a behind-the-scenes look at global cybersecurity issues. Bruce focused on US – China cyber cooperation, major threats and what the US is doing about them.

Allen then welcomed Christopher Davis, Professor of Information Systems, University of South Florida, to The Open Group Governing Board as an Elected Customer Member Representative. Chris also serves as Chair of The Open Group IT4IT™ Forum.

The plenary continued with a joint presentation “Can Cyber Insurance Be Linked to Assurance” by Larry Clinton, President & CEO, Internet Security Alliance and Dan Reddy, Adjunct Faculty, Quinsigamond Community College MA. The speakers emphasized that cybersecurity is not a simply an IT issue. They stated there are currently 15 billion mobile devices and there will be 50 billion within 5 years. Organizations and governments need to prepare for new vulnerabilities and the explosion of the Internet of Things (IoT).

The plenary culminated with a panel “US Government Initiatives for Securing the Global Supply Chain”. Panelists were Donald Davidson, Chief, Lifecycle Risk Management, DoD CIO for Cybersecurity, Angela Smith, Senior Technical Advisor, General Services Administration (GSA) and Matthew Scholl, Deputy Division Chief, NIST. The panel was moderated by Dave Lounsbury, CTO and VP, Services, The Open Group. They discussed the importance and benefits of ensuring product integrity of hardware, software and services being incorporated into government enterprise capabilities and critical infrastructure. Government and industry must look at supply chain, processes, best practices, standards and people.

All sessions concluded with Q&A moderated by Allen Brown and Jim Hietala, VP, Business Development and Security, The Open Group.

Afternoon tracks (11 presentations) consisted of various topics including Information & Data Architecture and EA & Business Transformation. The Risk, Dependability and Trusted Technology theme also continued. Jack Daniel, Strategist, Tenable Network Security shared “The Evolution of Vulnerability Management”. Michele Goetz, Principal Analyst at Forrester Research, presented “Harness the Composable Data Layer to Survive the Digital Tsunami”. This session was aimed at helping data professionals understand how Composable Data Layers set digital and the Internet of Things up for success.

The evening featured a Partner Pavilion and Networking Reception. The Open Group Forums and Partners hosted short presentations and demonstrations while guests also enjoyed the reception. Areas focused on were Enterprise Architecture, Healthcare, Security, Future Airborne Capability Environment (FACE™), IT4IT™ and Open Platform™.

Exhibitors in attendance were Esteral Technologies, Wind River, RTI and SimVentions.

By Loren K. Baynes, Director, Global Marketing CommunicationsPartner Pavilion – The Open Group Open Platform 3.0™

On July 21, Allen Brown began the plenary with the great news that Huawei has become a Platinum Member of The Open Group. Huawei joins our other Platinum Members Capgemini, HP, IBM, Philips and Oracle.

By Loren K Baynes, Director, Global Marketing CommunicationsAllen Brown, Trevor Cheung, Chris Forde

Trevor Cheung, VP Strategy & Architecture Practice, Huawei Global Services, will be joining The Open Group Governing Board. Trevor posed the question, “what can we do to combine The Open Group and IT aspects to make a customer experience transformation?” His presentation entitled “The Value of Industry Standardization in Promoting ICT Innovation”, addressed the “ROADS Experience”. ROADS is an acronym for Real Time, On-Demand, All Online, DIY, Social, which need to be defined across all industries. Trevor also discussed bridging the gap; the importance of combining Customer Experience (customer needs, strategy, business needs) and Enterprise Architecture (business outcome, strategies, systems, processes innovation). EA plays a key role in the digital transformation.

Allen then presented The Open Group Forum updates. He shared roadmaps which include schedules of snapshots, reviews, standards, and publications/white papers.

Allen also provided a sneak peek of results from our recent survey on TOGAF®, an Open Group standard. TOGAF® 9 is currently available in 15 different languages.

Next speaker was Jason Uppal, Chief Architecture and CEO, iCareQuality, on “Enterprise Architecture Practice Beyond Models”. Jason emphasized the goal is “Zero Patient Harm” and stressed the importance of Open CA Certification. He also stated that there are many roles of Enterprise Architects and they are always changing.

Joanne MacGregor, IT Trainer and Psychologist, Real IRM Solutions, gave a very interesting presentation entitled “You can Lead a Horse to Water… Managing the Human Aspects of Change in EA Implementations”. Joanne discussed managing, implementing, maintaining change and shared an in-depth analysis of the psychology of change.

“Outcome Driven Government and the Movement Towards Agility in Architecture” was presented by David Chesebrough, President, Association for Enterprise Information (AFEI). “IT Transformation reshapes business models, lean startups, web business challenges and even traditional organizations”, stated David.

Questions from attendees were addressed after each session.

In parallel with the plenary was the Healthcare Interoperability Day. Speakers from a wide range of Healthcare industry organizations, such as ONC, AMIA and Healthway shared their views and vision on how IT can improve the quality and efficiency of the Healthcare enterprise.

Before the plenary ended, Allen made another announcement. Allen is stepping down in April 2016 as President and CEO after more than 20 years with The Open Group, including the last 17 as CEO. After conducting a process to choose his successor, The Open Group Governing Board has selected Steve Nunn as his replacement who will assume the role with effect from November of this year. Steve is the current COO of The Open Group and CEO of the Association of Enterprise Architects. Please see press release here.By Loren K. Baynes, Director, Global Marketing Communications

Steve Nunn, Allen Brown

Afternoon track topics were comprised of EA Practice & Professional Development and Open Platform 3.0™.

After a very informative and productive day of sessions, workshops and presentations, event guests were treated to a dinner aboard the USS Constellation just a few minutes walk from the hotel. The USS Constellation constructed in 1854, is a sloop-of-war, the second US Navy ship to carry the name and is designated a National Historic Landmark.

By Loren K. Baynes, Director, Global Marketing CommunicationsUSS Constellation

On Wednesday, July 22, tracks continued: TOGAF® 9 Case Studies and Standard, EA & Capability Training, Knowledge Architecture and IT4IT™ – Managing the Business of IT.

Thursday consisted of members-only meetings which are closed sessions.

A special “thank you” goes to our sponsors and exhibitors: Avolution, SNA Technologies, BiZZdesign, Van Haren Publishing, AFEI and AEA.

Check out all the Twitter conversation about the event – @theopengroup #ogBWI

Event proceedings for all members and event attendees can be found here.

Hope to see you at The Open Group Edinburgh 2015 October 19-22! Please register here.

By Loren K. Baynes, Director, Global Marketing CommunicationsLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog, media relations and social media. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Comments Off on The Open Group Baltimore 2015 Highlights

Filed under Accreditations, Boundaryless Information Flow™, Cybersecurity, Enterprise Architecture, Enterprise Transformation, Healthcare, Internet of Things, Interoperability, Open CA, Open Platform 3.0, Security, Security Architecture, The Open Group Baltimore 2015, TOGAF®

A Tale of Two IT Departments, or How Governance is Essential in the Hybrid Cloud and Bimodal IT Era

Transcript of an Open Group discussion/podcast on the role of Cloud Governance and Enterprise Architecture and how they work together in the era of increasingly fragmented IT.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Sponsor: The Open Group

Dana Gardner: Hello, and welcome to a special Thought Leadership Panel Discussion, coming to you in conjunction with The Open Group’s upcoming conference on July 20, 2015 in Baltimore.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator as we examine the role that Cloud Governance and Enterprise Architecture play in an era of increasingly fragmented IT.

Not only are IT organizations dealing with so-called shadow IT and myriad proof-of-concept affairs, there is now a strong rationale for fostering what Gartner calls Bimodal IT. There’s a strong case to be made for exploiting the strengths of several different flavors of IT, except that — at the same time — businesses are asking IT in total to be faster, better, and cheaper.

The topic before us today is how to allow for the benefits of Bimodal IT or even Multimodal IT, but without IT fragmentation leading to a fractured and even broken business.

Here to update us on the work of The Open Group Cloud Governance initiatives and working groups and to further explore the ways that companies can better manage and thrive with hybrid IT are our guests. We’re here today with Dr. Chris Harding, Director for Interoperability and Cloud Computing Forum Director at The Open Group. Welcome, Chris.

Dr. Chris Harding: Thank you, Dana. It’s great to be here.

Gardner: We’re also here with David Janson, Executive IT Architect and Business Solutions Professional with the IBM Industry Solutions Team for Central and Eastern Europe and a leading contributor to The Open Group Cloud Governance Project. Welcome, David.

David Janson: Thank you. Glad to be here.

Gardner: Lastly, we here with Nadhan, HP Distinguished Technologist and Cloud Advisor and Co-Chairman of The Open Group Cloud Governance Project. Welcome, Nadhan.

Nadhan: Thank you, Dana. It’s a pleasure to be here.

IT trends

Gardner: Before we get into an update on The Open Group Cloud Governance Initiatives, in many ways over the past decades IT has always been somewhat fragmented. Very few companies have been able to keep all their IT oars rowing in the same direction, if you will. But today things seem to be changing so rapidly that we seem to acknowledge that some degree of disparate IT methods are necessary. We might even think of old IT and new IT, and this may even be desirable.

But what are the trends that are driving this need for a Multimodal IT? What’s accelerating the need for different types of IT, and how can we think about retaining a common governance, and even a frameworks-driven enterprise architecture umbrella, over these IT elements?

Nadhan: Basically, the change that we’re going through is really driven by the business. Business today has much more rapid access to the services that IT has traditionally provided. Business has a need to react to its own customers in a much more agile manner than they were traditionally used to.

We now have to react to demands where we’re talking days and weeks instead of months and years. Businesses today have a choice. Business units are no longer dependent on the traditional IT to avail themselves of the services provided. Instead, they can go out and use the services that are available external to the enterprise.

To a great extent, the advent of social media has also resulted in direct customer feedback on the sentiment from the external customer that businesses need to react to. That is actually changing the timelines. It is requiring IT to be delivered at the pace of business. And the very definition of IT is undergoing a change, where we need to have the right paradigm, the right technology, and the right solution for the right business function and therefore the right application.

Since the choices have increased with the new style of IT, the manner in which you pair them up, the solutions with the problems, also has significantly changed. With more choices, come more such pairs on which solution is right for which problem. That’s really what has caused the change that we’re going through.

A change of this magnitude requires governance that goes across building up on the traditional governance that was always in play, requiring elements like cloud to have governance that is more specific to solutions that are in the cloud across the whole lifecycle of cloud solutions deployment.

Gardner: David, do you agree that this seems to be a natural evolution, based on business requirements, that we basically spin out different types of IT within the same organization to address some of these issues around agility? Or is this perhaps a bad thing, something that’s unnatural and should be avoided?

Janson: In many ways, this follows a repeating pattern we’ve seen with other kinds of transformations in business and IT. Not to diminish the specifics about what we’re looking at today, but I think there are some repeating patterns here.

There are new disruptive events that compete with the status quo. Those things that have been optimized, proven, and settled into sort of a consistent groove can compete with each other. Excitement about the new value that can be produced by new approaches generates momentum, and so far this actually sounds like a healthy state of vitality.

Good governance

However, one of the challenges is that the excitement potentially can lead to overlooking other important factors, and that’s where I think good governance practices can help.

For example, governance helps remind people about important durable principles that should be guiding their decisions, important considerations that we don’t want to forget or under-appreciate as we roll through stages of change and transformation.

At the same time, governance practices need to evolve so that it can adapt to new things that fit into the governance framework. What are those things and how do we govern those? So governance needs to evolve at the same time.

There is a pattern here with some specific things that are new today, but there is a repeating pattern as well, something we can learn from.

Gardner: Chris Harding, is there a built-in capability with cloud governance that anticipates some of these issues around different styles or flavors or even velocity of IT innovation that can then allow for that innovation and experimentation, but then keep it all under the same umbrella with a common management and visibility?

Harding: There are a number of forces at play here, and there are three separate trends that we’ve seen, or at least that I have observed, in discussions with members within The Open Group that relate to this.

The first is one that Nadhan mentioned, the possibility of outsourcing IT. I remember a member’s meeting a few years ago, when one of our members who worked for a company that was starting a cloud brokerage activity happened to mention that two major clients were going to do away with their IT departments completely and just go for cloud brokerage. You could see the jaws drop around the table, particularly with the representatives who were from company corporate IT departments.

Of course, cloud brokers haven’t taken over from corporate IT, but there has been that trend towards things moving out of the enterprise to bring in IT services from elsewhere.

That’s all very well to do that, but from a governance perspective, you may have an easy life if you outsource all of your IT to a broker somewhere, but if you fail to comply with regulations, the broker won’t go to jail; you will go to jail.

So you need to make sure that you retain control at the governance level over what is happening from the point of view of compliance. You probably also want to make sure that your architecture principles are followed and retain governance control to enable that to happen. That’s the first trend and the governance implication of it.

In response to that, a second trend that we see is that IT departments have reacted often by becoming quite like brokers themselves — providing services, maybe providing hybrid cloud services or private cloud services within the enterprise, or maybe sourcing cloud services from outside. So that’s a way that IT has moved in the past and maybe still is moving.

Third trend

The third trend that we’re seeing in some cases is that multi-discipline teams within line of business divisions, including both business people and technical people, address the business problems. This is the way that some companies are addressing the need to be on top of the technology in order to innovate at a business level. That is an interesting and, I think, a very healthy development.

So maybe, yes, we are seeing a bimodal splitting in IT between the traditional IT and the more flexible and agile IT, but maybe you could say that that second part belongs really in the line of business departments, rather than in the IT departments. That’s at least how I see it.

Nadhan: I’d like to build on a point that David made earlier about repeating patterns. I can relate to that very well within The Open Group, speaking about the Cloud Governance Project. Truth be told, as we continue to evolve the content in cloud governance, some of the seeding content actually came from the SOA Governance Project that The Open Group worked on a few years back. So the point David made about the repeating patterns resonates very well with that particular case in mind.

Gardner: So we’ve been through this before. When there is change and disruption, sometimes it’s required for a new version of methodologies and best practices to emerge, perhaps even associated with specific technologies. Then, over time, we see that folded back in to IT in general, or maybe it’s pushed back out into the business, as Chris alluded to.

My question, though, is how we make sure that these don’t become disruptive and negative influences over time. Maybe governance and enterprise architecture principles can prevent that. So is there something about the cloud governance, which I think really anticipates a hybrid model, particularly a cloud hybrid model, that would be germane and appropriate for a hybrid IT environment?

David Janson, is there a cloud governance benefit in managing hybrid IT?

Janson: There most definitely is. I tend to think that hybrid IT is probably where we’re headed. I don’t think this is avoidable. My editorial comment upon that is that’s an unavoidable direction we’re going in. Part of the reason I say that is I think there’s a repeating pattern here of new approaches, new ways of doing things, coming into the picture.

And then some balancing acts goes on, where people look at more traditional ways versus the new approaches people are talking about, and eventually they look at the strengths and weaknesses of both.

There’s going to be some disruption, but that’s not necessarily bad. That’s how we drive change and transformation. What we’re really talking about is making sure the amount of disruption is not so counterproductive that it actually moves things backward instead of forward.

I don’t mind a little bit of disruption. The governance processes that we’re talking about, good governance practices, have an overall life cycle that things move through. If there is a way to apply governance, as you work through that life cycle, at each point, you’re looking at the particular decision points and actions that are going to happen, and make sure that those decisions and actions are well-informed.

We sometimes say that governance helps us do the right things right. So governance helps people know what the right things are, and then the right way to do those things..

Bimodal IT

Also, we can measure how well people are actually adapting to those “right things” to do. What’s “right” can vary over time, because we have disruptive change. Things like we are talking about with Bimodal IT is one example.

Within a narrower time frame in the process lifecycle,, there are points that evolve across that time frame that have particular decisions and actions. Governance makes sure that people are well informed as they’re rolling through that about important things they shouldn’t forget. It’s very easy to forget key things and optimize for only one factor, and governance helps people remember that.

Also, just check to see whether we’re getting the benefits that people expected out of it. Coming back around and looking afterward to see if we accomplish what we thought we would or did we get off in the wrong direction. So it’s a bit like a steering mechanism or a feedback mechanism, in it that helps keep the car on the road, rather than going off in the soft shoulder. Did we overlook something important? Governance is key to making this all successful.

Gardner: Let’s return to The Open Group’s upcoming conference on July 20 in Baltimore and also learn a bit more about what the Cloud Governance Project has been up to. I think that will help us better understand how cloud governance relates to these hybrid IT issues that we’ve been discussing.

Nadhan, you are the co-chairman of the Cloud Governance Project. Tell us about what to expect in Baltimore with the concepts of Boundaryless Information Flow™, and then also perhaps an update on what the Cloud Governance Project has been up to.

Nadhan: Absolutely, Dana. When the Cloud Governance Project started, the first question we challenged ourselves with was, what is it and why do we need it, especially given that SOA governance, architecture governance, IT governance, enterprise governance, in general are all out there with frameworks? We actually detailed out the landscape with different standards and then identified the niche or the domain that cloud governance addresses.

After that, we went through and identified the top five principles that matter for cloud governance to be done right. Some of the obvious ones being that cloud is a business decision, and the governance exercise should keep in mind whether it is the right business decision to go to the cloud rather than just jumping on the bandwagon. Those are just some examples of the foundational principles that drive how cloud governance must be established and exercised.

Subsequent to that, we have a lifecycle for cloud governance defined and then we have gone through the process of detailing it out by identifying and decoupling the governance process and the process that is actually governed.

So there is this concept of process pairs that we have going, where we’ve identified key processes, key process pairs, whether it be the planning, the architecture, reusing cloud service, subscribing to it, unsubscribing, retiring, and so on. These are some of the defining milestones in the life cycle.

We’ve actually put together a template for identifying and detailing these process pairs, and the template has an outline of the process that is being governed, the key phases that the governance goes through, the desirable business outcomes that we would expect because of the cloud governance, as well as the associated metrics and the key roles.

Real-life solution

The Cloud Governance Framework is actually detailing each one. Where we are right now is looking at a real-life solution. The hypothetical could be an actual business scenario, but the idea is to help the reader digest the concepts outlined in the context of a scenario where such governance is exercised. That’s where we are on the Cloud Governance Project.

Let me take the opportunity to invite everyone to be part of the project to continue it by subscribing to the right mailing list for cloud governance within The Open Group.

Gardner: Thank you. Chris Harding, just for the benefit of our readers and listeners who might not be that familiar with The Open Group, perhaps you could give us a very quick overview of The Open Group — its mission, its charter, what we could expect at the Baltimore conference, and why people should get involved, either directly by attending, or following it on social media or the other avenues that The Open Group provides on its website?

Harding: Thank you, Dana. The Open Group is a vendor-neutral consortium whose vision is Boundaryless Information Flow. That is to say the idea that information should be available to people within an enterprise, or indeed within an ecosystem of enterprises, as and when needed, not locked away into silos.

We hold main conferences, quarterly conferences, four times a year and also regional conferences in various parts of the world in between those, and we discuss a variety of topics.

In fact, the main topics for the conference that we will be holding in July in Baltimore are enterprise architecture and risk and security. Architecture and security are two of the key things for which The Open Group is known, Enterprise Architecture, particularly with its TOGAF® Framework, is perhaps what The Open Group is best known for.

We’ve been active in a number of other areas, and risk and security is one. We also have started a new vertical activity on healthcare, and there will be a track on that at the Baltimore conference.

There will be tracks on other topics too, including four sessions on Open Platform 3.0™. Open Platform 3.0 is The Open Group initiative to address how enterprises can gain value from new technologies, including cloud computing, social computing, mobile computing, big data analysis, and the Internet of Things.

We’ll have a number of presentations related to that. These will include, in fact, a perspective on cloud governance, although that will not necessarily reflect what is happening in the Cloud Governance Project. Until an Open Group standard is published, there is no official Open Group position on the topic, and members will present their views at conferences. So we’re including a presentation on that.

Lifecycle governance

There is also a presentation on another interesting governance topic, which is on Information Lifecycle Governance. We have a panel session on the business context for Open Platform 3.0 and a number of other presentations on particular topics, for example, relating to the new technologies that Open Platform 3.0 will help enterprises to use.

There’s always a lot going on at Open Group conferences, and that’s a brief flavor of what will happen at this one.

Gardner: Thank you. And I’d just add that there is more available at The Open Group website, opengroup.org.

Going to one thing you mentioned about a standard and publishing that standard — and I’ll throw this out to any of our guests today — is there a roadmap that we could look to in order to anticipate the next steps or milestones in the Cloud Governance Project? When would such a standard emerge and when might we expect it?

Nadhan: As I said earlier, the next step is to identify the business scenario and apply it. I’m expecting, with the right level of participation, that it will take another quarter, after which it would go through the internal review with The Open Group and the company reviews for the publication of the standard. Assuming we have that in another quarter, Chris, could you please weigh in on what it usually takes, on average, for those reviews before it gets published.

Harding: You could add on another quarter. It shouldn’t actually take that long, but we do have a thorough review process. All members of The Open Group are invited to participate. The document is posted for comment for, I would think, four weeks, after which we review the comments and decide what actually needs to be taken.

Certainly, it could take only two months to complete the overall publication of the standard from the draft being completed, but it’s safer to say about a quarter.

Gardner: So a real important working document could be available in the second half of 2015. Let’s now go back to why a cloud governance document and approach is important when we consider the implications of Bimodal or Multimodal IT.

One of things that Gartner says is that Bimodal IT projects require new project management styles. They didn’t say project management products. They didn’t say, downloads or services from a cloud provider. We’re talking about styles.

So it seems to me that, in order to prevent the good aspects of Bimodal IT to be overridden by negative impacts of chaos and the lack of coordination that we’re talking about, not about a product or a download, we’re talking about something that a working group and a standards approach like the Cloud Governance Project can accommodate.

David, why is it that you can’t buy this in a box or download it as a product? What is it that we need to look at in terms of governance across Bimodal IT and why is that appropriate for a style? Maybe the IT people need to think differently about accomplishing this through technology alone?

First question

Janson: When I think of anything like a tool or a piece of software, the first question I tend to have is what is that helping me do, because the tool itself generally is not the be-all and end-all of this. What process is this going to help me carry out?

So, before I would think about tools, I want to step back and think about what are the changes to project-related processes that new approaches require. Then secondly, think about how can tools help me speed up, automate, or make those a little bit more reliable?

It’s an easy thing to think about a tool that may have some process-related aspects embedded in it as sort of some kind of a magic wand that’s going to automatically make everything work well, but it’s the processes that the tool could enable that are really the important decision. Then, the tools simply help to carry that out more effectively, more reliably, and more consistently.

We’ve always seen an evolution about the processes we use in developing solutions, as well as tools. Technology requires tools to adapt. As to the processes we use, as they get more agile, we want to be more incremental, and see rapid turnarounds in how we’re developing things. Tools need to evolve with that.

But I’d really start out from a governance standpoint, thinking about challenging the idea that if we’re going to make a change, how do we know that it’s really an appropriate one and asking some questions about how we differentiate this change from just reinventing the wheel. Is this an innovation that really makes a difference and isn’t just change for the sake of change?

Governance helps people challenge their thinking and make sure that it’s actually a worthwhile step to take to make those adaptations in project-related processes.

Once you’ve settled on some decisions about evolving those processes, then we’ll start looking for tools that help you automate, accelerate, and make consistent and more reliable what those processes are.

I tend to start with the process and think of the technology second, rather than the other way around. Where governance can help to remind people of principles we want to think about. Are you putting the cart before the horse? It helps people challenge their thinking a little bit to be sure they’re really going in the right direction.

Gardner: Of course, a lot of what you just mentioned pertains to enterprise architecture generally as well.

Nadhan, when we think about Bimodal or Multimodal IT, this to me is going to be very variable from company to company, given their legacy, given their existing style, the rate of adoption of cloud or other software as a service (SaaS), agile, or DevOps types of methods. So this isn’t something that’s going to be a cookie-cutter. It really needs to be looked at company by company and timeline by timeline.

Is this a vehicle for professional services, for management consulting more than IT and product? What is n the relationship between cloud governance, Bimodal IT, and professional services?

Delineating systems

Nadhan: It’s a great question Dana. Let me characterize Bimodal IT slightly differently, before answering the question. Another way to look at Bimodal IT, where we are today, is delineating systems of record and systems of engagement.

In traditional IT, typically, we’re looking at the systems of record, and systems of engagement with the social media and so on are in the live interaction. Those define the continuously evolving, growing-by-the-second systems of engagement, which results in the need for big data, security, and definitely the cloud and so on.

The coexistence of both of these paradigms requires the right move to the cloud for the right reason. So even though they are the systems of record, some, if not most, do need to get transformed to the cloud, but that doesn’t mean all systems of engagement eventually get transformed to the cloud.

There are good reasons why you may actually want to leave certain systems of engagement the way they are. The art really is in combining the historical data that the systems of record have with the continual influx of data that we get through the live channels of social media, and then, using the right level of predictive analytics to get information.

I said a lot in there just to characterize the Bimodal IT slightly differently, making the point that what really is at play, Dana, is a new style of thinking. It’s a new style of addressing the problems that have been around for a while.

But a new way to address the same problems, new solutions, a new way of coming up with the solution models would address the business problems at hand. That requires an external perspective. That requires service providers, consulting professionals, who have worked with multiple customers, perhaps other customers in the same industry, and other industries with a healthy dose of innovation.

That’s where this is a new opportunity for professional services to work with the CxOs, the enterprise architects, the CIOs to exercise the right business decision with the rights level of governance.

Because of the challenges with the coexistence of both systems of record and systems of engagement and harvesting the right information to make the right business decision, there is a significant opportunity for consulting services to be provided to enterprises today.

Drilling down

Gardner: Before we close off I wanted to just drill down on one thing, Nadhan, that you brought up, which is that ability to measure and know and then analyze and compare.

One of the things that we’ve seen with IT developing over the past several years as well is that the big data capabilities have been applied to all the information coming out of IT systems so that we can develop a steady state and understand those systems of record, how they are performing, and compare and contrast in ways that we couldn’t have before.

So on our last topic for today, David Janson, how important is it for that measuring capability in a governance context, and for organizations that want to pursue Bimodal IT, but keep it governed and keep it from spinning out of control? What should they be thinking about putting in place, the proper big data and analytics and measurement and visibility apparatus and capabilities?

Janson: That’s a really good question. One aspect of this is that, when I talk with people about the ideas around governance, it’s not unusual that the first idea that people have about what governance is is about the compliance or the policing aspect that governance can play. That sounds like that’s interference, sand in the gears, but it really should be the other way around.

A governance framework should actually make it very clear how people should be doing things, what’s expected as the result at the end, and how things are checked and measured across time at early stages and later stages, so that people are very clear about how things are carried out and what they are expected to do. So, if someone does use a governance-compliance process to see if things are working right, there is no surprise, there is no slowdown. They actually know how to quickly move through that.

Good governance has communicated that well enough, so that people should actually move faster rather than slower. In other words, there should be no surprises.

Measuring things is very important, because if you haven’t established the objectives that you’re after and some metrics to help you determine whether you’re meeting those, then it’s kind of an empty suit, so to speak, with governance. You express some ideas that you want to achieve, but you have no way of knowing or answering the question of how we know if this is doing what we want to do. Metrics are very important around this.

We capture metrics within processes. Then, for the end result, is it actually producing the effects people want? That’s pretty important.

One of the things that we have built into the Cloud Governance Framework is some idea about what are the outcomes and the metrics that each of these process pairs should have in mind. It helps to answer the question, how do we know? How do we know if something is doing what we expect? That’s very, very essential.

Gardner: I am afraid we’ll have to leave it there. We’ve been examining the role of cloud governance and enterprise architecture and how they work together in the era of increasingly fragmented IT. And we’ve seen how The Open Group Cloud Governance Initiatives and Working Groups can help allow for the benefits of Bimodal IT, but without necessarily IT fragmentation leading to a fractured or broken business process around technology and innovation.

This special Thought Leadership Panel Discussion comes to you in conjunction with The Open Group’s upcoming conference on July 20, 2015 in Baltimore. And it’s not too late to register on The Open Group’s website or to follow the proceedings online and via social media such as Twitter, LinkedIn and Facebook.

So, thank you to our guests today. We’ve been joined by Dr. Chris Harding, Director for Interoperability and Cloud Computing Forum Director at The Open Group; David Janson, Executive IT Architect and Business Solutions Professional with the IBM Industry Solutions Team for Central and Eastern Europe and a leading contributor to The Open Group Cloud Governance Project, and Nadhan, HP Distinguished Technologist and Cloud Advisor and Co-Chairman of The Open Group Cloud Governance Project.

And a big thank you, too, to our audience for joining this special Open Group-sponsored discussion. This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this thought leadership panel discussion series. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android.

Sponsor: The Open Group

Transcript of an Open Group discussion/podcast on the role of Cloud Governance and Enterprise Architecture and how they work together in the era of increasingly fragmented IT. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2015. All rights reserved.

Join the conversation! @theopengroup #ogchat #ogBWI

You may also be interested in:

Comments Off on A Tale of Two IT Departments, or How Governance is Essential in the Hybrid Cloud and Bimodal IT Era

Filed under Accreditations, Boundaryless Information Flow™, Cloud, Cloud Governance, Interoperability, IoT, The Open Group Baltimore 2015

Securing Business Operations and Critical Infrastructure: Trusted Technology, Procurement Paradigms, Cyber Insurance

Following is the transcript of an Open Group discussion on ways to address supply chain risk in the information technology sector marketplace.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Sponsor: The Open Group

Dana Gardner: Hello, and welcome to a special Thought Leadership Panel Discussion, coming to you in conjunction with The Open Group’s upcoming conference on July 20, 2015 in Baltimore.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator as we explore ways to address supply chain risk in the information technology sector market.

We’ll specifically examine how The Open Group Trusted Technology Forum (OTTF) standards and accreditation activities are enhancing the security of global supply chains and improving the integrity of openly available IT products and components.

We’ll also learn how the age-old practice of insurance is coming to bear on the problem of IT supply-chain risk, and by leveraging insurance models, the specter of supply chain disruption and security yields may be significantly reduced.

To update us on the work of the OTTF and explain the workings and benefits of supply-chain insurance, we’re joined by our panel of experts. Please join me in welcoming Sally Long, Director of The Open Group Trusted Technology Forum. Welcome, Sally.

Sally Long: Thank you.

Gardner: We’re also here with Andras Szakal, Vice President and Chief Technology Officer for IBM U.S. Federal and Chairman of The Open Group Trusted Technology Forum. Welcome back, Andras.

Andras Szakal: Thank you for having me.

Gardner: And Bob Dix joins us. He is Vice President of Global Government Affairs and Public Policy for Juniper Networks and is a member of The Open Group Trusted Technology Forum. Welcome, Bob.

Bob Dix: Thank you for the invitation. Glad to be here.

Gardner: Lastly, we are joined by Dan Reddy, Supply Chain Assurance Specialist, college instructor and Lead of The Open Group Trusted Technology Forum Global Outreach and Standards Harmonization Work Group. Thanks for being with us, Dan.

Dan Reddy: Glad to be here, Dana.

Gardner: Sally, let’s start with you. Why don’t we just get a quick update on The Open Group Trusted Technology Forum (OTTF) and the supply-chain accreditation process generally? What has been going on?

OTTP standard

Long: For some of you who might not have heard of the O-TTPS, which is the standard, it’s called The Open Trusted Technology Provider™ Standard. The effort started with an initiative in 2009, a roundtable discussion with U.S. government and several ICT vendors, on how to identify trustworthy commercial off-the-shelf (COTS) information and communication technology (ICT), basically driven by the fact that governments were moving away from high assurance customized solution and more and more using COTS ICT.

That ad-hoc group formed under The OTTF and proceeded to deliver a standard and an accreditation program.

The standard really provides a set of best practices to be used throughout the COTS ICT product life cycle. That’s both during in-house development, as well as with outsourced development and manufacturing, including the best practices to use for security in the supply chain, encompassing all phases from design to disposal.

Just to bring you up to speed on just some of the milestones that we’ve had, we released our 1.0 version of the standard in 2013, launched our accreditation program to help assure conformance to the standard in February 2014, and then in July, we released our 1.1 version of the standard. We have now submitted that version to ISO for approval as a publicly available specification (PAS) and it’s a fast track for ISO.

The PAS is a process for adopting standards developed in other standards development organizations (SDOs), and the O-TTPS has passed the draft ISO ballot. Now, it’s coming up for final ballot.

That should bring folks up to speed, Dana, and let them know where we are today.

Gardner: Is there anything in particular at The Open Group Conference in Baltimore, coming up in July, that pertains to these activities? Is this something that’s going to be more than just discussed? Is there something of a milestone nature here too?

Long: Monday, July 20, is the Cyber Security Day of the Baltimore Conference. We’re going to be meeting in the plenary with many of the U.S. government officials from NIST, GSA, and the Department of Homeland Security. So there is going to be a big plenary discussion on cyber security and supply chain.

We’ll also be meeting separately as a member forum, but the whole open track on Monday will be devoted to cyber security and supply chain security.

The one milestone that might coincide is that we’re publishing our Chinese translation version of the standard 1.1 and we might be announcing that then. I think that’s about it, Dana.

OTTF background

Gardner: Andras, for the benefit of our listeners and readers who might be new to this concept, perhaps you could fill us in on the background on the types of problems that OTTF and the initiatives and standards are designed to solve. What’s the problem that we need to address here?

Szakal: That’s a great question. We realized, over the last 5 to 10 years, that the traditional supply-chain management practices, supply-chain integrity practices, where we were ensuring the integrity of the delivery of a product to the end customer, ensuring that it wasn’t tampered with, effectively managing our suppliers to ensure they provided us with quality components really had expanded as a result of the adoption of technology and the pervasive growth of technology in all aspects of manufacturing, but especially as IT has expanded into the Internet of Things, critical infrastructure and mobile technologies, and now obviously cloud and big data.

And as we manufacture those IT products we have to recognize that now we’re in a global environment, and manufacturing and sourcing of components occurs worldwide. In some cases, some of these components are even open source or freely available. We’re concerned, obviously, about the lineage, but also the practices of how these products are manufactured from a secure engineering perspective, as well as the supply-chain integrity and supply-chain security practices.

What we’ve recognized here is that the traditional life cycle of supplychain security and integrity has expanded to include all the way down to the design aspects of the product through sustainment and managing that product over a period of time, from cradle to grave, and disposal of the product to ensure that those components, if they were hardware-based, don’t actually end up recycled in a way that they pose a threat to our customers.

Gardner: So it’s as much a lifecycle as it is a procurement issue.

Szakal: Absolutely. When you talk about procurement, you’re talking about lifecycle and about mitigating risks to those two different aspects from sourcing and from manufacturing.

So from the customer’s perspective, they need to be considering how they actually apply techniques to ensure that they are sourcing from authorized channels, that they are also applying the same techniques that we use for secure engineering when they are doing the integration of their IT infrastructure.

But from a development perspective, it’s ensuring that we’re applying secure engineering techniques, that we have a well-defined baseline for our life cycle, and that we’re controlling our assets effectively. We understand who our partners are and we’re able to score them and ensure that we’re tracking their integrity and that we’re applying new techniques around secure engineering, like threat analysis and risk analysis to the supply chain.

We’re understanding the current risk landscape and applying techniques like vulnerability analysis and runtime protection techniques that would allow us to mitigate many of these risks as we build out our products and manufacture them.

It goes all the way through sustainment. You probably recognize now, most people would, that your products are no longer a shrink-wrap product that you get, install, and it lives for a year or two before you update it. It’s constantly being updated. So to ensure that the integrity and delivery of that update is consistent with the principles that we are trying to espouse is also really important.

Collaborative effort

Gardner: And to that point, no product stands alone. It’s really a result of a collaborative effort, very complex number of systems coming together. Not only are standards necessary, but cooperation among all those players in that ecosystem becomes necessary.

Dan Reddy, how have we done in terms of getting mutual assurance across a supply chain that all the participants are willing to take part? It seems to me that, if there is a weak link, everyone would benefit by shoring that up. So how do we go beyond the standards? How are we getting cooperation, get all the parties interested in contributing and being part of this?

Reddy: First of all, it’s an evolutionary process, and we’re still in the early days of fully communicating what the best practices are, what the standards are, and getting people to understand how that relates to their place in the supply chain.

Certainly, the supplier community would benefit by following some common practices so they don’t wind up answering customized survey questions from all of their customers.

That’s what’s happening today. It’s pretty much a one-off situation, where each customer says, “I need to protect my supply chain. Let me go find out what all of my suppliers are doing.” The real benefit here is to have the common language of the requirements in our standard and a way to measure it.

So there should be an incentive for the suppliers to take a look at that and say, “I’m tired of answering these individual survey questions. Maybe if I just document my best practices, I can avoid some of the effort that goes along with that individual approach.”

Everyone needs to understand that value proposition across the supply chain. Part of what we’re trying to do with the Baltimore conference is to talk to some thought leaders and continue to get the word out about the value proposition here.

Gardner: Bob Dix, the government in the U.S., and of course across the globe, all the governments, are major purchasers of technology and also have a great stake in security and low risk. What’s been driving some of the government activities? Of course, they’re also interested in using off-the-shelf technology and cutting costs. So what’s the role that governments can play in driving some of these activities around the OTTF?

Risk management

Dix: This issue of supply chain assurance and cyber security is all about risk management, and it’s a shared responsibility. For too long I think that the government has had a tendency to want to point a finger at the private sector as not sufficiently attending to this matter.

The fact is, Dana, that many in the private sector make substantial investments in their product integrity program, as Andras was talking about, from product conception, to delivery, to disposal. What’s really important is that when that investment is made and when companies apply the standard the OTTF has put forward, it’s incumbent upon the government to do their part in purchasing from authorized and trusted sources.

In today’s world, we still have a culture that’s pervasive across the government acquisition community, where decision-making on procurements is often driven by cost and schedule, and product authenticity, assurance, and security are not necessarily a part of that equation. It’s driven in many cases by budgets and other considerations, but nonetheless, we must change that culture to focus to include authenticity and assurance as a part of the decision making process.

The result of focusing on cost and schedule is often those acquisitions are made from untrusted and unauthorized sources, which raises the risk of acquiring counterfeit, tainted, or even malicious equipment.

Part of the work of the OTTF is to present to all stakeholders, in industry and government alike, that there is a process that can be uniform, as has been stated by Sally and Dan as well, that can be applied in an environment to raise the bar of authenticity, security, and assurance to improve upon that risk management approach.

Gardner: Sally, we’ve talked about where you’re standing in terms of some progress in your development around these standards and activities. We’ve heard about the challenges and the need for improvement.

Before we talk about this really interesting concept of insurance that would come to bear on perhaps encouraging standardization and giving people more ways to reduce their risk and adhere to best practices, what do you expect to see in a few years? If things go well and if this is adopted widely and embraced in true good practices, what’s the result? What do we expect to see as an improvement?

What I am trying to get at here is that if there’s a really interesting golden nugget to shoot for, a golden ring to grab for, what is that we can accomplish by doing this well?

Powerful impact

Long: The most important and significant aspect of the accreditation program is when you look at the holistic nature of the program and how it could have a very powerful impact if it’s widely adopted.

The idea of an accreditation program is that a provider gets accredited for conforming to the best practices. A provider that can get accredited could be an integrator, an OEM, the component suppliers of hardware and software that provide the components to the OEM, and the value-add resellers and distributors.

Every important constituent in that supply chain could be accredited. So not only from a business perspective is it important for governments and commercial customers to look on the Accreditation Registry and see who has been accredited for the integrators they want to work with or for the OEMs they want to work with, but it’s also important and beneficial for OEMs to be able to look at that register and say, “These component suppliers are accredited. So I’ll work with them as business partners.” It’s the same for value-add resellers and distributors.

It builds in these real business-market incentives to make the concept work, and in the end, of course, the ultimate goal of having a more secure supply chain and more products with integrity will be achieved.

To me, that is one of the most important aspects that we can reach for, especially if we reach out internationally. What we’re starting to see internationally is that localized requirements are cropping up in different countries. What that’s going to mean is that vendors need to meet those different requirements, increasing their cost, and sometimes even there will end up being trade barriers.

Back to what Dan and Bob were saying, we need to look at this global standard and accreditation program that already exists. It’s not in development; we’ve been working on it for five years with consensus from many, many of the major players in the industry and government. So urging global adoption of what already exists and what could work holistically is really an important objective for our next couple of years.

Gardner: It certainty sounds like a win, win, win if everyone can participate, have visibility, and get designated as having followed through on those principles. But as you know and as you mentioned, it’s the marketplace. Economics often drives business behavior. So in addition to a standards process and the definitions being available, what is it about this notion of insurance that might be a parallel market force that would help encourage better practices and ultimately move more companies in this direction?

Let’s start with Dan. Explain to me how cyber insurance, as it pertains to the supply chain, would work?

Early stages

Reddy: It’s an interesting question. The cyber insurance industry is still in the early stages, even though it goes back to the ’70s, where crime insurance started applying to outsiders gaining physical access to computer systems. You didn’t really see the advent of hacker insurance policies until the late ’90s. Then, starting in 2000, some of the first forms of cyber insurance covering first and third party started to appear.

What we’re seeing today is primarily related to the breaches that we hear about in the paper everyday, where some organization has been comprised, and sensitive information, like credit card information, is exposed for thousands of customers. The remediation is geared toward the companies that have to pay the claim and sign people up for identity protection. It’s pretty cut and dried. That’s the wave that the insurance industry is riding right now.

What I see is that as attacks get to be more sophisticated and potentially include attacks on the supply chain, it’s going to represent a whole new area for cyber insurance. Having consistent ways to address supplier-related risk, as well as the other infrastructure related risks that go beyond simple data breach, is going to be where the marketplace has to make an adjustment. Standardization is critical there.

Gardner: Andras, how does this work in conjunction with OTTF? Would insurance companies begin their risk assessment by making sure that participants in the supply chain are already adhering to your standards and seeking accreditation? Then, maybe they would have premiums that would reflect the diligence that companies extend into their supply chains. Maybe you could just explain to me, not just the insurance, but how it would work in conjunction with OTTF, maybe to each’s mutual benefit.

Szakal: You made a really great point earlier about the economic element that would drive compliance. For us in IBM, the economic element is the ability to prove that we’re providing the right assurance that is being specified in the requests for proposals (RFPs), not only in the federal sector, but outside the federal sector in critical infrastructure and finance. We continue to win those opportunities, and that’s driven our compliance, as well as the government policy aspect worldwide.

But from an insurance point of view, insurance comes in two forms. I buy policy insurance in a case where there are risks that are out of my control, and I apply protective measures that are under my control. So in the case of the supply chain, the OTTF is a set of practices that help you gain control and lower the risk of threat in the manufacturing process.

The question is, do you buy a policy, and what’s the balance here between a cyber threat that is in your control, and those aspects of supply chain security which are out of your control? This is with the understanding that there is an infinite number of a resources or revenue that you can apply to allocate to both of these aspects.

There’s going to have to be a balance, and it really is going to be case by case, with respect to customers and manufacturers, as to where the loss of potential intellectual property (IP) with insurance, versus applying controls. Those resources are better applied where they actually have control, versus that of policies that are protecting you against things that are out of your control.

For example, you might buy a policy for providing code to a third party, which has high value IP to manufacture a component. You have to share that information with that third-party supplier to actually manufacture that component as part of the overarching product, but with the realization that if that third party is somehow hacked or intruded on and that IP is stolen, you have lost some significant amount of value. That will be an area where insurance would be applicable.

What’s working

Gardner: Bob Dix, if insurance comes to bear in conjunction with standards like what the OTTF is developing in supply chain assurance, it seems to me that the insurance providers themselves would be in a position of gathering information for their actuarial decisions and could be a clearing house for what’s working and what isn’t working.

It would be in their best interest to then share that back into the marketplace in order to reduce the risk. That’s a market-driven, data-driven approach that could benefit everyone. Do you see the advent of insurance as a benefit or accelerant to improvement here?

Dix: It’s a tool. This is a conversation that’s been going on in the community for quite some time, the lack of actuarial data for catastrophic losses produced by cyber events, that is impacting some of the rate setting and premium setting by insurance companies, and that has continued to be a challenge.

But from an incentive standpoint, it’s just like in your home. If you have an alarm system, if you have a fence, if you do other kinds of protective measures, your insurance on your homeowners or liability insurance may get a reduction in premium for those actions that you have taken.

As an incentive, the opportunity to have an insurance policy to either transfer or buy down risk can be driven by the type of controls that you have in your environment. The standard that the OTTF has put forward provides guidance about how best to accomplish that. So, there is an opportunity to leverage, as an incentive, the reduction in premiums for insurance to transfer or buy down risk.

Gardner: It’s interesting, Sally, that the insurance industry could benefit from OTTF, and by having more insurance available in the marketplace, it could encourage more participation and make the standard even more applicable and valuable. So it’s interesting to see over time how that plays out.

Any thoughts or comments on the relationship between what you are doing at OTTF and The Open Group and what the private insurance industry is moving toward?

Long: I agree with what everyone has said. It’s an up-and-coming field, and there is a lot more focus on it. I hear at every conference I go to, there is a lot more research on cyber security insurance. There is a place for the O-TTPS in terms of buying down risk, as Bob was mentioning.

The other thing that’s interesting is the NIST Cybersecurity Framework. That whole paradigm started out with the fact that there would be incentives for those that followed the NIST Cybersecurity Framework – that incentive piece became very hard to pull together, and still is. To my knowledge, there are no incentives yet associated with it. But insurance was one of the ideas they talked about for incentivizing adopters of the CSF.

The other thing that I think came out of one of the presentations that Dan and Larry Clinton will be giving at our Baltimore Conference, is that insurers are looking for simplicity. They don’t want to go into a client’s environment and have them prove that they are doing all of these things required of them or filling out a long checklist.

That’s why, in terms of simplicity, asking for O-TTPS-accredited providers or lowering their rates based on that – would be a very simplistic approach, but again not here yet. As Bob said, it’s been talked about a lot for a long time, but I think it is coming to the fore.

Market of interest

Gardner: Dan Reddy, back to you. When there is generally a large addressable market of interest in a product or service, there often rises a commercial means to satisfy that. How can enterprises, the people who are consuming these products, encourage acceptance of these standards, perhaps push for a stronger insurance capability in the marketplace, or also get involved with some of these standards and practices that we have been talking about?

If you’re a publicly traded company, you would want to reduce your exposure and be able to claim accreditation and insurance as well. Let’s look at this from the perspective of the enterprise. What should and could they be doing to improve on this?

Reddy: I want to link back to what Sally said about the NIST Cyber Security Framework. What’s been very useful in publishing the Framework is that it gives enterprises a way to talk about their overall operational risk in a consistent fashion.

I was at one of the workshops sponsored by NIST where enterprises that had adopted it talked about what they were doing internally in their own enterprises in changing their practices, improving their security, and using the language of the framework to address that.

Yet, when they talked about one aspect of their risk, their supplier risk, they were trying to send the NIST Cybersecurity Framework risk questions to their suppliers, and those questions aren’t really sufficient. They’re interesting. You care about the enterprise of your supplier, but you really care about the products of your supplier.

So one of the things that the OTTF did is look at the requirements in our standard related to suppliers and link them specifically to the same operational areas that were included in the NIST Cybersecurity Framework.

This gives the standard enterprise looking at risk, trying to do standard things, a way to use the language of our requirements in the standard and the accreditation program as a form of measurement to see how that aspect of supplier risk would be addressed.

But remember, cyber insurance is more than just the risk of suppliers. It’s the risk at the enterprise level. But the attacks are going to change over time, and we’ll go beyond the simple breaches. That’s where the added complexity will be needed.

Gardner: Andras, any suggestions for how enterprises, suppliers, vendors, systems integrators, and now, of course, the cloud services providers, should get involved? Where can they go for more information? What can they do to become part of the solution on this?

International forum

Szakal: Well, they can always become a member of the Trusted Technology Forum, where we have an international forum.

Gardner: I thought you might say that.

Szakal: That’s an obvious one, right? But there are a couple of places where you can go to learn more about this challenge.

One is certainly our website. Download the framework, which was a compendium of best practices, which we gathered as a result of a lot of hard work of sharing in an open, penalty-free environment all of the best practices that the major vendors are employing to mitigate risks to counterfeit and maliciously tainted products, as well as other supply chain risks. I think that’s a good start, understanding the standard.

Then, it’s looking at how you might measure the standard against what your practices are currently using the accreditation criteria that we have established.

Other places would be NIST. I believe that it’s 161 that is the current pending standard for protecting supply chain security. There are several really good reports that the Defense Science Board and other organizations have conducted in the past within the federal government space. There are plenty of materials out there, a lot of discussion about challenges.

But I think the only place where you really find solutions, or at least one of the only places that I have seen is in the TTF, embedded in the standard as a set of practices that are very practical to implement.

Gardner: Sally, the same question to you. Where can people go to get involved? What should they perhaps do to get started?

Long: I’d reiterate what Andras said. I’d also point them toward the accreditation website, which is www.opengroup.org/accreditation/o-ttps. And on that accreditation site you can see the policy, standard and supporting docs. We publicize our assessment procedures so you have a good idea of what the assessment process will entail.

The program is based on evidence of conformance as well as a warranty from the applicant. So the assessment procedures being public will allow any organizations thinking about getting accredited to know exactly what they need to do.

As always, we would appreciate any new members, because we’ll be evolving the standard and the accreditation program, and it is done by consensus. So if you want a say in that, whether our standard needs to be stronger, weaker, broader, etc., join the forum and help us evolve it.

Impact on business

Gardner: Dan Reddy, when we think about managing these issues, often it falls on the shoulders of IT and their security apparatus, the Chief Information Security Officer perhaps. But it seems that the impact on business is growing. So should other people in the enterprise be thinking about this? I am thinking about procurement or the governance risk and compliance folks. Who else should be involved other than IT in their security apparatus in mitigating the risks as far as IT supply chain activity?

Reddy: You’re right that the old model of everything falls on IT is expanding, and now you see issues of enterprise risk and supply chain risk making it up to the boards of directors, who are asking tough questions. That’s one reason why boards look at cyber insurance as a way to mitigate some of the risk that they can’t control.

They’re asking tough questions all the way around, and I think acquisition people do need to understand what are the right questions to ask of technology providers.

To me, this comes back to scalability. This one-off approach of everyone asking questions of each of their vendors just isn’t going to make it. The advantage that we have here is that we have a consistent standard, built by consensus, freely available, and it’s measurable.

There are a lot of other good documents that talk about supply chain risk and secure engineering, but you can’t get a third-party assessment in a straightforward method, and I think that’s going to be appealing over time.

Gardner: Bob Dix, last word to you. What do you see happening in the area of government affairs and public policy around these issues? What should we hope for or expect from different governments in creating an atmosphere that improves risk across supply chain?

Dix: A couple things have to happen, Dana. First, we have got to quit blaming victims when we have breaches and compromises and start looking at solutions. The government has a tendency in the United States and in other countries around the world, to look at legislating and trying to pass regulatory measures that impose requirements on industry without a full understanding of what industry is already doing.

In this particular example, the government has had a tendency to take an approach that excludes vendors from being able to participate in federal procurement activities based on a risk level that they determine.

The really great thing about the work of the OTTF and the standard that’s being produced is it allows a different way to look at it and instead look at those that are accredited as having met the standard and being able to provide a higher assurance level of authenticity and security around the products and services that they deliver. I think that’s a much more productive approach.

Working together

And from a standpoint of public policy, this example on the great work that’s being done by industry and government working together globally to be able to deliver the standard provides the government a basis by which they can think about it a little differently.

Instead of just focusing on who they want to exclude, let’s look at who actually is delivering the value and meeting the requirements to be a trusted provider. That’s a different approach and it’s one that we are very proud of in terms of the work of The Open Group and we will continue to work that going forward.

Gardner: Excellent. I’m afraid we will have to leave it there. We’ve been exploring ways to address supply chain risk in the information technology sector marketplace, and we’ve seen how The Open Group Trusted Technology Forum standards and accreditation activities are enhancing the security of global supply chain and improving the integrity of openly available IT products and components. And we have also learned how the age-old practice of insurance is coming to bear on the problem of IT supply chain risk.

This special BriefingsDirect Thought Leadership Panel Discussion comes to you in conjunction with The Open Group’s upcoming conference on July 20, 2015 in Baltimore. It’s not too late to register on The Open Group’s website or to follow the proceedings online and via Twitter and other social media during the week of the presentation.

So a big thank you to our guests. We’ve been joined today by Sally Long, Director of The Open Group Trusted Technology Forum. Thanks so much, Sally.

Long: Thank you, Dana.

Gardner: And a big thank you to Andras Szakal, Vice President and Chief Technology Officer for IBM U.S. Federal and Chairman of The Open Group Trusted Technology Forum. Thank you, Andras.

Szakal: Thank you very much for having us and come join the TTF. We can use all the help we can get.

Gardner: Great. A big thank you too to Bob Dix, Vice President of Global Government Affairs & Public Policy for Juniper Networks and a member of The Open Group Trusted Technology Forum. Thanks, Bob.

Dix: Appreciate the invitation. I look forward to joining you again.

Gardner: And lastly, thank you to Dan Reddy, Supply Chain Assurance Specialist, college instructor and Lead of The Open Group Trusted Technology Forum Global Outreach and Standards Harmonization Work Group. I appreciate your input, Dan.

Reddy: Glad to be here.

Gardner: And lastly, a big thank you to our audience for joining us at the special Open Group sponsored Thought Leadership Panel Discussion.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for these Open Group discussions associated with the Baltimore Conference ( (Register Here). Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Sponsor: The Open Group

Join the conversation @theopengroup #ogchat #ogBWI

Transcript of a Briefings Direct discussion on ways to address supply chain risk in the information technology sector marketplace. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in:

1 Comment

Filed under Cybersecurity, OTTF, Supply chain risk, The Open Group Baltimore 2015