Tag Archives: standards process

The Open Group San Francisco Day One Highlights

By The Open Group

The Open Group kicked off its first event of 2017 on a sunny Monday morning, January 30, in the City by the Bay, with over 200 attendees from 20 countries including Australia, Finland, Germany and Singapore.

The Open Group CEO and President Steve Nunn began the day’s proceedings with a warm welcome and the announcement of the latest version of the Open Trusted Technology Provider™ Standard (O-TTPS), a standard that specifies best practices for providers to help them mitigate the risk of tainted or counterfeit products or parts getting into the IT supply chain. A new certification program for the standard was also announced, as well as the news that the standard has recently been ratified by ISO. Nunn also announced the availability of the next version of The Open Group IT4IT™ standard, version 2.1.

Monday’s plenary focused on IT4IT and Managing the Business of IT. Bernard Golden, CEO of Navica, spoke on the topic,“Cloud Computing and Business Expectations: How the Cloud Changes Everything.” Golden, who was named as one of the 10 most influential people in cloud computing by Wired magazine, began with a brief overview of the state of the computing industry today, which is largely characterized by the enormous growth of cloud computing. Golden believes that the public cloud will be the future of IT moving forward. With the speed that the cloud enables today, IT and app development have become both the bottleneck and differentiator for IT departments. To address these bottlenecks, IT must take a multi-pronged, continuous approach that uses a combination of cloud, Agile and DevOps to address business drivers. The challenge for IT shops today, Golden says, is also to decide where to focus and what cloud services they need to build applications. To help determine what works, IT must ask whether services are above or below what he calls “the value line,” which delineates whether the services available, which are often open-source, will ultimately advance the company’s goals or not, despite being low cost. IT must also be aware of the fact that the value line can present a lock-in challenge, creating tension between the availability of affordable—but potentially buggy—open-source tools and services and the ongoing value the business needs. Ultimately, Golden says, the cloud has changed everything—and IT must be willing to change with it and weigh the trade-offs between openness and potential lock-in.

Forrester Research analysts David Wheable, Vice President and Principal Consultant, and David Cannon, Vice President and Group Director, took the stage following Golden’s session to discuss “The Changing Role of IT: Strategy in the Age of the Customer.” Wheable spoke first, noting that technology has enabled a new “age of the customer,” an era where customers now have the majority of the power in the business/customer relationship.  As such, companies must now adapt to how their customers want to interact with their businesses and how customers use a company’s business applications (particularly via mobile devices) in order to survive and prevent customers from constantly changing their loyalties. Because IT strategists will not be able to predict how customers will use their applications, they must be able to put themselves in a position where they can quickly adapt to what is happening.

Cannon discussed what IT departments need to consider when it comes to strategy. To develop a viable IT strategy today, companies must consider what is valuable to the customer and how they will choose the technologies and applications that provide customers what they need. In the current IT landscape, features and quality no longer matter—instead, IT must take into account customers’ emotions, desires and immediate needs. Continuous exploitation of digital assets to deliver customer outcomes will be critical for both digital and business strategies—which Cannon argues are now essentially the same thing—moving forward. To survive in this new era, IT departments must also be able to enable customer outcomes, measure the customer experience, manage a portfolio of services, showcase business—not just technical—expertise and continue to enable service architectures that will deliver what customers need and want.

After the morning coffee break, Author and Researcher Gene Kim followed to discuss his recent book, The DevOps Handbook. His session, entitled, “The Rise of Architecture: Top Lessons Learned while Researching and Writing The DevOps Handbook,” explored the example of high performers in the tech sector and how the emergence of DevOps has influenced them. According to Kim, most IT departments are subject to a downward spiral over time due to the exponential growth of technical assets and debt during that time, which ultimately weigh them down and affect performance. In contrast, according to Kim’s research, high-performing organizations have been able to avoid this spiral by using DevOps. Organizations utilizing DevOps are nearly three times more agile than their peers, are more reliable and two times more likely to exceed profitability, market share and productivity goals in the marketplace. The ability to deploy small changes more frequently has been a game changer for these high-performing organizations not only allowing them to move faster but to create more humane working conditions and happier, more productive workers. Kim also found that fear of doing deployments is the most accurate predictor of success in organizations—those that fear deployments have less success than those that don’t.

by-the-open-group

Gene Kim

The final session of the morning plenary was presented by Charles Betz, IT Strategist, Advisor and Author from Armstrong Process Group. Betz provided an overview of how the IT4IT framework can be used within organizations to streamline IT processes, particularly by automating systems that no longer need to be done by hand. Standardizing IT processes also provides a way to deliver more consistent results across the entire IT value chain for better business results. Taking an iterative and team-oriented approach are also essential elements for managing the body of knowledge necessary for changing IT processes and creating digital transformation.

During the lunch hour conference partners Hewlett Packard Enterprise and Simplilearn each gave  separate presentations for attendees discussing the use of IT4IT for digital transformation and skills acquisition in the digital economy, respectively

Monday afternoon, The Open Group hosted its fourth TOGAF®, an Open Group standard, User Group meeting in addition to the afternoon speaking tracks. The User Group meeting consisted of an Oxford style debate on the pros and cons of “Create versus Reuse Architecture,” featuring Jason Uppal, Open CA Level 3 Certified Architect, QRS, and Peter Haviland, Managing Director, Head of Engineering & Architecture, Moody’s Corporation. In addition to the debate, User Group attendees had the opportunity to share use cases and stories with each other and discuss improvements for TOGAF that would be beneficial to them in their work.

The afternoon sessions consisted of five separate tracks:

  • IT4IT in Practice – Rob Akershoek from Logicalis/Shell Information Technology International moderated a panel of experts from the morning plenary as well as sessions related to presenting IT4IT to executives, the role of EA in the IT value chain and using IT4IT with TOGAF®.
  • Digital Business & the Customer Experience – Featuring sessions on architecting digital businesses and staying ahead of disruption hosted by Ron Schuldt of Femto-data.
  • Open Platform 3.0™/Cloud – Including talks on big data analytics in hybrid cloud environments and using standards and open source for cloud customer reference architectures hosted by Heather Kreger, Distinguished Engineer and CTO International Standards, IBM.
  • Open Trusted Technology – Trusted Technology Forum Director Sally Long introduced sessions on the new O-TTPS self-assessed certification and addressing product integrity and supply chain risk.
  • Open Business ArchitectureFeaturing an introduction to the new preliminary Business Architecture (O-BA) standard presented by Patrice Duboe, Innovation VP, Global Architects Leader from the CTO Office at Capgemini, and Venkat Nambiyur, Director – Business Transformation, Enterprise & Cloud Architecture, SMBs at Oracle.

Monday’s proceedings concluded with an evening networking reception featuring the day’s speakers, IT professionals, industry experts and exhibitors. Thanks for the San Francisco event also go to the event sponsors, which include Premium Sponsors Good eLearning, Hewlett Packard Enterprise, Orbus Software and Simplilearn, as well as sponsors Van Haren Publishing, the Association of Enterprise Architects and San Jose State University.

@theopengroup #ogSFO

Leave a comment

Filed under Enterprise Architecture (EA), Forrester, Gene Kim, IT4IT, Open Platform 3.0, OTTF, Steve Nunn, The Open Group, The Open Group San Francisco 2017, TOGAF®, Uncategorized

Viewpoint: Technology Supply Chain Security – Becoming a Trust-Worthy Provider

By Andras Szakal, IBM

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways: first, does the product meet essential functional requirements; and second, has the product or component been produced by trustworthy provider. Of course, the leaders or owners of these systems have to do their part to achieve security and safety (e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats). Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

For example:  Most of us are able to enjoy a meal of stir-fried shrimp and not give a second thought as to whether the shellfish is safe to eat.

Why is that? Because we are the beneficiaries of a system whose workings greatly increase the likelihood — in many parts of the world — that the shellfish served to end consumers is safe and uncontaminated. While tainted technology is not quite the same as tainted foods it’s a useful analogy.

Of course, a very high percentage of the seafood industry is extremely motivated to provide safe and delicious shellfish to the end consumer. So we start with the practical perspective that, much more likely than not in today’s hyper-informed and communicative world, the food supply system will provide reasonably safe and tasty products. Invisible though it may be to most of us, however, this generalized confidence rests on a worldwide system that is built on globally recognized standards and strong public-private collaboration.

This system is necessary because mistakes happen, expectations evolve and — worse — the occasional participant in the food supply chain may take a shortcut in their processing practices. Therefore, some kind of independent oversight and certification has proven useful to assure consumers that what they pay for — their desired size and quality grade and, always, safety — is what they will get. In many countries, close cooperation between industry and government results in industry-led development and implementation of food safety standards.[1]

Government’s role is limited but important. Clearly, government cannot look at and certify every piece of shellfish people buy. So its actions are focused on areas in which it can best contribute: to take action in the event of a reported issue; to help convene industry participants to create and update safety practices; to educate consumers on how to choose and prepare shellfish safely; and to recognize top performers.[2]

Is the system perfect? Of course not. But it works, and supports the most practical and affordable methods of conducting safe and global commerce.

Let’s apply this learning to another sphere: information technology. To wit:

  • We need to start with the realization that the overwhelming majority of technology suppliers are motivated to provide securely engineered products and services, and that competitive dynamics reward those who consistently perform well.
  • However, we also need to recognize that there is a gap in time between the corrective effect of the market’s Invisible Hand and the damage that can be done in any given incident. Mistakes will inevitably happen, and there are some bad actors. So some kind of oversight and governmental participation are important, to set the right incentives and expectations.
  • We need to acknowledge that third-party inspection and certification of every significant technology product at the “end of pipe” is not only impractical but also insufficient. It will not achieve trust across a wide variety of infrastructures and industries.  A much more effective approach is to gather the world’s experts and coalesce industry practices around the processes that the experts agree are best suited to produce desired end results.
  • Any proposed oversight or government involvement must not stymie innovation or endanger a provider’s intellectual capital by requiring exposure to 3rd party assessments or require overly burdensome escrow of source code.
  • Given the global and rapid manner in which technologies are invented, produced and sold, a global and agile approach to technology assurance is required to achieve scalable results.  The approach should be based on understood and transparently formulated standards that are, to the maximum extent possible, industry-led and global in their applicability.  Conformance to such standards once would then be recognized by multiple industry’s and geo-political regions.  Propagation of country or industry specific standards will result in economic fragmentation and slow the adoption of industry best practices.

The Open Group Trusted Technology Forum (OTTF)[3] is a promising and complementary effort in this regard. Facilitated by The Open Group, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Membership continues to grow and includes representation from manufacturers world-wide.

Governments and enterprises alike will benefit from OTTF’s work. Technology purchasers can use the Open Trusted Technology Provider (OTTP) Standard and OTTP Framework best practice recommendations to guide their strategies.  And a wide range of technology vendors can use OTTF approaches to build security and integrity into their end-to-end supply chains. The first version of the OTTPS is focused on mitigating the risk of tainted and counterfeit technology components or products. The OTTF is currently working a program that will accredit technology providers to the OTTP Standard. We expect to begin pilot testing of the program by the end of 2012.

Don’t misunderstand us: Market leaders like IBM have every incentive to engineer security and quality into our products and services. We continually encourage and support others to do the same.

But we realize that trusted technology — like food safety — can only be achieved if we collaborate with others in industry and in government.  That’s why IBM is pleased to be an active member of the Trusted Technology Forum, and looks forward to contributing to its continued success.

A version of this blog post was originally posted by the IBM Institute for Advanced Security.

Andras Szakal is the Chief Architect and a Senior Certified Software IT Architect for IBM’s Federal Software Sales business unit. His responsibilities include developing e-Government software architectures using IBM middleware and managing the IBM federal government software IT architect team. Szakal is a proponent of service oriented and web services based enterprise architectures and participates in open standards and open source product development initiatives within IBM.

 

Comments Off on Viewpoint: Technology Supply Chain Security – Becoming a Trust-Worthy Provider

Filed under OTTF

“Making Standards Work®”

By Andrew Josey, The Open Group

Next month as part of the ongoing process of “Making Standards Work®,” we will be setting standards and policy with those attending the member meetings at The Open Group Conference, London, (May 9-12, Central Hall Westminster). The standards development activities include a wide range of subject areas from Cloud Computing, Tools and People certification, best practices for Trusted Technology, SOA and Quantum Lifecycle Management, as well as maintenance of existing standards such as TOGAF® and ArchiMate®. The common link with all these activities is that all of these are open standards developed by members of The Open Group.

Why do our members invest their time and efforts in development of open standards? The key reasons as I see them are as follows:

  1. Open standards are a core part of today’s infrastructure
  2. Open standards allow vendors to differentiate their offerings by offering a level of openness (portable interfaces and interoperability)
  3. Open standards establish a baseline from which competitors can innovate
  4. Open standards backed with certification enable customers to buy with increased confidence

This is all very well, you say — but what differentiates The Open Group from other standards organizations? Well, when The Open Group develops a new standard, we take an end-to-end view of the ecosystem all the way through from customer requirements, developing consensus standards to certification and procurement. We aim to deliver standards that meet a need in the marketplace and then back those up with certification that delivers an assurance about the products or in the case of people certification, their knowledge or skills and experience. We then take regular feedback on our standards, maintain them and evolve them according to marketplace needs. We also have a deterministic, timely process for developing our standards that helps to avoid the stalemate that can occur in some standards development.

Let’s look briefly at two of the most well known Open Group standards:  UNIX® and TOGAF®,. The UNIX® and TOGAF® standards are both examples of where a full ecosystem has been developed around the standard.

The UNIX® standard for operating systems has been around since 1995 and is now in its fourth major iteration. High reliability, availability and scalability are all attributes associated with certified UNIX® systems. As well as the multi-billion-dollar annual market in server systems from HP, Oracle, IBM and Fujitsu, there is an installed base of 50 million users* using The Open Group certified UNIX® systems on the desktop.

TOGAF® is the standard enterprise architecture method and framework. It encourages use with other frameworks and adoption of best practices for enterprise architecture. Now in its ninth iteration, it is freely available for internal use by any organization globally and is widely adopted with over 60% of the Fortune 50 and more than 80% of the Global Forbes 50. The TOGAF® certification program now has more than 15,000 certified individuals, including over 6,000 for TOGAF® 9.

If you are able to join us in London in May, I hope you will be able to also join us at the member meetings to continue making standards work. If you are not yet a member then I hope you will attend the conference itself and network with the members to find out more and consider joining us in Making Standards Work®!

For more information on The Open Group Standards Process visit http://www.opengroup.org/standardsprocess/

(*) Apple estimated number from Briefing October 2010. Mac OS X is certified to the UNIX 03 standard.

Standards development will be part of member meetings taking place at The Open Group Conference, London, May 9-13. Join us for best practices and case studies on Enterprise Architecture, Cloud, Security and more, presented by preeminent thought leaders in the industry.

Andrew Josey is Director of Standards within The Open Group, responsible for the Standards Process across the organization. Andrew leads the standards development activities within The Open Group Architecture Forum, including the development and maintenance of TOGAF® 9, and the TOGAF® 9 People certification program. He also chairs the Austin Group, the working group responsible for development and maintenance the POSIX 1003.1 standard that forms the core volumes of the Single UNIX® Specification. He is the ISO project editor for ISO/IEC 9945 (POSIX). He is a member of the IEEE Computer Society’s Golden Core and is the IEEE P1003.1 chair and the IEEE PASC Functional chair of Interpretations. Andrew is based in the UK.

Comments Off on “Making Standards Work®”

Filed under Standards, TOGAF, UNIX