Category Archives: The Open Group

The Open Group San Francisco Day Two Highlights

By The Open Group

Day two of The Open Group San Francisco event was held Tuesday, January 31 on another sunny, winter day in San Francisco. Tuesday’s welcome address featured Steve Nunn, President & CEO, and Jim Hietala, VP Business Development and Security, both of The Open Group, greeting attendees for a morning of sessions centered around the theme of Making Standards Work®. Nunn kicked off the morning by reporting that the first day of the conference had been very well received with copious positive feedback on Monday’s speakers.

It was also announced that the first certification courses for ArchiMate® 3.0 , an Open Group standard, kicked off at the conference. In addition, the San Francisco event marked the launch of The Open Group Open Process Automation™ Forum, a Forum of The Open Group, which will address standards development for open, secure, interoperable process control architectures. The Forum will include end users, suppliers, systems integrators, integrated DCS vendors, standards organizations and academics from a variety of industries, including food and beverage, oil and gas, pulp and paper, petrochemical, pharmaceuticals, metals and mining, and utilities.  Hietala joined Nunn on stage to discuss the launch of the Forum, which came out of a vision from ExxonMobil. The Forum has already grown rapidly, with almost 100 members. Forum Members are also attending and holding events at the annual ARC Advisory Group Industry Forum in Orlando.

The morning plenary began with Dennis Stevens from Lockheed Martin discussing “The Influence of Open Architecture Standards on the Emergence of Advance Process Control Systems.” Stevens, who is involved in The Open Group FACE™ Consortium, will also be leading the Open Process Automation Forum. Stevens opened by saying that this is a particularly exciting time in industrial automation due to of the intersection of standards, technology and automation. According to Stevens, the work that has been done in the FACE Forum over the past few years has paved the way for what also needs to be done in process automation.

Stevens noted that many of the industrial systems in use today will be facing obsolescence in the next few years due to a variety of reasons, including a proliferation of proprietary and closed systems, a lack of sophisticated development tools and the high-cost of technology refreshes. Tech trends such as the Internet of Things, cybersecurity, open source and virtualization are also forcing a need for industrial manufacturers to change. In addition, the growth of complexity in software systems and the changeover from hardware dominant to software dominant systems is also compelling factors for automation change. However, Stevens says, by reusing existing and creating new standards, there are many opportunities for cost savings and reducing complexity.

According the Stevens, the goal is to standardize the interfaces that companies can use so there is interoperability across systems built atop a common framework. By standardizing the interface only, organizations can still differentiate themselves by bringing their own business processes and designs to those systems via hardware or software components. In addition, by bringing elements from the FACE standardization model to Open Process Automation, the new forum can also take advantage of proven processes that already take into account regulations around co-opetition and anti-trust. Stevens believes that Open Process Automation will ultimately enable new markets and suppliers for process automation as well as lower the cost of doing business in industrial automation.

Following the morning break, Chair of the Department of Economics at San Jose State University Dr. Lydia Ortega took stage for the second morning session, entitled “Innovative Communities.”  Ortega took a refreshing look at what The Open Group does and how it works by applying economic theory to illustrate how the organization is an “Innovative community.” Ortega began by providing what she called an “economist’s definition” of what open standards are, which she defined as a collection of dispersed knowledge that is a building block for innovation and is continually evolving. She also described open standards as a “public good,” due to the fact that they are knowledge-based, non-rivalrous, non-excludable and produced once and available to others at marginal cost. Teamwork, consensus, community are also characterizing features of what makes the organization work. Ortega plans to continue her research into what makes The Open Group work by examining competing standards bodies and the organization’s origins among other things.

Prior to introducing the next session, Steve Nunn presented an award to Steve Whitlock, a long-time Open Group member who recently retired from Boeing, for more than 20 years of leadership, contributions and service to The Open Group. Colleagues provided additional praise for Whitlock and his willingness to lead activities on behalf of The Open Group and its members, particularly in the area of security.

The morning’s third session featured Mike Jerbic, Principal Consultant for Trusted System Consulting Group, highlighting how the “Norwegian Regional Healthcare Project & Open FAIR” have been used to analyze the cost benefits of a home treatment program for dialysis patients in Norway. Currently, due to health and privacy regulations and security requirements, patients who receive home dialysis must physically transport data regarding their treatments to hospitals, which affects the quality of patient’s lives but protects the state from security issues related to transporting data online. Jerbic and a group of economics students at San Jose State University in California did an economic analysis to examine the costs vs. benefits of the program. Using The Open Group Open FAIR™ body of knowledge to analyze the potential threats to both patient privacy and information security, the group found it would make sense to pose the program risks as an engineering problem to be solved. However, they must do additional research to weigh the benefits of potential cost savings to the state vs. the benefits of quality of life for patients.

Concluding Tuesday’s plenary sessions was a panel entitled “Open FAIR in Practice,” which extended the conversation regarding the Norwegian healthcare project by taking questions from the audience about the program. Jerbic moderated the panel, which included Ortega; Eva Kuiper, ESS GRC Security Consultant, HPE; John Linford, Lecturer, Department of Economics, San Jose State University; and Sushmitha Kasturi, Undergraduate Researches, San Jose State University.

Jerbic also announced that a number of students from San Jose State, many of whom were in attendance, have recently either completed or begun their certification in Open FAIR.  He also talked about an Academic Program within The Open Group that is working with students on projects that are mutually beneficial, allowing The Open Group to get help with the work needed to create standards, while providing important practical work experience for students.

by-the-open-group

by-the-open-group

San Jose State University Students

Following the plenary, Tuesday’s lunchtime partner presentation featured Sean Cleary, Senior Consultant, Orbus Software, presenting on “Architecture Roadmap Visualization with ArchiMate® 3.0.”

Afternoon sessions were split into two tracks, Cognitive Computing and EA in Practice.

  • EA in Practice – Hosted by Len Fehskens of the Association of Enterprise Architects, two sessions looked at maxims and folktales for architects, presented by Fehskens, and how to enable government and management with continuous audits with Robert Weisman, CEO/COO of Build the Vision.
  • Cognitive Computing – Chris Harding from The Open Group served as host for four sessions in the track:
    • Ali Arsanjani, CTO for Analytics and Emerging Technologies, IBM – Arsanjani provided an overview of different ways that data can be structured for cognitive computing applications. According to Arsanjani, cognitive systems are meant to augment, not replace, human systems and to be of service to us. By combining human interaction and curation with automated data analysis and machine learning, companies will be able to gain greater business advantages. However, we also must also always be aware of the implications of using artificial systems and the potential consequences of doing so, he said.
    • Jitendra Maan, Enterprise Architect and Center of Excellence Lead, Tata Consultancy Services – Maan says cognitive computing signals a shift in how machines interact with humans, other machines and the environment, with potential for new categories of business outcomes and disruption. The design of automated systems is critical to how cognitive systems are expected to evolve but unlike traditional computing, cognitive will rely on a combination of natural language processing, machine learning and data. Potential business applications already in progress include service support centers, contract management, risk assessment, intelligent chat bots and conversation work flows. Maan predicts bots will actually replace many service functions in the next few years.
    • Swaminathan Chandrsekaran, Industry Apps & Solutions, IBM Watson, both of IBM – Chandrsekaran’s talk took a deeper dive into cognitive computing and the make-up of cognitive systems. Understanding, reason, learning and interaction are key to teaching cognitive systems how to work, he said. Cognitive systems are also broadly categorized around language, speech, vision and data & insights, much like the human brain. Patterns can generally be created from cognitive conversations, discovery and application extensions. Chandreskaran also shared how to model a reference architecture for a cognitive conversation pattern.
    • The Cognitive Computing panel, moderated by Harding, included afternoon speakers Arsanjani, Maan and Chandrsekaran. The panel discussed how businesses can gain advantage from cognitive computing, learned personalization and contextualization via systems training, the time it takes to train a system (now days or weeks vs. months or years), making the systems more intelligent over time, and the need to aggregate and curate data from the beginning of a project and also focus on introducing domain-relevant data, as well as the importance of good data curation.

The day concluded with a social event and dinner for attendees held at the Autodesk Gallery, a San Francisco destination that marries creativity, design and engineering in more than 20 exhibits sponsored by companies such as Lego and Mercedes Benz.

by-the-open-group

Networking at the Autodesk Gallery

The following day, the event offered track sessions in areas including  Internet of Things (IoT) and Architecture.  The Open Group San Francisco drew to a close with Members Only Meetings on February 2.

@theopengroup #ogSFO

We are looking forward to seeing you at The Open Group Berlin April 24-27, 2017! #ogBER

 

Leave a comment

Filed under ArchiMate®, Digital Transformation, Enterprise Architecture, Enterprise Architecture (EA), FACE™, Internet of Things, IoT, O-BA Standard, Open Business Architecture (O-BA), Open FAIR, Open Process Automation, standards, Steve Nunn, The Open Group, The Open Group San Francisco 2016, TOGAF®, Uncategorized

The Open Group San Francisco Day One Highlights

By The Open Group

The Open Group kicked off its first event of 2017 on a sunny Monday morning, January 30, in the City by the Bay, with over 200 attendees from 20 countries including Australia, Finland, Germany and Singapore.

The Open Group CEO and President Steve Nunn began the day’s proceedings with a warm welcome and the announcement of the latest version of the Open Trusted Technology Provider™ Standard (O-TTPS), a standard that specifies best practices for providers to help them mitigate the risk of tainted or counterfeit products or parts getting into the IT supply chain. A new certification program for the standard was also announced, as well as the news that the standard has recently been ratified by ISO. Nunn also announced the availability of the next version of The Open Group IT4IT™ standard, version 2.1.

Monday’s plenary focused on IT4IT and Managing the Business of IT. Bernard Golden, CEO of Navica, spoke on the topic,“Cloud Computing and Business Expectations: How the Cloud Changes Everything.” Golden, who was named as one of the 10 most influential people in cloud computing by Wired magazine, began with a brief overview of the state of the computing industry today, which is largely characterized by the enormous growth of cloud computing. Golden believes that the public cloud will be the future of IT moving forward. With the speed that the cloud enables today, IT and app development have become both the bottleneck and differentiator for IT departments. To address these bottlenecks, IT must take a multi-pronged, continuous approach that uses a combination of cloud, Agile and DevOps to address business drivers. The challenge for IT shops today, Golden says, is also to decide where to focus and what cloud services they need to build applications. To help determine what works, IT must ask whether services are above or below what he calls “the value line,” which delineates whether the services available, which are often open-source, will ultimately advance the company’s goals or not, despite being low cost. IT must also be aware of the fact that the value line can present a lock-in challenge, creating tension between the availability of affordable—but potentially buggy—open-source tools and services and the ongoing value the business needs. Ultimately, Golden says, the cloud has changed everything—and IT must be willing to change with it and weigh the trade-offs between openness and potential lock-in.

Forrester Research analysts David Wheable, Vice President and Principal Consultant, and David Cannon, Vice President and Group Director, took the stage following Golden’s session to discuss “The Changing Role of IT: Strategy in the Age of the Customer.” Wheable spoke first, noting that technology has enabled a new “age of the customer,” an era where customers now have the majority of the power in the business/customer relationship.  As such, companies must now adapt to how their customers want to interact with their businesses and how customers use a company’s business applications (particularly via mobile devices) in order to survive and prevent customers from constantly changing their loyalties. Because IT strategists will not be able to predict how customers will use their applications, they must be able to put themselves in a position where they can quickly adapt to what is happening.

Cannon discussed what IT departments need to consider when it comes to strategy. To develop a viable IT strategy today, companies must consider what is valuable to the customer and how they will choose the technologies and applications that provide customers what they need. In the current IT landscape, features and quality no longer matter—instead, IT must take into account customers’ emotions, desires and immediate needs. Continuous exploitation of digital assets to deliver customer outcomes will be critical for both digital and business strategies—which Cannon argues are now essentially the same thing—moving forward. To survive in this new era, IT departments must also be able to enable customer outcomes, measure the customer experience, manage a portfolio of services, showcase business—not just technical—expertise and continue to enable service architectures that will deliver what customers need and want.

After the morning coffee break, Author and Researcher Gene Kim followed to discuss his recent book, The DevOps Handbook. His session, entitled, “The Rise of Architecture: Top Lessons Learned while Researching and Writing The DevOps Handbook,” explored the example of high performers in the tech sector and how the emergence of DevOps has influenced them. According to Kim, most IT departments are subject to a downward spiral over time due to the exponential growth of technical assets and debt during that time, which ultimately weigh them down and affect performance. In contrast, according to Kim’s research, high-performing organizations have been able to avoid this spiral by using DevOps. Organizations utilizing DevOps are nearly three times more agile than their peers, are more reliable and two times more likely to exceed profitability, market share and productivity goals in the marketplace. The ability to deploy small changes more frequently has been a game changer for these high-performing organizations not only allowing them to move faster but to create more humane working conditions and happier, more productive workers. Kim also found that fear of doing deployments is the most accurate predictor of success in organizations—those that fear deployments have less success than those that don’t.

by-the-open-group

Gene Kim

The final session of the morning plenary was presented by Charles Betz, IT Strategist, Advisor and Author from Armstrong Process Group. Betz provided an overview of how the IT4IT framework can be used within organizations to streamline IT processes, particularly by automating systems that no longer need to be done by hand. Standardizing IT processes also provides a way to deliver more consistent results across the entire IT value chain for better business results. Taking an iterative and team-oriented approach are also essential elements for managing the body of knowledge necessary for changing IT processes and creating digital transformation.

During the lunch hour conference partners Hewlett Packard Enterprise and Simplilearn each gave  separate presentations for attendees discussing the use of IT4IT for digital transformation and skills acquisition in the digital economy, respectively

Monday afternoon, The Open Group hosted its fourth TOGAF®, an Open Group standard, User Group meeting in addition to the afternoon speaking tracks. The User Group meeting consisted of an Oxford style debate on the pros and cons of “Create versus Reuse Architecture,” featuring Jason Uppal, Open CA Level 3 Certified Architect, QRS, and Peter Haviland, Managing Director, Head of Engineering & Architecture, Moody’s Corporation. In addition to the debate, User Group attendees had the opportunity to share use cases and stories with each other and discuss improvements for TOGAF that would be beneficial to them in their work.

The afternoon sessions consisted of five separate tracks:

  • IT4IT in Practice – Rob Akershoek from Logicalis/Shell Information Technology International moderated a panel of experts from the morning plenary as well as sessions related to presenting IT4IT to executives, the role of EA in the IT value chain and using IT4IT with TOGAF®.
  • Digital Business & the Customer Experience – Featuring sessions on architecting digital businesses and staying ahead of disruption hosted by Ron Schuldt of Femto-data.
  • Open Platform 3.0™/Cloud – Including talks on big data analytics in hybrid cloud environments and using standards and open source for cloud customer reference architectures hosted by Heather Kreger, Distinguished Engineer and CTO International Standards, IBM.
  • Open Trusted Technology – Trusted Technology Forum Director Sally Long introduced sessions on the new O-TTPS self-assessed certification and addressing product integrity and supply chain risk.
  • Open Business ArchitectureFeaturing an introduction to the new preliminary Business Architecture (O-BA) standard presented by Patrice Duboe, Innovation VP, Global Architects Leader from the CTO Office at Capgemini, and Venkat Nambiyur, Director – Business Transformation, Enterprise & Cloud Architecture, SMBs at Oracle.

Monday’s proceedings concluded with an evening networking reception featuring the day’s speakers, IT professionals, industry experts and exhibitors. Thanks for the San Francisco event also go to the event sponsors, which include Premium Sponsors Good eLearning, Hewlett Packard Enterprise, Orbus Software and Simplilearn, as well as sponsors Van Haren Publishing, the Association of Enterprise Architects and San Jose State University.

@theopengroup #ogSFO

Leave a comment

Filed under Enterprise Architecture (EA), Forrester, Gene Kim, IT4IT, Open Platform 3.0, OTTF, Steve Nunn, The Open Group, The Open Group San Francisco 2017, TOGAF®, Uncategorized

The Open Trusted Technology Provider™ Standard (O-TTPS) – Approved as ISO/IEC 20243:2015 and the O-TTPS Certification Program

By The Open Group

The increase of cybersecurity threats, along with the global nature of Information and Communication Technology (ICT), results in a threat landscape ripe for the introduction of tainted (e.g., malware-enabled or malware-capable) and counterfeit components into ICT products. This poses significant risk to customers in the operation of their business enterprises and our critical infrastructures.

A compromised electronic component or piece of malware-enabled software that lies dormant and undetected within an organization could cause tremendous damage if activated remotely. Counterfeit products can also cause significant damage to customers and providers resulting in rogue functionality, failed or inferior products, or revenue, brand equity loss, and critical damage.

As a result, customers now need assurances they are buying from trusted technology providers who follow best practices with their own in-house secure development and engineering practices and also in securing their out-sourced components and their supply chains.

Summary

The O-TTPS, an Open Group Standard, specifies a set of best practice requirements and recommendations that ICT providers should follow throughout the full life cycle of their products from design through disposal – including their supply chains – in order to mitigate the risk of tainted and counterfeit components. The Standard is the first with a Certification Program that specifies measurable conformance criteria for both product integrity and supply chain security in ICT.

The Standard provides requirements for the full product life cycle, categorizing them further into best practice requirements for Technology Development (product development and secure engineering methods) and Supply Chain Security.

by-the-open-group

The Open Group O-TTPS Certification Program offers certificates for conformance to both the O-TTPS and ISO/IEC 20243:2015, as the two standards are equivalent. The Program identifies the successful applicant on a public registry so customers and business partners can readily identify an Open Trusted Technology Provider™ who conforms to the Standard.

The Certification Program is available to all providers in the ICT product’s supply chain, including: Original Equipment Manufacturers (OEMs), hardware and software component suppliers, integrators, Value Add Resellers (VARS), and distributors. Thus, it offers a holistic program that not only allows customers to identify trusted business partners like integrators or OEMs who are listed on the registry, but it also allows OEMs and integrators to identify trusted business partners like hardware and software component suppliers, VARS, and distributors from the public registry.

by-the-open-group

Target Audience

As the O-TTPS Certification Program is open to all constituents involved in a product’s life cycle – from design through disposal – including those in the product’s supply chain, the Standard and the Certification Program should be of interest to all ICT providers as well as ICT customers.

The newly published guide: O-TTPS for ICT Product Integrity and Supply Chain Security – A Management Guide, available from The Open Group Bookstore at www.opengroup.org/bookstore/catalog/g169.htm, offers guidance to managers – business managers, procurement managers, or program managers – who are considering adopting the best practices or becoming certified as an Open Trusted Technology Provider™. It provides valuable information on:

  • The best practices in the Standard, with an Appendix that includes all of the requirements
  • The business rationale for why a company should consider implementing the Standard and becoming certified
  • What an organization should understand about the Certification Program and how they can best prepare for the process
  • The differences between the options (self-assessed or third-party assessed) that are currently available for the Certification Program
  • The process steps and the terms and conditions of the certification, with pointers to the relevant supporting documents, which are freely available

The Management Guide offers a practical introduction to executives, managers, those involved directly in implementing the best practices defined in the Standard, and those who would be involved in the assessments, whether self-assessment or third-party assessment.

Further Information

The Open Trusted Technology Provider™ Standard (O-TTPS), Version 1.1 is available free-of-charge from www.opengroup.org/bookstore/catalog/c147.htm.

The technically equivalent standard – ISO/IEC 20243: 2015 – is available for a fee from iso.org.

For more information on the Open Trusted Technology Provider™ Standard (O-TTPS) and the O-TTPS Certification Program, visit www.opengroup.org/ottps.

@theopengroup #ogSFO

1 Comment

Filed under Accreditations, Certifications, COTS, Cybersecurity, O-TTF, O-TTPS, OTTF, standards, Supply chain risk, The Open Group, The Open Group San Francisco 2017, Uncategorized

To Colonize Mars, Look to Standards Development

By The Open Group

In advance of The Open Group San Francisco 2017, we spoke with Keegan Kirkpatrick, one of the co-founders of RedWorks, a “NewSpace” start-up focused on building 3D printable habitats for use on earth and in space.  Kirkpatrick will be speaking during the Open Platform 3.0™/Internet of Things (IoT) session on February 1.

Keegan Kirkpatrick believes that if we are to someday realize the dream of colonizing Mars, Enterprise Architects will play a critical role in getting us there.

Kirkpatrick defines the contemporary NewSpace industry as a group of companies that are looking to create near-term solutions that can be used on Earth, derived from solutions created for long-term use in space. With more private companies getting into the space game than ever before, Kirkpatrick believes the means to create habitable environments on the moon or on other planets isn’t nearly as far away as we might think.

“The space economy has always been 20 years away from where you’re standing now,” he says.

But with new entrepreneurs and space ventures following the lead of Elon Musk’s SpaceX, the space industry is starting to heat up, branching out beyond traditional aerospace and defense players like NASA, Boeing or Lockheed Martin.

“Now it’s more like five to ten years away,” Kirkpatrick says.

Kirkpatrick, who has a background in aerospace engineering, says RedWorks was born out of NASA’s 3D Printed Habitat Challenge, a “Centennial Challenge” where people from all kinds of backgrounds competed to create 3D printing/construction solutions for building and surviving on Mars.

“I was looking to get involved in the challenge. The idea of 3D printing habitats for Mars was fascinating to me. How do we solve the mass problem? How do we allow people to be self-sufficient on Mars once they get there?” he says.

Kirkpatrick says the company came together when he found a small 3D printing company in Lancaster, Calif., close to where he lives, and went to visit them. “About 20 minutes later, RedWorks was born,” he says. The company currently consists of Kirkpatrick, a 3D printing expert, and a geologist, along with student volunteers and a small team of engineers and technicians.

Like other NewSpace companies, RedWorks is focusing on terrestrial solutions first; both in order to create immediate value for what they’re doing and to help raise capital. As such, the company is looking to design and build homes by 3D printing low-cost materials that can be used in places that have a need for low-cost housing. The company is talking with real estate developers and urban planners and looking to areas where affordable housing might be able to be built entirely on site using their Mars-derived solutions.

“Terrestrial first is where the industry is going,” Kirkpatrick says. “You’ll see more players showing up in the next few years trying to capitalize on Earth-based challenges with space-based solutions.”

RedWorks plans to use parametric architecture models and parametric planning (design processes based on algorithmic thinking in which the relationship between elements is used to inform the design of complex structures) to create software for planning the printable communities and buildings. In the short-term, Kirkpatrick believes 3D printing can be used to create smart-city living solutions. The goal is to be able to combine 3D printing and embedded software so that people can design solutions specific to the environments where they’ll be used. (Hence the need for a geologist on their team.) Then they can build everything they need on site.

“For Mars, to make it a place that you can colonize, not just explore, you need to create the tools that people with not much of an engineering or space architecture background can use to set up a colony wherever they happen to land,” Kirkpatrick says. “The idea is if you have X number of people and you need to make a colony Y big, then the habitat design will scale everything with necessary utilities and living spaces entirely on-site. Then you can make use of the tools that you bring with you to print out a complete structure.”

Kirkpatrick says the objective is to be able to use materials native to each environment in order to create and print the structures. Because dirt and sand on Earth are fundamentally similar to the type of silicate materials found on the Moon and Mars, RedWorks is looking to develop a general-purpose silica printer that can be used to build 3D structures. That’s why they’re looking first to develop structures in desert climates, such southern California, North Africa and the Middle East.

A role for architecture and standards

As the private, NewSpace industry begins to take off, he believes there will be a strong need for standards to guide the nascent industry—and for Enterprise Architects to help navigate the complexities that will come with designing the technology that will enable the industry.

“Standards are necessary for collaborating and managing how fast this will take off,” he says.

Kirkpatrick also believes that developing open standards for the new space industry will better help NewSpace companies figure out how they can work together. Although he says many of NewSpace start-ups already have an interest in collaborating, with much of their work in the very early stages, they do not necessarily have much incentive to work together as of yet. However, he says, “everyone realizes that collaboration will be critical for the long-term development of the industry.”  Beginning to work toward standards development with an organization such as The Open Group now will help incentivize the NewSpace community to work together—and thus push the industry along even faster, Kirkpatrick says.

“Everyone’s trying to help each other as much as they can right now, but there’s not a lot of mechanisms in place to do so,” he says.

According to Kirkpatrick, it’s important to begin to think about standards for space-related technology solutions before the industry reaches an inflection point and begins to take off quickly. Kirkpatrick expects that inflection point will occur once a launcher like SpaceX is able to do full return landings of its rockets that are then ready for reuse. He expects that launch costs will begin to fall rapidly over the next five to ten years once launch providers can offer reliable reusable launch services, spurring the industry forward.

“Once you see launch costs fall by a factor of 10 or 100, the business side of the industry is going to grow like a weed. We need the infrastructure in place for everyone to work together and enable this incredible opportunity we have in space. There’s a very bright horizon ahead of use that’s just a little hard for everyone to see right now. But it’s coming faster than anyone realizes.”

@theopengroup #ogSFO

by-the-open-groupKeegan Kirkpatrick is the Team Lead and founder of RedWorks, a NewSpace startup in Lancaster, California. He has an undergraduate degree in Aerospace Engineering from Embry-Riddle Aeronautical University, and before turning entrepreneur worked as an engineer at Masten Space Systems on the Mojave Air and Spaceport.

In 2015, Keegan founded RedWorks with Paul Petros, Susan Jennings, and Lino Stavole to compete in and make it to the finals of the NASA Centennial 3D Printed Habitat Challenge. Keegan’s team is creating ways to 3D-print habitats from on-site materials, laying the groundwork for human settlement of the solar system.

Leave a comment

Filed under digital technologies, Enterprise Architecture (EA), Future Technologies, Internet of Things, IoT, Open Platform 3.0, Standards, The Open Group, The Open Group San Francisco 2017, Uncategorized

Gaining Executive Buy-In for IT4IT™: A Conversation with Mark Bodman

By The Open Group

With many organizations undergoing digital transformation, IT departments everywhere are taking serious hits. And although technology is at the heart of many business transformations, IT has traditionally had a reputation as a cost center rather than an innovation center.

As such, executives are often skeptical when presented with yet another new IT plan or architecture for their organizations that will be better than the last. Due to the role Enterprise Architects play in bridging the gap between the business and IT, it’s often incumbent on them to make the case for big changes when needed.

Mark Bodman, Senior Product Manager at ServiceNow and formerly at HPE, has been working with and presenting the IT4IT standard, an Open Group standard, to executives for a number of years. At The Open Group San Francisco 2017 event on January 30, Bodman will offer advice on how to present IT4IT in order to gain executive buy-in. We spoke with him in advance of the conference to get a sneak peek before his session.

What are Enterprise Architects up against these days when dealing with executives and trying to promote IT-related initiatives?

The one big change that I’ve seen is the commoditization of IT. With the cloud-based economy and the ability to rent cheap compute, storage and networking, being able to effectively leveraging commodity IT is a key differentiator that will make or break an organization. At the end of the day, the people who can exploit cheaper technology to do unique things faster are those companies who will come out ahead long-term. Companies based on legacy technologies that don’t evolve will stall out and die.

Uber and Netflix are great case studies for this trend. It’s happening everyday around us—and it’s reaching a tipping point. Enterprise Architects are faced with communicating these scenarios within their own organizations—use cases like going digital, streamlining for costs, sourcing more in the cloud—all strategies required to move the needle. Enterprise Architects are the senior most technical people within IT. They bridge the gap between business and technology at the highest level—and have to figure out ‘How do I communicate and plan for these disruptions here so that we can, survive in the digital era?’

It’s a Herculean task, not an easy thing to do. I’ve found there’s varying degrees of success for Enterprise Architects. Sometimes by no fault of their own, because they are dealing with politics, they can’t move the right agenda forward.  Or the EA may be dealing with a Board that just wants to see financial results the next quarter, and doesn’t care about the long-term transformations. These are the massive challenges that Enterprise Architects deal with every day.

Why is it important to properly present a framework like IT4IT to executives right now?

It’s as important as the changes in accounting rules have impacted organizations.  How those new rules and regulations changed in response to Enron and the other big financial failures within recent memory was quite impactful. When an IT shop is implementing services and running the IT organization as a whole, what is the operating model they use? Why is one IT shop so much different from another when we’re all facing similar challenges, using similar resources? I think it’s critically important to have a vetted industry standard to answer these questions.

Throughout my career, I’ve seen many different models for running IT from many different sources. From technology companies like HPE and IBM, to consulting companies like Deloitte, Accenture and Bain; each has their own way of doing things.  I refer this to the ‘IT flavor of the month.’  One framework is chosen over another depending on what leadership decides for their playbook—they get tired of one model, or a new leader imposes the model they are familiar with, so they adopt a new model and change the entire IT operating model, quite disruptive.                                                                                                                        

The IT4IT standard takes that whole answer to ‘how to run IT as a business’ out of the hands of any one source. That’s why a diverse set of contributors is important, like PWC and Accenture–they both have consulting practices for running IT shops. Seeing them contribute to an open standard that aggregates this know-how allows IT to evolve faster. When large IT vendors like ServiceNow, IBM, Microsoft and HPE are all participating and agreeing upon the model, we can start creating solutions that are compatible with one another. The reason we have Wi-Fi in every single corner of the planet or cellular service that you can use from any phone is because we standardized. We need to take a similar approach to running IT shops—renting commoditized services, plugging them in, and managing them with standard software. You can’t do that unless you agree on the fundamentals, the IT4IT standard provides much of this guidance.

When Enterprise Architects are thinking about presenting a framework like IT4IT, what considerations should they make as they’re preparing to present it to executives?

I like to use the word ‘contextualize,’ and the way I view the challenge is that if I contextualize our current operating model against IT4IT, how are we the same or different? What you’ll mostly find is that IT shops are somewhat aligned. A lot of the work that I’ve done with the standard over the past three years is to create material that shows IT4IT in multiple contexts. The one that I prefer to start with for an executive audience is showing how the de-facto plan-build-run IT organizational model, which is how most IT shops are structured, maps to the IT4IT structure. Once you make that correlation, it’s a lot easier to understand how IT4IT then fits across your particular organization filling some glaring gaps in plan-build-run.

Recently I’ve created a video blog series on YouTube called IT4IT Insights to share these contextual views. I’ve posted two videos so far, and plan to post a new video per month. I have posted one video on how Gartner’s Bi-Modal concept maps to IT4IT concepts, and another on the disruptive value that the Request to Fulfill value stream provides IT shops.

Why have executives been dismissive of frameworks like this in the past and how can that be combatted with a new approach such as IT4IT?

IT4IT is different than anything I have seen before.  I think it’s the first time we have seen a comprehensive business-oriented framework created for IT as an open standard. There are some IT frameworks specific to vertical industries out there, but IT4IT is really generic and addresses everything that any CIO would worry about on a daily basis. Of course they don’t teach CIOs IT4IT in school yet—it’s brand new. Many IT execs come from consulting firms where they have grown very familiar with a particular IT operating model, or they were promoted through the years establishing their own unique playbook along the way.  When a new standard framework like IT4IT comes along and an Enterprise Architect shows them how different it might be from what the executive currently knows, it’s very disruptive. IT executives got to that position through growth and experience using what works, it’s a tough pill to swallow to adopting something new like IT4IT.

To overcome this problem it’s import to contextualize the IT4IT concepts.  I’m finding many of the large consulting organizations are just now starting to learn IT4IT—some are ahead of others. The danger is that IT4IT takes some that unique IP away, and that’s a little risky to them, but I think it’s an advantage if they get on the bandwagon first and can contextually map what they do now against IT4IT. One other thing that’s important is that since IT4IT is an open standard, organizations may contribute intellectual property to the standard and be recognized as the key contributor for that content. You see some of this already with Accenture’s and PWC’s contributions.  At the same time, each consulting organization will hold some of their IP back in to differentiate themselves where applicable. That’s why I think it’s important for people presenting IT4IT to contextualize to their particular organization and practice.  If they don’t, it’s just going to be a much harder discussion.

Like with any new concept—eventually you find the first few who will get it, then latch on to it to become the ‘IT4IT champion.’ It’s very important to have at least one IT4IT champion to really evangelize the IT4IT standard and drive adoption.  That champion might not be in an executive position able to change things in their organization, but it’s an important job to educate and evangelize a better way of managing IT.

What lessons have you learned in presenting IT4IT to executives? Can you offer some tips and tricks for gaining mindshare?

I have many that I’ll talk about in January, but one thing that seems to work well is that I take a few IT4IT books into an executive briefing, the printed standard and pocket guide usually.  I’ll pass them around the room while I present the IT4IT standard. (I’m usually presenting the IT4IT standard as part of a broader executive briefing agenda.) I usually find that the books get stuck with someone in the room who has cracked open the book and recognized something of value.  They will usually want to keep the book after that, and at that point I know who my champion is.  I then gauge how passionate they are by making them twist my arm to keep the book.  This usually works well to generate discussion of what they found valuable, in the context of their own IT organization and in front of the other executives in the room. I recently presented to the CIO of a major insurance company performing this trick.  I passed the books around during my presentation and found them back in front of me.  I was thinking that was it, no takers. But the CIO decided to ask for them back once I concluded the IT4IT presentation.  The CIO was my new champion and everyone in the room knew it.

What about measurement and results? Is there enough evidence out there yet on the standard and the difference it’s making in IT departments to bring measurement into your argument to get buy in from executives?

I will present some use cases that have some very crystal clear results, though I can’t communicate financials. The more tangible measurements are around the use cases where we leveraged the IT4IT standard to rationalize the current IT organization and tools to identify any redundancies. One of the things I learned 10 years ago, well before the IT4IT standard was around, was how to rationalize applications for an entire organization that have gotten out of hand from a rash of M&A activity. Think about the redundancies created when two businesses merge. You’re usually merging because of a product or market that you are after, there’s some business need driving that acquisition. But all the common functions, like HR and finance are redundant.  This includes IT technologies and applications to manage IT, too. You don’t need two HR systems, or two IT helpdesk systems; you’ve got to consolidate this to a reasonable number of applications to do the work. I have tackled the IT rationalization by using the IT4IT standard, going through an evaluation process to identify redundancies per functional component.  In some cases we have found more 300 tools that perform the same IT function, like monitoring. You shouldn’t need to have 300 different monitoring tools—that’s ridiculous. This is just one clear use case where we’ve applied IT4IT to identify similar tools and processes that exist within IT specifically, a very compelling business case to eliminate massive redundancy.

Does the role of standards also help in being able to make a case for IT4IT with executives? Does that lend credence to what you’re proposing and do standards matter to them?

They do in a way because like accounting rules, if you have non-standard accounting rules today, it might land your executives in jail. It won’t land you in jail if you have a non-standard IT shop however, but being non-standard will increase the cost of everything you do and increase risks because you’re going against the grain for something that should be a commodity. At the executive level, you need to contextualize the problem of being non-standard and show them how adopting the IT4IT standard may be similar to the accounting rule standardization.

Another benefit of standards I use is to show how the standard is open, and the result of vetting good ideas from many different organizations vs. trying to make it up as you go.  The man-years of experience that went into the standard, and elegance of the result becomes a compelling argument for adoption that shouldn’t be overlooked.

What else should EAs take into consideration when presenting something like IT4IT to executives?

I think the primary thing to remember is to contextualize your conversation to your executives and organization. Some executives in IT may have zero technology background, some may have come up through the ranks and still know how to program, so you’ve got to tell the story based on the audience and tailor it. I presented recently to 50 CIOs in Washington D.C., so I had to contextualize the standard to show how IT4IT relates to the major changes happening in the federal market, such as the Federal Information Technology Acquisition Reform Act (FITARA), and how it supports the Federal Enterprise Architecture framework. These unique requirement changes had to be contextualized against the IT4IT standard so the audience understood exactly how IT4IT relates to the big challenges they are dealing with unique to the market.

Any last comments?

The next phase of the IT4IT standard is just taking off.  The initial group of people who were certified are now using IT4IT for training and to certify the next wave of adopters. We’re at a point now where the growth is going to take off exponentially. It takes a little time to get comfortable with something new and I’m seeing this happen more quickly in every new engagement. Enterprise Architects need to know that there’s a wealth of material out there, and folks who have been working with the IT4IT standard for a long time. There’s something new being published almost every day now.

It can take a while sometimes from first contact to reaching critical mass adoption, but it’s happening.  In my short three weeks at ServiceNow so far I have already had two customer conversations on IT4IT, it’s clearly relevant here too—and I have been able to show relevance to every other IT shop and vendor in the last three years.  This new IT4IT paradigm does need to soak in a bit, so don’t get frustrated about the pace of adoption and understanding.  One day you might come across a need and pull out the IT4IT standard to help in some way that’s not apparent right now.  It’s exciting to see people who worked with initial phases of the standard development now working on their next gig.  It’s encouraging to see folks in their second and even their third job leveraging the IT4IT standard.  This is a great indicator that the IT4IT standard is being accepted and starting to become mainstream.

@theopengroup #ogSFO

by-the-open-groupMark Bodman is an experienced, results-oriented IT4IT™ strategist with an Enterprise Architecture background, executive adviser, thought leader and mentor. He previously worked on cross-portfolio strategies to shape HPE’s products and services within HPE to include service multi-source service brokering, and IT4IT adoption. Mark has recently joined ServiceNow as the outbound Application Portfolio Management Product Manager.

Hands-on experience from years of interaction with multiple organizations has given Mark a unique foundation of experience and IT domain knowledge. Mark is well versed in industry standards such as TOGAF®, an Open Group standard, COBIT, and ITIL, has implemented portfolio management and EA practices, chaired governance boards within Dell, managed products at Troux, and helped HPE customers adopt strategic transformation planning practices using reference architectures and rationalization techniques.

 

 

1 Comment

Filed under Digital Transformation, Enterprise Architecture, Enterprise Transformation, IT, IT4IT, Standards, The Open Group, Uncategorized

What is Open FAIR™?

By Jim Hietala, VP, Business Development and Security, The Open Group

Risk Practitioners should be informed about the Open FAIR body of knowledge, and the role that The Open Group has played in creating a set of open and vendor-neutral standards and best practices in the area of Risk Analysis. For those not familiar with The Open Group, our Security Forum has created standards and best practices in the area of Security and Risk for 20+ years. The Open Group is a consensus-based and member-driven organization. Our interest in Risk Analysis dates back many years, as our membership saw a need to provide better methods to help organizations understand the level of risk present in their IT environments. The Open Group membership includes over 550 member organizations from both the buy-side and supply-side of the IT industry. The Security Forum currently has 80+ active member organizations contributing to our work.

A History of Open FAIR and The Open Group

In 2007, Security Forum Chairman Mike Jerbic brought the Factor Analysis of Information Risk (FAIR) to our attention, and suggested that it might be an interesting Risk Analysis taxonomy and method to consider as a possible open standard in this area. Originally created by Jack Jones and his then company Risk Management Insights (RMI), Jack and his partner Alex Hutton agreed to join The Open Group as members, and to contribute the FAIR IP as the basis for a possible open risk taxonomy standard.

Over a period of time, the Security Forum membership worked to create a standard comprising relevant aspects of FAIR (this initially meant the FAIR Risk Taxonomy). The result of this work was the eventual publication of the first version of the Risk Taxonomy Standard (O-RT), which was published in January 2009.  In 2012, the Security Forum decided to create a certification program of practitioners of the FAIR methodology, and undertook a couple of related efforts to update the Risk Taxonomy Standard, and to create a companion standard, the Risk Analysis Standard (O-RA). O-RA provides guidance on the process aspects of Risk Analysis that are lacking in O-RT, including things like risk measurement and calibration, the Risk Analysis process, and control considerations relating to Risk Analysis. The updated O-RT standard and the O-RA standard were published in late 2013, and the standards are available here:

C13G Risk Analysis (O-RA)

C13K Risk Taxonomy (O-RT), Version 2.0

We collectively refer to these two standards as the Open FAIR body of knowledge.  In late 2013, we also commenced operation of the Open FAIR Certification Program for Risk Analysts. In early 2014, we started development of an accreditation program for Open FAIR accredited training courses. The current list of accredited Open FAIR courses is found here. If you are with a training organization and want to explore accreditation, please feel free to contact us, and we can provide details. We have also created licensable Open FAIR courseware that can enable you to get started quickly with training on Open FAIR. Future articles will dive deeper into the Open FAIR certification program and the accredited training opportunity. It is worth noting at this point that we have also produced some hard copy Open FAIR guides that are helpful to candidates seeking to certify to Open FAIR. These are accessible via the links below, and are available at a nominal cost from our publishing partner Van Haren.

B140   Open FAIR Foundation Study Guide

G144  A Pocket Guide to the Open FAIR Body of Knowledge

Beyond the standards and certification program work, The Open Group has produced a number of other helpful publications relating to Risk, Security, and the use of Open FAIR. These include the following, all of which are available as free downloads:

W148  An Introduction to the Open FAIR Body of Knowledge

C103  FAIR – ISO/IEC 27005 Cookbook

G167  The Open FAIR™ – NIST Cybersecurity Framework Cookbook

G152  Integrating Risk and Security within a TOGAF® Enterprise Architecture

G081  Requirements for Risk Assessment Methodologies

W150  Modeling Enterprise Risk Management and Security with the ArchiMate® Language

Other Active Open FAIR Workgroups in the Security Forum

In addition to the standards and best practices described above, The Open Group has active workgroups developing the following related items.  Stay tuned for more details of these activities.   If any of the following projects are of interest to your organization, please feel free to reach out to learn more.

1) Open FAIR to STIX Mapping Whitepaper. This group is writing a whitepaper that maps the Open FAIR Risk Taxonomy Standard (O-RT) to STIX, a standard which originated at MITRE, and is being developed by OASIS.

2) Open FAIR Process Guide project – This group is writing a process guide for performing Open FAIR-based Risk Analysis. This guide fills a gap in our standards & best practices by providing a “how-to” process guide.

3) Open Source Open FAIR Risk Analysis tool – A basic Open FAIR Risk Analysis tool is being developed for students and industry.

5) Academic Program – A program is being established at The Open Group to support active student intern participation in risk activities within the Security Forum. The mission is to promote the development of the next generation of security practitioner and experience within a standards body.

6) Integration of Security and Risk into TOGAF®, an Open Group standard. This project is working to ensure that future versions of the TOGAF standard will comprehensively address security and risk.

How We Do What We Do

The Open Group Security Forum is a member-led group that aims to help members meet their business objectives through the development of standards and best practices. For the past several years, the focus of our work has been in the areas of Risk Management, Security Architecture, and Information Security Management standards and best practices. ‘Member-led’ means that members drive the work program, proposing projects that help them to meet their objectives as CISO’s, Security Architects, Risk Managers, or operational information security staff. All of our standards and best practices guidance are developed using our open, consensus-based standards process.

The standards development process at The Open Group allows members to collaborate effectively to develop standards and best practices that address real business issues. In the area of Risk Management, most of the publications noted above were created because members saw a need to determine how to apply Open FAIR in the context of other standards or frameworks, and then leveraged the entire Security Forum membership to produce useful guidance.

It is also worth noting that we do a lot of collaborating with other parts of The Open Group, including with the Architecture Forum on the integration of Risk and Security with TOGAF®, with the ArchiMate™ Forum on the use of ArchiMate, an Open Group standard, to model Risk and Security, with the Open Platform 3.0™ Forum, and with other Forums. We also have a number of external organizations that we work with, including SIRA, ISACA, and of course the FAIR Institute in the Risk Management area.

The Path Forward for Open FAIR

Our future work in the area of Risk Analysis will likely include other cookbook guides, showing how to use Open FAIR with other standards and frameworks. We are committed to meeting the needs of the industry, and all of our work comes from members describing a need in a given area. So in the area of Risk Management, we’d love to hear from you as to what your needs are, and even more, to have you contributing to the development of new materials.

For more information, please feel free to contact me directly via email or Linkedin:

 

@theopengroup

Jimby-jim-hietala-vp-business-development-and-security Hietala, Open FAIR, CISSP, GSEC, is Vice President, Business Development and Security for The Open Group, where he manages the business team, as well as Security and Risk Management programs and standards activities,  He has participated in the development of several industry standards including O-ISM3, O-ESA, O-RT (Risk Taxonomy Standard), O-RA (Risk Analysis Standard), and O-ACEML. He also led the development of compliance and audit guidance for the Cloud Security Alliance v2 publication.

Jim is a frequent speaker at industry conferences. He has participated in the SANS Analyst/Expert program, having written several research white papers and participated in several webcasts for SANS. He has also published numerous articles on information security, risk management, and compliance topics in publications including CSO, The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

An IT security industry veteran, he has held leadership roles at several IT security vendors.

Jim holds a B.S. in Marketing from Southern Illinois University.

Leave a comment

Filed under Accreditations, ArchiMate®, Certifications, Cybersecurity, Open FAIR, Open FAIR Certification, RISK Management, Security, Standards, The Open Group, TOGAF®, Uncategorized

Digital Transformation and Business Architecture (Part 3 of 3) – Presented by Dr. Giovanni Traverso, Huawei

At The Open Group Shanghai 2016 summit, we invited Dr. Giovanni Traverso, Chief Business Architect of HUAWEI Service Strategy and Architecture Practice, to give a keynote speech “New Open Business Architecture (O-BA) to Support the Construction of Digital Business and Smart Government”.

Huawei was a Diamond Sponsor of this summit, is a Platinum Member of The Open Group and is participating in the creation of the O-BA standard, whose first part was launched in July 2016 as a Preliminary Standard.

Giovanni, who is leading this effort within Huawei, presented Huawei’s perspectives on Business Architecture coming from best practices.

This is part three in a three-part series.

Part #3 – Business Architecture Answers the Business Questions

When undertaking our transformation efforts we need to answer, in a structured way, the inherent business questions, such as:

How to ensure a common understanding of the transformation within the organization? How to align an organization and its constituents towards the goal? Do we have the necessary skills and what changes should we drive, even in our organization’s culture? How to unleash technology-driven innovation to lead business model innovation, as well as the other way round? How processes and organization are going to be impacted? How to identify priorities? What about dependencies and risks?

More importantly: how to focus investments and ensure that the desired business outcomes will be achieved?

How our partners and channels are going to be impacted? How our customers/users’ experience is going to be impacted?

The O-BA standard reflects industry best practices addressing the business view of a transformation.

There are several techniques to uncover the capabilities that a business requires. Any of them will try to combine the business stakeholders’ views in a holistic picture.

For a digital business sake, as discussed in my first post, we want to start from the intended customer experience and make it our baseline.

By The Open Group.png

The relevant customer journeys – tailored to customer personas – represent the main value stream which is the backbone of the whole story.

Along with the customer’s perspective, we will align the relevant value streams of the other stakeholders, including their pursued goals.

Then we will analyze them and uncover the capabilities that enable each stage of the value streams and their respective relationships (e.g. information exchanges).

Capabilities will be mapped to organization and then we will identify respective enablers such as technology, application, process, information, skills.

By The Open Group.png

Business Architecture builds the overall picture and connects the dots in a logical a traceable framework, generating blueprints that represent the current state, future state and possible intermediate steps of a transformation. In this way we can find answers to our business questions.

O-BA describes a typical transformation lifecycle providing a framework for two dimensions of traceability: vertical (from strategy to competitive assessment to investment to implementation and outcomes) and horizontal (across different domains of the organization/ecosystem).

By The Open Group.png

While O-BA is meant to be the overarching framework for a transformation, technology, data and application architectures complete the view according to TOGAF®, an Open Group standard.

By The Open Group.png

Diagrams like the one reported below show the overall transformation and traceability across business objectives (customer’s being on top), capabilities and their interconnections allowing to achieve the objectives, metrics that audit capabilities, applications and relevant investments enabling the desired capabilities. In a closed loop, business architecture provides a framework to control the return on investment from a complex transformation.

by-the-open-group

More details on this example are given in the linked whitepaper produced by Huawei and published by The Open Group: https://www2.opengroup.org/ogsys/catalog/W166

The O-BA is an initiative in the Architecture Forum of The Open Group, driven by six Platinum Members (Capgemini, Hewlett Packard Enterprise, Huawei, IBM, Oracle, Philips) It intends to standardize a common understanding of Business Architecture, reflecting the best practices in the industry (most notably with a contribution by the Business Architecture Guild).

Members of The Open Group can download this presentation at http://www.opengroup.org/public/member/proceedings/Shanghai-2016-08/Presentations/Giovanni%20Traverso-Keynote4.pdf

The Open Group Shanghai 2016 event proceedings are available for members here.

@theopengroup

by-the-open-group

Giovanni Traverso
• 28 years in telecom business, Product Management, R&D Management, Business Unit GM and Transformation Management
• Now leading the Enterprise Architecture team at Huawei Global Services, Standard and Industry Development Dept.
• Certified Business Architect (CBA)
• Contributor to The Open Group Open Business Architecture (O-BA) Standard and the Business Architecture Body of Knowledge (BizBOK)

Comments Off on Digital Transformation and Business Architecture (Part 3 of 3) – Presented by Dr. Giovanni Traverso, Huawei

Filed under Business Architecture, Digital Transformation, Open Business Architecture (O-BA), Standards, The Open Group, TOGAF®, Uncategorized