Category Archives: Uncategorized

Do One Thing and Do It Well

By The Open Group

One significant example of “fake news” in 2016 was the announcement that Dennis Ritchie, one of the original authors of the UNIX® Operating System, had passed away. In fact, he’d done so in 2011, a week after the death of Steve Jobs. This year was in fact the fifth anniversary of his passing, but one where the extent of his contribution to the world was not overshadowed by others, and could be correctly acknowledged.

A lot of the central UNIX philosophy that he engineered alongside Bell Labs colleagues Ken Thompson and Brian Kernighan lives on this day. That of building systems from a range of modular and reusable software components; that while many UNIX programs do quite trivial things in isolation, that they combine with other programs to become general and useful tools. The envisioned ability to design and build systems quickly, and to reuse tried and trusted software components, remain as cultural norms in environments that employ Agile and DevOps techniques some 45 years later.

by-the-open-group

Their foresight was such that the same tools and user interface norms were replicated with the GNU project atop the Linux kernel. With the advent of the Internet, with interconnect standards agreed by the IETF and more latterly the W3C consortium, the same philosophy extended to very low cost industry standard servers. This, followed by the substitution of vendor specific buses to ever faster Ethernet and IP based connections, gave the ability for processors, storage and software components to be distributed in a scale-out fashion. The very nature of these industry standards was such that the geography over which these system components could be distributed extended well beyond a single datacentre; in some cases, and cognizant of latency and reliability concerns –to be able to work worldwide. The end result is that while traditional UNIX systems embody reliability and n+1 scaling, there is another approach based on the same core components that can scale out. With that, an operation as simple as a simple search on Google can involve the participation of over 1,000 geographically dispersed CPUs, and return results to the end user typically in under 200 milliseconds. However, building such systems – which are architected assuming individual device and communication path failures – tend to follow a very different set of design patterns.

The economics of using cloud based Linux infrastructure is often perceived as attractive, though we’re just past the “war” stage where each cloud vendors stacks are inherently proprietary. There are some laudable efforts to abstract code to be able to run on multiple cloud providers; one is FOG in the Ruby ecosystem. Another is CloudFoundry, that is executing particularly well in Enterprises with large investments in Java code. Emergent Serverless platforms (event driven, auto scalable function-as-a-service, where the whole supporting infrastructure is abstracted away) are probably the most extreme examples of chaotic evolution – and very vendor specific – at the time of writing.

The antithesis of open platforms is this effort to make full use of unique features in each cloud vendors offerings, a traditional lock-in strategy (to avoid their services becoming a price led commodity). The sort of thing that the UNIX community solved together many years ago by agreeing effective, vendor independent standards. Where certification engendered an assurance of compatibility and trust, leading to the ability for the industry to focus on higher end services to delight their end customers without fear of unnecessary lock-in.

Given the use of software designed to functionally mirror that of UNIX systems, one very valid question is “What would it take for Linux vendors to themselves have their distributions certified against recognized compelling industry standards – such as UNIX 03?”.  This so that customers could ascribe the same level of vendor-independent assurance and trust as achieved by the largest Enterprise UNIX system vendors – but to the “scale out” sibling.

Given the licensing conditions on the Linux kernel and associated open source components, both Huawei and Inspur have achieved certification of their Red Hat Linux derivative EulerOS 2.0 and Inspur K-UX 3.0 operating systems. No mean feat and an indication that their customers have the most Enterprise ready Linux OS available on Intel architecture server platforms today.

This is a level of certification that we don’t think will go unnoticed in large emerging markets of the world. That said, we’d welcome any other Linux vendor to prove their compliance to the same standard. In the interim, well done Huawei, and well done Inspur – proving it can be done.

References:

Version 3 of the Single UNIX® Specification: the UNIX 03 Product Standard:

https://www.opengroup.org/openbrand/register/xym0.htm

Huawei Technology achieve UNIX® v03 Conformance of Huawei EulerOS 2.0 Operating System:

https://www.opengroup.org/openbrand/register/brand3622.htm

Inspur achieve UNIX® 03 Conformance of Inspur K-UX 3.0:

https://www.opengroup.org/openbrand/register/brand3617.htm

UNIX® Philosophy: https://en.wikipedia.org/wiki/Unix_philosophy

http://www.opengroup.org/unix

@theopengroup

 

Leave a comment

Filed under architecture, Certifications, operating systems, Single UNIX Specification, Standards, Uncategorized, UNIX

The Open Group San Francisco Day Two Highlights

By The Open Group

Day two of The Open Group San Francisco event was held Tuesday, January 31 on another sunny, winter day in San Francisco. Tuesday’s welcome address featured Steve Nunn, President & CEO, and Jim Hietala, VP Business Development and Security, both of The Open Group, greeting attendees for a morning of sessions centered around the theme of Making Standards Work®. Nunn kicked off the morning by reporting that the first day of the conference had been very well received with copious positive feedback on Monday’s speakers.

It was also announced that the first certification courses for ArchiMate® 3.0 , an Open Group standard, kicked off at the conference. In addition, the San Francisco event marked the launch of The Open Group Open Process Automation™ Forum, a Forum of The Open Group, which will address standards development for open, secure, interoperable process control architectures. The Forum will include end users, suppliers, systems integrators, integrated DCS vendors, standards organizations and academics from a variety of industries, including food and beverage, oil and gas, pulp and paper, petrochemical, pharmaceuticals, metals and mining, and utilities.  Hietala joined Nunn on stage to discuss the launch of the Forum, which came out of a vision from ExxonMobil. The Forum has already grown rapidly, with almost 100 members. Forum Members are also attending and holding events at the annual ARC Advisory Group Industry Forum in Orlando.

The morning plenary began with Dennis Stevens from Lockheed Martin discussing “The Influence of Open Architecture Standards on the Emergence of Advance Process Control Systems.” Stevens, who is involved in The Open Group FACE™ Consortium, will also be leading the Open Process Automation Forum. Stevens opened by saying that this is a particularly exciting time in industrial automation due to of the intersection of standards, technology and automation. According to Stevens, the work that has been done in the FACE Forum over the past few years has paved the way for what also needs to be done in process automation.

Stevens noted that many of the industrial systems in use today will be facing obsolescence in the next few years due to a variety of reasons, including a proliferation of proprietary and closed systems, a lack of sophisticated development tools and the high-cost of technology refreshes. Tech trends such as the Internet of Things, cybersecurity, open source and virtualization are also forcing a need for industrial manufacturers to change. In addition, the growth of complexity in software systems and the changeover from hardware dominant to software dominant systems is also compelling factors for automation change. However, Stevens says, by reusing existing and creating new standards, there are many opportunities for cost savings and reducing complexity.

According the Stevens, the goal is to standardize the interfaces that companies can use so there is interoperability across systems built atop a common framework. By standardizing the interface only, organizations can still differentiate themselves by bringing their own business processes and designs to those systems via hardware or software components. In addition, by bringing elements from the FACE standardization model to Open Process Automation, the new forum can also take advantage of proven processes that already take into account regulations around co-opetition and anti-trust. Stevens believes that Open Process Automation will ultimately enable new markets and suppliers for process automation as well as lower the cost of doing business in industrial automation.

Following the morning break, Chair of the Department of Economics at San Jose State University Dr. Lydia Ortega took stage for the second morning session, entitled “Innovative Communities.”  Ortega took a refreshing look at what The Open Group does and how it works by applying economic theory to illustrate how the organization is an “Innovative community.” Ortega began by providing what she called an “economist’s definition” of what open standards are, which she defined as a collection of dispersed knowledge that is a building block for innovation and is continually evolving. She also described open standards as a “public good,” due to the fact that they are knowledge-based, non-rivalrous, non-excludable and produced once and available to others at marginal cost. Teamwork, consensus, community are also characterizing features of what makes the organization work. Ortega plans to continue her research into what makes The Open Group work by examining competing standards bodies and the organization’s origins among other things.

Prior to introducing the next session, Steve Nunn presented an award to Steve Whitlock, a long-time Open Group member who recently retired from Boeing, for more than 20 years of leadership, contributions and service to The Open Group. Colleagues provided additional praise for Whitlock and his willingness to lead activities on behalf of The Open Group and its members, particularly in the area of security.

The morning’s third session featured Mike Jerbic, Principal Consultant for Trusted System Consulting Group, highlighting how the “Norwegian Regional Healthcare Project & Open FAIR” have been used to analyze the cost benefits of a home treatment program for dialysis patients in Norway. Currently, due to health and privacy regulations and security requirements, patients who receive home dialysis must physically transport data regarding their treatments to hospitals, which affects the quality of patient’s lives but protects the state from security issues related to transporting data online. Jerbic and a group of economics students at San Jose State University in California did an economic analysis to examine the costs vs. benefits of the program. Using The Open Group Open FAIR™ body of knowledge to analyze the potential threats to both patient privacy and information security, the group found it would make sense to pose the program risks as an engineering problem to be solved. However, they must do additional research to weigh the benefits of potential cost savings to the state vs. the benefits of quality of life for patients.

Concluding Tuesday’s plenary sessions was a panel entitled “Open FAIR in Practice,” which extended the conversation regarding the Norwegian healthcare project by taking questions from the audience about the program. Jerbic moderated the panel, which included Ortega; Eva Kuiper, ESS GRC Security Consultant, HPE; John Linford, Lecturer, Department of Economics, San Jose State University; and Sushmitha Kasturi, Undergraduate Researches, San Jose State University.

Jerbic also announced that a number of students from San Jose State, many of whom were in attendance, have recently either completed or begun their certification in Open FAIR.  He also talked about an Academic Program within The Open Group that is working with students on projects that are mutually beneficial, allowing The Open Group to get help with the work needed to create standards, while providing important practical work experience for students.

by-the-open-group

by-the-open-group

San Jose State University Students

Following the plenary, Tuesday’s lunchtime partner presentation featured Sean Cleary, Senior Consultant, Orbus Software, presenting on “Architecture Roadmap Visualization with ArchiMate® 3.0.”

Afternoon sessions were split into two tracks, Cognitive Computing and EA in Practice.

  • EA in Practice – Hosted by Len Fehskens of the Association of Enterprise Architects, two sessions looked at maxims and folktales for architects, presented by Fehskens, and how to enable government and management with continuous audits with Robert Weisman, CEO/COO of Build the Vision.
  • Cognitive Computing – Chris Harding from The Open Group served as host for four sessions in the track:
    • Ali Arsanjani, CTO for Analytics and Emerging Technologies, IBM – Arsanjani provided an overview of different ways that data can be structured for cognitive computing applications. According to Arsanjani, cognitive systems are meant to augment, not replace, human systems and to be of service to us. By combining human interaction and curation with automated data analysis and machine learning, companies will be able to gain greater business advantages. However, we also must also always be aware of the implications of using artificial systems and the potential consequences of doing so, he said.
    • Jitendra Maan, Enterprise Architect and Center of Excellence Lead, Tata Consultancy Services – Maan says cognitive computing signals a shift in how machines interact with humans, other machines and the environment, with potential for new categories of business outcomes and disruption. The design of automated systems is critical to how cognitive systems are expected to evolve but unlike traditional computing, cognitive will rely on a combination of natural language processing, machine learning and data. Potential business applications already in progress include service support centers, contract management, risk assessment, intelligent chat bots and conversation work flows. Maan predicts bots will actually replace many service functions in the next few years.
    • Swaminathan Chandrsekaran, Industry Apps & Solutions, IBM Watson, both of IBM – Chandrsekaran’s talk took a deeper dive into cognitive computing and the make-up of cognitive systems. Understanding, reason, learning and interaction are key to teaching cognitive systems how to work, he said. Cognitive systems are also broadly categorized around language, speech, vision and data & insights, much like the human brain. Patterns can generally be created from cognitive conversations, discovery and application extensions. Chandreskaran also shared how to model a reference architecture for a cognitive conversation pattern.
    • The Cognitive Computing panel, moderated by Harding, included afternoon speakers Arsanjani, Maan and Chandrsekaran. The panel discussed how businesses can gain advantage from cognitive computing, learned personalization and contextualization via systems training, the time it takes to train a system (now days or weeks vs. months or years), making the systems more intelligent over time, and the need to aggregate and curate data from the beginning of a project and also focus on introducing domain-relevant data, as well as the importance of good data curation.

The day concluded with a social event and dinner for attendees held at the Autodesk Gallery, a San Francisco destination that marries creativity, design and engineering in more than 20 exhibits sponsored by companies such as Lego and Mercedes Benz.

by-the-open-group

Networking at the Autodesk Gallery

The following day, the event offered track sessions in areas including  Internet of Things (IoT) and Architecture.  The Open Group San Francisco drew to a close with Members Only Meetings on February 2.

@theopengroup #ogSFO

We are looking forward to seeing you at The Open Group Berlin April 24-27, 2017! #ogBER

 

Leave a comment

Filed under ArchiMate®, Digital Transformation, Enterprise Architecture, Enterprise Architecture (EA), FACE™, Internet of Things, IoT, O-BA Standard, Open Business Architecture (O-BA), Open FAIR, Open Process Automation, standards, Steve Nunn, The Open Group, The Open Group San Francisco 2016, TOGAF®, Uncategorized

The Open Group San Francisco Day One Highlights

By The Open Group

The Open Group kicked off its first event of 2017 on a sunny Monday morning, January 30, in the City by the Bay, with over 200 attendees from 20 countries including Australia, Finland, Germany and Singapore.

The Open Group CEO and President Steve Nunn began the day’s proceedings with a warm welcome and the announcement of the latest version of the Open Trusted Technology Provider™ Standard (O-TTPS), a standard that specifies best practices for providers to help them mitigate the risk of tainted or counterfeit products or parts getting into the IT supply chain. A new certification program for the standard was also announced, as well as the news that the standard has recently been ratified by ISO. Nunn also announced the availability of the next version of The Open Group IT4IT™ standard, version 2.1.

Monday’s plenary focused on IT4IT and Managing the Business of IT. Bernard Golden, CEO of Navica, spoke on the topic,“Cloud Computing and Business Expectations: How the Cloud Changes Everything.” Golden, who was named as one of the 10 most influential people in cloud computing by Wired magazine, began with a brief overview of the state of the computing industry today, which is largely characterized by the enormous growth of cloud computing. Golden believes that the public cloud will be the future of IT moving forward. With the speed that the cloud enables today, IT and app development have become both the bottleneck and differentiator for IT departments. To address these bottlenecks, IT must take a multi-pronged, continuous approach that uses a combination of cloud, Agile and DevOps to address business drivers. The challenge for IT shops today, Golden says, is also to decide where to focus and what cloud services they need to build applications. To help determine what works, IT must ask whether services are above or below what he calls “the value line,” which delineates whether the services available, which are often open-source, will ultimately advance the company’s goals or not, despite being low cost. IT must also be aware of the fact that the value line can present a lock-in challenge, creating tension between the availability of affordable—but potentially buggy—open-source tools and services and the ongoing value the business needs. Ultimately, Golden says, the cloud has changed everything—and IT must be willing to change with it and weigh the trade-offs between openness and potential lock-in.

Forrester Research analysts David Wheable, Vice President and Principal Consultant, and David Cannon, Vice President and Group Director, took the stage following Golden’s session to discuss “The Changing Role of IT: Strategy in the Age of the Customer.” Wheable spoke first, noting that technology has enabled a new “age of the customer,” an era where customers now have the majority of the power in the business/customer relationship.  As such, companies must now adapt to how their customers want to interact with their businesses and how customers use a company’s business applications (particularly via mobile devices) in order to survive and prevent customers from constantly changing their loyalties. Because IT strategists will not be able to predict how customers will use their applications, they must be able to put themselves in a position where they can quickly adapt to what is happening.

Cannon discussed what IT departments need to consider when it comes to strategy. To develop a viable IT strategy today, companies must consider what is valuable to the customer and how they will choose the technologies and applications that provide customers what they need. In the current IT landscape, features and quality no longer matter—instead, IT must take into account customers’ emotions, desires and immediate needs. Continuous exploitation of digital assets to deliver customer outcomes will be critical for both digital and business strategies—which Cannon argues are now essentially the same thing—moving forward. To survive in this new era, IT departments must also be able to enable customer outcomes, measure the customer experience, manage a portfolio of services, showcase business—not just technical—expertise and continue to enable service architectures that will deliver what customers need and want.

After the morning coffee break, Author and Researcher Gene Kim followed to discuss his recent book, The DevOps Handbook. His session, entitled, “The Rise of Architecture: Top Lessons Learned while Researching and Writing The DevOps Handbook,” explored the example of high performers in the tech sector and how the emergence of DevOps has influenced them. According to Kim, most IT departments are subject to a downward spiral over time due to the exponential growth of technical assets and debt during that time, which ultimately weigh them down and affect performance. In contrast, according to Kim’s research, high-performing organizations have been able to avoid this spiral by using DevOps. Organizations utilizing DevOps are nearly three times more agile than their peers, are more reliable and two times more likely to exceed profitability, market share and productivity goals in the marketplace. The ability to deploy small changes more frequently has been a game changer for these high-performing organizations not only allowing them to move faster but to create more humane working conditions and happier, more productive workers. Kim also found that fear of doing deployments is the most accurate predictor of success in organizations—those that fear deployments have less success than those that don’t.

by-the-open-group

Gene Kim

The final session of the morning plenary was presented by Charles Betz, IT Strategist, Advisor and Author from Armstrong Process Group. Betz provided an overview of how the IT4IT framework can be used within organizations to streamline IT processes, particularly by automating systems that no longer need to be done by hand. Standardizing IT processes also provides a way to deliver more consistent results across the entire IT value chain for better business results. Taking an iterative and team-oriented approach are also essential elements for managing the body of knowledge necessary for changing IT processes and creating digital transformation.

During the lunch hour conference partners Hewlett Packard Enterprise and Simplilearn each gave  separate presentations for attendees discussing the use of IT4IT for digital transformation and skills acquisition in the digital economy, respectively

Monday afternoon, The Open Group hosted its fourth TOGAF®, an Open Group standard, User Group meeting in addition to the afternoon speaking tracks. The User Group meeting consisted of an Oxford style debate on the pros and cons of “Create versus Reuse Architecture,” featuring Jason Uppal, Open CA Level 3 Certified Architect, QRS, and Peter Haviland, Managing Director, Head of Engineering & Architecture, Moody’s Corporation. In addition to the debate, User Group attendees had the opportunity to share use cases and stories with each other and discuss improvements for TOGAF that would be beneficial to them in their work.

The afternoon sessions consisted of five separate tracks:

  • IT4IT in Practice – Rob Akershoek from Logicalis/Shell Information Technology International moderated a panel of experts from the morning plenary as well as sessions related to presenting IT4IT to executives, the role of EA in the IT value chain and using IT4IT with TOGAF®.
  • Digital Business & the Customer Experience – Featuring sessions on architecting digital businesses and staying ahead of disruption hosted by Ron Schuldt of Femto-data.
  • Open Platform 3.0™/Cloud – Including talks on big data analytics in hybrid cloud environments and using standards and open source for cloud customer reference architectures hosted by Heather Kreger, Distinguished Engineer and CTO International Standards, IBM.
  • Open Trusted Technology – Trusted Technology Forum Director Sally Long introduced sessions on the new O-TTPS self-assessed certification and addressing product integrity and supply chain risk.
  • Open Business ArchitectureFeaturing an introduction to the new preliminary Business Architecture (O-BA) standard presented by Patrice Duboe, Innovation VP, Global Architects Leader from the CTO Office at Capgemini, and Venkat Nambiyur, Director – Business Transformation, Enterprise & Cloud Architecture, SMBs at Oracle.

Monday’s proceedings concluded with an evening networking reception featuring the day’s speakers, IT professionals, industry experts and exhibitors. Thanks for the San Francisco event also go to the event sponsors, which include Premium Sponsors Good eLearning, Hewlett Packard Enterprise, Orbus Software and Simplilearn, as well as sponsors Van Haren Publishing, the Association of Enterprise Architects and San Jose State University.

@theopengroup #ogSFO

Leave a comment

Filed under Enterprise Architecture (EA), Forrester, Gene Kim, IT4IT, Open Platform 3.0, OTTF, Steve Nunn, The Open Group, The Open Group San Francisco 2017, TOGAF®, Uncategorized

The Open Trusted Technology Provider™ Standard (O-TTPS) – Approved as ISO/IEC 20243:2015 and the O-TTPS Certification Program

By The Open Group

The increase of cybersecurity threats, along with the global nature of Information and Communication Technology (ICT), results in a threat landscape ripe for the introduction of tainted (e.g., malware-enabled or malware-capable) and counterfeit components into ICT products. This poses significant risk to customers in the operation of their business enterprises and our critical infrastructures.

A compromised electronic component or piece of malware-enabled software that lies dormant and undetected within an organization could cause tremendous damage if activated remotely. Counterfeit products can also cause significant damage to customers and providers resulting in rogue functionality, failed or inferior products, or revenue, brand equity loss, and critical damage.

As a result, customers now need assurances they are buying from trusted technology providers who follow best practices with their own in-house secure development and engineering practices and also in securing their out-sourced components and their supply chains.

Summary

The O-TTPS, an Open Group Standard, specifies a set of best practice requirements and recommendations that ICT providers should follow throughout the full life cycle of their products from design through disposal – including their supply chains – in order to mitigate the risk of tainted and counterfeit components. The Standard is the first with a Certification Program that specifies measurable conformance criteria for both product integrity and supply chain security in ICT.

The Standard provides requirements for the full product life cycle, categorizing them further into best practice requirements for Technology Development (product development and secure engineering methods) and Supply Chain Security.

by-the-open-group

The Open Group O-TTPS Certification Program offers certificates for conformance to both the O-TTPS and ISO/IEC 20243:2015, as the two standards are equivalent. The Program identifies the successful applicant on a public registry so customers and business partners can readily identify an Open Trusted Technology Provider™ who conforms to the Standard.

The Certification Program is available to all providers in the ICT product’s supply chain, including: Original Equipment Manufacturers (OEMs), hardware and software component suppliers, integrators, Value Add Resellers (VARS), and distributors. Thus, it offers a holistic program that not only allows customers to identify trusted business partners like integrators or OEMs who are listed on the registry, but it also allows OEMs and integrators to identify trusted business partners like hardware and software component suppliers, VARS, and distributors from the public registry.

by-the-open-group

Target Audience

As the O-TTPS Certification Program is open to all constituents involved in a product’s life cycle – from design through disposal – including those in the product’s supply chain, the Standard and the Certification Program should be of interest to all ICT providers as well as ICT customers.

The newly published guide: O-TTPS for ICT Product Integrity and Supply Chain Security – A Management Guide, available from The Open Group Bookstore at www.opengroup.org/bookstore/catalog/g169.htm, offers guidance to managers – business managers, procurement managers, or program managers – who are considering adopting the best practices or becoming certified as an Open Trusted Technology Provider™. It provides valuable information on:

  • The best practices in the Standard, with an Appendix that includes all of the requirements
  • The business rationale for why a company should consider implementing the Standard and becoming certified
  • What an organization should understand about the Certification Program and how they can best prepare for the process
  • The differences between the options (self-assessed or third-party assessed) that are currently available for the Certification Program
  • The process steps and the terms and conditions of the certification, with pointers to the relevant supporting documents, which are freely available

The Management Guide offers a practical introduction to executives, managers, those involved directly in implementing the best practices defined in the Standard, and those who would be involved in the assessments, whether self-assessment or third-party assessment.

Further Information

The Open Trusted Technology Provider™ Standard (O-TTPS), Version 1.1 is available free-of-charge from www.opengroup.org/bookstore/catalog/c147.htm.

The technically equivalent standard – ISO/IEC 20243: 2015 – is available for a fee from iso.org.

For more information on the Open Trusted Technology Provider™ Standard (O-TTPS) and the O-TTPS Certification Program, visit www.opengroup.org/ottps.

@theopengroup #ogSFO

1 Comment

Filed under Accreditations, Certifications, COTS, Cybersecurity, O-TTF, O-TTPS, OTTF, standards, Supply chain risk, The Open Group, The Open Group San Francisco 2017, Uncategorized

Understanding the Customer Experience: A Conversation with Forrester Analysts David Cannon and David Wheable

By The Open Group

With more technology in the hands of consumers than ever before, customers have become increasingly demanding in terms of not only the service they receive from companies but also the experience they have with your company or brand. Today, companies must be aware of and respond to what customers are looking for in terms of what they get from a company and how they interact—or they risk losing those customers.

This is leaving many companies in a very vulnerable position, particularly when it comes to digital customer experiences. In advance of The Open Group San Francisco 2017, we spoke with David Cannon, Vice President and Group Director, and David Wheable, Vice President and Principle Consultant, both of Forrester Research, about what customer expectations look like today and what companies need to be aware of so that they can survive in an ever-changing digital landscape. Both will be keynote speakers at The Open Group event on January 30.

The customer experience is something that’s been talked about for many years. What’s different now about customers that make their experiences with companies an even more urgent matter than in the past?

David Cannon (DC): The single most important thing that’s changed is that customers have more choice and the ability to change suppliers within literally seconds. And this is not limited to individual consumers.  Enterprises can switch key systems with minimal disruption.  The key to retaining customers today is to make sure their experience with you is good—if not there’s no reason to stay.

David Wheable (DW): Building on that is the way we talk about digital business; many of those interactions occur digitally now. The role of technology in that experience now is key. If you don’t deliver a good digital customer experience, as Dave Cannon said, the next one in the line will get the business. I actually did that the other day—one site would not let me log in, so they lost my business and the next one got my business instantly.

DC: David’s right, with digitization, we’re not actually dealing with individuals and human beings, we’re dealing with simple, digital interfaces. This reduces any potential sense of loyalty—we just want what we want, when we want it and that’s it.

That takes away a huge part of how businesses have traditionally run—it’s that relationship they have with the customer that has often set businesses apart. Are there ways that companies can better personalize experience and counteract that loss of human interaction or do they need to also make sure they are continuing to work person-to-person?

DW: That’s an interesting question because particularly when I talk to technical people, they really don’t actually understand what the customer experience is. Forrester defines it in terms of three Es—ease, effectiveness and emotion. Technical people have generally dealt with the ease and effectiveness for many years, so that’s no problem, but what they’re really bad at thinking about is designing for emotion. So if you are trying to have a digital customer experience, digital touch points, and you still have to include the emotion side in it, that’s where the loyalty comes from. Where we see that driven is when organizations look at how the positive, painless, frictionless kinds of experiences drive that kind of loyalty. What we see now is that those companies that are thinking about this are moving away from thinking about products and services and moving toward thinking about the customer in terms of experiences, desires and outcomes, and they might only be a small part of an ecosystem that generates that experience or outcome.

DC: I’ll add to that. One of the secrets to understanding how you’re impacting that emotion is to be able to gather more information about what the customer is doing, how they’re doing it, when they’re doing it and why they’re doing it.  We have tools that can do this better than we’ve ever done it before—without even interviewing or surveying our customers.  We have to be able to infer from whatever they’re doing digitally whether that equates to a good emotion or a negative emotion. The whole area of analytics becomes more important than ever—but it’s also different than before.

To give an example, sites like Yelp or TripAdvisor, give you a history of people’s experiences with a restaurant or service provider.  But they don’t provide real time information if the thing that upset a customer two years ago is still there.  Unless the customer provides constructive feedback that’s visible to all, they don’t help the service provider understand what they can do to make the customer’s experience better. Customer satisfaction ratings are also limited, because they are just a snapshot of a customer at a moment.  They don’t always tell us why the customer was (dis)satisfied, or whether they would have the same rating with that service today.

We’re getting better at looking at real-time analytics that tell us, in real-time, what is the context, where are customers using this, why are they using this and how does that impact their experience at that time? Is there a way that we can detect a negative experience and determine exactly what’s causing it and how to change it immediately?

One technique we use is Touchpoint Analysis, which breaks down what a customer does in individual interactions and individual contexts and then figures out how to measure their experience with each touchpoint.  To identify each touchpoint and then instrument it for real time experience was a huge ask, but technology is making it possible.

Personalization and customization have been talked about for at least 20 years now. At this point are there still concerns about privacy and knowing too much about customers? And on the flip side, if companies are relying on data to determine customer interactions rather than personal contact or relationships—and granted large companies can’t rely on personal interactions with thousands of people—does that reliance on data continue the problem of taking away from the human interaction?

DC: It’s kind of a paradox. On the one hand, you’re inventing technology and you’re putting that technology in the hands of users and that distances them from you. At the same time, you’re making them more capable of engaging with you. The very technology that allows you to be more remote (work from home, etc.) is being used to create online communities, friends, go shopping, run a political campaign, etc.  So technology is not only changing patterns of customer behavior, it’s changing how society works.  This is neither good news nor bad (or perhaps it’s a bit of both)—it’s just what’s happening.

On the other hand, by participating in this online society, you are sacrificing privacy. Many people demand better customer experience, fully understanding that that means that companies know more about them.  We’re starting to see some awareness of how ‘creepy’ this can be (being stalked by advertisers in one app because you searched for something in a different app).  But at this stage the search for better customer experience is still more powerful than the need for privacy. Will the pendulum swing the other way?  Definitely, but it will take some time and a more serious revelation of how privacy has been abused than those that have already emerged.

DW:  I also thing that one of the drivers of loyalty that customers are looking for from a brand is that trust in that brand to look after their data appropriately and use it appropriately. What we see again is that is a business imperative to respect privacy, to use data appropriately and obscure data appropriately and if the customers of that organization feel that is happening, they will be more loyal to that organization or company than one that they don’t trust their approach to data.

DC: I totally agree with that. I’d say though that in some cases, the realization that a company has not dealt with my data appropriately comes too late. We’re starting to see a shift to companies being more proactive in communicating how they’re safeguarding your privacy so it becomes more of a selling point for the services they provide. Not only are they going to give you a better experience, they’re going to give you a safer experience as well. Up until now that need for customers to know that up front has not really been as urgent. I think based on what David just said, that’s changing.

With all the high profile security breaches over the past few years, that’s important. On the other hand, if companies have poor service and do things that anger people, it’s as simple as if you’re waiting too long at the airport for your flight and you start tweeting about it, then you’re helping to damage the reputation of the airline.

DC: And what we’ve seen is that some of these companies are monitoring that kind of traffic and recording who those users are that make those statements. Using social media to communicate your experience with a company can also act against your relationship with that company. Some customers have reported negative experiences after they tweet bad things and positive experiences after they tweet good things

I think the only thing that we can deduce from this is that every type of human interaction that existed before all this technology is now happening using the technology. Just as you were careful in the real world, you have to be careful in the online world. You have to be careful about what you say, about whom and to whom—and that goes for whether you’re a consumer or a company.

Technical people still have to catch up with this a bit. Some think as long as there’s anti-virus or intrusion control on our major systems, we’re OK. What they’re not looking at is the business risk associated with, for example, a privacy breach — we’re not talking about a technical threat here, we’re talking about your business being able to survive or not.

We’re really exploring very new ethical and legislative ground here and the whole customer experience is really going to test that in the coming years. Just how much information is too much? Just what constitutes private information? Different countries have different views of what constitutes private information and my ability as a company to place my base of operation in one of those countries that is less responsible is that I can do more, but it makes me less responsible to my customers—how is that going to impact my business? These questions are still being tested.

When David and I will be talking in San Francisco, we’re not just talking about how do you get more friendly with your customers and get better service, what we’re really talking about is how do you survive as business in a changing world where the rules are changing every day? That’s a much bigger conversation than how technical people give better customer service—which is what the discussion was before.

You mention that there’s been gap among companies between those that “look” digital and those that are actually “being” digital. What does that gap look like and how can companies bridge that gap?

DW: Effectively, the way that I try to describe it to people is that a lot of the work on digital up to now has been really about automation. It’s been taking the same approach to business and just using technology to make that more efficient. Whether that’s faster or cheaper, that’s the fundamental role that technology has driven in those organizations. But now the technology has hit the point where it’s fundamentally changing the business, so those organizations that are looking digital are the ones that are putting this thin veneer over their existing business structure. Quite often if you dig beneath the scenes, what you’ll find is there are still bits of paper going on, there are still people looking at a form that was entered on a website and doing something with it.

Those companies that are truly digital are actually using those digital capabilities to change the way that they do the business. If you look at some of the examples that we use—like John Deere or Burberry—all of them have really gone back to their roots, looked at what their business actually is and then figured out how they can use digital technology to change their interactions with customers, change their outcome and restructure their business completely. You see that with companies like GE standing up and saying ‘we may have been a manufacturing company but now we’re a software and analytics company.’ That whole understanding of what the change means is significant. Those that are looking digital are the ones that are saying ‘we have an e-commerce site, therefore we’re digital.’ That’s not the story.

Why has it traditionally been so difficult for IT departments to execute on technology strategies?

DW: Dave and I spend a lot of time talking to these organizations. The majority of organizations feel stuck in a very operational frame of mind. Very few of them really have a strong ability to understand the context of technology strategy within the business. They tend to think of technology as this abstract and separate item rather than something that’s used to deliver most business results.

That sounds like a case for Enterprise Architecture and for architects to be that bridge between IT and the business.

DW: The challenge is it shouldn’t be a bridge, the idea is that it should be a fundamental part of the business strategy not a joining up, not something that you have to interpret. How does that technology deliver the business? It’s not how to back up the business. That’s where we see the real challenge of being digital—those business people who actually understand the digital part and can execute and come up with a digital strategy not necessarily having Enterprise Architects (EA) who try to interpret that and come up with technology.

DC: This is correct only when architects were ‘enterprise’ architects rather than solution or technology architects. We find that many organizations limit their architects to simply translating from the enterprise strategy to the technical solutions.  As long as this remains the case, architects will continue to be focused on operational issues, by reacting to business demands instead of working with business to jointly architect the strategy. Enterprise architecture has started to change into something being called “Business Architecture” where an EA looks at both sides of the fence at the same time (and in fact doesn’t see it as two sides) and asks what we have to all do together to make the organization successful—whether it’s operational or strategic.

To put it slightly more bluntly, the traditional IT model is when the business says ‘we need this,’ and IT builds and delivers it. That mindset has to change. IT is part of the business, and it has to be embedded in those frontline customer-facing parts of the business, not just be a technical service provider that just does whatever it’s told. To be honest, we’re in a situation now where the new technology that’s emerging is not really understood. If IT is buried in the basement somewhere, it’s going to be more difficult to make that technology work for the company. They really need to be on the frontline. What that means is that IT people have to become more business-like and more strategic.

How can technologists, customers and business work together to help solve their mutual problems?

DW: This is an interesting question, and it’s something we get asked all the time. We deal a lot with those companies being challenged with that. A lot of it comes down to culture—it comes down to understanding the difference between how a business will look at prod ops and how IT still looks at projects for example. This is why Dave says that DevOps is a start but it needs to go further. We’re constantly talking about how to start applying the similar techniques that people use for product development into the IT, technology and digital solutions as well. Design thinking, doing ethnographic work up front, doing actual feedback with customers, AB testing—you create those strong testing and feedback mechanisms, what works, what doesn’t work, and not just assume that everything’s understood and you can just write a system that does everything it can. What we see now is those techniques—DevOps, Agile, customer mapping experience, personas—all started coming together and really are creating that overall structure of how you understand the customer, how you understand employees and how you start delivering those solutions that actually give the right outcome and right experience to achieve what they want.

Is there a role for standards in all of this and what would that be?

DW: Very much so. One of the points we want to make is that now when you have effectively a digitally connected ecosystem and businesses form parts of that ecosystem, all the services that consumed are not under your control. In the old days of IT, you’d buy the hardware, you’d buy the software licenses, you’d build it and put it in a building and that would be your interaction, even in the old web days, with your customers. Now your customers link together with services or other businesses electronically. So in terms of the levels of connection, trust and understanding, that has now become very important in terms of the technical communications standards but equally the skills and how you approach that from a business standpoint. Looking at what IT4IT does, for example, is important because you need ways to talk about how the organizations should be constructed, what competencies you need and how they’re put together. Without some form of structure, you just get chaos. The idea of standards from my point of view is to try to find that chaos and give some sense of order to what’s going on.

DC: I agree with David. I would say also that we’re still going to see the importance of best practices as well as standards. To put it bluntly:  Standards are established and agreed ways of doing something.  But much of the technology emerging today is testing the relevance of standards.  Best practices (not the best name, they should be called Tested Practices or Good Practices) are those emerging practices that have been shown to work somewhere in the industry. What may be an appropriate standard for what you did five years ago may not be appropriate for what’s going to emerge next year. There’s always going to be this tension between the established standard, what we know to be true, and the emerging standard or best practice—the things that are working that aren’t necessarily in the standard or are beyond where it is today.

I think the industry has to become a little better at understanding the differences between standards and best practices and using them appropriately. I think what we’ve also seen is a lack of investment in best practices. We’re seeing a lot of people in the industry coming up with suggested best practices and frameworks. But it’s been awhile since we’ve seen a truly independent best practice. IT4IT, is a really good ramping point for some new best practices to emerge.  But just like any proposed practice, it will have its limitations.  Instead of following it blindly, we should keep monitoring it to figure out what those limitations are and how to overcome them.

Standards will continue to be really important to keep the Wild West at bay, but at the same time you’ve got to be pushing things forward and best practices (sponsored by independent organizations) are a good way to do that.

@theopengroup #ogSFO

by-the-open-groupDavid WheableVice President and Principal Consultant, Forrester Research Inc.
David provides research-based consulting services to BT Professionals, helping them leverage Forrester’s proprietary research and expertise to meet the ever-changing needs and expectations of their stakeholders.

David specializes in helping clients create effective and efficient strategies for their IT Service Management challenges including integrating cloud services, bring your own device (BYOD), and mobility.

Prior to joining Forrester, David worked at HP, where he served as the professional services innovation lead for the software and professional services organization, as worldwide solution lead, and as a consulting manager.

by-the-open-groupDavid CannonVice President and Group Director, Forrester Research Inc.
David serves Infrastructure & Operations Professionals. He is a leader in the fields of IT and service strategy and has led consulting practices for BMC Software and Hewlett-Packard. He is the coauthor of the ITIL 2007 service operation book and author of the ITIL 2011 service strategy book. He is also a founder and past chairman of both itSMF South Africa and itSMF International and a past president of itSMF USA.

Prior to joining Forrester, David led the IT service management (ITSM) practice of BMC Software Global Services and led the ITSM consulting practice at Hewlett-Packard. He has educated and consulted within a broad range of organizations in the private and public sectors over the past 20 years. He has consulted in virtually every area of IT management, but he specializes in the integration of business and technology management.

David has degrees in industrial sociology and psychology from the University of South Africa and holds the ITIL Expert certificate. He is also a fellow of service management and double recipient of the itSMF Lifetime Achievement Award.

 

Leave a comment

Filed under Digital Customer Experience, digital technologies, Digital Transformation, Enterprise Architecture (EA), Forrester, IT4IT, Standards, The Open Group, The Open Group San Francisco 2017, Uncategorized

To Colonize Mars, Look to Standards Development

By The Open Group

In advance of The Open Group San Francisco 2017, we spoke with Keegan Kirkpatrick, one of the co-founders of RedWorks, a “NewSpace” start-up focused on building 3D printable habitats for use on earth and in space.  Kirkpatrick will be speaking during the Open Platform 3.0™/Internet of Things (IoT) session on February 1.

Keegan Kirkpatrick believes that if we are to someday realize the dream of colonizing Mars, Enterprise Architects will play a critical role in getting us there.

Kirkpatrick defines the contemporary NewSpace industry as a group of companies that are looking to create near-term solutions that can be used on Earth, derived from solutions created for long-term use in space. With more private companies getting into the space game than ever before, Kirkpatrick believes the means to create habitable environments on the moon or on other planets isn’t nearly as far away as we might think.

“The space economy has always been 20 years away from where you’re standing now,” he says.

But with new entrepreneurs and space ventures following the lead of Elon Musk’s SpaceX, the space industry is starting to heat up, branching out beyond traditional aerospace and defense players like NASA, Boeing or Lockheed Martin.

“Now it’s more like five to ten years away,” Kirkpatrick says.

Kirkpatrick, who has a background in aerospace engineering, says RedWorks was born out of NASA’s 3D Printed Habitat Challenge, a “Centennial Challenge” where people from all kinds of backgrounds competed to create 3D printing/construction solutions for building and surviving on Mars.

“I was looking to get involved in the challenge. The idea of 3D printing habitats for Mars was fascinating to me. How do we solve the mass problem? How do we allow people to be self-sufficient on Mars once they get there?” he says.

Kirkpatrick says the company came together when he found a small 3D printing company in Lancaster, Calif., close to where he lives, and went to visit them. “About 20 minutes later, RedWorks was born,” he says. The company currently consists of Kirkpatrick, a 3D printing expert, and a geologist, along with student volunteers and a small team of engineers and technicians.

Like other NewSpace companies, RedWorks is focusing on terrestrial solutions first; both in order to create immediate value for what they’re doing and to help raise capital. As such, the company is looking to design and build homes by 3D printing low-cost materials that can be used in places that have a need for low-cost housing. The company is talking with real estate developers and urban planners and looking to areas where affordable housing might be able to be built entirely on site using their Mars-derived solutions.

“Terrestrial first is where the industry is going,” Kirkpatrick says. “You’ll see more players showing up in the next few years trying to capitalize on Earth-based challenges with space-based solutions.”

RedWorks plans to use parametric architecture models and parametric planning (design processes based on algorithmic thinking in which the relationship between elements is used to inform the design of complex structures) to create software for planning the printable communities and buildings. In the short-term, Kirkpatrick believes 3D printing can be used to create smart-city living solutions. The goal is to be able to combine 3D printing and embedded software so that people can design solutions specific to the environments where they’ll be used. (Hence the need for a geologist on their team.) Then they can build everything they need on site.

“For Mars, to make it a place that you can colonize, not just explore, you need to create the tools that people with not much of an engineering or space architecture background can use to set up a colony wherever they happen to land,” Kirkpatrick says. “The idea is if you have X number of people and you need to make a colony Y big, then the habitat design will scale everything with necessary utilities and living spaces entirely on-site. Then you can make use of the tools that you bring with you to print out a complete structure.”

Kirkpatrick says the objective is to be able to use materials native to each environment in order to create and print the structures. Because dirt and sand on Earth are fundamentally similar to the type of silicate materials found on the Moon and Mars, RedWorks is looking to develop a general-purpose silica printer that can be used to build 3D structures. That’s why they’re looking first to develop structures in desert climates, such southern California, North Africa and the Middle East.

A role for architecture and standards

As the private, NewSpace industry begins to take off, he believes there will be a strong need for standards to guide the nascent industry—and for Enterprise Architects to help navigate the complexities that will come with designing the technology that will enable the industry.

“Standards are necessary for collaborating and managing how fast this will take off,” he says.

Kirkpatrick also believes that developing open standards for the new space industry will better help NewSpace companies figure out how they can work together. Although he says many of NewSpace start-ups already have an interest in collaborating, with much of their work in the very early stages, they do not necessarily have much incentive to work together as of yet. However, he says, “everyone realizes that collaboration will be critical for the long-term development of the industry.”  Beginning to work toward standards development with an organization such as The Open Group now will help incentivize the NewSpace community to work together—and thus push the industry along even faster, Kirkpatrick says.

“Everyone’s trying to help each other as much as they can right now, but there’s not a lot of mechanisms in place to do so,” he says.

According to Kirkpatrick, it’s important to begin to think about standards for space-related technology solutions before the industry reaches an inflection point and begins to take off quickly. Kirkpatrick expects that inflection point will occur once a launcher like SpaceX is able to do full return landings of its rockets that are then ready for reuse. He expects that launch costs will begin to fall rapidly over the next five to ten years once launch providers can offer reliable reusable launch services, spurring the industry forward.

“Once you see launch costs fall by a factor of 10 or 100, the business side of the industry is going to grow like a weed. We need the infrastructure in place for everyone to work together and enable this incredible opportunity we have in space. There’s a very bright horizon ahead of use that’s just a little hard for everyone to see right now. But it’s coming faster than anyone realizes.”

@theopengroup #ogSFO

by-the-open-groupKeegan Kirkpatrick is the Team Lead and founder of RedWorks, a NewSpace startup in Lancaster, California. He has an undergraduate degree in Aerospace Engineering from Embry-Riddle Aeronautical University, and before turning entrepreneur worked as an engineer at Masten Space Systems on the Mojave Air and Spaceport.

In 2015, Keegan founded RedWorks with Paul Petros, Susan Jennings, and Lino Stavole to compete in and make it to the finals of the NASA Centennial 3D Printed Habitat Challenge. Keegan’s team is creating ways to 3D-print habitats from on-site materials, laying the groundwork for human settlement of the solar system.

Leave a comment

Filed under digital technologies, Enterprise Architecture (EA), Future Technologies, Internet of Things, IoT, Open Platform 3.0, Standards, The Open Group, The Open Group San Francisco 2017, Uncategorized

Gaining Executive Buy-In for IT4IT™: A Conversation with Mark Bodman

By The Open Group

With many organizations undergoing digital transformation, IT departments everywhere are taking serious hits. And although technology is at the heart of many business transformations, IT has traditionally had a reputation as a cost center rather than an innovation center.

As such, executives are often skeptical when presented with yet another new IT plan or architecture for their organizations that will be better than the last. Due to the role Enterprise Architects play in bridging the gap between the business and IT, it’s often incumbent on them to make the case for big changes when needed.

Mark Bodman, Senior Product Manager at ServiceNow and formerly at HPE, has been working with and presenting the IT4IT standard, an Open Group standard, to executives for a number of years. At The Open Group San Francisco 2017 event on January 30, Bodman will offer advice on how to present IT4IT in order to gain executive buy-in. We spoke with him in advance of the conference to get a sneak peek before his session.

What are Enterprise Architects up against these days when dealing with executives and trying to promote IT-related initiatives?

The one big change that I’ve seen is the commoditization of IT. With the cloud-based economy and the ability to rent cheap compute, storage and networking, being able to effectively leveraging commodity IT is a key differentiator that will make or break an organization. At the end of the day, the people who can exploit cheaper technology to do unique things faster are those companies who will come out ahead long-term. Companies based on legacy technologies that don’t evolve will stall out and die.

Uber and Netflix are great case studies for this trend. It’s happening everyday around us—and it’s reaching a tipping point. Enterprise Architects are faced with communicating these scenarios within their own organizations—use cases like going digital, streamlining for costs, sourcing more in the cloud—all strategies required to move the needle. Enterprise Architects are the senior most technical people within IT. They bridge the gap between business and technology at the highest level—and have to figure out ‘How do I communicate and plan for these disruptions here so that we can, survive in the digital era?’

It’s a Herculean task, not an easy thing to do. I’ve found there’s varying degrees of success for Enterprise Architects. Sometimes by no fault of their own, because they are dealing with politics, they can’t move the right agenda forward.  Or the EA may be dealing with a Board that just wants to see financial results the next quarter, and doesn’t care about the long-term transformations. These are the massive challenges that Enterprise Architects deal with every day.

Why is it important to properly present a framework like IT4IT to executives right now?

It’s as important as the changes in accounting rules have impacted organizations.  How those new rules and regulations changed in response to Enron and the other big financial failures within recent memory was quite impactful. When an IT shop is implementing services and running the IT organization as a whole, what is the operating model they use? Why is one IT shop so much different from another when we’re all facing similar challenges, using similar resources? I think it’s critically important to have a vetted industry standard to answer these questions.

Throughout my career, I’ve seen many different models for running IT from many different sources. From technology companies like HPE and IBM, to consulting companies like Deloitte, Accenture and Bain; each has their own way of doing things.  I refer this to the ‘IT flavor of the month.’  One framework is chosen over another depending on what leadership decides for their playbook—they get tired of one model, or a new leader imposes the model they are familiar with, so they adopt a new model and change the entire IT operating model, quite disruptive.                                                                                                                        

The IT4IT standard takes that whole answer to ‘how to run IT as a business’ out of the hands of any one source. That’s why a diverse set of contributors is important, like PWC and Accenture–they both have consulting practices for running IT shops. Seeing them contribute to an open standard that aggregates this know-how allows IT to evolve faster. When large IT vendors like ServiceNow, IBM, Microsoft and HPE are all participating and agreeing upon the model, we can start creating solutions that are compatible with one another. The reason we have Wi-Fi in every single corner of the planet or cellular service that you can use from any phone is because we standardized. We need to take a similar approach to running IT shops—renting commoditized services, plugging them in, and managing them with standard software. You can’t do that unless you agree on the fundamentals, the IT4IT standard provides much of this guidance.

When Enterprise Architects are thinking about presenting a framework like IT4IT, what considerations should they make as they’re preparing to present it to executives?

I like to use the word ‘contextualize,’ and the way I view the challenge is that if I contextualize our current operating model against IT4IT, how are we the same or different? What you’ll mostly find is that IT shops are somewhat aligned. A lot of the work that I’ve done with the standard over the past three years is to create material that shows IT4IT in multiple contexts. The one that I prefer to start with for an executive audience is showing how the de-facto plan-build-run IT organizational model, which is how most IT shops are structured, maps to the IT4IT structure. Once you make that correlation, it’s a lot easier to understand how IT4IT then fits across your particular organization filling some glaring gaps in plan-build-run.

Recently I’ve created a video blog series on YouTube called IT4IT Insights to share these contextual views. I’ve posted two videos so far, and plan to post a new video per month. I have posted one video on how Gartner’s Bi-Modal concept maps to IT4IT concepts, and another on the disruptive value that the Request to Fulfill value stream provides IT shops.

Why have executives been dismissive of frameworks like this in the past and how can that be combatted with a new approach such as IT4IT?

IT4IT is different than anything I have seen before.  I think it’s the first time we have seen a comprehensive business-oriented framework created for IT as an open standard. There are some IT frameworks specific to vertical industries out there, but IT4IT is really generic and addresses everything that any CIO would worry about on a daily basis. Of course they don’t teach CIOs IT4IT in school yet—it’s brand new. Many IT execs come from consulting firms where they have grown very familiar with a particular IT operating model, or they were promoted through the years establishing their own unique playbook along the way.  When a new standard framework like IT4IT comes along and an Enterprise Architect shows them how different it might be from what the executive currently knows, it’s very disruptive. IT executives got to that position through growth and experience using what works, it’s a tough pill to swallow to adopting something new like IT4IT.

To overcome this problem it’s import to contextualize the IT4IT concepts.  I’m finding many of the large consulting organizations are just now starting to learn IT4IT—some are ahead of others. The danger is that IT4IT takes some that unique IP away, and that’s a little risky to them, but I think it’s an advantage if they get on the bandwagon first and can contextually map what they do now against IT4IT. One other thing that’s important is that since IT4IT is an open standard, organizations may contribute intellectual property to the standard and be recognized as the key contributor for that content. You see some of this already with Accenture’s and PWC’s contributions.  At the same time, each consulting organization will hold some of their IP back in to differentiate themselves where applicable. That’s why I think it’s important for people presenting IT4IT to contextualize to their particular organization and practice.  If they don’t, it’s just going to be a much harder discussion.

Like with any new concept—eventually you find the first few who will get it, then latch on to it to become the ‘IT4IT champion.’ It’s very important to have at least one IT4IT champion to really evangelize the IT4IT standard and drive adoption.  That champion might not be in an executive position able to change things in their organization, but it’s an important job to educate and evangelize a better way of managing IT.

What lessons have you learned in presenting IT4IT to executives? Can you offer some tips and tricks for gaining mindshare?

I have many that I’ll talk about in January, but one thing that seems to work well is that I take a few IT4IT books into an executive briefing, the printed standard and pocket guide usually.  I’ll pass them around the room while I present the IT4IT standard. (I’m usually presenting the IT4IT standard as part of a broader executive briefing agenda.) I usually find that the books get stuck with someone in the room who has cracked open the book and recognized something of value.  They will usually want to keep the book after that, and at that point I know who my champion is.  I then gauge how passionate they are by making them twist my arm to keep the book.  This usually works well to generate discussion of what they found valuable, in the context of their own IT organization and in front of the other executives in the room. I recently presented to the CIO of a major insurance company performing this trick.  I passed the books around during my presentation and found them back in front of me.  I was thinking that was it, no takers. But the CIO decided to ask for them back once I concluded the IT4IT presentation.  The CIO was my new champion and everyone in the room knew it.

What about measurement and results? Is there enough evidence out there yet on the standard and the difference it’s making in IT departments to bring measurement into your argument to get buy in from executives?

I will present some use cases that have some very crystal clear results, though I can’t communicate financials. The more tangible measurements are around the use cases where we leveraged the IT4IT standard to rationalize the current IT organization and tools to identify any redundancies. One of the things I learned 10 years ago, well before the IT4IT standard was around, was how to rationalize applications for an entire organization that have gotten out of hand from a rash of M&A activity. Think about the redundancies created when two businesses merge. You’re usually merging because of a product or market that you are after, there’s some business need driving that acquisition. But all the common functions, like HR and finance are redundant.  This includes IT technologies and applications to manage IT, too. You don’t need two HR systems, or two IT helpdesk systems; you’ve got to consolidate this to a reasonable number of applications to do the work. I have tackled the IT rationalization by using the IT4IT standard, going through an evaluation process to identify redundancies per functional component.  In some cases we have found more 300 tools that perform the same IT function, like monitoring. You shouldn’t need to have 300 different monitoring tools—that’s ridiculous. This is just one clear use case where we’ve applied IT4IT to identify similar tools and processes that exist within IT specifically, a very compelling business case to eliminate massive redundancy.

Does the role of standards also help in being able to make a case for IT4IT with executives? Does that lend credence to what you’re proposing and do standards matter to them?

They do in a way because like accounting rules, if you have non-standard accounting rules today, it might land your executives in jail. It won’t land you in jail if you have a non-standard IT shop however, but being non-standard will increase the cost of everything you do and increase risks because you’re going against the grain for something that should be a commodity. At the executive level, you need to contextualize the problem of being non-standard and show them how adopting the IT4IT standard may be similar to the accounting rule standardization.

Another benefit of standards I use is to show how the standard is open, and the result of vetting good ideas from many different organizations vs. trying to make it up as you go.  The man-years of experience that went into the standard, and elegance of the result becomes a compelling argument for adoption that shouldn’t be overlooked.

What else should EAs take into consideration when presenting something like IT4IT to executives?

I think the primary thing to remember is to contextualize your conversation to your executives and organization. Some executives in IT may have zero technology background, some may have come up through the ranks and still know how to program, so you’ve got to tell the story based on the audience and tailor it. I presented recently to 50 CIOs in Washington D.C., so I had to contextualize the standard to show how IT4IT relates to the major changes happening in the federal market, such as the Federal Information Technology Acquisition Reform Act (FITARA), and how it supports the Federal Enterprise Architecture framework. These unique requirement changes had to be contextualized against the IT4IT standard so the audience understood exactly how IT4IT relates to the big challenges they are dealing with unique to the market.

Any last comments?

The next phase of the IT4IT standard is just taking off.  The initial group of people who were certified are now using IT4IT for training and to certify the next wave of adopters. We’re at a point now where the growth is going to take off exponentially. It takes a little time to get comfortable with something new and I’m seeing this happen more quickly in every new engagement. Enterprise Architects need to know that there’s a wealth of material out there, and folks who have been working with the IT4IT standard for a long time. There’s something new being published almost every day now.

It can take a while sometimes from first contact to reaching critical mass adoption, but it’s happening.  In my short three weeks at ServiceNow so far I have already had two customer conversations on IT4IT, it’s clearly relevant here too—and I have been able to show relevance to every other IT shop and vendor in the last three years.  This new IT4IT paradigm does need to soak in a bit, so don’t get frustrated about the pace of adoption and understanding.  One day you might come across a need and pull out the IT4IT standard to help in some way that’s not apparent right now.  It’s exciting to see people who worked with initial phases of the standard development now working on their next gig.  It’s encouraging to see folks in their second and even their third job leveraging the IT4IT standard.  This is a great indicator that the IT4IT standard is being accepted and starting to become mainstream.

@theopengroup #ogSFO

by-the-open-groupMark Bodman is an experienced, results-oriented IT4IT™ strategist with an Enterprise Architecture background, executive adviser, thought leader and mentor. He previously worked on cross-portfolio strategies to shape HPE’s products and services within HPE to include service multi-source service brokering, and IT4IT adoption. Mark has recently joined ServiceNow as the outbound Application Portfolio Management Product Manager.

Hands-on experience from years of interaction with multiple organizations has given Mark a unique foundation of experience and IT domain knowledge. Mark is well versed in industry standards such as TOGAF®, an Open Group standard, COBIT, and ITIL, has implemented portfolio management and EA practices, chaired governance boards within Dell, managed products at Troux, and helped HPE customers adopt strategic transformation planning practices using reference architectures and rationalization techniques.

 

 

1 Comment

Filed under Digital Transformation, Enterprise Architecture, Enterprise Transformation, IT, IT4IT, Standards, The Open Group, Uncategorized