Category Archives: Enterprise Architecture

Do One Thing and Do It Well

By The Open Group

One significant example of “fake news” in 2016 was the announcement that Dennis Ritchie, one of the original authors of the UNIX® Operating System, had passed away. In fact, he’d done so in 2011, a week after the death of Steve Jobs. This year was in fact the fifth anniversary of his passing, but one where the extent of his contribution to the world was not overshadowed by others, and could be correctly acknowledged.

A lot of the central UNIX philosophy that he engineered alongside Bell Labs colleagues Ken Thompson and Brian Kernighan lives on this day. That of building systems from a range of modular and reusable software components; that while many UNIX programs do quite trivial things in isolation, that they combine with other programs to become general and useful tools. The envisioned ability to design and build systems quickly, and to reuse tried and trusted software components, remain as cultural norms in environments that employ Agile and DevOps techniques some 45 years later.

by-the-open-group

Their foresight was such that the same tools and user interface norms were replicated with the GNU project atop the Linux kernel. With the advent of the Internet, with interconnect standards agreed by the IETF and more latterly the W3C consortium, the same philosophy extended to very low cost industry standard servers. This, followed by the substitution of vendor specific buses to ever faster Ethernet and IP based connections, gave the ability for processors, storage and software components to be distributed in a scale-out fashion. The very nature of these industry standards was such that the geography over which these system components could be distributed extended well beyond a single datacentre; in some cases, and cognizant of latency and reliability concerns –to be able to work worldwide. The end result is that while traditional UNIX systems embody reliability and n+1 scaling, there is another approach based on the same core components that can scale out. With that, an operation as simple as a simple search on Google can involve the participation of over 1,000 geographically dispersed CPUs, and return results to the end user typically in under 200 milliseconds. However, building such systems – which are architected assuming individual device and communication path failures – tend to follow a very different set of design patterns.

The economics of using cloud based Linux infrastructure is often perceived as attractive, though we’re just past the “war” stage where each cloud vendors stacks are inherently proprietary. There are some laudable efforts to abstract code to be able to run on multiple cloud providers; one is FOG in the Ruby ecosystem. Another is CloudFoundry, that is executing particularly well in Enterprises with large investments in Java code. Emergent Serverless platforms (event driven, auto scalable function-as-a-service, where the whole supporting infrastructure is abstracted away) are probably the most extreme examples of chaotic evolution – and very vendor specific – at the time of writing.

The antithesis of open platforms is this effort to make full use of unique features in each cloud vendors offerings, a traditional lock-in strategy (to avoid their services becoming a price led commodity). The sort of thing that the UNIX community solved together many years ago by agreeing effective, vendor independent standards. Where certification engendered an assurance of compatibility and trust, leading to the ability for the industry to focus on higher end services to delight their end customers without fear of unnecessary lock-in.

Given the use of software designed to functionally mirror that of UNIX systems, one very valid question is “What would it take for Linux vendors to themselves have their distributions certified against recognized compelling industry standards – such as UNIX 03?”.  This so that customers could ascribe the same level of vendor-independent assurance and trust as achieved by the largest Enterprise UNIX system vendors – but to the “scale out” sibling.

Given the licensing conditions on the Linux kernel and associated open source components, both Huawei and Inspur have achieved certification of their Red Hat Linux derivative EulerOS 2.0 and Inspur K-UX 3.0 operating systems. No mean feat and an indication that their customers have the most Enterprise ready Linux OS available on Intel architecture server platforms today.

This is a level of certification that we don’t think will go unnoticed in large emerging markets of the world. That said, we’d welcome any other Linux vendor to prove their compliance to the same standard. In the interim, well done Huawei, and well done Inspur – proving it can be done.

References:

Version 3 of the Single UNIX® Specification: the UNIX 03 Product Standard:

https://www.opengroup.org/openbrand/register/xym0.htm

Huawei Technology achieve UNIX® v03 Conformance of Huawei EulerOS 2.0 Operating System:

https://www.opengroup.org/openbrand/register/brand3622.htm

Inspur achieve UNIX® 03 Conformance of Inspur K-UX 3.0:

https://www.opengroup.org/openbrand/register/brand3617.htm

UNIX® Philosophy: https://en.wikipedia.org/wiki/Unix_philosophy

http://www.opengroup.org/unix

@theopengroup

 

Leave a comment

Filed under architecture, Certifications, operating systems, Single UNIX Specification, Standards, Uncategorized, UNIX

The Open Group San Francisco Day Two Highlights

By The Open Group

Day two of The Open Group San Francisco event was held Tuesday, January 31 on another sunny, winter day in San Francisco. Tuesday’s welcome address featured Steve Nunn, President & CEO, and Jim Hietala, VP Business Development and Security, both of The Open Group, greeting attendees for a morning of sessions centered around the theme of Making Standards Work®. Nunn kicked off the morning by reporting that the first day of the conference had been very well received with copious positive feedback on Monday’s speakers.

It was also announced that the first certification courses for ArchiMate® 3.0 , an Open Group standard, kicked off at the conference. In addition, the San Francisco event marked the launch of The Open Group Open Process Automation™ Forum, a Forum of The Open Group, which will address standards development for open, secure, interoperable process control architectures. The Forum will include end users, suppliers, systems integrators, integrated DCS vendors, standards organizations and academics from a variety of industries, including food and beverage, oil and gas, pulp and paper, petrochemical, pharmaceuticals, metals and mining, and utilities.  Hietala joined Nunn on stage to discuss the launch of the Forum, which came out of a vision from ExxonMobil. The Forum has already grown rapidly, with almost 100 members. Forum Members are also attending and holding events at the annual ARC Advisory Group Industry Forum in Orlando.

The morning plenary began with Dennis Stevens from Lockheed Martin discussing “The Influence of Open Architecture Standards on the Emergence of Advance Process Control Systems.” Stevens, who is involved in The Open Group FACE™ Consortium, will also be leading the Open Process Automation Forum. Stevens opened by saying that this is a particularly exciting time in industrial automation due to of the intersection of standards, technology and automation. According to Stevens, the work that has been done in the FACE Forum over the past few years has paved the way for what also needs to be done in process automation.

Stevens noted that many of the industrial systems in use today will be facing obsolescence in the next few years due to a variety of reasons, including a proliferation of proprietary and closed systems, a lack of sophisticated development tools and the high-cost of technology refreshes. Tech trends such as the Internet of Things, cybersecurity, open source and virtualization are also forcing a need for industrial manufacturers to change. In addition, the growth of complexity in software systems and the changeover from hardware dominant to software dominant systems is also compelling factors for automation change. However, Stevens says, by reusing existing and creating new standards, there are many opportunities for cost savings and reducing complexity.

According the Stevens, the goal is to standardize the interfaces that companies can use so there is interoperability across systems built atop a common framework. By standardizing the interface only, organizations can still differentiate themselves by bringing their own business processes and designs to those systems via hardware or software components. In addition, by bringing elements from the FACE standardization model to Open Process Automation, the new forum can also take advantage of proven processes that already take into account regulations around co-opetition and anti-trust. Stevens believes that Open Process Automation will ultimately enable new markets and suppliers for process automation as well as lower the cost of doing business in industrial automation.

Following the morning break, Chair of the Department of Economics at San Jose State University Dr. Lydia Ortega took stage for the second morning session, entitled “Innovative Communities.”  Ortega took a refreshing look at what The Open Group does and how it works by applying economic theory to illustrate how the organization is an “Innovative community.” Ortega began by providing what she called an “economist’s definition” of what open standards are, which she defined as a collection of dispersed knowledge that is a building block for innovation and is continually evolving. She also described open standards as a “public good,” due to the fact that they are knowledge-based, non-rivalrous, non-excludable and produced once and available to others at marginal cost. Teamwork, consensus, community are also characterizing features of what makes the organization work. Ortega plans to continue her research into what makes The Open Group work by examining competing standards bodies and the organization’s origins among other things.

Prior to introducing the next session, Steve Nunn presented an award to Steve Whitlock, a long-time Open Group member who recently retired from Boeing, for more than 20 years of leadership, contributions and service to The Open Group. Colleagues provided additional praise for Whitlock and his willingness to lead activities on behalf of The Open Group and its members, particularly in the area of security.

The morning’s third session featured Mike Jerbic, Principal Consultant for Trusted System Consulting Group, highlighting how the “Norwegian Regional Healthcare Project & Open FAIR” have been used to analyze the cost benefits of a home treatment program for dialysis patients in Norway. Currently, due to health and privacy regulations and security requirements, patients who receive home dialysis must physically transport data regarding their treatments to hospitals, which affects the quality of patient’s lives but protects the state from security issues related to transporting data online. Jerbic and a group of economics students at San Jose State University in California did an economic analysis to examine the costs vs. benefits of the program. Using The Open Group Open FAIR™ body of knowledge to analyze the potential threats to both patient privacy and information security, the group found it would make sense to pose the program risks as an engineering problem to be solved. However, they must do additional research to weigh the benefits of potential cost savings to the state vs. the benefits of quality of life for patients.

Concluding Tuesday’s plenary sessions was a panel entitled “Open FAIR in Practice,” which extended the conversation regarding the Norwegian healthcare project by taking questions from the audience about the program. Jerbic moderated the panel, which included Ortega; Eva Kuiper, ESS GRC Security Consultant, HPE; John Linford, Lecturer, Department of Economics, San Jose State University; and Sushmitha Kasturi, Undergraduate Researches, San Jose State University.

Jerbic also announced that a number of students from San Jose State, many of whom were in attendance, have recently either completed or begun their certification in Open FAIR.  He also talked about an Academic Program within The Open Group that is working with students on projects that are mutually beneficial, allowing The Open Group to get help with the work needed to create standards, while providing important practical work experience for students.

by-the-open-group

by-the-open-group

San Jose State University Students

Following the plenary, Tuesday’s lunchtime partner presentation featured Sean Cleary, Senior Consultant, Orbus Software, presenting on “Architecture Roadmap Visualization with ArchiMate® 3.0.”

Afternoon sessions were split into two tracks, Cognitive Computing and EA in Practice.

  • EA in Practice – Hosted by Len Fehskens of the Association of Enterprise Architects, two sessions looked at maxims and folktales for architects, presented by Fehskens, and how to enable government and management with continuous audits with Robert Weisman, CEO/COO of Build the Vision.
  • Cognitive Computing – Chris Harding from The Open Group served as host for four sessions in the track:
    • Ali Arsanjani, CTO for Analytics and Emerging Technologies, IBM – Arsanjani provided an overview of different ways that data can be structured for cognitive computing applications. According to Arsanjani, cognitive systems are meant to augment, not replace, human systems and to be of service to us. By combining human interaction and curation with automated data analysis and machine learning, companies will be able to gain greater business advantages. However, we also must also always be aware of the implications of using artificial systems and the potential consequences of doing so, he said.
    • Jitendra Maan, Enterprise Architect and Center of Excellence Lead, Tata Consultancy Services – Maan says cognitive computing signals a shift in how machines interact with humans, other machines and the environment, with potential for new categories of business outcomes and disruption. The design of automated systems is critical to how cognitive systems are expected to evolve but unlike traditional computing, cognitive will rely on a combination of natural language processing, machine learning and data. Potential business applications already in progress include service support centers, contract management, risk assessment, intelligent chat bots and conversation work flows. Maan predicts bots will actually replace many service functions in the next few years.
    • Swaminathan Chandrsekaran, Industry Apps & Solutions, IBM Watson, both of IBM – Chandrsekaran’s talk took a deeper dive into cognitive computing and the make-up of cognitive systems. Understanding, reason, learning and interaction are key to teaching cognitive systems how to work, he said. Cognitive systems are also broadly categorized around language, speech, vision and data & insights, much like the human brain. Patterns can generally be created from cognitive conversations, discovery and application extensions. Chandreskaran also shared how to model a reference architecture for a cognitive conversation pattern.
    • The Cognitive Computing panel, moderated by Harding, included afternoon speakers Arsanjani, Maan and Chandrsekaran. The panel discussed how businesses can gain advantage from cognitive computing, learned personalization and contextualization via systems training, the time it takes to train a system (now days or weeks vs. months or years), making the systems more intelligent over time, and the need to aggregate and curate data from the beginning of a project and also focus on introducing domain-relevant data, as well as the importance of good data curation.

The day concluded with a social event and dinner for attendees held at the Autodesk Gallery, a San Francisco destination that marries creativity, design and engineering in more than 20 exhibits sponsored by companies such as Lego and Mercedes Benz.

by-the-open-group

Networking at the Autodesk Gallery

The following day, the event offered track sessions in areas including  Internet of Things (IoT) and Architecture.  The Open Group San Francisco drew to a close with Members Only Meetings on February 2.

@theopengroup #ogSFO

We are looking forward to seeing you at The Open Group Berlin April 24-27, 2017! #ogBER

 

Leave a comment

Filed under ArchiMate®, Digital Transformation, Enterprise Architecture, Enterprise Architecture (EA), FACE™, Internet of Things, IoT, O-BA Standard, Open Business Architecture (O-BA), Open FAIR, Open Process Automation, standards, Steve Nunn, The Open Group, The Open Group San Francisco 2016, TOGAF®, Uncategorized

The Open Group San Francisco Day One Highlights

By The Open Group

The Open Group kicked off its first event of 2017 on a sunny Monday morning, January 30, in the City by the Bay, with over 200 attendees from 20 countries including Australia, Finland, Germany and Singapore.

The Open Group CEO and President Steve Nunn began the day’s proceedings with a warm welcome and the announcement of the latest version of the Open Trusted Technology Provider™ Standard (O-TTPS), a standard that specifies best practices for providers to help them mitigate the risk of tainted or counterfeit products or parts getting into the IT supply chain. A new certification program for the standard was also announced, as well as the news that the standard has recently been ratified by ISO. Nunn also announced the availability of the next version of The Open Group IT4IT™ standard, version 2.1.

Monday’s plenary focused on IT4IT and Managing the Business of IT. Bernard Golden, CEO of Navica, spoke on the topic,“Cloud Computing and Business Expectations: How the Cloud Changes Everything.” Golden, who was named as one of the 10 most influential people in cloud computing by Wired magazine, began with a brief overview of the state of the computing industry today, which is largely characterized by the enormous growth of cloud computing. Golden believes that the public cloud will be the future of IT moving forward. With the speed that the cloud enables today, IT and app development have become both the bottleneck and differentiator for IT departments. To address these bottlenecks, IT must take a multi-pronged, continuous approach that uses a combination of cloud, Agile and DevOps to address business drivers. The challenge for IT shops today, Golden says, is also to decide where to focus and what cloud services they need to build applications. To help determine what works, IT must ask whether services are above or below what he calls “the value line,” which delineates whether the services available, which are often open-source, will ultimately advance the company’s goals or not, despite being low cost. IT must also be aware of the fact that the value line can present a lock-in challenge, creating tension between the availability of affordable—but potentially buggy—open-source tools and services and the ongoing value the business needs. Ultimately, Golden says, the cloud has changed everything—and IT must be willing to change with it and weigh the trade-offs between openness and potential lock-in.

Forrester Research analysts David Wheable, Vice President and Principal Consultant, and David Cannon, Vice President and Group Director, took the stage following Golden’s session to discuss “The Changing Role of IT: Strategy in the Age of the Customer.” Wheable spoke first, noting that technology has enabled a new “age of the customer,” an era where customers now have the majority of the power in the business/customer relationship.  As such, companies must now adapt to how their customers want to interact with their businesses and how customers use a company’s business applications (particularly via mobile devices) in order to survive and prevent customers from constantly changing their loyalties. Because IT strategists will not be able to predict how customers will use their applications, they must be able to put themselves in a position where they can quickly adapt to what is happening.

Cannon discussed what IT departments need to consider when it comes to strategy. To develop a viable IT strategy today, companies must consider what is valuable to the customer and how they will choose the technologies and applications that provide customers what they need. In the current IT landscape, features and quality no longer matter—instead, IT must take into account customers’ emotions, desires and immediate needs. Continuous exploitation of digital assets to deliver customer outcomes will be critical for both digital and business strategies—which Cannon argues are now essentially the same thing—moving forward. To survive in this new era, IT departments must also be able to enable customer outcomes, measure the customer experience, manage a portfolio of services, showcase business—not just technical—expertise and continue to enable service architectures that will deliver what customers need and want.

After the morning coffee break, Author and Researcher Gene Kim followed to discuss his recent book, The DevOps Handbook. His session, entitled, “The Rise of Architecture: Top Lessons Learned while Researching and Writing The DevOps Handbook,” explored the example of high performers in the tech sector and how the emergence of DevOps has influenced them. According to Kim, most IT departments are subject to a downward spiral over time due to the exponential growth of technical assets and debt during that time, which ultimately weigh them down and affect performance. In contrast, according to Kim’s research, high-performing organizations have been able to avoid this spiral by using DevOps. Organizations utilizing DevOps are nearly three times more agile than their peers, are more reliable and two times more likely to exceed profitability, market share and productivity goals in the marketplace. The ability to deploy small changes more frequently has been a game changer for these high-performing organizations not only allowing them to move faster but to create more humane working conditions and happier, more productive workers. Kim also found that fear of doing deployments is the most accurate predictor of success in organizations—those that fear deployments have less success than those that don’t.

by-the-open-group

Gene Kim

The final session of the morning plenary was presented by Charles Betz, IT Strategist, Advisor and Author from Armstrong Process Group. Betz provided an overview of how the IT4IT framework can be used within organizations to streamline IT processes, particularly by automating systems that no longer need to be done by hand. Standardizing IT processes also provides a way to deliver more consistent results across the entire IT value chain for better business results. Taking an iterative and team-oriented approach are also essential elements for managing the body of knowledge necessary for changing IT processes and creating digital transformation.

During the lunch hour conference partners Hewlett Packard Enterprise and Simplilearn each gave  separate presentations for attendees discussing the use of IT4IT for digital transformation and skills acquisition in the digital economy, respectively

Monday afternoon, The Open Group hosted its fourth TOGAF®, an Open Group standard, User Group meeting in addition to the afternoon speaking tracks. The User Group meeting consisted of an Oxford style debate on the pros and cons of “Create versus Reuse Architecture,” featuring Jason Uppal, Open CA Level 3 Certified Architect, QRS, and Peter Haviland, Managing Director, Head of Engineering & Architecture, Moody’s Corporation. In addition to the debate, User Group attendees had the opportunity to share use cases and stories with each other and discuss improvements for TOGAF that would be beneficial to them in their work.

The afternoon sessions consisted of five separate tracks:

  • IT4IT in Practice – Rob Akershoek from Logicalis/Shell Information Technology International moderated a panel of experts from the morning plenary as well as sessions related to presenting IT4IT to executives, the role of EA in the IT value chain and using IT4IT with TOGAF®.
  • Digital Business & the Customer Experience – Featuring sessions on architecting digital businesses and staying ahead of disruption hosted by Ron Schuldt of Femto-data.
  • Open Platform 3.0™/Cloud – Including talks on big data analytics in hybrid cloud environments and using standards and open source for cloud customer reference architectures hosted by Heather Kreger, Distinguished Engineer and CTO International Standards, IBM.
  • Open Trusted Technology – Trusted Technology Forum Director Sally Long introduced sessions on the new O-TTPS self-assessed certification and addressing product integrity and supply chain risk.
  • Open Business ArchitectureFeaturing an introduction to the new preliminary Business Architecture (O-BA) standard presented by Patrice Duboe, Innovation VP, Global Architects Leader from the CTO Office at Capgemini, and Venkat Nambiyur, Director – Business Transformation, Enterprise & Cloud Architecture, SMBs at Oracle.

Monday’s proceedings concluded with an evening networking reception featuring the day’s speakers, IT professionals, industry experts and exhibitors. Thanks for the San Francisco event also go to the event sponsors, which include Premium Sponsors Good eLearning, Hewlett Packard Enterprise, Orbus Software and Simplilearn, as well as sponsors Van Haren Publishing, the Association of Enterprise Architects and San Jose State University.

@theopengroup #ogSFO

Leave a comment

Filed under Enterprise Architecture (EA), Forrester, Gene Kim, IT4IT, Open Platform 3.0, OTTF, Steve Nunn, The Open Group, The Open Group San Francisco 2017, TOGAF®, Uncategorized

Gaining Executive Buy-In for IT4IT™: A Conversation with Mark Bodman

By The Open Group

With many organizations undergoing digital transformation, IT departments everywhere are taking serious hits. And although technology is at the heart of many business transformations, IT has traditionally had a reputation as a cost center rather than an innovation center.

As such, executives are often skeptical when presented with yet another new IT plan or architecture for their organizations that will be better than the last. Due to the role Enterprise Architects play in bridging the gap between the business and IT, it’s often incumbent on them to make the case for big changes when needed.

Mark Bodman, Senior Product Manager at ServiceNow and formerly at HPE, has been working with and presenting the IT4IT standard, an Open Group standard, to executives for a number of years. At The Open Group San Francisco 2017 event on January 30, Bodman will offer advice on how to present IT4IT in order to gain executive buy-in. We spoke with him in advance of the conference to get a sneak peek before his session.

What are Enterprise Architects up against these days when dealing with executives and trying to promote IT-related initiatives?

The one big change that I’ve seen is the commoditization of IT. With the cloud-based economy and the ability to rent cheap compute, storage and networking, being able to effectively leveraging commodity IT is a key differentiator that will make or break an organization. At the end of the day, the people who can exploit cheaper technology to do unique things faster are those companies who will come out ahead long-term. Companies based on legacy technologies that don’t evolve will stall out and die.

Uber and Netflix are great case studies for this trend. It’s happening everyday around us—and it’s reaching a tipping point. Enterprise Architects are faced with communicating these scenarios within their own organizations—use cases like going digital, streamlining for costs, sourcing more in the cloud—all strategies required to move the needle. Enterprise Architects are the senior most technical people within IT. They bridge the gap between business and technology at the highest level—and have to figure out ‘How do I communicate and plan for these disruptions here so that we can, survive in the digital era?’

It’s a Herculean task, not an easy thing to do. I’ve found there’s varying degrees of success for Enterprise Architects. Sometimes by no fault of their own, because they are dealing with politics, they can’t move the right agenda forward.  Or the EA may be dealing with a Board that just wants to see financial results the next quarter, and doesn’t care about the long-term transformations. These are the massive challenges that Enterprise Architects deal with every day.

Why is it important to properly present a framework like IT4IT to executives right now?

It’s as important as the changes in accounting rules have impacted organizations.  How those new rules and regulations changed in response to Enron and the other big financial failures within recent memory was quite impactful. When an IT shop is implementing services and running the IT organization as a whole, what is the operating model they use? Why is one IT shop so much different from another when we’re all facing similar challenges, using similar resources? I think it’s critically important to have a vetted industry standard to answer these questions.

Throughout my career, I’ve seen many different models for running IT from many different sources. From technology companies like HPE and IBM, to consulting companies like Deloitte, Accenture and Bain; each has their own way of doing things.  I refer this to the ‘IT flavor of the month.’  One framework is chosen over another depending on what leadership decides for their playbook—they get tired of one model, or a new leader imposes the model they are familiar with, so they adopt a new model and change the entire IT operating model, quite disruptive.                                                                                                                        

The IT4IT standard takes that whole answer to ‘how to run IT as a business’ out of the hands of any one source. That’s why a diverse set of contributors is important, like PWC and Accenture–they both have consulting practices for running IT shops. Seeing them contribute to an open standard that aggregates this know-how allows IT to evolve faster. When large IT vendors like ServiceNow, IBM, Microsoft and HPE are all participating and agreeing upon the model, we can start creating solutions that are compatible with one another. The reason we have Wi-Fi in every single corner of the planet or cellular service that you can use from any phone is because we standardized. We need to take a similar approach to running IT shops—renting commoditized services, plugging them in, and managing them with standard software. You can’t do that unless you agree on the fundamentals, the IT4IT standard provides much of this guidance.

When Enterprise Architects are thinking about presenting a framework like IT4IT, what considerations should they make as they’re preparing to present it to executives?

I like to use the word ‘contextualize,’ and the way I view the challenge is that if I contextualize our current operating model against IT4IT, how are we the same or different? What you’ll mostly find is that IT shops are somewhat aligned. A lot of the work that I’ve done with the standard over the past three years is to create material that shows IT4IT in multiple contexts. The one that I prefer to start with for an executive audience is showing how the de-facto plan-build-run IT organizational model, which is how most IT shops are structured, maps to the IT4IT structure. Once you make that correlation, it’s a lot easier to understand how IT4IT then fits across your particular organization filling some glaring gaps in plan-build-run.

Recently I’ve created a video blog series on YouTube called IT4IT Insights to share these contextual views. I’ve posted two videos so far, and plan to post a new video per month. I have posted one video on how Gartner’s Bi-Modal concept maps to IT4IT concepts, and another on the disruptive value that the Request to Fulfill value stream provides IT shops.

Why have executives been dismissive of frameworks like this in the past and how can that be combatted with a new approach such as IT4IT?

IT4IT is different than anything I have seen before.  I think it’s the first time we have seen a comprehensive business-oriented framework created for IT as an open standard. There are some IT frameworks specific to vertical industries out there, but IT4IT is really generic and addresses everything that any CIO would worry about on a daily basis. Of course they don’t teach CIOs IT4IT in school yet—it’s brand new. Many IT execs come from consulting firms where they have grown very familiar with a particular IT operating model, or they were promoted through the years establishing their own unique playbook along the way.  When a new standard framework like IT4IT comes along and an Enterprise Architect shows them how different it might be from what the executive currently knows, it’s very disruptive. IT executives got to that position through growth and experience using what works, it’s a tough pill to swallow to adopting something new like IT4IT.

To overcome this problem it’s import to contextualize the IT4IT concepts.  I’m finding many of the large consulting organizations are just now starting to learn IT4IT—some are ahead of others. The danger is that IT4IT takes some that unique IP away, and that’s a little risky to them, but I think it’s an advantage if they get on the bandwagon first and can contextually map what they do now against IT4IT. One other thing that’s important is that since IT4IT is an open standard, organizations may contribute intellectual property to the standard and be recognized as the key contributor for that content. You see some of this already with Accenture’s and PWC’s contributions.  At the same time, each consulting organization will hold some of their IP back in to differentiate themselves where applicable. That’s why I think it’s important for people presenting IT4IT to contextualize to their particular organization and practice.  If they don’t, it’s just going to be a much harder discussion.

Like with any new concept—eventually you find the first few who will get it, then latch on to it to become the ‘IT4IT champion.’ It’s very important to have at least one IT4IT champion to really evangelize the IT4IT standard and drive adoption.  That champion might not be in an executive position able to change things in their organization, but it’s an important job to educate and evangelize a better way of managing IT.

What lessons have you learned in presenting IT4IT to executives? Can you offer some tips and tricks for gaining mindshare?

I have many that I’ll talk about in January, but one thing that seems to work well is that I take a few IT4IT books into an executive briefing, the printed standard and pocket guide usually.  I’ll pass them around the room while I present the IT4IT standard. (I’m usually presenting the IT4IT standard as part of a broader executive briefing agenda.) I usually find that the books get stuck with someone in the room who has cracked open the book and recognized something of value.  They will usually want to keep the book after that, and at that point I know who my champion is.  I then gauge how passionate they are by making them twist my arm to keep the book.  This usually works well to generate discussion of what they found valuable, in the context of their own IT organization and in front of the other executives in the room. I recently presented to the CIO of a major insurance company performing this trick.  I passed the books around during my presentation and found them back in front of me.  I was thinking that was it, no takers. But the CIO decided to ask for them back once I concluded the IT4IT presentation.  The CIO was my new champion and everyone in the room knew it.

What about measurement and results? Is there enough evidence out there yet on the standard and the difference it’s making in IT departments to bring measurement into your argument to get buy in from executives?

I will present some use cases that have some very crystal clear results, though I can’t communicate financials. The more tangible measurements are around the use cases where we leveraged the IT4IT standard to rationalize the current IT organization and tools to identify any redundancies. One of the things I learned 10 years ago, well before the IT4IT standard was around, was how to rationalize applications for an entire organization that have gotten out of hand from a rash of M&A activity. Think about the redundancies created when two businesses merge. You’re usually merging because of a product or market that you are after, there’s some business need driving that acquisition. But all the common functions, like HR and finance are redundant.  This includes IT technologies and applications to manage IT, too. You don’t need two HR systems, or two IT helpdesk systems; you’ve got to consolidate this to a reasonable number of applications to do the work. I have tackled the IT rationalization by using the IT4IT standard, going through an evaluation process to identify redundancies per functional component.  In some cases we have found more 300 tools that perform the same IT function, like monitoring. You shouldn’t need to have 300 different monitoring tools—that’s ridiculous. This is just one clear use case where we’ve applied IT4IT to identify similar tools and processes that exist within IT specifically, a very compelling business case to eliminate massive redundancy.

Does the role of standards also help in being able to make a case for IT4IT with executives? Does that lend credence to what you’re proposing and do standards matter to them?

They do in a way because like accounting rules, if you have non-standard accounting rules today, it might land your executives in jail. It won’t land you in jail if you have a non-standard IT shop however, but being non-standard will increase the cost of everything you do and increase risks because you’re going against the grain for something that should be a commodity. At the executive level, you need to contextualize the problem of being non-standard and show them how adopting the IT4IT standard may be similar to the accounting rule standardization.

Another benefit of standards I use is to show how the standard is open, and the result of vetting good ideas from many different organizations vs. trying to make it up as you go.  The man-years of experience that went into the standard, and elegance of the result becomes a compelling argument for adoption that shouldn’t be overlooked.

What else should EAs take into consideration when presenting something like IT4IT to executives?

I think the primary thing to remember is to contextualize your conversation to your executives and organization. Some executives in IT may have zero technology background, some may have come up through the ranks and still know how to program, so you’ve got to tell the story based on the audience and tailor it. I presented recently to 50 CIOs in Washington D.C., so I had to contextualize the standard to show how IT4IT relates to the major changes happening in the federal market, such as the Federal Information Technology Acquisition Reform Act (FITARA), and how it supports the Federal Enterprise Architecture framework. These unique requirement changes had to be contextualized against the IT4IT standard so the audience understood exactly how IT4IT relates to the big challenges they are dealing with unique to the market.

Any last comments?

The next phase of the IT4IT standard is just taking off.  The initial group of people who were certified are now using IT4IT for training and to certify the next wave of adopters. We’re at a point now where the growth is going to take off exponentially. It takes a little time to get comfortable with something new and I’m seeing this happen more quickly in every new engagement. Enterprise Architects need to know that there’s a wealth of material out there, and folks who have been working with the IT4IT standard for a long time. There’s something new being published almost every day now.

It can take a while sometimes from first contact to reaching critical mass adoption, but it’s happening.  In my short three weeks at ServiceNow so far I have already had two customer conversations on IT4IT, it’s clearly relevant here too—and I have been able to show relevance to every other IT shop and vendor in the last three years.  This new IT4IT paradigm does need to soak in a bit, so don’t get frustrated about the pace of adoption and understanding.  One day you might come across a need and pull out the IT4IT standard to help in some way that’s not apparent right now.  It’s exciting to see people who worked with initial phases of the standard development now working on their next gig.  It’s encouraging to see folks in their second and even their third job leveraging the IT4IT standard.  This is a great indicator that the IT4IT standard is being accepted and starting to become mainstream.

@theopengroup #ogSFO

by-the-open-groupMark Bodman is an experienced, results-oriented IT4IT™ strategist with an Enterprise Architecture background, executive adviser, thought leader and mentor. He previously worked on cross-portfolio strategies to shape HPE’s products and services within HPE to include service multi-source service brokering, and IT4IT adoption. Mark has recently joined ServiceNow as the outbound Application Portfolio Management Product Manager.

Hands-on experience from years of interaction with multiple organizations has given Mark a unique foundation of experience and IT domain knowledge. Mark is well versed in industry standards such as TOGAF®, an Open Group standard, COBIT, and ITIL, has implemented portfolio management and EA practices, chaired governance boards within Dell, managed products at Troux, and helped HPE customers adopt strategic transformation planning practices using reference architectures and rationalization techniques.

 

 

1 Comment

Filed under Digital Transformation, Enterprise Architecture, Enterprise Transformation, IT, IT4IT, Standards, The Open Group, Uncategorized

What is Open FAIR™?

By Jim Hietala, VP, Business Development and Security, The Open Group

Risk Practitioners should be informed about the Open FAIR body of knowledge, and the role that The Open Group has played in creating a set of open and vendor-neutral standards and best practices in the area of Risk Analysis. For those not familiar with The Open Group, our Security Forum has created standards and best practices in the area of Security and Risk for 20+ years. The Open Group is a consensus-based and member-driven organization. Our interest in Risk Analysis dates back many years, as our membership saw a need to provide better methods to help organizations understand the level of risk present in their IT environments. The Open Group membership includes over 550 member organizations from both the buy-side and supply-side of the IT industry. The Security Forum currently has 80+ active member organizations contributing to our work.

A History of Open FAIR and The Open Group

In 2007, Security Forum Chairman Mike Jerbic brought the Factor Analysis of Information Risk (FAIR) to our attention, and suggested that it might be an interesting Risk Analysis taxonomy and method to consider as a possible open standard in this area. Originally created by Jack Jones and his then company Risk Management Insights (RMI), Jack and his partner Alex Hutton agreed to join The Open Group as members, and to contribute the FAIR IP as the basis for a possible open risk taxonomy standard.

Over a period of time, the Security Forum membership worked to create a standard comprising relevant aspects of FAIR (this initially meant the FAIR Risk Taxonomy). The result of this work was the eventual publication of the first version of the Risk Taxonomy Standard (O-RT), which was published in January 2009.  In 2012, the Security Forum decided to create a certification program of practitioners of the FAIR methodology, and undertook a couple of related efforts to update the Risk Taxonomy Standard, and to create a companion standard, the Risk Analysis Standard (O-RA). O-RA provides guidance on the process aspects of Risk Analysis that are lacking in O-RT, including things like risk measurement and calibration, the Risk Analysis process, and control considerations relating to Risk Analysis. The updated O-RT standard and the O-RA standard were published in late 2013, and the standards are available here:

C13G Risk Analysis (O-RA)

C13K Risk Taxonomy (O-RT), Version 2.0

We collectively refer to these two standards as the Open FAIR body of knowledge.  In late 2013, we also commenced operation of the Open FAIR Certification Program for Risk Analysts. In early 2014, we started development of an accreditation program for Open FAIR accredited training courses. The current list of accredited Open FAIR courses is found here. If you are with a training organization and want to explore accreditation, please feel free to contact us, and we can provide details. We have also created licensable Open FAIR courseware that can enable you to get started quickly with training on Open FAIR. Future articles will dive deeper into the Open FAIR certification program and the accredited training opportunity. It is worth noting at this point that we have also produced some hard copy Open FAIR guides that are helpful to candidates seeking to certify to Open FAIR. These are accessible via the links below, and are available at a nominal cost from our publishing partner Van Haren.

B140   Open FAIR Foundation Study Guide

G144  A Pocket Guide to the Open FAIR Body of Knowledge

Beyond the standards and certification program work, The Open Group has produced a number of other helpful publications relating to Risk, Security, and the use of Open FAIR. These include the following, all of which are available as free downloads:

W148  An Introduction to the Open FAIR Body of Knowledge

C103  FAIR – ISO/IEC 27005 Cookbook

G167  The Open FAIR™ – NIST Cybersecurity Framework Cookbook

G152  Integrating Risk and Security within a TOGAF® Enterprise Architecture

G081  Requirements for Risk Assessment Methodologies

W150  Modeling Enterprise Risk Management and Security with the ArchiMate® Language

Other Active Open FAIR Workgroups in the Security Forum

In addition to the standards and best practices described above, The Open Group has active workgroups developing the following related items.  Stay tuned for more details of these activities.   If any of the following projects are of interest to your organization, please feel free to reach out to learn more.

1) Open FAIR to STIX Mapping Whitepaper. This group is writing a whitepaper that maps the Open FAIR Risk Taxonomy Standard (O-RT) to STIX, a standard which originated at MITRE, and is being developed by OASIS.

2) Open FAIR Process Guide project – This group is writing a process guide for performing Open FAIR-based Risk Analysis. This guide fills a gap in our standards & best practices by providing a “how-to” process guide.

3) Open Source Open FAIR Risk Analysis tool – A basic Open FAIR Risk Analysis tool is being developed for students and industry.

5) Academic Program – A program is being established at The Open Group to support active student intern participation in risk activities within the Security Forum. The mission is to promote the development of the next generation of security practitioner and experience within a standards body.

6) Integration of Security and Risk into TOGAF®, an Open Group standard. This project is working to ensure that future versions of the TOGAF standard will comprehensively address security and risk.

How We Do What We Do

The Open Group Security Forum is a member-led group that aims to help members meet their business objectives through the development of standards and best practices. For the past several years, the focus of our work has been in the areas of Risk Management, Security Architecture, and Information Security Management standards and best practices. ‘Member-led’ means that members drive the work program, proposing projects that help them to meet their objectives as CISO’s, Security Architects, Risk Managers, or operational information security staff. All of our standards and best practices guidance are developed using our open, consensus-based standards process.

The standards development process at The Open Group allows members to collaborate effectively to develop standards and best practices that address real business issues. In the area of Risk Management, most of the publications noted above were created because members saw a need to determine how to apply Open FAIR in the context of other standards or frameworks, and then leveraged the entire Security Forum membership to produce useful guidance.

It is also worth noting that we do a lot of collaborating with other parts of The Open Group, including with the Architecture Forum on the integration of Risk and Security with TOGAF®, with the ArchiMate™ Forum on the use of ArchiMate, an Open Group standard, to model Risk and Security, with the Open Platform 3.0™ Forum, and with other Forums. We also have a number of external organizations that we work with, including SIRA, ISACA, and of course the FAIR Institute in the Risk Management area.

The Path Forward for Open FAIR

Our future work in the area of Risk Analysis will likely include other cookbook guides, showing how to use Open FAIR with other standards and frameworks. We are committed to meeting the needs of the industry, and all of our work comes from members describing a need in a given area. So in the area of Risk Management, we’d love to hear from you as to what your needs are, and even more, to have you contributing to the development of new materials.

For more information, please feel free to contact me directly via email or Linkedin:

 

@theopengroup

Jimby-jim-hietala-vp-business-development-and-security Hietala, Open FAIR, CISSP, GSEC, is Vice President, Business Development and Security for The Open Group, where he manages the business team, as well as Security and Risk Management programs and standards activities,  He has participated in the development of several industry standards including O-ISM3, O-ESA, O-RT (Risk Taxonomy Standard), O-RA (Risk Analysis Standard), and O-ACEML. He also led the development of compliance and audit guidance for the Cloud Security Alliance v2 publication.

Jim is a frequent speaker at industry conferences. He has participated in the SANS Analyst/Expert program, having written several research white papers and participated in several webcasts for SANS. He has also published numerous articles on information security, risk management, and compliance topics in publications including CSO, The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

An IT security industry veteran, he has held leadership roles at several IT security vendors.

Jim holds a B.S. in Marketing from Southern Illinois University.

Leave a comment

Filed under Accreditations, ArchiMate®, Certifications, Cybersecurity, Open FAIR, Open FAIR Certification, RISK Management, Security, Standards, The Open Group, TOGAF®, Uncategorized

Digital Transformation and Disruption – A Conversation with Sriram Sabesan

By The Open Group

The term “disruption” has been the de rigueur description for what’s been going on in the technology industry for a number of years now. But with the pressures of a digital economy facing all industries now, each is being disrupted in numerous ways.

Although disruption is leading to new and better ways of doing things, it can also be devastating for businesses and industries if they choose to ignore the advances that digitalization is bringing. Companies that don’t want to be left behind have to adapt more quickly than ever—or learn to disrupt themselves.

Sriram Sabesan, a partner with Conexiam, believes that a certain amount of disruption, or mitigations to disruptions, can indeed by architected by an enterprise—if they have the foresight to do so. We spoke with him in advance of his session at The Open Group San Francisco 2017 event to learn more about how enterprises can architect their own disruptions.

We’ve been hearing a lot about disruption over the past few years. How do you define disruption and what is the disruption curve?

Disruption normally happens when you don’t anticipate something and the change suddenly occurs on you.  In fact, the changes have been happening, but no one has taken the time to connect the dots. To give an example, let us consider an individual holding a mutual fund, which has significant stakes in property and casualty (P&C) insurance businesses.  The impact of a shared economy (Uber, Lyft, Airbnb) is that the number of ‘owners’ is likely to stay flat or see marginal increase.  This cascades into a smaller number of insured people, hence diminished revenue for the insurance provider.  This impacts the stock valuation of the P&C companies, finally, impacting the individual owning the mutual fund with interest in P&C sector.  And that’s a foresight people might not have. This is not about crying ‘wolf,’ but about mitigating potential risk to an asset—at every step of the chain.

Let us take another example. Most manufacturing businesses hold reasonable stock of spare parts to their machinery.  Even at home, we hold metallic clips, nails, etc.  With 3D printing, one may be able to reuse the raw materials—sheet metal or plastic or whatever they’re trying to manufacture for the main product to create the spare-parts.  At home, we don’t have to stock clips, pins or nails—but raw material.  3D printing impacts the businesses that are producing these products todays.  Some positively (example e-Nable – http://enablingthefuture.org/) and some in unknown ways.

It is about walking the chain.  The company adopting a new technology or approach may not be the one getting impacted.  It may not be about the industry vertical that is adopting a new model.  It’s mostly likely the cascading effect of people taking part in the diluted chain that are impacted. It’s a system of systems game.

The Disruption Curve is based on product maturity ‘S-curve.’  Familiarity breeds contempt and raises expectations.  As people get used to do something in a certain way, some start to notice the little annoyances, and others want to do things differently or better.  For businesses, it is the necessity to create a new revenue model.  The next S-curve is born when the old S-curve approaches its top end.  The best definition is given by Prof. Clayton Christensen from Harvard Business School. However, the simplest interpretation could be ‘an unexpected change to the way one does things or when someone is unseated.’  For this topic, I think everyone is trying to improve their personal productivity, including better disposable income, a dose of vacation or a personal moment for themselves.  Any and all of these will cause a disruption.

In your opinion, what have been the biggest industry disruptions to occur in the past 10 years?

Most of the changes happened in isolation in the past.  There was no significant combinatorial effect that could transcend multiple industry verticals like today.

Google disrupted search; Amazon disrupted in-store purchase models; Netflix  the DVD rental market.  They all leveraged the internet. Google was able to capture and connect contents of websites, scanned copies of books, triggering the birth of  ‘big data’.  And Amazon on the other side, when they started having too many products, they couldn’t have an ecosystem that could support the enterprise across the globe and they came up with the AWS system. What they made internally, they also made a commercial, external facing product. Skype changed telephony; Paypal changed money exchange. 

Growth in metallurgy and medical sciences evolved from the foundations laid in the later half of last century.  Growing human body parts in lab, implantable devices, etc. The last decade made remote, continuous monitoring of human behavior and health possible.

But the biggest change is that all these companies discovered that they actually depend on each other.  Netflix on AWS, AWS on fiber optic cable owners, both of them on the last mile internet service providers, etc.  That realization, and the resultant collaboration via open standards is the biggest of all.

All of them changed some of the fundamentals of human-to-human interaction and human-to-machine interaction.  The new model made any individual able to provide a solution rather than waiting for large enterprises to initiate changes.

Who have been the most influential disruptors—is it just the usual suspects like Uber or Airbnb or are there others disruptors that are having a greater influence on industries that people are less aware of?

It depends on the vertical. As I said before, the past decade has been limited to a single vertical.

If you think about tax filing, Intuit has been the most influential in that area with Turbo Tax. They made a lot of things easier. Now you can take a picture of your W2 and 80% of your filing work is completed. Using another product, Mint.com, they became a personal finance advisor in a non-intrusive way—working with your banks, investment accounts and credit card accounts.  PayPal and Square are disruptors in the ecommerce and money movement sectors.

Each vertical had its own set of disruptors, not everyone came together. But now more and more people are coming together because the services are so interdependent. Apple with its iTunes changed the whole music industry. Amazon Kindle for books.  IBM with its Watson is changing multiple verticals.

Medical devices are also undergoing a lot of change in terms of things that could be planted in human beings and monitored wirelessly so it can give real-time information to doctors. The most common human behavior is to visit doctors when we are not healthy. Doctors don’t have data points on the transition from a healthy state to an unhealthy state, what happened, why it happened. Now they can monitor a person and behavior continuously. I recently read about an emergency room operation that used the data from a FitBit to figure out what happened to a patient and treat the patient very quickly. They saw the transition and the data points stored in the device and were able to diagnose the patient because the patient wasn’t conscious.

So, I guess, there are more unusual suspects and players.  To name a few: Khan Academy and Open Courseware in education, e-Nable for exoskeletal structures, derivatives of military’s ‘ready-to-eat-meals’.  There are also new products like ‘Ok Google,’ ‘Alexa’ and ‘x.ai’ which combines several aspects.

Your talk at The Open Group San Francisco advocates for an “architected approach” to disruption. Can disruption be architected or is there a certain amount of happenstance involved in having that kind of impact on an industry?

There is some element of happenstance.  However, most of the disruptions are architected.

An enterprise invariably architects for disruption or reacts rapidly to mitigate disruptive threats to sustain a business.  There are some good examples that go unnoticed or written off as the natural evolution of an industry.

I believe Qantas airlines was the first to realize that replacing seat mounted inflight entertainment (IFE) units with iPads saved at least 15 pounds per seat.  Even after adding 40% more seats, eliminating these devices reduced the overall weight of a Boeing 777 by 7%.  Simply by observing inflight human behavior and running test flights without IFEs, airlines architected this change.  The moment the savings was realized, almost every airline followed.  This is an example of architected change.  As regulators started accepting use of wifi devices at any altitude, compliance work done at the gate, by the pilot and maintenance crew also switched to hand-held devices.  Less paper and faster turnaround times.  Savings in weight resulted in lower overall operating cost per flight, contributing to either lower prices or more cargo revenue for the airline.

Every enterprise can anticipate changes in human behavior or nudge a new behavior, build a new business model around such behaviors.  Apple’s introduction of touch devices and natural interfaces is another example of well-architected and executed change.

There are parts of a business that need significant effort to change due to cascading impacts, say an ERP system or CRM or SCM system.  Even shifting them from on-premise to cloud would appear daunting.  However, the industry has started to chip away the periphery of these solutions that can be moved to cloud.  The issue is not technical feasibility or availability of new solutions.  It is more about recognizing what to change and when to change.  The economics of the current way of doing things balanced against cost of change and post change operations will simplify decision making.  The architect has to look outside the enterprise for inspiration, identify the points of friction within the enterprise, and simply perform a techno-economic analysis to architect a solution.

Sometimes a group of architects or industries realize a need for a change.  They collectively guide the change.  For example, consider The Open Group’s new Open Process Automation Forum.  What would normally appear to be disconnected verticals – Oil and Gas, Food Processing, Pharmaceuticals, Fabric and Cable manufacturers have come together to solve process management problems.  Current equipment suppliers to such companies are also part of the forum.  The way the forum works will lead to incremental changes. The results will appear to be natural evolution of the industry but the fact that these folks have come together can be called a disruption to an otherwise normal way of operations.  With this, there is the possibility of collaboration and mutual learning between operations technology and information technology.

I know of car companies, insurance companies and highway management companies who started silent collaboration to explore solar panels embedded on the road and live charging of automobiles.  An extended ‘what if’ scenario is the use of GPS to identify the availability of solar panel embedded roads matched with driving behavior of the car owner to make a decision whether the charge on the car’s battery can be used as source of power to reduce the burden on the electric grid.  Last month I read an article that the first solar panel road is a reality.  For metering and charging of power consumption, this may not be much of a disruption.  But other adjoining areas like regulations, parking privileges, toll charges will be impacted.  It is a question of how soon the players are going to react to make the transition gradual or suddenly wake up to call them disruptions.

Is it possible for established enterprises to be the arbiters of disruption or is that really something that has to come out of a start-up environment? Can an architected approach to disruption help established companies keep their edge?

Yes and no. The way most companies have grown is to protect what they’ve already established. A good number of organizations operate under the philosophy that failure is not an option, which implies that taking risks has to be reduced which in turn stifles innovation. They will innovate within the boundaries and allowances for failures. Start-ups have a mindset that failure is an option because they have nothing else to lose. They are looking for the right fit.

To be an arbiter, start-up or established enterprise, take a page from the research on Design Thinking and Service Blueprinting by Stanford University.  It provides a framework for innovation and possibly disruptions by any organization – not just the start-ups.  Progressive’s telemetry device is just the beginning.  Once the customers understand the limits of privacy management, all insurance companies will change the way they rate premiums.  Just learn from the rapid changes the TSA made for full-body scanners.  Scanned images rapidly changed from close to real body shape to a template outline.  Customer outrage forced that change.

Some big enterprises are actually working with start-ups to figure out what changes the start-ups want to do, what kind of pain points they’re offsetting. There are companies who work with an agenda to change the operating model of the whole industry. 

In the U.S., one can look at CaptialOne, Amazon (the retail business, not AWS), MegaBus, and Old Navy for creating new business models, if not a complete disruption.  Expedia created GlassDoor, and Zillow; Expedia was founded on making search, comparison of competitive offers and decision-making simple. The bottom line is whether the philosophy with which an enterprise was created has become its DNA, resulting in new verticals and value creation in the eyes of the investors.

It is possible to have an architected disruption approach moving forward but it comes from a place where the company defines the level of risk and change they’re willing to bring. At the end of the day, public companies are under constant pressure for quarterly results so big changes may not be possible; but they may be doing small incremental things that morph into something else that we cannot easily see.

Is architected disruption a potential new direction that Enterprise Architects can take as either a career path or as a way to show their continued relevance within the enterprise?

Yes. Let me qualify that. As things stand today, there are three kinds of architects.

Architects who guide and oversee implementation—people who have to make sure that what has been planned goes according to plan. These architects are not chartered to create or mitigate disruptions.  It is the task that is given to them that distances them from effecting big changes.

The second kind of architects focus on integrating things across businesses or departments and execute with the strategy leaders of the company.  These architects are probably on the periphery of enabling disruption or mitigating impacts of a disruption using an architected approach. These architects often react to disruptions or changes.

The third set of architects are trying to provide the strategy for the company’s success—creating roadmaps, operating at the edges of corporate charter or philosophy, thinking about every moving part within and outside the enterprise. They are on the watch out for what’s happening in human behavior, what’s happening in machine behavior and what’s happening in automation and trying to modify the portfolio quarter by quarter, if not sooner.  It is tricky for these architects to keep track of everything happening around them, so it is normal to get lost in the noise.

With the right attitude and opportunity, an architect can create a career path to move from the first kind to the third kind.  Having said that, let me be clear, all three kinds of architects are relevant and required for an enterprise to function.

Is there a role for standards in architected disruption?  

Yes.  The standards provide a buffer zone to limit the impact of disruption.  It also provides a transition path to adopt a new way of doing things.

The standards help in a couple ways—The Open Group sets standards for Boundaryless Information Flow™.  At the end of the day, no business is an island. So when a payment or financial e-commerce transaction changes from a bank to a PayPal account to a mobile wallet or a phone number, you need to have certain communications protocols, certain exchange standards to be defined. What kind of failure mitigation one needs to have in place needs to be defined—that’s one.

Second is supporting management decision makers—CEOs, COOs. We have to provide them the information that says ‘if you do this within this confine, the possibilities of failures go down.’ It’s about making it easier for them to decide and take on a change effort.

The standards provide a framework for adopting the change as well as a framework for helping management decisions mitigate risk and for making an ecosystem work well together.

Are there any other ways that disruption can be planned for?

One way is to look at the business patterns, the economic indicators that come along with these patterns.

Would Uber have survived in the mid-to-late 1990s? Probably not, because of the growing and more affluent economy. The economic pressure of the late 2000s diminished total disposal income so people were open to certain changes in their habits. Not only were they open in their thinking about socializing, they were open to penny-pinching as well.

There are parts of businesses that are hard to change, like the logistics management and ERP systems of an airline; clearing house operations of banking systems; cross-border, high-value sales.  There are parts of the business that can change with minimal impact.  Gartner calls this concept Pace-Layering.  We have to look for such layered patterns and make it easier to solve.  And the growth part will be complemented by what’s going on outside the enterprise.

There are a lot of examples of products that were way ahead of their time and for users to imagine / accept the change, and hence failed.  Uber or Ford, despite following different approach to deliver their product to the market, focused on the problem of mobility, the economic and social climate, and were willing to innovate and iterate. Oxo products, for example, though they cannot be technically classified as disruptors, changed the way we look at kitchen tools.  Oxo focused on user research and product fit.

So the winning formula is to focus on market and customer needs.  Start with accepting failure, test like there is no tomorrow. And at the hint of a tipping point, scale.

@theopengroup #ogSFO
by-the-open-groupSriram Sabesan leads the Digital Transformation practice at Conexiam.  He is responsible for developing best practice and standards in the areas of Social, Mobile, Analytics, Cloud and IoT (SMACIT), Customer Experience Management and governance.

Over the past 20 years, Sriram has led teams specializing in system engineering, process engineering and architecture development across federal, technology, manufacturing, telecommunication, and financial services verticals. Managing and leading large geographically distributed teams, Sriram has enabled clients develop and execute strategies in response to shifts technology or economic conditions.

Sriram has been an active member of The Open Group since 2010 and is on The Open Group Governing Board.  He has contributed to the development of Open Group standards, snapshots and white papers. He is an Open Group Certified Distinguished Architect and is certified in TOGAF® v8, Scrum Practice and Project Management.

Sriram holds a Bachelor of Science degree Mechanical Engineering and Master of Science (Tech) in Power and Energy.  Sriram also received his Diplomas in Financial and Operations Management in 1998.

1 Comment

Filed under digital business, digital strategy, digital technologies, Digital Transformation, Enterprise Architecture, Enterprise Architecture (EA), Open Platform 3.0, The Open Group San Francisco 2017, TOGAF®, Uncategorized

Looking Forward to a New Year

By Steve Nunn, President & CEO, The Open Group

As another new year begins, I would like to wish our members and The Open Group community a happy, healthy and prosperous 2017! It’s been nearly 15 months since I transitioned into my new role as the CEO of The Open Group, and I can’t believe how quickly that time has gone.

As I look back, it was at The Open Group Edinburgh event in October 2015 that we launched the IT4IT™ Reference Architecture, Version 2.0. In just the short time since then, I’m pleased to report that IT4IT has garnered attention worldwide. The IT4IT Certification for People program that we launched last January—one of the first things I had the pleasure of doing as CEO—has also gained momentum quickly. Wherever I have traveled over the past year, IT4IT has been a topic of great interest, particularly in countries like India and Brazil. There is a lot of potential for the standard globally, and we can look forward to various new IT4IT guides and whitepapers as well as an update to the technical standard in the first few months of this year.

Looking back more at 2016, there were a number of events that stood out throughout the course of the year. We were excited to welcome back Fujitsu as a Platinum member in April. The Open Group global reach and continued work creating standards relevant to how technology is impacting the worldwide business climate were key factors in Fujitsu’s decision to rejoin, and it’s great to have them back.

In addition to Fujitsu, we welcomed 86 new members in 2016. Our membership has been increasingly steadily over the past several years—we now have more than 511 members in 42 countries. Our own footprint continues to expand, with staff and local partners now in 12 countries. We have now reached a point where not a month goes by without The Open Group hosting an event somewhere in the world. In fact, more than 66,000 people attended an Open Group event either online or in-person last year. That’s a big number, and it is a reflection on the interest in the work that is going on inside The Open Group.

I believe this tremendous growth in membership and participation in our activities is due to a number of factors, including our focus on Enterprise Architecture and the continued take up of TOGAF® and ArchiMate® – Open Group standards – and the ecosystems around them.  In 2016, we successfully held the first TOGAF User Group meetings worldwide, and we also released the first part of the Open Business Architecture standard. Members can look forward to additions to that standard this year, as well as updates to the ArchiMate certifications, to reflect the latest version of the standard – ArchiMate® 3.0.

In addition, our work with The Open Group FACE™ Consortium has had a significant impact on growth—the consortium added 13 members last year, and it is literally setting the standard for how government customers buy from suppliers in the avionics market. Indeed, such has the success of The Open Group FACE Consortium been that it will be spinning out its own new consortium later this year, SOSA, or the Sensor Open Systems Architecture. The FACE Consortium was also nominated for the 2017 Aviation Week Awards in Innovation for assuming that software conforming to the FACE technical standard is open, portable and reusable. Watch this space for more information on that in the coming months.

2017 will bring new work from our Security and Open Platform 3.0™ Forums as well. The Security and Architecture Forums are working together to integrate security architectures into TOGAF, and we can expect updates to the O-ISM3 security, and OpenFair Risk Analysis and Taxonomy standards later in the year. The Open Platform 3.0 Forum has been hard at work developing materials that they can contribute to the vast topic of convergence, including the areas of Cloud Governance, Data Lakes, and Digital Business Strategy and Customer Experience. Look for new developments in those areas throughout the course of this year.

As the ever-growing need for businesses to transform for the digital world continues to disrupt industries and governments worldwide, we expect The Open Group influence to reach far and wide. Standards can help enterprises navigate these rapid changes. I believe The Open Group vision of Boundaryless Information Flow™ is coming to fruition through the work our Forums and Working Groups are doing. Look for us to take Boundaryless Information Flow one step further in January when we announce our latest Forum, the Open Process Automation™ Forum, at our upcoming San Francisco event. This promises to be a real cross-industry activity, bringing together industries as disparate as oil and gas, mining and metals, food and beverage, pulp and paper, pharmaceutical, petrochemical, utilities, and others. Stay tuned at the end of January to learn more about what some prominent companies in these industries have in common, in addition to being members of The Open Group!

With all of these activities to look forward to in 2017—and undoubtedly many more we have yet to see—all signs point to an active, productive and fulfilling year. I look forward to working with all of you throughout the next 12 months.

Happy New Year!

by-steve-nunn-president-and-ceo

by-steve-nunn-president-and-ceoSteve Nunn is President and CEO of The Open Group – a global consortium that enables the achievement of business objectives through IT standards. He is also President of the Association of Enterprise Architects (AEA).

Steve joined The Open Group in 1993, spending the majority of his time as Chief Operating Officer and General Counsel.   He was also CEO of the AEA from 2010 until 2015.

Steve is a lawyer by training, has an L.L.B. (Hons) in Law with French and retains a current legal practicing certificate.  Having spent most of his life in the UK, Steve has lived in the San Francisco Bay Area since 2007. He enjoys spending time with his family, walking, playing golf, 80s music, and is a lifelong West Ham United fan.

@theopengroup @stevenunn

 

 

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Transformation, Digital Transformation, Enterprise Architecture, FACE™, IT4IT, Open Platform 3.0, Open Process Automation, Standards, Steve Nunn, Uncategorized