Category Archives: Standards

Do One Thing and Do It Well

By The Open Group

One significant example of “fake news” in 2016 was the announcement that Dennis Ritchie, one of the original authors of the UNIX® Operating System, had passed away. In fact, he’d done so in 2011, a week after the death of Steve Jobs. This year was in fact the fifth anniversary of his passing, but one where the extent of his contribution to the world was not overshadowed by others, and could be correctly acknowledged.

A lot of the central UNIX philosophy that he engineered alongside Bell Labs colleagues Ken Thompson and Brian Kernighan lives on this day. That of building systems from a range of modular and reusable software components; that while many UNIX programs do quite trivial things in isolation, that they combine with other programs to become general and useful tools. The envisioned ability to design and build systems quickly, and to reuse tried and trusted software components, remain as cultural norms in environments that employ Agile and DevOps techniques some 45 years later.

by-the-open-group

Their foresight was such that the same tools and user interface norms were replicated with the GNU project atop the Linux kernel. With the advent of the Internet, with interconnect standards agreed by the IETF and more latterly the W3C consortium, the same philosophy extended to very low cost industry standard servers. This, followed by the substitution of vendor specific buses to ever faster Ethernet and IP based connections, gave the ability for processors, storage and software components to be distributed in a scale-out fashion. The very nature of these industry standards was such that the geography over which these system components could be distributed extended well beyond a single datacentre; in some cases, and cognizant of latency and reliability concerns –to be able to work worldwide. The end result is that while traditional UNIX systems embody reliability and n+1 scaling, there is another approach based on the same core components that can scale out. With that, an operation as simple as a simple search on Google can involve the participation of over 1,000 geographically dispersed CPUs, and return results to the end user typically in under 200 milliseconds. However, building such systems – which are architected assuming individual device and communication path failures – tend to follow a very different set of design patterns.

The economics of using cloud based Linux infrastructure is often perceived as attractive, though we’re just past the “war” stage where each cloud vendors stacks are inherently proprietary. There are some laudable efforts to abstract code to be able to run on multiple cloud providers; one is FOG in the Ruby ecosystem. Another is CloudFoundry, that is executing particularly well in Enterprises with large investments in Java code. Emergent Serverless platforms (event driven, auto scalable function-as-a-service, where the whole supporting infrastructure is abstracted away) are probably the most extreme examples of chaotic evolution – and very vendor specific – at the time of writing.

The antithesis of open platforms is this effort to make full use of unique features in each cloud vendors offerings, a traditional lock-in strategy (to avoid their services becoming a price led commodity). The sort of thing that the UNIX community solved together many years ago by agreeing effective, vendor independent standards. Where certification engendered an assurance of compatibility and trust, leading to the ability for the industry to focus on higher end services to delight their end customers without fear of unnecessary lock-in.

Given the use of software designed to functionally mirror that of UNIX systems, one very valid question is “What would it take for Linux vendors to themselves have their distributions certified against recognized compelling industry standards – such as UNIX 03?”.  This so that customers could ascribe the same level of vendor-independent assurance and trust as achieved by the largest Enterprise UNIX system vendors – but to the “scale out” sibling.

Given the licensing conditions on the Linux kernel and associated open source components, both Huawei and Inspur have achieved certification of their Red Hat Linux derivative EulerOS 2.0 and Inspur K-UX 3.0 operating systems. No mean feat and an indication that their customers have the most Enterprise ready Linux OS available on Intel architecture server platforms today.

This is a level of certification that we don’t think will go unnoticed in large emerging markets of the world. That said, we’d welcome any other Linux vendor to prove their compliance to the same standard. In the interim, well done Huawei, and well done Inspur – proving it can be done.

References:

Version 3 of the Single UNIX® Specification: the UNIX 03 Product Standard:

https://www.opengroup.org/openbrand/register/xym0.htm

Huawei Technology achieve UNIX® v03 Conformance of Huawei EulerOS 2.0 Operating System:

https://www.opengroup.org/openbrand/register/brand3622.htm

Inspur achieve UNIX® 03 Conformance of Inspur K-UX 3.0:

https://www.opengroup.org/openbrand/register/brand3617.htm

UNIX® Philosophy: https://en.wikipedia.org/wiki/Unix_philosophy

http://www.opengroup.org/unix

@theopengroup

 

Leave a comment

Filed under architecture, Certifications, operating systems, Single UNIX Specification, Standards, Uncategorized, UNIX

Understanding the Customer Experience: A Conversation with Forrester Analysts David Cannon and David Wheable

By The Open Group

With more technology in the hands of consumers than ever before, customers have become increasingly demanding in terms of not only the service they receive from companies but also the experience they have with your company or brand. Today, companies must be aware of and respond to what customers are looking for in terms of what they get from a company and how they interact—or they risk losing those customers.

This is leaving many companies in a very vulnerable position, particularly when it comes to digital customer experiences. In advance of The Open Group San Francisco 2017, we spoke with David Cannon, Vice President and Group Director, and David Wheable, Vice President and Principle Consultant, both of Forrester Research, about what customer expectations look like today and what companies need to be aware of so that they can survive in an ever-changing digital landscape. Both will be keynote speakers at The Open Group event on January 30.

The customer experience is something that’s been talked about for many years. What’s different now about customers that make their experiences with companies an even more urgent matter than in the past?

David Cannon (DC): The single most important thing that’s changed is that customers have more choice and the ability to change suppliers within literally seconds. And this is not limited to individual consumers.  Enterprises can switch key systems with minimal disruption.  The key to retaining customers today is to make sure their experience with you is good—if not there’s no reason to stay.

David Wheable (DW): Building on that is the way we talk about digital business; many of those interactions occur digitally now. The role of technology in that experience now is key. If you don’t deliver a good digital customer experience, as Dave Cannon said, the next one in the line will get the business. I actually did that the other day—one site would not let me log in, so they lost my business and the next one got my business instantly.

DC: David’s right, with digitization, we’re not actually dealing with individuals and human beings, we’re dealing with simple, digital interfaces. This reduces any potential sense of loyalty—we just want what we want, when we want it and that’s it.

That takes away a huge part of how businesses have traditionally run—it’s that relationship they have with the customer that has often set businesses apart. Are there ways that companies can better personalize experience and counteract that loss of human interaction or do they need to also make sure they are continuing to work person-to-person?

DW: That’s an interesting question because particularly when I talk to technical people, they really don’t actually understand what the customer experience is. Forrester defines it in terms of three Es—ease, effectiveness and emotion. Technical people have generally dealt with the ease and effectiveness for many years, so that’s no problem, but what they’re really bad at thinking about is designing for emotion. So if you are trying to have a digital customer experience, digital touch points, and you still have to include the emotion side in it, that’s where the loyalty comes from. Where we see that driven is when organizations look at how the positive, painless, frictionless kinds of experiences drive that kind of loyalty. What we see now is that those companies that are thinking about this are moving away from thinking about products and services and moving toward thinking about the customer in terms of experiences, desires and outcomes, and they might only be a small part of an ecosystem that generates that experience or outcome.

DC: I’ll add to that. One of the secrets to understanding how you’re impacting that emotion is to be able to gather more information about what the customer is doing, how they’re doing it, when they’re doing it and why they’re doing it.  We have tools that can do this better than we’ve ever done it before—without even interviewing or surveying our customers.  We have to be able to infer from whatever they’re doing digitally whether that equates to a good emotion or a negative emotion. The whole area of analytics becomes more important than ever—but it’s also different than before.

To give an example, sites like Yelp or TripAdvisor, give you a history of people’s experiences with a restaurant or service provider.  But they don’t provide real time information if the thing that upset a customer two years ago is still there.  Unless the customer provides constructive feedback that’s visible to all, they don’t help the service provider understand what they can do to make the customer’s experience better. Customer satisfaction ratings are also limited, because they are just a snapshot of a customer at a moment.  They don’t always tell us why the customer was (dis)satisfied, or whether they would have the same rating with that service today.

We’re getting better at looking at real-time analytics that tell us, in real-time, what is the context, where are customers using this, why are they using this and how does that impact their experience at that time? Is there a way that we can detect a negative experience and determine exactly what’s causing it and how to change it immediately?

One technique we use is Touchpoint Analysis, which breaks down what a customer does in individual interactions and individual contexts and then figures out how to measure their experience with each touchpoint.  To identify each touchpoint and then instrument it for real time experience was a huge ask, but technology is making it possible.

Personalization and customization have been talked about for at least 20 years now. At this point are there still concerns about privacy and knowing too much about customers? And on the flip side, if companies are relying on data to determine customer interactions rather than personal contact or relationships—and granted large companies can’t rely on personal interactions with thousands of people—does that reliance on data continue the problem of taking away from the human interaction?

DC: It’s kind of a paradox. On the one hand, you’re inventing technology and you’re putting that technology in the hands of users and that distances them from you. At the same time, you’re making them more capable of engaging with you. The very technology that allows you to be more remote (work from home, etc.) is being used to create online communities, friends, go shopping, run a political campaign, etc.  So technology is not only changing patterns of customer behavior, it’s changing how society works.  This is neither good news nor bad (or perhaps it’s a bit of both)—it’s just what’s happening.

On the other hand, by participating in this online society, you are sacrificing privacy. Many people demand better customer experience, fully understanding that that means that companies know more about them.  We’re starting to see some awareness of how ‘creepy’ this can be (being stalked by advertisers in one app because you searched for something in a different app).  But at this stage the search for better customer experience is still more powerful than the need for privacy. Will the pendulum swing the other way?  Definitely, but it will take some time and a more serious revelation of how privacy has been abused than those that have already emerged.

DW:  I also thing that one of the drivers of loyalty that customers are looking for from a brand is that trust in that brand to look after their data appropriately and use it appropriately. What we see again is that is a business imperative to respect privacy, to use data appropriately and obscure data appropriately and if the customers of that organization feel that is happening, they will be more loyal to that organization or company than one that they don’t trust their approach to data.

DC: I totally agree with that. I’d say though that in some cases, the realization that a company has not dealt with my data appropriately comes too late. We’re starting to see a shift to companies being more proactive in communicating how they’re safeguarding your privacy so it becomes more of a selling point for the services they provide. Not only are they going to give you a better experience, they’re going to give you a safer experience as well. Up until now that need for customers to know that up front has not really been as urgent. I think based on what David just said, that’s changing.

With all the high profile security breaches over the past few years, that’s important. On the other hand, if companies have poor service and do things that anger people, it’s as simple as if you’re waiting too long at the airport for your flight and you start tweeting about it, then you’re helping to damage the reputation of the airline.

DC: And what we’ve seen is that some of these companies are monitoring that kind of traffic and recording who those users are that make those statements. Using social media to communicate your experience with a company can also act against your relationship with that company. Some customers have reported negative experiences after they tweet bad things and positive experiences after they tweet good things

I think the only thing that we can deduce from this is that every type of human interaction that existed before all this technology is now happening using the technology. Just as you were careful in the real world, you have to be careful in the online world. You have to be careful about what you say, about whom and to whom—and that goes for whether you’re a consumer or a company.

Technical people still have to catch up with this a bit. Some think as long as there’s anti-virus or intrusion control on our major systems, we’re OK. What they’re not looking at is the business risk associated with, for example, a privacy breach — we’re not talking about a technical threat here, we’re talking about your business being able to survive or not.

We’re really exploring very new ethical and legislative ground here and the whole customer experience is really going to test that in the coming years. Just how much information is too much? Just what constitutes private information? Different countries have different views of what constitutes private information and my ability as a company to place my base of operation in one of those countries that is less responsible is that I can do more, but it makes me less responsible to my customers—how is that going to impact my business? These questions are still being tested.

When David and I will be talking in San Francisco, we’re not just talking about how do you get more friendly with your customers and get better service, what we’re really talking about is how do you survive as business in a changing world where the rules are changing every day? That’s a much bigger conversation than how technical people give better customer service—which is what the discussion was before.

You mention that there’s been gap among companies between those that “look” digital and those that are actually “being” digital. What does that gap look like and how can companies bridge that gap?

DW: Effectively, the way that I try to describe it to people is that a lot of the work on digital up to now has been really about automation. It’s been taking the same approach to business and just using technology to make that more efficient. Whether that’s faster or cheaper, that’s the fundamental role that technology has driven in those organizations. But now the technology has hit the point where it’s fundamentally changing the business, so those organizations that are looking digital are the ones that are putting this thin veneer over their existing business structure. Quite often if you dig beneath the scenes, what you’ll find is there are still bits of paper going on, there are still people looking at a form that was entered on a website and doing something with it.

Those companies that are truly digital are actually using those digital capabilities to change the way that they do the business. If you look at some of the examples that we use—like John Deere or Burberry—all of them have really gone back to their roots, looked at what their business actually is and then figured out how they can use digital technology to change their interactions with customers, change their outcome and restructure their business completely. You see that with companies like GE standing up and saying ‘we may have been a manufacturing company but now we’re a software and analytics company.’ That whole understanding of what the change means is significant. Those that are looking digital are the ones that are saying ‘we have an e-commerce site, therefore we’re digital.’ That’s not the story.

Why has it traditionally been so difficult for IT departments to execute on technology strategies?

DW: Dave and I spend a lot of time talking to these organizations. The majority of organizations feel stuck in a very operational frame of mind. Very few of them really have a strong ability to understand the context of technology strategy within the business. They tend to think of technology as this abstract and separate item rather than something that’s used to deliver most business results.

That sounds like a case for Enterprise Architecture and for architects to be that bridge between IT and the business.

DW: The challenge is it shouldn’t be a bridge, the idea is that it should be a fundamental part of the business strategy not a joining up, not something that you have to interpret. How does that technology deliver the business? It’s not how to back up the business. That’s where we see the real challenge of being digital—those business people who actually understand the digital part and can execute and come up with a digital strategy not necessarily having Enterprise Architects (EA) who try to interpret that and come up with technology.

DC: This is correct only when architects were ‘enterprise’ architects rather than solution or technology architects. We find that many organizations limit their architects to simply translating from the enterprise strategy to the technical solutions.  As long as this remains the case, architects will continue to be focused on operational issues, by reacting to business demands instead of working with business to jointly architect the strategy. Enterprise architecture has started to change into something being called “Business Architecture” where an EA looks at both sides of the fence at the same time (and in fact doesn’t see it as two sides) and asks what we have to all do together to make the organization successful—whether it’s operational or strategic.

To put it slightly more bluntly, the traditional IT model is when the business says ‘we need this,’ and IT builds and delivers it. That mindset has to change. IT is part of the business, and it has to be embedded in those frontline customer-facing parts of the business, not just be a technical service provider that just does whatever it’s told. To be honest, we’re in a situation now where the new technology that’s emerging is not really understood. If IT is buried in the basement somewhere, it’s going to be more difficult to make that technology work for the company. They really need to be on the frontline. What that means is that IT people have to become more business-like and more strategic.

How can technologists, customers and business work together to help solve their mutual problems?

DW: This is an interesting question, and it’s something we get asked all the time. We deal a lot with those companies being challenged with that. A lot of it comes down to culture—it comes down to understanding the difference between how a business will look at prod ops and how IT still looks at projects for example. This is why Dave says that DevOps is a start but it needs to go further. We’re constantly talking about how to start applying the similar techniques that people use for product development into the IT, technology and digital solutions as well. Design thinking, doing ethnographic work up front, doing actual feedback with customers, AB testing—you create those strong testing and feedback mechanisms, what works, what doesn’t work, and not just assume that everything’s understood and you can just write a system that does everything it can. What we see now is those techniques—DevOps, Agile, customer mapping experience, personas—all started coming together and really are creating that overall structure of how you understand the customer, how you understand employees and how you start delivering those solutions that actually give the right outcome and right experience to achieve what they want.

Is there a role for standards in all of this and what would that be?

DW: Very much so. One of the points we want to make is that now when you have effectively a digitally connected ecosystem and businesses form parts of that ecosystem, all the services that consumed are not under your control. In the old days of IT, you’d buy the hardware, you’d buy the software licenses, you’d build it and put it in a building and that would be your interaction, even in the old web days, with your customers. Now your customers link together with services or other businesses electronically. So in terms of the levels of connection, trust and understanding, that has now become very important in terms of the technical communications standards but equally the skills and how you approach that from a business standpoint. Looking at what IT4IT does, for example, is important because you need ways to talk about how the organizations should be constructed, what competencies you need and how they’re put together. Without some form of structure, you just get chaos. The idea of standards from my point of view is to try to find that chaos and give some sense of order to what’s going on.

DC: I agree with David. I would say also that we’re still going to see the importance of best practices as well as standards. To put it bluntly:  Standards are established and agreed ways of doing something.  But much of the technology emerging today is testing the relevance of standards.  Best practices (not the best name, they should be called Tested Practices or Good Practices) are those emerging practices that have been shown to work somewhere in the industry. What may be an appropriate standard for what you did five years ago may not be appropriate for what’s going to emerge next year. There’s always going to be this tension between the established standard, what we know to be true, and the emerging standard or best practice—the things that are working that aren’t necessarily in the standard or are beyond where it is today.

I think the industry has to become a little better at understanding the differences between standards and best practices and using them appropriately. I think what we’ve also seen is a lack of investment in best practices. We’re seeing a lot of people in the industry coming up with suggested best practices and frameworks. But it’s been awhile since we’ve seen a truly independent best practice. IT4IT, is a really good ramping point for some new best practices to emerge.  But just like any proposed practice, it will have its limitations.  Instead of following it blindly, we should keep monitoring it to figure out what those limitations are and how to overcome them.

Standards will continue to be really important to keep the Wild West at bay, but at the same time you’ve got to be pushing things forward and best practices (sponsored by independent organizations) are a good way to do that.

@theopengroup #ogSFO

by-the-open-groupDavid WheableVice President and Principal Consultant, Forrester Research Inc.
David provides research-based consulting services to BT Professionals, helping them leverage Forrester’s proprietary research and expertise to meet the ever-changing needs and expectations of their stakeholders.

David specializes in helping clients create effective and efficient strategies for their IT Service Management challenges including integrating cloud services, bring your own device (BYOD), and mobility.

Prior to joining Forrester, David worked at HP, where he served as the professional services innovation lead for the software and professional services organization, as worldwide solution lead, and as a consulting manager.

by-the-open-groupDavid CannonVice President and Group Director, Forrester Research Inc.
David serves Infrastructure & Operations Professionals. He is a leader in the fields of IT and service strategy and has led consulting practices for BMC Software and Hewlett-Packard. He is the coauthor of the ITIL 2007 service operation book and author of the ITIL 2011 service strategy book. He is also a founder and past chairman of both itSMF South Africa and itSMF International and a past president of itSMF USA.

Prior to joining Forrester, David led the IT service management (ITSM) practice of BMC Software Global Services and led the ITSM consulting practice at Hewlett-Packard. He has educated and consulted within a broad range of organizations in the private and public sectors over the past 20 years. He has consulted in virtually every area of IT management, but he specializes in the integration of business and technology management.

David has degrees in industrial sociology and psychology from the University of South Africa and holds the ITIL Expert certificate. He is also a fellow of service management and double recipient of the itSMF Lifetime Achievement Award.

 

Leave a comment

Filed under Digital Customer Experience, digital technologies, Digital Transformation, Enterprise Architecture (EA), Forrester, IT4IT, Standards, The Open Group, The Open Group San Francisco 2017, Uncategorized

To Colonize Mars, Look to Standards Development

By The Open Group

In advance of The Open Group San Francisco 2017, we spoke with Keegan Kirkpatrick, one of the co-founders of RedWorks, a “NewSpace” start-up focused on building 3D printable habitats for use on earth and in space.  Kirkpatrick will be speaking during the Open Platform 3.0™/Internet of Things (IoT) session on February 1.

Keegan Kirkpatrick believes that if we are to someday realize the dream of colonizing Mars, Enterprise Architects will play a critical role in getting us there.

Kirkpatrick defines the contemporary NewSpace industry as a group of companies that are looking to create near-term solutions that can be used on Earth, derived from solutions created for long-term use in space. With more private companies getting into the space game than ever before, Kirkpatrick believes the means to create habitable environments on the moon or on other planets isn’t nearly as far away as we might think.

“The space economy has always been 20 years away from where you’re standing now,” he says.

But with new entrepreneurs and space ventures following the lead of Elon Musk’s SpaceX, the space industry is starting to heat up, branching out beyond traditional aerospace and defense players like NASA, Boeing or Lockheed Martin.

“Now it’s more like five to ten years away,” Kirkpatrick says.

Kirkpatrick, who has a background in aerospace engineering, says RedWorks was born out of NASA’s 3D Printed Habitat Challenge, a “Centennial Challenge” where people from all kinds of backgrounds competed to create 3D printing/construction solutions for building and surviving on Mars.

“I was looking to get involved in the challenge. The idea of 3D printing habitats for Mars was fascinating to me. How do we solve the mass problem? How do we allow people to be self-sufficient on Mars once they get there?” he says.

Kirkpatrick says the company came together when he found a small 3D printing company in Lancaster, Calif., close to where he lives, and went to visit them. “About 20 minutes later, RedWorks was born,” he says. The company currently consists of Kirkpatrick, a 3D printing expert, and a geologist, along with student volunteers and a small team of engineers and technicians.

Like other NewSpace companies, RedWorks is focusing on terrestrial solutions first; both in order to create immediate value for what they’re doing and to help raise capital. As such, the company is looking to design and build homes by 3D printing low-cost materials that can be used in places that have a need for low-cost housing. The company is talking with real estate developers and urban planners and looking to areas where affordable housing might be able to be built entirely on site using their Mars-derived solutions.

“Terrestrial first is where the industry is going,” Kirkpatrick says. “You’ll see more players showing up in the next few years trying to capitalize on Earth-based challenges with space-based solutions.”

RedWorks plans to use parametric architecture models and parametric planning (design processes based on algorithmic thinking in which the relationship between elements is used to inform the design of complex structures) to create software for planning the printable communities and buildings. In the short-term, Kirkpatrick believes 3D printing can be used to create smart-city living solutions. The goal is to be able to combine 3D printing and embedded software so that people can design solutions specific to the environments where they’ll be used. (Hence the need for a geologist on their team.) Then they can build everything they need on site.

“For Mars, to make it a place that you can colonize, not just explore, you need to create the tools that people with not much of an engineering or space architecture background can use to set up a colony wherever they happen to land,” Kirkpatrick says. “The idea is if you have X number of people and you need to make a colony Y big, then the habitat design will scale everything with necessary utilities and living spaces entirely on-site. Then you can make use of the tools that you bring with you to print out a complete structure.”

Kirkpatrick says the objective is to be able to use materials native to each environment in order to create and print the structures. Because dirt and sand on Earth are fundamentally similar to the type of silicate materials found on the Moon and Mars, RedWorks is looking to develop a general-purpose silica printer that can be used to build 3D structures. That’s why they’re looking first to develop structures in desert climates, such southern California, North Africa and the Middle East.

A role for architecture and standards

As the private, NewSpace industry begins to take off, he believes there will be a strong need for standards to guide the nascent industry—and for Enterprise Architects to help navigate the complexities that will come with designing the technology that will enable the industry.

“Standards are necessary for collaborating and managing how fast this will take off,” he says.

Kirkpatrick also believes that developing open standards for the new space industry will better help NewSpace companies figure out how they can work together. Although he says many of NewSpace start-ups already have an interest in collaborating, with much of their work in the very early stages, they do not necessarily have much incentive to work together as of yet. However, he says, “everyone realizes that collaboration will be critical for the long-term development of the industry.”  Beginning to work toward standards development with an organization such as The Open Group now will help incentivize the NewSpace community to work together—and thus push the industry along even faster, Kirkpatrick says.

“Everyone’s trying to help each other as much as they can right now, but there’s not a lot of mechanisms in place to do so,” he says.

According to Kirkpatrick, it’s important to begin to think about standards for space-related technology solutions before the industry reaches an inflection point and begins to take off quickly. Kirkpatrick expects that inflection point will occur once a launcher like SpaceX is able to do full return landings of its rockets that are then ready for reuse. He expects that launch costs will begin to fall rapidly over the next five to ten years once launch providers can offer reliable reusable launch services, spurring the industry forward.

“Once you see launch costs fall by a factor of 10 or 100, the business side of the industry is going to grow like a weed. We need the infrastructure in place for everyone to work together and enable this incredible opportunity we have in space. There’s a very bright horizon ahead of use that’s just a little hard for everyone to see right now. But it’s coming faster than anyone realizes.”

@theopengroup #ogSFO

by-the-open-groupKeegan Kirkpatrick is the Team Lead and founder of RedWorks, a NewSpace startup in Lancaster, California. He has an undergraduate degree in Aerospace Engineering from Embry-Riddle Aeronautical University, and before turning entrepreneur worked as an engineer at Masten Space Systems on the Mojave Air and Spaceport.

In 2015, Keegan founded RedWorks with Paul Petros, Susan Jennings, and Lino Stavole to compete in and make it to the finals of the NASA Centennial 3D Printed Habitat Challenge. Keegan’s team is creating ways to 3D-print habitats from on-site materials, laying the groundwork for human settlement of the solar system.

Leave a comment

Filed under digital technologies, Enterprise Architecture (EA), Future Technologies, Internet of Things, IoT, Open Platform 3.0, Standards, The Open Group, The Open Group San Francisco 2017, Uncategorized

Gaining Executive Buy-In for IT4IT™: A Conversation with Mark Bodman

By The Open Group

With many organizations undergoing digital transformation, IT departments everywhere are taking serious hits. And although technology is at the heart of many business transformations, IT has traditionally had a reputation as a cost center rather than an innovation center.

As such, executives are often skeptical when presented with yet another new IT plan or architecture for their organizations that will be better than the last. Due to the role Enterprise Architects play in bridging the gap between the business and IT, it’s often incumbent on them to make the case for big changes when needed.

Mark Bodman, Senior Product Manager at ServiceNow and formerly at HPE, has been working with and presenting the IT4IT standard, an Open Group standard, to executives for a number of years. At The Open Group San Francisco 2017 event on January 30, Bodman will offer advice on how to present IT4IT in order to gain executive buy-in. We spoke with him in advance of the conference to get a sneak peek before his session.

What are Enterprise Architects up against these days when dealing with executives and trying to promote IT-related initiatives?

The one big change that I’ve seen is the commoditization of IT. With the cloud-based economy and the ability to rent cheap compute, storage and networking, being able to effectively leveraging commodity IT is a key differentiator that will make or break an organization. At the end of the day, the people who can exploit cheaper technology to do unique things faster are those companies who will come out ahead long-term. Companies based on legacy technologies that don’t evolve will stall out and die.

Uber and Netflix are great case studies for this trend. It’s happening everyday around us—and it’s reaching a tipping point. Enterprise Architects are faced with communicating these scenarios within their own organizations—use cases like going digital, streamlining for costs, sourcing more in the cloud—all strategies required to move the needle. Enterprise Architects are the senior most technical people within IT. They bridge the gap between business and technology at the highest level—and have to figure out ‘How do I communicate and plan for these disruptions here so that we can, survive in the digital era?’

It’s a Herculean task, not an easy thing to do. I’ve found there’s varying degrees of success for Enterprise Architects. Sometimes by no fault of their own, because they are dealing with politics, they can’t move the right agenda forward.  Or the EA may be dealing with a Board that just wants to see financial results the next quarter, and doesn’t care about the long-term transformations. These are the massive challenges that Enterprise Architects deal with every day.

Why is it important to properly present a framework like IT4IT to executives right now?

It’s as important as the changes in accounting rules have impacted organizations.  How those new rules and regulations changed in response to Enron and the other big financial failures within recent memory was quite impactful. When an IT shop is implementing services and running the IT organization as a whole, what is the operating model they use? Why is one IT shop so much different from another when we’re all facing similar challenges, using similar resources? I think it’s critically important to have a vetted industry standard to answer these questions.

Throughout my career, I’ve seen many different models for running IT from many different sources. From technology companies like HPE and IBM, to consulting companies like Deloitte, Accenture and Bain; each has their own way of doing things.  I refer this to the ‘IT flavor of the month.’  One framework is chosen over another depending on what leadership decides for their playbook—they get tired of one model, or a new leader imposes the model they are familiar with, so they adopt a new model and change the entire IT operating model, quite disruptive.                                                                                                                        

The IT4IT standard takes that whole answer to ‘how to run IT as a business’ out of the hands of any one source. That’s why a diverse set of contributors is important, like PWC and Accenture–they both have consulting practices for running IT shops. Seeing them contribute to an open standard that aggregates this know-how allows IT to evolve faster. When large IT vendors like ServiceNow, IBM, Microsoft and HPE are all participating and agreeing upon the model, we can start creating solutions that are compatible with one another. The reason we have Wi-Fi in every single corner of the planet or cellular service that you can use from any phone is because we standardized. We need to take a similar approach to running IT shops—renting commoditized services, plugging them in, and managing them with standard software. You can’t do that unless you agree on the fundamentals, the IT4IT standard provides much of this guidance.

When Enterprise Architects are thinking about presenting a framework like IT4IT, what considerations should they make as they’re preparing to present it to executives?

I like to use the word ‘contextualize,’ and the way I view the challenge is that if I contextualize our current operating model against IT4IT, how are we the same or different? What you’ll mostly find is that IT shops are somewhat aligned. A lot of the work that I’ve done with the standard over the past three years is to create material that shows IT4IT in multiple contexts. The one that I prefer to start with for an executive audience is showing how the de-facto plan-build-run IT organizational model, which is how most IT shops are structured, maps to the IT4IT structure. Once you make that correlation, it’s a lot easier to understand how IT4IT then fits across your particular organization filling some glaring gaps in plan-build-run.

Recently I’ve created a video blog series on YouTube called IT4IT Insights to share these contextual views. I’ve posted two videos so far, and plan to post a new video per month. I have posted one video on how Gartner’s Bi-Modal concept maps to IT4IT concepts, and another on the disruptive value that the Request to Fulfill value stream provides IT shops.

Why have executives been dismissive of frameworks like this in the past and how can that be combatted with a new approach such as IT4IT?

IT4IT is different than anything I have seen before.  I think it’s the first time we have seen a comprehensive business-oriented framework created for IT as an open standard. There are some IT frameworks specific to vertical industries out there, but IT4IT is really generic and addresses everything that any CIO would worry about on a daily basis. Of course they don’t teach CIOs IT4IT in school yet—it’s brand new. Many IT execs come from consulting firms where they have grown very familiar with a particular IT operating model, or they were promoted through the years establishing their own unique playbook along the way.  When a new standard framework like IT4IT comes along and an Enterprise Architect shows them how different it might be from what the executive currently knows, it’s very disruptive. IT executives got to that position through growth and experience using what works, it’s a tough pill to swallow to adopting something new like IT4IT.

To overcome this problem it’s import to contextualize the IT4IT concepts.  I’m finding many of the large consulting organizations are just now starting to learn IT4IT—some are ahead of others. The danger is that IT4IT takes some that unique IP away, and that’s a little risky to them, but I think it’s an advantage if they get on the bandwagon first and can contextually map what they do now against IT4IT. One other thing that’s important is that since IT4IT is an open standard, organizations may contribute intellectual property to the standard and be recognized as the key contributor for that content. You see some of this already with Accenture’s and PWC’s contributions.  At the same time, each consulting organization will hold some of their IP back in to differentiate themselves where applicable. That’s why I think it’s important for people presenting IT4IT to contextualize to their particular organization and practice.  If they don’t, it’s just going to be a much harder discussion.

Like with any new concept—eventually you find the first few who will get it, then latch on to it to become the ‘IT4IT champion.’ It’s very important to have at least one IT4IT champion to really evangelize the IT4IT standard and drive adoption.  That champion might not be in an executive position able to change things in their organization, but it’s an important job to educate and evangelize a better way of managing IT.

What lessons have you learned in presenting IT4IT to executives? Can you offer some tips and tricks for gaining mindshare?

I have many that I’ll talk about in January, but one thing that seems to work well is that I take a few IT4IT books into an executive briefing, the printed standard and pocket guide usually.  I’ll pass them around the room while I present the IT4IT standard. (I’m usually presenting the IT4IT standard as part of a broader executive briefing agenda.) I usually find that the books get stuck with someone in the room who has cracked open the book and recognized something of value.  They will usually want to keep the book after that, and at that point I know who my champion is.  I then gauge how passionate they are by making them twist my arm to keep the book.  This usually works well to generate discussion of what they found valuable, in the context of their own IT organization and in front of the other executives in the room. I recently presented to the CIO of a major insurance company performing this trick.  I passed the books around during my presentation and found them back in front of me.  I was thinking that was it, no takers. But the CIO decided to ask for them back once I concluded the IT4IT presentation.  The CIO was my new champion and everyone in the room knew it.

What about measurement and results? Is there enough evidence out there yet on the standard and the difference it’s making in IT departments to bring measurement into your argument to get buy in from executives?

I will present some use cases that have some very crystal clear results, though I can’t communicate financials. The more tangible measurements are around the use cases where we leveraged the IT4IT standard to rationalize the current IT organization and tools to identify any redundancies. One of the things I learned 10 years ago, well before the IT4IT standard was around, was how to rationalize applications for an entire organization that have gotten out of hand from a rash of M&A activity. Think about the redundancies created when two businesses merge. You’re usually merging because of a product or market that you are after, there’s some business need driving that acquisition. But all the common functions, like HR and finance are redundant.  This includes IT technologies and applications to manage IT, too. You don’t need two HR systems, or two IT helpdesk systems; you’ve got to consolidate this to a reasonable number of applications to do the work. I have tackled the IT rationalization by using the IT4IT standard, going through an evaluation process to identify redundancies per functional component.  In some cases we have found more 300 tools that perform the same IT function, like monitoring. You shouldn’t need to have 300 different monitoring tools—that’s ridiculous. This is just one clear use case where we’ve applied IT4IT to identify similar tools and processes that exist within IT specifically, a very compelling business case to eliminate massive redundancy.

Does the role of standards also help in being able to make a case for IT4IT with executives? Does that lend credence to what you’re proposing and do standards matter to them?

They do in a way because like accounting rules, if you have non-standard accounting rules today, it might land your executives in jail. It won’t land you in jail if you have a non-standard IT shop however, but being non-standard will increase the cost of everything you do and increase risks because you’re going against the grain for something that should be a commodity. At the executive level, you need to contextualize the problem of being non-standard and show them how adopting the IT4IT standard may be similar to the accounting rule standardization.

Another benefit of standards I use is to show how the standard is open, and the result of vetting good ideas from many different organizations vs. trying to make it up as you go.  The man-years of experience that went into the standard, and elegance of the result becomes a compelling argument for adoption that shouldn’t be overlooked.

What else should EAs take into consideration when presenting something like IT4IT to executives?

I think the primary thing to remember is to contextualize your conversation to your executives and organization. Some executives in IT may have zero technology background, some may have come up through the ranks and still know how to program, so you’ve got to tell the story based on the audience and tailor it. I presented recently to 50 CIOs in Washington D.C., so I had to contextualize the standard to show how IT4IT relates to the major changes happening in the federal market, such as the Federal Information Technology Acquisition Reform Act (FITARA), and how it supports the Federal Enterprise Architecture framework. These unique requirement changes had to be contextualized against the IT4IT standard so the audience understood exactly how IT4IT relates to the big challenges they are dealing with unique to the market.

Any last comments?

The next phase of the IT4IT standard is just taking off.  The initial group of people who were certified are now using IT4IT for training and to certify the next wave of adopters. We’re at a point now where the growth is going to take off exponentially. It takes a little time to get comfortable with something new and I’m seeing this happen more quickly in every new engagement. Enterprise Architects need to know that there’s a wealth of material out there, and folks who have been working with the IT4IT standard for a long time. There’s something new being published almost every day now.

It can take a while sometimes from first contact to reaching critical mass adoption, but it’s happening.  In my short three weeks at ServiceNow so far I have already had two customer conversations on IT4IT, it’s clearly relevant here too—and I have been able to show relevance to every other IT shop and vendor in the last three years.  This new IT4IT paradigm does need to soak in a bit, so don’t get frustrated about the pace of adoption and understanding.  One day you might come across a need and pull out the IT4IT standard to help in some way that’s not apparent right now.  It’s exciting to see people who worked with initial phases of the standard development now working on their next gig.  It’s encouraging to see folks in their second and even their third job leveraging the IT4IT standard.  This is a great indicator that the IT4IT standard is being accepted and starting to become mainstream.

@theopengroup #ogSFO

by-the-open-groupMark Bodman is an experienced, results-oriented IT4IT™ strategist with an Enterprise Architecture background, executive adviser, thought leader and mentor. He previously worked on cross-portfolio strategies to shape HPE’s products and services within HPE to include service multi-source service brokering, and IT4IT adoption. Mark has recently joined ServiceNow as the outbound Application Portfolio Management Product Manager.

Hands-on experience from years of interaction with multiple organizations has given Mark a unique foundation of experience and IT domain knowledge. Mark is well versed in industry standards such as TOGAF®, an Open Group standard, COBIT, and ITIL, has implemented portfolio management and EA practices, chaired governance boards within Dell, managed products at Troux, and helped HPE customers adopt strategic transformation planning practices using reference architectures and rationalization techniques.

 

 

1 Comment

Filed under Digital Transformation, Enterprise Architecture, Enterprise Transformation, IT, IT4IT, Standards, The Open Group, Uncategorized

What is Open FAIR™?

By Jim Hietala, VP, Business Development and Security, The Open Group

Risk Practitioners should be informed about the Open FAIR body of knowledge, and the role that The Open Group has played in creating a set of open and vendor-neutral standards and best practices in the area of Risk Analysis. For those not familiar with The Open Group, our Security Forum has created standards and best practices in the area of Security and Risk for 20+ years. The Open Group is a consensus-based and member-driven organization. Our interest in Risk Analysis dates back many years, as our membership saw a need to provide better methods to help organizations understand the level of risk present in their IT environments. The Open Group membership includes over 550 member organizations from both the buy-side and supply-side of the IT industry. The Security Forum currently has 80+ active member organizations contributing to our work.

A History of Open FAIR and The Open Group

In 2007, Security Forum Chairman Mike Jerbic brought the Factor Analysis of Information Risk (FAIR) to our attention, and suggested that it might be an interesting Risk Analysis taxonomy and method to consider as a possible open standard in this area. Originally created by Jack Jones and his then company Risk Management Insights (RMI), Jack and his partner Alex Hutton agreed to join The Open Group as members, and to contribute the FAIR IP as the basis for a possible open risk taxonomy standard.

Over a period of time, the Security Forum membership worked to create a standard comprising relevant aspects of FAIR (this initially meant the FAIR Risk Taxonomy). The result of this work was the eventual publication of the first version of the Risk Taxonomy Standard (O-RT), which was published in January 2009.  In 2012, the Security Forum decided to create a certification program of practitioners of the FAIR methodology, and undertook a couple of related efforts to update the Risk Taxonomy Standard, and to create a companion standard, the Risk Analysis Standard (O-RA). O-RA provides guidance on the process aspects of Risk Analysis that are lacking in O-RT, including things like risk measurement and calibration, the Risk Analysis process, and control considerations relating to Risk Analysis. The updated O-RT standard and the O-RA standard were published in late 2013, and the standards are available here:

C13G Risk Analysis (O-RA)

C13K Risk Taxonomy (O-RT), Version 2.0

We collectively refer to these two standards as the Open FAIR body of knowledge.  In late 2013, we also commenced operation of the Open FAIR Certification Program for Risk Analysts. In early 2014, we started development of an accreditation program for Open FAIR accredited training courses. The current list of accredited Open FAIR courses is found here. If you are with a training organization and want to explore accreditation, please feel free to contact us, and we can provide details. We have also created licensable Open FAIR courseware that can enable you to get started quickly with training on Open FAIR. Future articles will dive deeper into the Open FAIR certification program and the accredited training opportunity. It is worth noting at this point that we have also produced some hard copy Open FAIR guides that are helpful to candidates seeking to certify to Open FAIR. These are accessible via the links below, and are available at a nominal cost from our publishing partner Van Haren.

B140   Open FAIR Foundation Study Guide

G144  A Pocket Guide to the Open FAIR Body of Knowledge

Beyond the standards and certification program work, The Open Group has produced a number of other helpful publications relating to Risk, Security, and the use of Open FAIR. These include the following, all of which are available as free downloads:

W148  An Introduction to the Open FAIR Body of Knowledge

C103  FAIR – ISO/IEC 27005 Cookbook

G167  The Open FAIR™ – NIST Cybersecurity Framework Cookbook

G152  Integrating Risk and Security within a TOGAF® Enterprise Architecture

G081  Requirements for Risk Assessment Methodologies

W150  Modeling Enterprise Risk Management and Security with the ArchiMate® Language

Other Active Open FAIR Workgroups in the Security Forum

In addition to the standards and best practices described above, The Open Group has active workgroups developing the following related items.  Stay tuned for more details of these activities.   If any of the following projects are of interest to your organization, please feel free to reach out to learn more.

1) Open FAIR to STIX Mapping Whitepaper. This group is writing a whitepaper that maps the Open FAIR Risk Taxonomy Standard (O-RT) to STIX, a standard which originated at MITRE, and is being developed by OASIS.

2) Open FAIR Process Guide project – This group is writing a process guide for performing Open FAIR-based Risk Analysis. This guide fills a gap in our standards & best practices by providing a “how-to” process guide.

3) Open Source Open FAIR Risk Analysis tool – A basic Open FAIR Risk Analysis tool is being developed for students and industry.

5) Academic Program – A program is being established at The Open Group to support active student intern participation in risk activities within the Security Forum. The mission is to promote the development of the next generation of security practitioner and experience within a standards body.

6) Integration of Security and Risk into TOGAF®, an Open Group standard. This project is working to ensure that future versions of the TOGAF standard will comprehensively address security and risk.

How We Do What We Do

The Open Group Security Forum is a member-led group that aims to help members meet their business objectives through the development of standards and best practices. For the past several years, the focus of our work has been in the areas of Risk Management, Security Architecture, and Information Security Management standards and best practices. ‘Member-led’ means that members drive the work program, proposing projects that help them to meet their objectives as CISO’s, Security Architects, Risk Managers, or operational information security staff. All of our standards and best practices guidance are developed using our open, consensus-based standards process.

The standards development process at The Open Group allows members to collaborate effectively to develop standards and best practices that address real business issues. In the area of Risk Management, most of the publications noted above were created because members saw a need to determine how to apply Open FAIR in the context of other standards or frameworks, and then leveraged the entire Security Forum membership to produce useful guidance.

It is also worth noting that we do a lot of collaborating with other parts of The Open Group, including with the Architecture Forum on the integration of Risk and Security with TOGAF®, with the ArchiMate™ Forum on the use of ArchiMate, an Open Group standard, to model Risk and Security, with the Open Platform 3.0™ Forum, and with other Forums. We also have a number of external organizations that we work with, including SIRA, ISACA, and of course the FAIR Institute in the Risk Management area.

The Path Forward for Open FAIR

Our future work in the area of Risk Analysis will likely include other cookbook guides, showing how to use Open FAIR with other standards and frameworks. We are committed to meeting the needs of the industry, and all of our work comes from members describing a need in a given area. So in the area of Risk Management, we’d love to hear from you as to what your needs are, and even more, to have you contributing to the development of new materials.

For more information, please feel free to contact me directly via email or Linkedin:

 

@theopengroup

Jimby-jim-hietala-vp-business-development-and-security Hietala, Open FAIR, CISSP, GSEC, is Vice President, Business Development and Security for The Open Group, where he manages the business team, as well as Security and Risk Management programs and standards activities,  He has participated in the development of several industry standards including O-ISM3, O-ESA, O-RT (Risk Taxonomy Standard), O-RA (Risk Analysis Standard), and O-ACEML. He also led the development of compliance and audit guidance for the Cloud Security Alliance v2 publication.

Jim is a frequent speaker at industry conferences. He has participated in the SANS Analyst/Expert program, having written several research white papers and participated in several webcasts for SANS. He has also published numerous articles on information security, risk management, and compliance topics in publications including CSO, The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

An IT security industry veteran, he has held leadership roles at several IT security vendors.

Jim holds a B.S. in Marketing from Southern Illinois University.

Leave a comment

Filed under Accreditations, ArchiMate®, Certifications, Cybersecurity, Open FAIR, Open FAIR Certification, RISK Management, Security, Standards, The Open Group, TOGAF®, Uncategorized

Looking Forward to a New Year

By Steve Nunn, President & CEO, The Open Group

As another new year begins, I would like to wish our members and The Open Group community a happy, healthy and prosperous 2017! It’s been nearly 15 months since I transitioned into my new role as the CEO of The Open Group, and I can’t believe how quickly that time has gone.

As I look back, it was at The Open Group Edinburgh event in October 2015 that we launched the IT4IT™ Reference Architecture, Version 2.0. In just the short time since then, I’m pleased to report that IT4IT has garnered attention worldwide. The IT4IT Certification for People program that we launched last January—one of the first things I had the pleasure of doing as CEO—has also gained momentum quickly. Wherever I have traveled over the past year, IT4IT has been a topic of great interest, particularly in countries like India and Brazil. There is a lot of potential for the standard globally, and we can look forward to various new IT4IT guides and whitepapers as well as an update to the technical standard in the first few months of this year.

Looking back more at 2016, there were a number of events that stood out throughout the course of the year. We were excited to welcome back Fujitsu as a Platinum member in April. The Open Group global reach and continued work creating standards relevant to how technology is impacting the worldwide business climate were key factors in Fujitsu’s decision to rejoin, and it’s great to have them back.

In addition to Fujitsu, we welcomed 86 new members in 2016. Our membership has been increasingly steadily over the past several years—we now have more than 511 members in 42 countries. Our own footprint continues to expand, with staff and local partners now in 12 countries. We have now reached a point where not a month goes by without The Open Group hosting an event somewhere in the world. In fact, more than 66,000 people attended an Open Group event either online or in-person last year. That’s a big number, and it is a reflection on the interest in the work that is going on inside The Open Group.

I believe this tremendous growth in membership and participation in our activities is due to a number of factors, including our focus on Enterprise Architecture and the continued take up of TOGAF® and ArchiMate® – Open Group standards – and the ecosystems around them.  In 2016, we successfully held the first TOGAF User Group meetings worldwide, and we also released the first part of the Open Business Architecture standard. Members can look forward to additions to that standard this year, as well as updates to the ArchiMate certifications, to reflect the latest version of the standard – ArchiMate® 3.0.

In addition, our work with The Open Group FACE™ Consortium has had a significant impact on growth—the consortium added 13 members last year, and it is literally setting the standard for how government customers buy from suppliers in the avionics market. Indeed, such has the success of The Open Group FACE Consortium been that it will be spinning out its own new consortium later this year, SOSA, or the Sensor Open Systems Architecture. The FACE Consortium was also nominated for the 2017 Aviation Week Awards in Innovation for assuming that software conforming to the FACE technical standard is open, portable and reusable. Watch this space for more information on that in the coming months.

2017 will bring new work from our Security and Open Platform 3.0™ Forums as well. The Security and Architecture Forums are working together to integrate security architectures into TOGAF, and we can expect updates to the O-ISM3 security, and OpenFair Risk Analysis and Taxonomy standards later in the year. The Open Platform 3.0 Forum has been hard at work developing materials that they can contribute to the vast topic of convergence, including the areas of Cloud Governance, Data Lakes, and Digital Business Strategy and Customer Experience. Look for new developments in those areas throughout the course of this year.

As the ever-growing need for businesses to transform for the digital world continues to disrupt industries and governments worldwide, we expect The Open Group influence to reach far and wide. Standards can help enterprises navigate these rapid changes. I believe The Open Group vision of Boundaryless Information Flow™ is coming to fruition through the work our Forums and Working Groups are doing. Look for us to take Boundaryless Information Flow one step further in January when we announce our latest Forum, the Open Process Automation™ Forum, at our upcoming San Francisco event. This promises to be a real cross-industry activity, bringing together industries as disparate as oil and gas, mining and metals, food and beverage, pulp and paper, pharmaceutical, petrochemical, utilities, and others. Stay tuned at the end of January to learn more about what some prominent companies in these industries have in common, in addition to being members of The Open Group!

With all of these activities to look forward to in 2017—and undoubtedly many more we have yet to see—all signs point to an active, productive and fulfilling year. I look forward to working with all of you throughout the next 12 months.

Happy New Year!

by-steve-nunn-president-and-ceo

by-steve-nunn-president-and-ceoSteve Nunn is President and CEO of The Open Group – a global consortium that enables the achievement of business objectives through IT standards. He is also President of the Association of Enterprise Architects (AEA).

Steve joined The Open Group in 1993, spending the majority of his time as Chief Operating Officer and General Counsel.   He was also CEO of the AEA from 2010 until 2015.

Steve is a lawyer by training, has an L.L.B. (Hons) in Law with French and retains a current legal practicing certificate.  Having spent most of his life in the UK, Steve has lived in the San Francisco Bay Area since 2007. He enjoys spending time with his family, walking, playing golf, 80s music, and is a lifelong West Ham United fan.

@theopengroup @stevenunn

 

 

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Transformation, Digital Transformation, Enterprise Architecture, FACE™, IT4IT, Open Platform 3.0, Open Process Automation, Standards, Steve Nunn, Uncategorized

TOGAF® User Group Meetings

By The Open Group

Since its inception more than two decades ago, TOGAF®, an Open Group standard, has grown to become the de facto global framework for creating Enterprise Architectures.

Thousands of companies worldwide have adopted and adapted TOGAF to transform their businesses. Facts about TOGAF include:

  • 80% of the Fortune Top 50 companies use TOGAF
  • Over 60,000 individuals hold certifications in TOGAF 9
  • TOGAF users are based in 120 countries
  • Greater than 60 accredited training courses available globally

The Open Group wants to ensure that TOGAF maintains its momentum worldwide and realizes that doing so cannot be done without capturing the voices beyond the The Open Group members.  Additionally, there is an increase in the number of licensed TOGAF professionals who want to follow up their training with a forum for discussion and sharing. Thus, there is an opportunity to provide TOGAF Users to easily Share, get Enlightenment, and Express their needs (’SEE’ TOGAF).

The starting off point for The Open Group was to begin hosting TOGAF User Group Meetings, which move in a direction where users get more involved in their structure. With these meetings, The Open Group gets an opportunity to Harvest ideas on use, Educate users, have Access to larger user base and broader set of Requirements (‘HEAR’ about TOGAF use).

The User Group Meetings are open to all interested people and are free to attend.

So there is a win-win for TOGAF Users to meet. This part of the story is yet to be written!

For the upcoming TOGAF® User Group Meeting in San Francisco, CA on January 30, 2017, please visit here.

by-the-open-group

 

Comments Off on TOGAF® User Group Meetings

Filed under Certifications, Enterprise Architecture, Enterprise Architecture (EA), Enterprise Transformation, Professional Development, Standards, The Open Group San Francisco 2017, TOGAF, TOGAF®