Category Archives: The Open Group San Francisco 2017

The Open Group San Francisco Day One Highlights

By The Open Group

The Open Group kicked off its first event of 2017 on a sunny Monday morning, January 30, in the City by the Bay, with over 200 attendees from 20 countries including Australia, Finland, Germany and Singapore.

The Open Group CEO and President Steve Nunn began the day’s proceedings with a warm welcome and the announcement of the latest version of the Open Trusted Technology Provider™ Standard (O-TTPS), a standard that specifies best practices for providers to help them mitigate the risk of tainted or counterfeit products or parts getting into the IT supply chain. A new certification program for the standard was also announced, as well as the news that the standard has recently been ratified by ISO. Nunn also announced the availability of the next version of The Open Group IT4IT™ standard, version 2.1.

Monday’s plenary focused on IT4IT and Managing the Business of IT. Bernard Golden, CEO of Navica, spoke on the topic,“Cloud Computing and Business Expectations: How the Cloud Changes Everything.” Golden, who was named as one of the 10 most influential people in cloud computing by Wired magazine, began with a brief overview of the state of the computing industry today, which is largely characterized by the enormous growth of cloud computing. Golden believes that the public cloud will be the future of IT moving forward. With the speed that the cloud enables today, IT and app development have become both the bottleneck and differentiator for IT departments. To address these bottlenecks, IT must take a multi-pronged, continuous approach that uses a combination of cloud, Agile and DevOps to address business drivers. The challenge for IT shops today, Golden says, is also to decide where to focus and what cloud services they need to build applications. To help determine what works, IT must ask whether services are above or below what he calls “the value line,” which delineates whether the services available, which are often open-source, will ultimately advance the company’s goals or not, despite being low cost. IT must also be aware of the fact that the value line can present a lock-in challenge, creating tension between the availability of affordable—but potentially buggy—open-source tools and services and the ongoing value the business needs. Ultimately, Golden says, the cloud has changed everything—and IT must be willing to change with it and weigh the trade-offs between openness and potential lock-in.

Forrester Research analysts David Wheable, Vice President and Principal Consultant, and David Cannon, Vice President and Group Director, took the stage following Golden’s session to discuss “The Changing Role of IT: Strategy in the Age of the Customer.” Wheable spoke first, noting that technology has enabled a new “age of the customer,” an era where customers now have the majority of the power in the business/customer relationship.  As such, companies must now adapt to how their customers want to interact with their businesses and how customers use a company’s business applications (particularly via mobile devices) in order to survive and prevent customers from constantly changing their loyalties. Because IT strategists will not be able to predict how customers will use their applications, they must be able to put themselves in a position where they can quickly adapt to what is happening.

Cannon discussed what IT departments need to consider when it comes to strategy. To develop a viable IT strategy today, companies must consider what is valuable to the customer and how they will choose the technologies and applications that provide customers what they need. In the current IT landscape, features and quality no longer matter—instead, IT must take into account customers’ emotions, desires and immediate needs. Continuous exploitation of digital assets to deliver customer outcomes will be critical for both digital and business strategies—which Cannon argues are now essentially the same thing—moving forward. To survive in this new era, IT departments must also be able to enable customer outcomes, measure the customer experience, manage a portfolio of services, showcase business—not just technical—expertise and continue to enable service architectures that will deliver what customers need and want.

After the morning coffee break, Author and Researcher Gene Kim followed to discuss his recent book, The DevOps Handbook. His session, entitled, “The Rise of Architecture: Top Lessons Learned while Researching and Writing The DevOps Handbook,” explored the example of high performers in the tech sector and how the emergence of DevOps has influenced them. According to Kim, most IT departments are subject to a downward spiral over time due to the exponential growth of technical assets and debt during that time, which ultimately weigh them down and affect performance. In contrast, according to Kim’s research, high-performing organizations have been able to avoid this spiral by using DevOps. Organizations utilizing DevOps are nearly three times more agile than their peers, are more reliable and two times more likely to exceed profitability, market share and productivity goals in the marketplace. The ability to deploy small changes more frequently has been a game changer for these high-performing organizations not only allowing them to move faster but to create more humane working conditions and happier, more productive workers. Kim also found that fear of doing deployments is the most accurate predictor of success in organizations—those that fear deployments have less success than those that don’t.

by-the-open-group

Gene Kim

The final session of the morning plenary was presented by Charles Betz, IT Strategist, Advisor and Author from Armstrong Process Group. Betz provided an overview of how the IT4IT framework can be used within organizations to streamline IT processes, particularly by automating systems that no longer need to be done by hand. Standardizing IT processes also provides a way to deliver more consistent results across the entire IT value chain for better business results. Taking an iterative and team-oriented approach are also essential elements for managing the body of knowledge necessary for changing IT processes and creating digital transformation.

During the lunch hour conference partners Hewlett Packard Enterprise and Simplilearn each gave  separate presentations for attendees discussing the use of IT4IT for digital transformation and skills acquisition in the digital economy, respectively

Monday afternoon, The Open Group hosted its fourth TOGAF®, an Open Group standard, User Group meeting in addition to the afternoon speaking tracks. The User Group meeting consisted of an Oxford style debate on the pros and cons of “Create versus Reuse Architecture,” featuring Jason Uppal, Open CA Level 3 Certified Architect, QRS, and Peter Haviland, Managing Director, Head of Engineering & Architecture, Moody’s Corporation. In addition to the debate, User Group attendees had the opportunity to share use cases and stories with each other and discuss improvements for TOGAF that would be beneficial to them in their work.

The afternoon sessions consisted of five separate tracks:

  • IT4IT in Practice – Rob Akershoek from Logicalis/Shell Information Technology International moderated a panel of experts from the morning plenary as well as sessions related to presenting IT4IT to executives, the role of EA in the IT value chain and using IT4IT with TOGAF®.
  • Digital Business & the Customer Experience – Featuring sessions on architecting digital businesses and staying ahead of disruption hosted by Ron Schuldt of Femto-data.
  • Open Platform 3.0™/Cloud – Including talks on big data analytics in hybrid cloud environments and using standards and open source for cloud customer reference architectures hosted by Heather Kreger, Distinguished Engineer and CTO International Standards, IBM.
  • Open Trusted Technology – Trusted Technology Forum Director Sally Long introduced sessions on the new O-TTPS self-assessed certification and addressing product integrity and supply chain risk.
  • Open Business ArchitectureFeaturing an introduction to the new preliminary Business Architecture (O-BA) standard presented by Patrice Duboe, Innovation VP, Global Architects Leader from the CTO Office at Capgemini, and Venkat Nambiyur, Director – Business Transformation, Enterprise & Cloud Architecture, SMBs at Oracle.

Monday’s proceedings concluded with an evening networking reception featuring the day’s speakers, IT professionals, industry experts and exhibitors. Thanks for the San Francisco event also go to the event sponsors, which include Premium Sponsors Good eLearning, Hewlett Packard Enterprise, Orbus Software and Simplilearn, as well as sponsors Van Haren Publishing, the Association of Enterprise Architects and San Jose State University.

@theopengroup #ogSFO

Leave a comment

Filed under Enterprise Architecture (EA), Forrester, Gene Kim, IT4IT, Open Platform 3.0, OTTF, Steve Nunn, The Open Group, The Open Group San Francisco 2017, TOGAF®, Uncategorized

The Open Trusted Technology Provider™ Standard (O-TTPS) – Approved as ISO/IEC 20243:2015 and the O-TTPS Certification Program

By The Open Group

The increase of cybersecurity threats, along with the global nature of Information and Communication Technology (ICT), results in a threat landscape ripe for the introduction of tainted (e.g., malware-enabled or malware-capable) and counterfeit components into ICT products. This poses significant risk to customers in the operation of their business enterprises and our critical infrastructures.

A compromised electronic component or piece of malware-enabled software that lies dormant and undetected within an organization could cause tremendous damage if activated remotely. Counterfeit products can also cause significant damage to customers and providers resulting in rogue functionality, failed or inferior products, or revenue, brand equity loss, and critical damage.

As a result, customers now need assurances they are buying from trusted technology providers who follow best practices with their own in-house secure development and engineering practices and also in securing their out-sourced components and their supply chains.

Summary

The O-TTPS, an Open Group Standard, specifies a set of best practice requirements and recommendations that ICT providers should follow throughout the full life cycle of their products from design through disposal – including their supply chains – in order to mitigate the risk of tainted and counterfeit components. The Standard is the first with a Certification Program that specifies measurable conformance criteria for both product integrity and supply chain security in ICT.

The Standard provides requirements for the full product life cycle, categorizing them further into best practice requirements for Technology Development (product development and secure engineering methods) and Supply Chain Security.

by-the-open-group

The Open Group O-TTPS Certification Program offers certificates for conformance to both the O-TTPS and ISO/IEC 20243:2015, as the two standards are equivalent. The Program identifies the successful applicant on a public registry so customers and business partners can readily identify an Open Trusted Technology Provider™ who conforms to the Standard.

The Certification Program is available to all providers in the ICT product’s supply chain, including: Original Equipment Manufacturers (OEMs), hardware and software component suppliers, integrators, Value Add Resellers (VARS), and distributors. Thus, it offers a holistic program that not only allows customers to identify trusted business partners like integrators or OEMs who are listed on the registry, but it also allows OEMs and integrators to identify trusted business partners like hardware and software component suppliers, VARS, and distributors from the public registry.

by-the-open-group

Target Audience

As the O-TTPS Certification Program is open to all constituents involved in a product’s life cycle – from design through disposal – including those in the product’s supply chain, the Standard and the Certification Program should be of interest to all ICT providers as well as ICT customers.

The newly published guide: O-TTPS for ICT Product Integrity and Supply Chain Security – A Management Guide, available from The Open Group Bookstore at www.opengroup.org/bookstore/catalog/g169.htm, offers guidance to managers – business managers, procurement managers, or program managers – who are considering adopting the best practices or becoming certified as an Open Trusted Technology Provider™. It provides valuable information on:

  • The best practices in the Standard, with an Appendix that includes all of the requirements
  • The business rationale for why a company should consider implementing the Standard and becoming certified
  • What an organization should understand about the Certification Program and how they can best prepare for the process
  • The differences between the options (self-assessed or third-party assessed) that are currently available for the Certification Program
  • The process steps and the terms and conditions of the certification, with pointers to the relevant supporting documents, which are freely available

The Management Guide offers a practical introduction to executives, managers, those involved directly in implementing the best practices defined in the Standard, and those who would be involved in the assessments, whether self-assessment or third-party assessment.

Further Information

The Open Trusted Technology Provider™ Standard (O-TTPS), Version 1.1 is available free-of-charge from www.opengroup.org/bookstore/catalog/c147.htm.

The technically equivalent standard – ISO/IEC 20243: 2015 – is available for a fee from iso.org.

For more information on the Open Trusted Technology Provider™ Standard (O-TTPS) and the O-TTPS Certification Program, visit www.opengroup.org/ottps.

@theopengroup #ogSFO

1 Comment

Filed under Accreditations, Certifications, COTS, Cybersecurity, O-TTF, O-TTPS, OTTF, standards, Supply chain risk, The Open Group, The Open Group San Francisco 2017, Uncategorized

Understanding the Customer Experience: A Conversation with Forrester Analysts David Cannon and David Wheable

By The Open Group

With more technology in the hands of consumers than ever before, customers have become increasingly demanding in terms of not only the service they receive from companies but also the experience they have with your company or brand. Today, companies must be aware of and respond to what customers are looking for in terms of what they get from a company and how they interact—or they risk losing those customers.

This is leaving many companies in a very vulnerable position, particularly when it comes to digital customer experiences. In advance of The Open Group San Francisco 2017, we spoke with David Cannon, Vice President and Group Director, and David Wheable, Vice President and Principle Consultant, both of Forrester Research, about what customer expectations look like today and what companies need to be aware of so that they can survive in an ever-changing digital landscape. Both will be keynote speakers at The Open Group event on January 30.

The customer experience is something that’s been talked about for many years. What’s different now about customers that make their experiences with companies an even more urgent matter than in the past?

David Cannon (DC): The single most important thing that’s changed is that customers have more choice and the ability to change suppliers within literally seconds. And this is not limited to individual consumers.  Enterprises can switch key systems with minimal disruption.  The key to retaining customers today is to make sure their experience with you is good—if not there’s no reason to stay.

David Wheable (DW): Building on that is the way we talk about digital business; many of those interactions occur digitally now. The role of technology in that experience now is key. If you don’t deliver a good digital customer experience, as Dave Cannon said, the next one in the line will get the business. I actually did that the other day—one site would not let me log in, so they lost my business and the next one got my business instantly.

DC: David’s right, with digitization, we’re not actually dealing with individuals and human beings, we’re dealing with simple, digital interfaces. This reduces any potential sense of loyalty—we just want what we want, when we want it and that’s it.

That takes away a huge part of how businesses have traditionally run—it’s that relationship they have with the customer that has often set businesses apart. Are there ways that companies can better personalize experience and counteract that loss of human interaction or do they need to also make sure they are continuing to work person-to-person?

DW: That’s an interesting question because particularly when I talk to technical people, they really don’t actually understand what the customer experience is. Forrester defines it in terms of three Es—ease, effectiveness and emotion. Technical people have generally dealt with the ease and effectiveness for many years, so that’s no problem, but what they’re really bad at thinking about is designing for emotion. So if you are trying to have a digital customer experience, digital touch points, and you still have to include the emotion side in it, that’s where the loyalty comes from. Where we see that driven is when organizations look at how the positive, painless, frictionless kinds of experiences drive that kind of loyalty. What we see now is that those companies that are thinking about this are moving away from thinking about products and services and moving toward thinking about the customer in terms of experiences, desires and outcomes, and they might only be a small part of an ecosystem that generates that experience or outcome.

DC: I’ll add to that. One of the secrets to understanding how you’re impacting that emotion is to be able to gather more information about what the customer is doing, how they’re doing it, when they’re doing it and why they’re doing it.  We have tools that can do this better than we’ve ever done it before—without even interviewing or surveying our customers.  We have to be able to infer from whatever they’re doing digitally whether that equates to a good emotion or a negative emotion. The whole area of analytics becomes more important than ever—but it’s also different than before.

To give an example, sites like Yelp or TripAdvisor, give you a history of people’s experiences with a restaurant or service provider.  But they don’t provide real time information if the thing that upset a customer two years ago is still there.  Unless the customer provides constructive feedback that’s visible to all, they don’t help the service provider understand what they can do to make the customer’s experience better. Customer satisfaction ratings are also limited, because they are just a snapshot of a customer at a moment.  They don’t always tell us why the customer was (dis)satisfied, or whether they would have the same rating with that service today.

We’re getting better at looking at real-time analytics that tell us, in real-time, what is the context, where are customers using this, why are they using this and how does that impact their experience at that time? Is there a way that we can detect a negative experience and determine exactly what’s causing it and how to change it immediately?

One technique we use is Touchpoint Analysis, which breaks down what a customer does in individual interactions and individual contexts and then figures out how to measure their experience with each touchpoint.  To identify each touchpoint and then instrument it for real time experience was a huge ask, but technology is making it possible.

Personalization and customization have been talked about for at least 20 years now. At this point are there still concerns about privacy and knowing too much about customers? And on the flip side, if companies are relying on data to determine customer interactions rather than personal contact or relationships—and granted large companies can’t rely on personal interactions with thousands of people—does that reliance on data continue the problem of taking away from the human interaction?

DC: It’s kind of a paradox. On the one hand, you’re inventing technology and you’re putting that technology in the hands of users and that distances them from you. At the same time, you’re making them more capable of engaging with you. The very technology that allows you to be more remote (work from home, etc.) is being used to create online communities, friends, go shopping, run a political campaign, etc.  So technology is not only changing patterns of customer behavior, it’s changing how society works.  This is neither good news nor bad (or perhaps it’s a bit of both)—it’s just what’s happening.

On the other hand, by participating in this online society, you are sacrificing privacy. Many people demand better customer experience, fully understanding that that means that companies know more about them.  We’re starting to see some awareness of how ‘creepy’ this can be (being stalked by advertisers in one app because you searched for something in a different app).  But at this stage the search for better customer experience is still more powerful than the need for privacy. Will the pendulum swing the other way?  Definitely, but it will take some time and a more serious revelation of how privacy has been abused than those that have already emerged.

DW:  I also thing that one of the drivers of loyalty that customers are looking for from a brand is that trust in that brand to look after their data appropriately and use it appropriately. What we see again is that is a business imperative to respect privacy, to use data appropriately and obscure data appropriately and if the customers of that organization feel that is happening, they will be more loyal to that organization or company than one that they don’t trust their approach to data.

DC: I totally agree with that. I’d say though that in some cases, the realization that a company has not dealt with my data appropriately comes too late. We’re starting to see a shift to companies being more proactive in communicating how they’re safeguarding your privacy so it becomes more of a selling point for the services they provide. Not only are they going to give you a better experience, they’re going to give you a safer experience as well. Up until now that need for customers to know that up front has not really been as urgent. I think based on what David just said, that’s changing.

With all the high profile security breaches over the past few years, that’s important. On the other hand, if companies have poor service and do things that anger people, it’s as simple as if you’re waiting too long at the airport for your flight and you start tweeting about it, then you’re helping to damage the reputation of the airline.

DC: And what we’ve seen is that some of these companies are monitoring that kind of traffic and recording who those users are that make those statements. Using social media to communicate your experience with a company can also act against your relationship with that company. Some customers have reported negative experiences after they tweet bad things and positive experiences after they tweet good things

I think the only thing that we can deduce from this is that every type of human interaction that existed before all this technology is now happening using the technology. Just as you were careful in the real world, you have to be careful in the online world. You have to be careful about what you say, about whom and to whom—and that goes for whether you’re a consumer or a company.

Technical people still have to catch up with this a bit. Some think as long as there’s anti-virus or intrusion control on our major systems, we’re OK. What they’re not looking at is the business risk associated with, for example, a privacy breach — we’re not talking about a technical threat here, we’re talking about your business being able to survive or not.

We’re really exploring very new ethical and legislative ground here and the whole customer experience is really going to test that in the coming years. Just how much information is too much? Just what constitutes private information? Different countries have different views of what constitutes private information and my ability as a company to place my base of operation in one of those countries that is less responsible is that I can do more, but it makes me less responsible to my customers—how is that going to impact my business? These questions are still being tested.

When David and I will be talking in San Francisco, we’re not just talking about how do you get more friendly with your customers and get better service, what we’re really talking about is how do you survive as business in a changing world where the rules are changing every day? That’s a much bigger conversation than how technical people give better customer service—which is what the discussion was before.

You mention that there’s been gap among companies between those that “look” digital and those that are actually “being” digital. What does that gap look like and how can companies bridge that gap?

DW: Effectively, the way that I try to describe it to people is that a lot of the work on digital up to now has been really about automation. It’s been taking the same approach to business and just using technology to make that more efficient. Whether that’s faster or cheaper, that’s the fundamental role that technology has driven in those organizations. But now the technology has hit the point where it’s fundamentally changing the business, so those organizations that are looking digital are the ones that are putting this thin veneer over their existing business structure. Quite often if you dig beneath the scenes, what you’ll find is there are still bits of paper going on, there are still people looking at a form that was entered on a website and doing something with it.

Those companies that are truly digital are actually using those digital capabilities to change the way that they do the business. If you look at some of the examples that we use—like John Deere or Burberry—all of them have really gone back to their roots, looked at what their business actually is and then figured out how they can use digital technology to change their interactions with customers, change their outcome and restructure their business completely. You see that with companies like GE standing up and saying ‘we may have been a manufacturing company but now we’re a software and analytics company.’ That whole understanding of what the change means is significant. Those that are looking digital are the ones that are saying ‘we have an e-commerce site, therefore we’re digital.’ That’s not the story.

Why has it traditionally been so difficult for IT departments to execute on technology strategies?

DW: Dave and I spend a lot of time talking to these organizations. The majority of organizations feel stuck in a very operational frame of mind. Very few of them really have a strong ability to understand the context of technology strategy within the business. They tend to think of technology as this abstract and separate item rather than something that’s used to deliver most business results.

That sounds like a case for Enterprise Architecture and for architects to be that bridge between IT and the business.

DW: The challenge is it shouldn’t be a bridge, the idea is that it should be a fundamental part of the business strategy not a joining up, not something that you have to interpret. How does that technology deliver the business? It’s not how to back up the business. That’s where we see the real challenge of being digital—those business people who actually understand the digital part and can execute and come up with a digital strategy not necessarily having Enterprise Architects (EA) who try to interpret that and come up with technology.

DC: This is correct only when architects were ‘enterprise’ architects rather than solution or technology architects. We find that many organizations limit their architects to simply translating from the enterprise strategy to the technical solutions.  As long as this remains the case, architects will continue to be focused on operational issues, by reacting to business demands instead of working with business to jointly architect the strategy. Enterprise architecture has started to change into something being called “Business Architecture” where an EA looks at both sides of the fence at the same time (and in fact doesn’t see it as two sides) and asks what we have to all do together to make the organization successful—whether it’s operational or strategic.

To put it slightly more bluntly, the traditional IT model is when the business says ‘we need this,’ and IT builds and delivers it. That mindset has to change. IT is part of the business, and it has to be embedded in those frontline customer-facing parts of the business, not just be a technical service provider that just does whatever it’s told. To be honest, we’re in a situation now where the new technology that’s emerging is not really understood. If IT is buried in the basement somewhere, it’s going to be more difficult to make that technology work for the company. They really need to be on the frontline. What that means is that IT people have to become more business-like and more strategic.

How can technologists, customers and business work together to help solve their mutual problems?

DW: This is an interesting question, and it’s something we get asked all the time. We deal a lot with those companies being challenged with that. A lot of it comes down to culture—it comes down to understanding the difference between how a business will look at prod ops and how IT still looks at projects for example. This is why Dave says that DevOps is a start but it needs to go further. We’re constantly talking about how to start applying the similar techniques that people use for product development into the IT, technology and digital solutions as well. Design thinking, doing ethnographic work up front, doing actual feedback with customers, AB testing—you create those strong testing and feedback mechanisms, what works, what doesn’t work, and not just assume that everything’s understood and you can just write a system that does everything it can. What we see now is those techniques—DevOps, Agile, customer mapping experience, personas—all started coming together and really are creating that overall structure of how you understand the customer, how you understand employees and how you start delivering those solutions that actually give the right outcome and right experience to achieve what they want.

Is there a role for standards in all of this and what would that be?

DW: Very much so. One of the points we want to make is that now when you have effectively a digitally connected ecosystem and businesses form parts of that ecosystem, all the services that consumed are not under your control. In the old days of IT, you’d buy the hardware, you’d buy the software licenses, you’d build it and put it in a building and that would be your interaction, even in the old web days, with your customers. Now your customers link together with services or other businesses electronically. So in terms of the levels of connection, trust and understanding, that has now become very important in terms of the technical communications standards but equally the skills and how you approach that from a business standpoint. Looking at what IT4IT does, for example, is important because you need ways to talk about how the organizations should be constructed, what competencies you need and how they’re put together. Without some form of structure, you just get chaos. The idea of standards from my point of view is to try to find that chaos and give some sense of order to what’s going on.

DC: I agree with David. I would say also that we’re still going to see the importance of best practices as well as standards. To put it bluntly:  Standards are established and agreed ways of doing something.  But much of the technology emerging today is testing the relevance of standards.  Best practices (not the best name, they should be called Tested Practices or Good Practices) are those emerging practices that have been shown to work somewhere in the industry. What may be an appropriate standard for what you did five years ago may not be appropriate for what’s going to emerge next year. There’s always going to be this tension between the established standard, what we know to be true, and the emerging standard or best practice—the things that are working that aren’t necessarily in the standard or are beyond where it is today.

I think the industry has to become a little better at understanding the differences between standards and best practices and using them appropriately. I think what we’ve also seen is a lack of investment in best practices. We’re seeing a lot of people in the industry coming up with suggested best practices and frameworks. But it’s been awhile since we’ve seen a truly independent best practice. IT4IT, is a really good ramping point for some new best practices to emerge.  But just like any proposed practice, it will have its limitations.  Instead of following it blindly, we should keep monitoring it to figure out what those limitations are and how to overcome them.

Standards will continue to be really important to keep the Wild West at bay, but at the same time you’ve got to be pushing things forward and best practices (sponsored by independent organizations) are a good way to do that.

@theopengroup #ogSFO

by-the-open-groupDavid WheableVice President and Principal Consultant, Forrester Research Inc.
David provides research-based consulting services to BT Professionals, helping them leverage Forrester’s proprietary research and expertise to meet the ever-changing needs and expectations of their stakeholders.

David specializes in helping clients create effective and efficient strategies for their IT Service Management challenges including integrating cloud services, bring your own device (BYOD), and mobility.

Prior to joining Forrester, David worked at HP, where he served as the professional services innovation lead for the software and professional services organization, as worldwide solution lead, and as a consulting manager.

by-the-open-groupDavid CannonVice President and Group Director, Forrester Research Inc.
David serves Infrastructure & Operations Professionals. He is a leader in the fields of IT and service strategy and has led consulting practices for BMC Software and Hewlett-Packard. He is the coauthor of the ITIL 2007 service operation book and author of the ITIL 2011 service strategy book. He is also a founder and past chairman of both itSMF South Africa and itSMF International and a past president of itSMF USA.

Prior to joining Forrester, David led the IT service management (ITSM) practice of BMC Software Global Services and led the ITSM consulting practice at Hewlett-Packard. He has educated and consulted within a broad range of organizations in the private and public sectors over the past 20 years. He has consulted in virtually every area of IT management, but he specializes in the integration of business and technology management.

David has degrees in industrial sociology and psychology from the University of South Africa and holds the ITIL Expert certificate. He is also a fellow of service management and double recipient of the itSMF Lifetime Achievement Award.

 

Leave a comment

Filed under Digital Customer Experience, digital technologies, Digital Transformation, Enterprise Architecture (EA), Forrester, IT4IT, Standards, The Open Group, The Open Group San Francisco 2017, Uncategorized

To Colonize Mars, Look to Standards Development

By The Open Group

In advance of The Open Group San Francisco 2017, we spoke with Keegan Kirkpatrick, one of the co-founders of RedWorks, a “NewSpace” start-up focused on building 3D printable habitats for use on earth and in space.  Kirkpatrick will be speaking during the Open Platform 3.0™/Internet of Things (IoT) session on February 1.

Keegan Kirkpatrick believes that if we are to someday realize the dream of colonizing Mars, Enterprise Architects will play a critical role in getting us there.

Kirkpatrick defines the contemporary NewSpace industry as a group of companies that are looking to create near-term solutions that can be used on Earth, derived from solutions created for long-term use in space. With more private companies getting into the space game than ever before, Kirkpatrick believes the means to create habitable environments on the moon or on other planets isn’t nearly as far away as we might think.

“The space economy has always been 20 years away from where you’re standing now,” he says.

But with new entrepreneurs and space ventures following the lead of Elon Musk’s SpaceX, the space industry is starting to heat up, branching out beyond traditional aerospace and defense players like NASA, Boeing or Lockheed Martin.

“Now it’s more like five to ten years away,” Kirkpatrick says.

Kirkpatrick, who has a background in aerospace engineering, says RedWorks was born out of NASA’s 3D Printed Habitat Challenge, a “Centennial Challenge” where people from all kinds of backgrounds competed to create 3D printing/construction solutions for building and surviving on Mars.

“I was looking to get involved in the challenge. The idea of 3D printing habitats for Mars was fascinating to me. How do we solve the mass problem? How do we allow people to be self-sufficient on Mars once they get there?” he says.

Kirkpatrick says the company came together when he found a small 3D printing company in Lancaster, Calif., close to where he lives, and went to visit them. “About 20 minutes later, RedWorks was born,” he says. The company currently consists of Kirkpatrick, a 3D printing expert, and a geologist, along with student volunteers and a small team of engineers and technicians.

Like other NewSpace companies, RedWorks is focusing on terrestrial solutions first; both in order to create immediate value for what they’re doing and to help raise capital. As such, the company is looking to design and build homes by 3D printing low-cost materials that can be used in places that have a need for low-cost housing. The company is talking with real estate developers and urban planners and looking to areas where affordable housing might be able to be built entirely on site using their Mars-derived solutions.

“Terrestrial first is where the industry is going,” Kirkpatrick says. “You’ll see more players showing up in the next few years trying to capitalize on Earth-based challenges with space-based solutions.”

RedWorks plans to use parametric architecture models and parametric planning (design processes based on algorithmic thinking in which the relationship between elements is used to inform the design of complex structures) to create software for planning the printable communities and buildings. In the short-term, Kirkpatrick believes 3D printing can be used to create smart-city living solutions. The goal is to be able to combine 3D printing and embedded software so that people can design solutions specific to the environments where they’ll be used. (Hence the need for a geologist on their team.) Then they can build everything they need on site.

“For Mars, to make it a place that you can colonize, not just explore, you need to create the tools that people with not much of an engineering or space architecture background can use to set up a colony wherever they happen to land,” Kirkpatrick says. “The idea is if you have X number of people and you need to make a colony Y big, then the habitat design will scale everything with necessary utilities and living spaces entirely on-site. Then you can make use of the tools that you bring with you to print out a complete structure.”

Kirkpatrick says the objective is to be able to use materials native to each environment in order to create and print the structures. Because dirt and sand on Earth are fundamentally similar to the type of silicate materials found on the Moon and Mars, RedWorks is looking to develop a general-purpose silica printer that can be used to build 3D structures. That’s why they’re looking first to develop structures in desert climates, such southern California, North Africa and the Middle East.

A role for architecture and standards

As the private, NewSpace industry begins to take off, he believes there will be a strong need for standards to guide the nascent industry—and for Enterprise Architects to help navigate the complexities that will come with designing the technology that will enable the industry.

“Standards are necessary for collaborating and managing how fast this will take off,” he says.

Kirkpatrick also believes that developing open standards for the new space industry will better help NewSpace companies figure out how they can work together. Although he says many of NewSpace start-ups already have an interest in collaborating, with much of their work in the very early stages, they do not necessarily have much incentive to work together as of yet. However, he says, “everyone realizes that collaboration will be critical for the long-term development of the industry.”  Beginning to work toward standards development with an organization such as The Open Group now will help incentivize the NewSpace community to work together—and thus push the industry along even faster, Kirkpatrick says.

“Everyone’s trying to help each other as much as they can right now, but there’s not a lot of mechanisms in place to do so,” he says.

According to Kirkpatrick, it’s important to begin to think about standards for space-related technology solutions before the industry reaches an inflection point and begins to take off quickly. Kirkpatrick expects that inflection point will occur once a launcher like SpaceX is able to do full return landings of its rockets that are then ready for reuse. He expects that launch costs will begin to fall rapidly over the next five to ten years once launch providers can offer reliable reusable launch services, spurring the industry forward.

“Once you see launch costs fall by a factor of 10 or 100, the business side of the industry is going to grow like a weed. We need the infrastructure in place for everyone to work together and enable this incredible opportunity we have in space. There’s a very bright horizon ahead of use that’s just a little hard for everyone to see right now. But it’s coming faster than anyone realizes.”

@theopengroup #ogSFO

by-the-open-groupKeegan Kirkpatrick is the Team Lead and founder of RedWorks, a NewSpace startup in Lancaster, California. He has an undergraduate degree in Aerospace Engineering from Embry-Riddle Aeronautical University, and before turning entrepreneur worked as an engineer at Masten Space Systems on the Mojave Air and Spaceport.

In 2015, Keegan founded RedWorks with Paul Petros, Susan Jennings, and Lino Stavole to compete in and make it to the finals of the NASA Centennial 3D Printed Habitat Challenge. Keegan’s team is creating ways to 3D-print habitats from on-site materials, laying the groundwork for human settlement of the solar system.

Leave a comment

Filed under digital technologies, Enterprise Architecture (EA), Future Technologies, Internet of Things, IoT, Open Platform 3.0, Standards, The Open Group, The Open Group San Francisco 2017, Uncategorized

Digital Transformation and Disruption – A Conversation with Sriram Sabesan

By The Open Group

The term “disruption” has been the de rigueur description for what’s been going on in the technology industry for a number of years now. But with the pressures of a digital economy facing all industries now, each is being disrupted in numerous ways.

Although disruption is leading to new and better ways of doing things, it can also be devastating for businesses and industries if they choose to ignore the advances that digitalization is bringing. Companies that don’t want to be left behind have to adapt more quickly than ever—or learn to disrupt themselves.

Sriram Sabesan, a partner with Conexiam, believes that a certain amount of disruption, or mitigations to disruptions, can indeed by architected by an enterprise—if they have the foresight to do so. We spoke with him in advance of his session at The Open Group San Francisco 2017 event to learn more about how enterprises can architect their own disruptions.

We’ve been hearing a lot about disruption over the past few years. How do you define disruption and what is the disruption curve?

Disruption normally happens when you don’t anticipate something and the change suddenly occurs on you.  In fact, the changes have been happening, but no one has taken the time to connect the dots. To give an example, let us consider an individual holding a mutual fund, which has significant stakes in property and casualty (P&C) insurance businesses.  The impact of a shared economy (Uber, Lyft, Airbnb) is that the number of ‘owners’ is likely to stay flat or see marginal increase.  This cascades into a smaller number of insured people, hence diminished revenue for the insurance provider.  This impacts the stock valuation of the P&C companies, finally, impacting the individual owning the mutual fund with interest in P&C sector.  And that’s a foresight people might not have. This is not about crying ‘wolf,’ but about mitigating potential risk to an asset—at every step of the chain.

Let us take another example. Most manufacturing businesses hold reasonable stock of spare parts to their machinery.  Even at home, we hold metallic clips, nails, etc.  With 3D printing, one may be able to reuse the raw materials—sheet metal or plastic or whatever they’re trying to manufacture for the main product to create the spare-parts.  At home, we don’t have to stock clips, pins or nails—but raw material.  3D printing impacts the businesses that are producing these products todays.  Some positively (example e-Nable – http://enablingthefuture.org/) and some in unknown ways.

It is about walking the chain.  The company adopting a new technology or approach may not be the one getting impacted.  It may not be about the industry vertical that is adopting a new model.  It’s mostly likely the cascading effect of people taking part in the diluted chain that are impacted. It’s a system of systems game.

The Disruption Curve is based on product maturity ‘S-curve.’  Familiarity breeds contempt and raises expectations.  As people get used to do something in a certain way, some start to notice the little annoyances, and others want to do things differently or better.  For businesses, it is the necessity to create a new revenue model.  The next S-curve is born when the old S-curve approaches its top end.  The best definition is given by Prof. Clayton Christensen from Harvard Business School. However, the simplest interpretation could be ‘an unexpected change to the way one does things or when someone is unseated.’  For this topic, I think everyone is trying to improve their personal productivity, including better disposable income, a dose of vacation or a personal moment for themselves.  Any and all of these will cause a disruption.

In your opinion, what have been the biggest industry disruptions to occur in the past 10 years?

Most of the changes happened in isolation in the past.  There was no significant combinatorial effect that could transcend multiple industry verticals like today.

Google disrupted search; Amazon disrupted in-store purchase models; Netflix  the DVD rental market.  They all leveraged the internet. Google was able to capture and connect contents of websites, scanned copies of books, triggering the birth of  ‘big data’.  And Amazon on the other side, when they started having too many products, they couldn’t have an ecosystem that could support the enterprise across the globe and they came up with the AWS system. What they made internally, they also made a commercial, external facing product. Skype changed telephony; Paypal changed money exchange. 

Growth in metallurgy and medical sciences evolved from the foundations laid in the later half of last century.  Growing human body parts in lab, implantable devices, etc. The last decade made remote, continuous monitoring of human behavior and health possible.

But the biggest change is that all these companies discovered that they actually depend on each other.  Netflix on AWS, AWS on fiber optic cable owners, both of them on the last mile internet service providers, etc.  That realization, and the resultant collaboration via open standards is the biggest of all.

All of them changed some of the fundamentals of human-to-human interaction and human-to-machine interaction.  The new model made any individual able to provide a solution rather than waiting for large enterprises to initiate changes.

Who have been the most influential disruptors—is it just the usual suspects like Uber or Airbnb or are there others disruptors that are having a greater influence on industries that people are less aware of?

It depends on the vertical. As I said before, the past decade has been limited to a single vertical.

If you think about tax filing, Intuit has been the most influential in that area with Turbo Tax. They made a lot of things easier. Now you can take a picture of your W2 and 80% of your filing work is completed. Using another product, Mint.com, they became a personal finance advisor in a non-intrusive way—working with your banks, investment accounts and credit card accounts.  PayPal and Square are disruptors in the ecommerce and money movement sectors.

Each vertical had its own set of disruptors, not everyone came together. But now more and more people are coming together because the services are so interdependent. Apple with its iTunes changed the whole music industry. Amazon Kindle for books.  IBM with its Watson is changing multiple verticals.

Medical devices are also undergoing a lot of change in terms of things that could be planted in human beings and monitored wirelessly so it can give real-time information to doctors. The most common human behavior is to visit doctors when we are not healthy. Doctors don’t have data points on the transition from a healthy state to an unhealthy state, what happened, why it happened. Now they can monitor a person and behavior continuously. I recently read about an emergency room operation that used the data from a FitBit to figure out what happened to a patient and treat the patient very quickly. They saw the transition and the data points stored in the device and were able to diagnose the patient because the patient wasn’t conscious.

So, I guess, there are more unusual suspects and players.  To name a few: Khan Academy and Open Courseware in education, e-Nable for exoskeletal structures, derivatives of military’s ‘ready-to-eat-meals’.  There are also new products like ‘Ok Google,’ ‘Alexa’ and ‘x.ai’ which combines several aspects.

Your talk at The Open Group San Francisco advocates for an “architected approach” to disruption. Can disruption be architected or is there a certain amount of happenstance involved in having that kind of impact on an industry?

There is some element of happenstance.  However, most of the disruptions are architected.

An enterprise invariably architects for disruption or reacts rapidly to mitigate disruptive threats to sustain a business.  There are some good examples that go unnoticed or written off as the natural evolution of an industry.

I believe Qantas airlines was the first to realize that replacing seat mounted inflight entertainment (IFE) units with iPads saved at least 15 pounds per seat.  Even after adding 40% more seats, eliminating these devices reduced the overall weight of a Boeing 777 by 7%.  Simply by observing inflight human behavior and running test flights without IFEs, airlines architected this change.  The moment the savings was realized, almost every airline followed.  This is an example of architected change.  As regulators started accepting use of wifi devices at any altitude, compliance work done at the gate, by the pilot and maintenance crew also switched to hand-held devices.  Less paper and faster turnaround times.  Savings in weight resulted in lower overall operating cost per flight, contributing to either lower prices or more cargo revenue for the airline.

Every enterprise can anticipate changes in human behavior or nudge a new behavior, build a new business model around such behaviors.  Apple’s introduction of touch devices and natural interfaces is another example of well-architected and executed change.

There are parts of a business that need significant effort to change due to cascading impacts, say an ERP system or CRM or SCM system.  Even shifting them from on-premise to cloud would appear daunting.  However, the industry has started to chip away the periphery of these solutions that can be moved to cloud.  The issue is not technical feasibility or availability of new solutions.  It is more about recognizing what to change and when to change.  The economics of the current way of doing things balanced against cost of change and post change operations will simplify decision making.  The architect has to look outside the enterprise for inspiration, identify the points of friction within the enterprise, and simply perform a techno-economic analysis to architect a solution.

Sometimes a group of architects or industries realize a need for a change.  They collectively guide the change.  For example, consider The Open Group’s new Open Process Automation Forum.  What would normally appear to be disconnected verticals – Oil and Gas, Food Processing, Pharmaceuticals, Fabric and Cable manufacturers have come together to solve process management problems.  Current equipment suppliers to such companies are also part of the forum.  The way the forum works will lead to incremental changes. The results will appear to be natural evolution of the industry but the fact that these folks have come together can be called a disruption to an otherwise normal way of operations.  With this, there is the possibility of collaboration and mutual learning between operations technology and information technology.

I know of car companies, insurance companies and highway management companies who started silent collaboration to explore solar panels embedded on the road and live charging of automobiles.  An extended ‘what if’ scenario is the use of GPS to identify the availability of solar panel embedded roads matched with driving behavior of the car owner to make a decision whether the charge on the car’s battery can be used as source of power to reduce the burden on the electric grid.  Last month I read an article that the first solar panel road is a reality.  For metering and charging of power consumption, this may not be much of a disruption.  But other adjoining areas like regulations, parking privileges, toll charges will be impacted.  It is a question of how soon the players are going to react to make the transition gradual or suddenly wake up to call them disruptions.

Is it possible for established enterprises to be the arbiters of disruption or is that really something that has to come out of a start-up environment? Can an architected approach to disruption help established companies keep their edge?

Yes and no. The way most companies have grown is to protect what they’ve already established. A good number of organizations operate under the philosophy that failure is not an option, which implies that taking risks has to be reduced which in turn stifles innovation. They will innovate within the boundaries and allowances for failures. Start-ups have a mindset that failure is an option because they have nothing else to lose. They are looking for the right fit.

To be an arbiter, start-up or established enterprise, take a page from the research on Design Thinking and Service Blueprinting by Stanford University.  It provides a framework for innovation and possibly disruptions by any organization – not just the start-ups.  Progressive’s telemetry device is just the beginning.  Once the customers understand the limits of privacy management, all insurance companies will change the way they rate premiums.  Just learn from the rapid changes the TSA made for full-body scanners.  Scanned images rapidly changed from close to real body shape to a template outline.  Customer outrage forced that change.

Some big enterprises are actually working with start-ups to figure out what changes the start-ups want to do, what kind of pain points they’re offsetting. There are companies who work with an agenda to change the operating model of the whole industry. 

In the U.S., one can look at CaptialOne, Amazon (the retail business, not AWS), MegaBus, and Old Navy for creating new business models, if not a complete disruption.  Expedia created GlassDoor, and Zillow; Expedia was founded on making search, comparison of competitive offers and decision-making simple. The bottom line is whether the philosophy with which an enterprise was created has become its DNA, resulting in new verticals and value creation in the eyes of the investors.

It is possible to have an architected disruption approach moving forward but it comes from a place where the company defines the level of risk and change they’re willing to bring. At the end of the day, public companies are under constant pressure for quarterly results so big changes may not be possible; but they may be doing small incremental things that morph into something else that we cannot easily see.

Is architected disruption a potential new direction that Enterprise Architects can take as either a career path or as a way to show their continued relevance within the enterprise?

Yes. Let me qualify that. As things stand today, there are three kinds of architects.

Architects who guide and oversee implementation—people who have to make sure that what has been planned goes according to plan. These architects are not chartered to create or mitigate disruptions.  It is the task that is given to them that distances them from effecting big changes.

The second kind of architects focus on integrating things across businesses or departments and execute with the strategy leaders of the company.  These architects are probably on the periphery of enabling disruption or mitigating impacts of a disruption using an architected approach. These architects often react to disruptions or changes.

The third set of architects are trying to provide the strategy for the company’s success—creating roadmaps, operating at the edges of corporate charter or philosophy, thinking about every moving part within and outside the enterprise. They are on the watch out for what’s happening in human behavior, what’s happening in machine behavior and what’s happening in automation and trying to modify the portfolio quarter by quarter, if not sooner.  It is tricky for these architects to keep track of everything happening around them, so it is normal to get lost in the noise.

With the right attitude and opportunity, an architect can create a career path to move from the first kind to the third kind.  Having said that, let me be clear, all three kinds of architects are relevant and required for an enterprise to function.

Is there a role for standards in architected disruption?  

Yes.  The standards provide a buffer zone to limit the impact of disruption.  It also provides a transition path to adopt a new way of doing things.

The standards help in a couple ways—The Open Group sets standards for Boundaryless Information Flow™.  At the end of the day, no business is an island. So when a payment or financial e-commerce transaction changes from a bank to a PayPal account to a mobile wallet or a phone number, you need to have certain communications protocols, certain exchange standards to be defined. What kind of failure mitigation one needs to have in place needs to be defined—that’s one.

Second is supporting management decision makers—CEOs, COOs. We have to provide them the information that says ‘if you do this within this confine, the possibilities of failures go down.’ It’s about making it easier for them to decide and take on a change effort.

The standards provide a framework for adopting the change as well as a framework for helping management decisions mitigate risk and for making an ecosystem work well together.

Are there any other ways that disruption can be planned for?

One way is to look at the business patterns, the economic indicators that come along with these patterns.

Would Uber have survived in the mid-to-late 1990s? Probably not, because of the growing and more affluent economy. The economic pressure of the late 2000s diminished total disposal income so people were open to certain changes in their habits. Not only were they open in their thinking about socializing, they were open to penny-pinching as well.

There are parts of businesses that are hard to change, like the logistics management and ERP systems of an airline; clearing house operations of banking systems; cross-border, high-value sales.  There are parts of the business that can change with minimal impact.  Gartner calls this concept Pace-Layering.  We have to look for such layered patterns and make it easier to solve.  And the growth part will be complemented by what’s going on outside the enterprise.

There are a lot of examples of products that were way ahead of their time and for users to imagine / accept the change, and hence failed.  Uber or Ford, despite following different approach to deliver their product to the market, focused on the problem of mobility, the economic and social climate, and were willing to innovate and iterate. Oxo products, for example, though they cannot be technically classified as disruptors, changed the way we look at kitchen tools.  Oxo focused on user research and product fit.

So the winning formula is to focus on market and customer needs.  Start with accepting failure, test like there is no tomorrow. And at the hint of a tipping point, scale.

@theopengroup #ogSFO
by-the-open-groupSriram Sabesan leads the Digital Transformation practice at Conexiam.  He is responsible for developing best practice and standards in the areas of Social, Mobile, Analytics, Cloud and IoT (SMACIT), Customer Experience Management and governance.

Over the past 20 years, Sriram has led teams specializing in system engineering, process engineering and architecture development across federal, technology, manufacturing, telecommunication, and financial services verticals. Managing and leading large geographically distributed teams, Sriram has enabled clients develop and execute strategies in response to shifts technology or economic conditions.

Sriram has been an active member of The Open Group since 2010 and is on The Open Group Governing Board.  He has contributed to the development of Open Group standards, snapshots and white papers. He is an Open Group Certified Distinguished Architect and is certified in TOGAF® v8, Scrum Practice and Project Management.

Sriram holds a Bachelor of Science degree Mechanical Engineering and Master of Science (Tech) in Power and Energy.  Sriram also received his Diplomas in Financial and Operations Management in 1998.

1 Comment

Filed under digital business, digital strategy, digital technologies, Digital Transformation, Enterprise Architecture, Enterprise Architecture (EA), Open Platform 3.0, The Open Group San Francisco 2017, TOGAF®, Uncategorized

TOGAF® User Group Meetings

By The Open Group

Since its inception more than two decades ago, TOGAF®, an Open Group standard, has grown to become the de facto global framework for creating Enterprise Architectures.

Thousands of companies worldwide have adopted and adapted TOGAF to transform their businesses. Facts about TOGAF include:

  • 80% of the Fortune Top 50 companies use TOGAF
  • Over 60,000 individuals hold certifications in TOGAF 9
  • TOGAF users are based in 120 countries
  • Greater than 60 accredited training courses available globally

The Open Group wants to ensure that TOGAF maintains its momentum worldwide and realizes that doing so cannot be done without capturing the voices beyond the The Open Group members.  Additionally, there is an increase in the number of licensed TOGAF professionals who want to follow up their training with a forum for discussion and sharing. Thus, there is an opportunity to provide TOGAF Users to easily Share, get Enlightenment, and Express their needs (’SEE’ TOGAF).

The starting off point for The Open Group was to begin hosting TOGAF User Group Meetings, which move in a direction where users get more involved in their structure. With these meetings, The Open Group gets an opportunity to Harvest ideas on use, Educate users, have Access to larger user base and broader set of Requirements (‘HEAR’ about TOGAF use).

The User Group Meetings are open to all interested people and are free to attend.

So there is a win-win for TOGAF Users to meet. This part of the story is yet to be written!

For the upcoming TOGAF® User Group Meeting in San Francisco, CA on January 30, 2017, please visit here.

by-the-open-group

 

Comments Off on TOGAF® User Group Meetings

Filed under Certifications, Enterprise Architecture, Enterprise Architecture (EA), Enterprise Transformation, Professional Development, Standards, The Open Group San Francisco 2017, TOGAF, TOGAF®