Tag Archives: The Open Group

Connect with @theopengroup on April 17 for an Identity Management Tweet Jam #ogChat

By Patty Donovan, The Open Group

In about a week, The Open Group will be hosting its very first tweet jam! In case you’re not familiar with tweet jams, a tweet jam is a one hour “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on a chosen topic – in this case, identity management. Each tweet jam is led by a moderator (The Open Group) and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is free (and encouraged!) to join the discussion.

Tweet, Tweet – Come Join Us

You can join our Identity Management Tweet Jam on April 17 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST. We welcome Open Group members and interested participants from all backgrounds to participate in the session and interact with our panel of experts in the identity management space.

Here is the current line-up for our expert panel:

To access the discussion, please follow the #ogChat hashtag next Wednesday during the allotted discussion time. Other hashtags we recommend you use for this tweet jam that encompass the topics that will be discussed include:

  • Identity management: #IdM
  • Single sign-on: #SSO
  • Cloud computing: #cloud
  • Mobile: #mobile
  • IT security: #ITSec
  • Information security: #InfoSec
  • Enterprise identity: #EntID
  • Identity ecosystem: #IDecosys

Below are a list of the questions that will be addressed during the hour-long discussion:

  1. What are the biggest challenges of identity management today?
  2. What should be the role of governments and private companies in creating identity management standards?
  3. What are the barriers to developing an identity ecosystem?
  4. Identity attributes may be valuable and subject to monetization. How will this play out?
  5. How secure are single sign-on schemes through Web service providers such as Google and Facebook?
  6. Is identity management more or less secure on mobile devices?
Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

  • Have your first #ogChat tweet be a self-introduction: name, affiliation, occupation.
  • Start all other tweets with the question number you’re responding to and the #ogChat hashtag.
    • Sample: “Q2: @theopengroup, attributes are absolutely more critical than biometrics #IdM #ogChat”
  • Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and  stimulate discussion.
  • While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue!
  • A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event, please direct them to Rod McLeod (rmcleod at bateman-group dot com). We anticipate a lively chat on April 17, so you will be able to join!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the US.

2 Comments

Filed under Identity Management, Tweet Jam

Why We Can’t Agree on What We Mean by “Enterprise Architecture” and Why That’s OK, At Least for Now

By Leonard Fehskens, The Open Group

Many people have commented that one of the most significant consequences of the Internet is the “democratization of commentary.” The ability to comment on subjects of interest to a community is no longer limited to those few who have access to traditional methods of broadcast communications (e.g., printed media, radio and television). At the same time, membership in such communities is no longer limited to those who are physically proximate. The result is everyone has a wide-reaching public voice now (even this blog is one such example).

The chorus of public voices speaking about Enterprise Architecture has created something of a din. Over the past several years my listening to this chorus has revealed an extraordinary diversity of opinion about what we mean by “Enterprise Architecture.” I have tried to sort out and categorize this diversity of opinion to try to understand how the Enterprise Architecture community could think so many different things about the idea that unites it. Creating a true profession of Enterprise Architecture will require that we come to some sort of convergence and agreement as to what the profession is about, and I hope that understanding the roots of this wide diversity of opinion will facilitate achieving that convergence.

At The Open Group Conference in Cannes, France later this month, I will be speaking on this subject. Here is a preview of that talk.

Assumptions and Approaches 

In many discussions about Enterprise Architecture I have seen preliminary apparent agreement rapidly disintegrate into disagreement bordering on hostility. People who initially thought they were saying the same things discovered as they explored the implications of those statements that they actually meant and understood things quite differently. How can this happen?

There seem to me to be two things that contribute to this phenomenon. The first is the assumptions we make, and the second is the approaches we adopt in defining, thinking about and talking about Enterprise Architecture. As important as the nature of these assumptions and approaches is the fact that we are almost never explicit about them. Indeed, one of the most widespread and consequential assumptions we make is that we all share the same assumptions.

To keep this article short and to avoid “stealing my own thunder” from my upcoming conference presentation, I’m going to step from the tip of one iceberg to the next, hopefully whetting your appetite for a more in-depth treatment.

How We Approach the Problem

There are an even half dozen ways that I have observed people approach the problem of defining Enterprise Architecture that have, by their use, created additional problems. They are:

  • The use of ambiguous language – many of the words we have borrowed from common usage to talk about Enterprise Architecture have multiple meanings.
  • Failing to understand, and account for, the difference between denotation and connotation – a word denotes its literal meaning, but it also connotes a set of associations. We may all agree explicitly on what a word denotes, but at the same time each hold, probably implicitly, very different connotative associations for the word.
  • The use of figures of speech (metaphor, simile, metonymy, synecdoche) – figures of speech are expressive rhetorical gestures, but they too often have very little practical value as models for reasoning about the subject to which they are applied.
  • Conflation – the inclusion of a related but distinct discipline as an integral part of Enterprise Architecture.
  • Mixing up roles and job definitions or job descriptions – jobs are defined to meet the needs of a specific organization and may include parts of many different roles.
  • The “blind men and the elephant” syndrome – defining something to be the part of it that we individually know.

The Many Things We Make Assumptions About

The problem with assumptions is not that we make them, but that we do so implicitly, or worse, unknowingly. Our assumptions often reflect legitimate choices that we have made, but we must not forget that there are other possible choices that others can make.

I’ve identified fifteen areas where people make assumptions that lead to sometimes radically different perspectives on Enterprise Architecture. They have to do with things like what we think “architecture,” “enterprise,” and “business” mean; what we think the geography, landscape or taxonomy of Enterprise Architecture is; how we name or think we should name architectures; what kinds of things can have architectures; what we think makes a good definition; and several more. Come to my talk at The Open Group conference in Cannes at the end of the month if you want to explore this very rich space.

What Can We Do?

It’s tempting when someone comes at a problem from a different perspective, or makes a different choice from among a number of options, to conclude that they don’t understand our position, or too often, that they are simply wrong. Enterprise Architecture is a young discipline, and it is still sorting itself out. We need to remain open to alternative perspectives, and rather than focus on our differences, look for ways to accommodate these different perspectives under unifying generalizations. The first step to doing do is to be aware of our assumptions, and to acknowledge that they are not the only assumptions that might be made.

In the words of St. Augustine, “Let us, on both sides, lay aside all arrogance. Let us not, on either side, claim that we have already discovered the truth. Let us seek it together as something which is known to neither of us. For then only may we seek it, lovingly and tranquilly, if there be no bold presumption that it is already discovered and possessed.”

Len Fehskens is Vice President of Skills and Capabilities at The Open Group. He is responsible for The Open Group’s activities relating to the professionalization of the discipline of enterprise architecture. Prior to joining The Open Group, Len led the Worldwide Architecture Profession Office for HP Services at Hewlett-Packard. Len is based in the US.

6 Comments

Filed under Conference, Enterprise Architecture

Enterprise Transformation Takes the French Riviera

By The Open Group Conference Team

The Open Group Conference in Cannes, France is just around the corner. Taking place April 23-27, the conference will bring together leading minds in technology to discuss the process of Enterprise Transformation, and the role of Enterprise Architecture (EA) and IT in Enterprise Transformation.

The French Riviera is a true playground for the rich and famous. As the location of the next Open Group Conference, (not to mention the next Open Cannes Awards) it seems only fitting that we not only have an incredible venue for the event, the JW Marriott Cannes, but have our own star-studded lineup of speakers, sessions and activities that are sure to make the conference an unforgettable experience.

In addition to tutorial sessions on TOGAF and ArchiMate, the conference offers roughly 60 sessions on a varied of topics, including:

  • Enterprise Transformation, including Enterprise Architecture and SOA
  • Cybersecurity, Cloud Security and Trusted Technology for the Supply Chain
  • Cloud Computing for Business, Collaborative Cloud Frameworks and Cloud Architectures

The conference theme “Enterprise Transformation” will highlight how Enterprise Architecture can be used to truly change how companies do business and create models and architectures that help them make those changes. Keynote speakers include:

  • Dr. Alexander Osterwalder, Best-selling Author and Entrepreneur

Dr. Osterwalder is a renowned thought leader on business model design and innovation. Many executives and entrepreneurs and world-leading organizations have applied Dr. Osterwalderʼs approach to strengthen their business model and achieve a competitive advantage through business model innovation. His keynote session at the conference, titled: “Business Models, IT, and Enterprise Transformation,” will discuss how to use the Business Model Canvas approach to better align IT and business strategy, empower multi-disciplinary teams and contribute to Enterprise Transformation.

  • Herve Gouezel, Advisor to the CEO at BNP Paribas & Eric Boulay, Founder and CEO of Arismore

Keynote: “EA and Transformation: An Enterprise Issue, a New Role for the CIO?” will examine governance within the Enterprise and what steps need to take place to create a collaborative Enterprise.

  • Peter Haviland, Chief Architect and Head of Business Architecture Advisory Services at Ernst & Young, US

Keynote: “World Class EA 2012: Putting Your Architecture Team in the Middle of Enterprise Transformation,” will identify and discuss key activities leading practice architecture teams are performing to create and sustain value, to remain at the forefront of enterprise transformation.

  • Kirk Avery, Software Architect at Lockheed Martin & Robert Sweeney, MSMA Lead Systems Engineer at Naval Air Systems Command

Keynote: “FACE: Transforming the DoD Avionics Software Industry Through the Use of Open Standards,” will address the DoD Avionics Industry’s need for providing complex mission capability in less time and in an environment of shrinking government budgets

The Common Criteria Workshop and the European Commission

We are also pleased to be hosting the first Common Criteria Workshop during the Cannes Conference. This two-day event – taking place April 25 to 26 – offers a rich opportunity to hear from distinguished speakers from the Common Criteria Security community, explore viewpoints through panel discussions and work with minded people towards common goals.

One of the keynote speakers during the workshop is Andrea Servida, the Deputy Head of the Internet, Network and Information Security unit with the European Commission in Brussels, Belgium. With extensive experience defining and implementing strategies and policies on network and information security and critical information infrastructure protection, Mr. Servida is an ideal speaker as we kick-off the first workshop.

The Open Cannes Awards

What trip would be complete to Cannes without an awards ceremony? Presented by The Open Group, The Open Cannes Awards is an opportunity for our members to recognize each other’s accomplishments within The Open Group with a little fun during the gala ceremony on the night of Tuesday, April 24. The goal is to acknowledge the success stories, the hard work and dedication that members, either as individuals or as organizations, have devoted to The Open Group’s ideals and vision over the past decade.

We hope to see you in Cannes! For more information on the conference tracks or to register, please visit our conference registration page, and please stay tuned throughout the next month as we continue to release blog posts and information leading up to The Open Group Conference in Cannes, France!

Comments Off

Filed under Cloud, Cloud/SOA, Conference, Cybersecurity, Enterprise Architecture, Enterprise Transformation, FACE™, Semantic Interoperability, Service Oriented Architecture

The Open Group Testifies before Congress on the Supply Chain Landscape

By David Lounsbury, The Open Group

On Tuesday, March 27, I had the honor of testifying on behalf of The Open Group Trusted Technology Forum to the House Energy and Commerce Oversight and Investigations Subcommittee at their congressional hearing on IT supply chain security. The hearing focused on these major supply chain issues:

  • The key risks associated with supply chains used by federal agencies to procure IT equipment, software or services
  • The extent to which selected national security-related agencies have addressed IT supply chain risks
  • The extent to which national security-related federal agencies have determined that their telecommunications networks contain foreign-developed equipment, software or services
  • The extent to which private industry has addressed IT supply chain risks

This was the first time that an Open Group employee has testified in front of Congress, and the invitation was a testament to The Open Group’s work as a vendor-neutral certification authority business for over 20 years as well as the traction that The Open Group Trusted Technology Forum (OTTF) has made over the past year.

You can see the full session on the YouTube video embedded below. The Chair and Ranking Member’s opening statements underscored three things for me:

  • That this problem is both widespread and critical – both government agencies and many private companies are struggling to address global supply chain vulnerabilities
  • There is a clear need for collaboration and standards, as well as a need to bring transparency on conformance to such standards at all links in the supply chain.
  • The most critical issues are tainted code / malware and counterfeit products in the supply chain – exactly the focus areas of OTTF

We launched OTTF in December 2010 with the objective of reducing risks to IT products that can be introduced through vulnerable supply chain and development processes. Our goal has been to help the technology industry build with integrity and enable customer organizations and governments to buy with confidence. We have worked closely with the U.S. government throughout the process of developing the Open Trusted Technology Provider Standard (O-TTPS). The U.S. Department of Defense (DoD) was a founding member of the forum, and the impetus for the forum came out of a collaborative initiative between the DoD and industry verticals looking into cybersecurity for acquisitions. I was very gratified that the DoD witness singled out The Open Group’s efforts on OTTF and highlighted their participation in the forum.

Recognizing that a secure global supply chain is important to all governments, one of OTTF’s main objectives is to outreach to other governments around the world in much the same way they have with the U.S. To that end, forum members plan to extend an invitation to participate in the development of the standard and planned accreditation program for trusted technology providers, which will include governments, providers, integrators and component suppliers from around the world. To preview OTTF’s work, you can download the current draft of the Open-Trusted Technology Provider Standard (Snapshot).

The subcommittee already had a strong background on OTTF’s mission and its current initiatives and was very interested to hear what global procurement strategies and best practices OTTF is planning to include in the O-TTPS and how these best practices could be applied within the U.S. government to ensure the security of supply chain both nationally and globally. The subcommittee noted Open Group’s previous work with international standards such as International Standardization for Organization (ISO) as encouraging, illustrating that the global supply chain is taking a step in the right direction under the stewardship of The Open Group.

Overall, the hearing was very positive, and the whole experience validated the work that OTTF has produced thus far. We anticipate that the standard will have a significant impact on how organizations procure large commercial off-the-shelf information and communication technology over the next few years across the global supply chain and are excited to see governments take an active interest in securing the global supply chain.

 

Dave LounsburyDavid Lounsbury is The Open Group‘s Chief Technology Officer, previously VP ofCollaboration Services.  Dave holds three U.S. patents and is based in the U.S.

Comments Off

Filed under OTTF

OTTF Releases Snapshot of Developing Standard

By Sally Long, The Open Group

Globalization has transformed the supply chain forever. While it has brought benefits to large Commercial Off-the-Shelf (COTS) Information and Communication Technology (ICT), it has also brought considerable risk. Although most technology hardware and software products today would not exist without global development, the increase of sophisticated cyberattacks has forced technology suppliers and governments to take a more comprehensive approach to risk management in order to protect supply chain integrity and security.

The Open Group Trusted Technology Forum (OTTF) was founded to help technology companies, customers, government and supplier organizations address the risks that tainted and counterfeit products posed to organizations, and the forum made a big step in that direction this week. On March 5, OTTF announced the release of a snapshot preview of the Open Trusted Technology Provider Standard (O-TTPS) that will help global providers and acquirers of COTS ICT products by providing them with best practices that aim to enhance the security of the global supply chain.

The purpose of the snapshot is to:

  • Enable participants across the COTS ICT supply chain to understand the value in adopting best practice requirements and recommendations
  • Provide an early look at the standard so providers, component suppliers and integrators can begin planning how to implement the standard within their organizations, and so customers, including government acquirers, can differentiate those providers who adopt the standard’s practices
  • Preview the criteria for mitigating tainted or counterfeit technology products from entering the supply chain

O-TTPS Version 1.0 will be published later this year. There have been many organizations that have helped shape the initiative thus far, and we will continue to rely on the support and guidance of: Apex Assurance, atsec Information Security, Boeing, Booz Allen Hamilton, CA Technologies, Carnegie Mellon SEI, Cisco, EMC, Fraunhofer SIT, Hewlett-Packard, IBM, IDA, Juniper Networks, Kingdee, Lockheed Martin, Microsoft, MITRE, Motorola Solutions, NASA, Oracle, Office of the Under Secretary of Defense for Acquisition, Technology and Logistics (OUSD AT&L), SAIC, Tata Consultancy Services, and U.S. Department of Defense/CIO.

We anticipate that O-TTPS will have a significant impact on how organizations procure COTS ICT products over the next few years across the global supply chain and are interested in hearing your thoughts on the snapshot and the initial direction of the standard. We welcome any feedback in the comments section below, and if you would like to help further define this standard and the conformance criteria for accreditation, please contact Mike Hickey or Chris Parnell regarding membership.

Sally Long is the Director of Consortia Services at The Open Group. She was the Release Engineering Section Manager for all collaborative, multi-vendor, development projects (OSF/1, DME, DCE, and Motif) at The Open Software Foundation (OSF), in Cambridge Massachusetts.  Following the merger of OSF and X/Open under The Open Group, Sally served as the Program Director for multiple Forums within The Open Group including: The Distributed Computing Environment (DCE) Forum, The Enterprise Management Forum, The Quality of Service (QoS) Task Force, The Real-time and Embedded Systems Forum and most recently the Open Trusted Technology Forum. 

4 Comments

Filed under Cybersecurity, OTTF

Enterprise Architecture and Enterprise Transformation: Related But Distinct Concepts That Can Change the World

By Dana Gardner, Interarbor Solutions

For some, if you want enterprise transformation, you really need the organizing benefits of Enterprise Architecture to succeed.

For others, the elevation of Enterprise Architecture as an essential ingredient to enterprise transformation improperly conflates the role of Enterprise Architecture, and waters down Enterprise Architecture while risking its powerful contribution.

So how should we view these important roles and functions? How high into the enterprise transformation firmament should Enterprise Architecture rise? And will rising too high, in effect, melt its wings and cause it to crash back to earth and perhaps become irrelevant?

Or is enterprise transformation nowadays significantly dependent upon Enterprise Architecture, and therefore, we should make Enterprise Architecture a critical aspect for any business moving forward?

We posed these and other questions to a panel of business and EA experts at last month’s Open Group Conference in San Francisco to deeply examine the fascinating relationship between Enterprise Architecture and enterprise transformation.

The panel: Len Fehskens, Vice President of Skills and Capabilities at The Open GroupMadhav Naidu, Lead Enterprise Architect at Ciena Corp.; Bill Rouse, Professor in the School of Industrial and Systems Engineering and the College of Computing, as well as Executive Director of the Tennenbaum Institute, all at the Georgia Institute of Technology, and Jeanne Ross, Director and Principal Research Scientist at the MIT Center for Information Systems Research.

Here are some excerpts:

Gardner: Why is enterprise transformation not significantly dependent upon Enterprise Architecture, and why would it be a disservice to bring Enterprise Architecture into the same category?

Fehskens: My biggest concern is the identification of Enterprise Architecture with enterprise transformation.

First of all, these two disciplines have different names, and there’s a reason for that. Architecture is a means to transformation, but it is not the same as transformation. Architecture enables transformation, but by itself is not enough to effect successful transformation. There are a whole bunch of other things that you have to do.

My second concern is that right now, the discipline of Enterprise Architecture is sort of undergoing — I wouldn’t call it an identity crisis — but certainly, it’s the case that we still really haven’t come to a widespread, universally shared understanding of what Enterprise Architecture really means.

My position is that they’re two separate disciplines. Enterprise Architecture is a valuable contributor to enterprise transformation, but the fact of the matter is that people have been transforming enterprises reasonably successfully for a long time without using Enterprise Architecture. So it’s not necessary, but it certainly helps. … There are other things that you need to be able to do besides developing architectures in order to successfully transform an enterprise.

Gardner: As a practitioner of Enterprise Architecture at Ciena Corp., are you finding that your role, the value that you’re bringing to your company as an enterprise architect, is transformative? Do you think that there’s really a confluence between these different disciplines at this time?

Means and ends

Naidu: Transformation itself is more like a wedding and EA is more like a wedding planner. I know we have seen many weddings without a wedding planner, but it makes it easier if you have a wedding planner, because they have gone through certain steps (as part of their experience). They walk us through those processes, those methods, and those approaches. It makes it easier.

I agree with what Len said. Enterprise transformation is different. It’s a huge task and it is the actual end. Enterprise Architecture is a profession that can help lead the transformation successfully.

Almost everybody in the enterprise is engaged in [transformation] one way or another. The enterprise architect plays more like a facilitator role. They are bringing the folks together, aligning them with the transformation, the vision of it, and then driving the transformation and building the capabilities. Those are the roles I will look at EA handling, but definitely, these two are two different aspects.

Gardner: Is there something about the state of affairs right now that makes Enterprise Architecture specifically important or particularly important for enterprise transformation?

Naidu: We know many organizations that have successfully transformed without really calling a function EA and without really using help from a team called EA. But indirectly they are using the same processes, methods, and best practices. They may not be calling those things out, but they are using the best practices.

Rouse: There are two distinctions I’d like to draw. First of all, in the many transformation experiences we’ve studied, you can simplistically say there are three key issues: people, organizations, and technology, and the technology is the easy part. The people and organizations are the hard part.

The other thing is I think you’re talking about is the enterprise IT architecture. If I draw an Enterprise Architecture, I actually map out organizations and relationships among organizations and work and how it gets done by people and view that as the architecture of the enterprise.

Important enabler

Sometimes, we think of an enterprise quite broadly, like the architecture of the healthcare enterprise is not synonymous with information technology (IT). In fact, if you were to magically overnight have a wonderful IT architecture throughout our healthcare system in United States, it would be quite helpful but we would still have a problem with our system because the incentives aren’t right. The whole incentive system is messed up.

So I do think that the enterprise IT architecture, is an important enabler, a crucial enabler, to many aspects of enterprise transformation. But I don’t see them as close at all in terms of thinking of them as synonymous.

Gardner: Len Fehskens, are we actually talking about IT architecture or Enterprise Architecture and what’s the key difference?

Fehskens: Well, again that’s this part of the problem, and there’s a big debate going on within the Enterprise Architecture community whether Enterprise Architecture is really about IT, in which case it probably ought to be called enterprise IT architecture or whether it’s about the enterprise as a whole.

For example, when you look at the commitment of resources to the IT function in most organizations, depending on how you count, whether you count by headcount or dollars invested or whatever, the numbers typically run about 5-10 percent. So there’s 90 percent of most organizations that is not about IT, and in the true enterprise transformation, that other 90 percent has to transform itself as well.

So part of it is just glib naming of the discipline. Certainly, what most people mean when they say Enterprise Architecture and what is actually practiced under the rubric of Enterprise Architecture is mostly about IT. That is, the implementation of the architecture, the effects of the architecture occurs primarily in the IT domain.

Gardner: But, Len, don’t TOGAF® at The Open Group and ArchiMate really step far beyond IT? Isn’t that sort of the trend?

Fehskens: It certainly is a trend, but I think we’ve still got a long way to go. Just look at the language that’s used in the architecture development method (ADM) for TOGAF, for example, and the model of an Enterprise Architecture. There’s business, information, application, and technology.

Well, three of those concepts are very much related to IT and only one of them is really about business. And mostly, the business part is about that part of the business that IT can provide support for. Yes, we do know organizations that are using TOGAF to do architecture outside of the IT realm, but the way it’s described, the way it was originally intended, is largely focused on IT.

Not a lot going on

What is going on is generally not called architecture. It’s called organizational design or management or it goes under a whole bunch of other stuff. And it’s not referred to as Enterprise Architecture, but there is a lot of that stuff happening. As I said earlier, it is essential to making enterprise transformation successful.

My personal opinion is that virtually all forms of design involve doing some architectural thinking. Whether you call it that or not, architecture is a particular aspect of the design process, and people do it without recognizing it, and therefore are probably not doing it explicitly.

But Bill made a really important observation, which is that it can’t be solely about IT. There’s lots of other stuff in the enterprise that needs to transform.

Ross: Go back to the challenge we have here of Enterprise Architecture being buried in the IT unit. Enterprise Architecture is an enterprise effort, initiative, and impact. Because Enterprise Architecture is so often buried in IT, IT people are trying to do things and accomplish things that cannot be done within IT.

We’ve got to continue to push that Enterprise Architecture is about designing the way this company will do it business, and that it’s far beyond the scope of IT alone. I take it back to the transformation discussion. What we find is that when a company really understands Enterprise Architecture and embraces it, it will go through a transformation, because it’s not used to thinking that way and it’s not used to acting that way.

Disciplined processes

If management says we’re going to start using IT strategically, we’re going to start designing ourselves so that we have disciplined business processes and that we use data well. The company is embracing Enterprise Architecture and that will lead to a transformation.

Gardner: You said that someday CIOs are going to report to the enterprise architects, and that’s the way it ought to be. Does that get closer to this notion that IT can’t do this alone, that a different level of thinking across disciplines and functions needs to occur?

Ross: I certainly think so. Look at companies that have really embraced and gotten benefits from Enterprise Architecture like Procter & GambleTetra Pak, and Maersk. At P&G’s, IT is reporting to the CIO but he is also the President of Shared Services. At Maersk and Tetra Pak, it’s the Head of Global Business Processes.

Once we get CIOs either in charge with more of a business role and they are in charge of process, and of the technology, or are reporting to a COO or head of business process, head of business transformation, or head of shared services, then we know what it is we’re architecting, and the whole organization is designed so that architecture is a critical element.

I don’t think that title-wise, this is ever going to happen. I don’t think we’re ever going to see a CIO report to chief enterprise architect. But in practice, what we’re seeing is more CIOs reporting to someone who is, in fact, in charge of designing the architecture of the organization.

By that, I mean business processes and its use of data. When we get there, first of all, we will transform to get to that point and secondly, we’ll really start seeing some benefits and real strategic impact of Enterprise Architecture.

Gardner: There’s some cynicism and skepticism around architecture, and yet, what we’re hearing is it’s not in name only. It is important, and it’s increasingly important, even at higher and higher abstractions in the organization.

How to evangelize?

How then do you evangelize or propel architectural thinking into companies? How do you get the thinking around an architectural approach more deeply engrained in these companies?

Fehskens: Dana, I think that’s the $64,000 question. The fundamental way to get architectural thinking accepted is to demonstrate value. I mean to show that it really brings something to the party. That’s part of my concern about the conflation of enterprise transformation with Enterprise Architecture and making even bigger promises that probably can’t be kept.

The reason that in organizations who’ve tried Enterprise Architecture and decided that it didn’t taste good, it was because the effort didn’t actually deliver any value.

The way to get architectural thinking integrated into an organization is to use it in places where it can deliver obvious, readily apparent value in the short-term and then grow out from that nucleus. Trying to bite off more than you can chew only results in you choking. That’s the big problem we’ve had historically.

It’s about making promises that you can actually keep. Once you’ve done that, and done that consistently and repeatedly, then people will say that there’s really something to this. There’s some reason why these guys are actually delivering on a big promise.

Rouse: We ran a study recently about what competencies you need to transform an organization based on a series of successful case studies and we did a survey with hundreds of top executives in the industry.

The number one and two things you need are the top leader has to have a vision of where you’re going and they have to be committed to making that happen. Without those two things, it seldom happens at all. From that perspective, I’d argue that the CIO probably already does report to the chief architect. Bill Gates and Steve Jobs architected Microsoft and AppleCarnegie and Rockefeller architected the steel and oil industries.

If you look at the business histories of people with these very successful companies, often they had a really keen architectural sense of what the pieces were and how they needed to fit together. So if we’re going to really be in the transformation business with TOGAF and stuff, we need to be talking to the CEO, not the CIO.

Corporate strategy

Ross: I totally agree. The industries and companies that you cited, Bill, instinctively did what every company is going to need to do in the digital economy, which is think about corporate strategy not just in terms of what products do we offer, what markets are we in, what companies do we acquire, and what things do we sell up.

At the highest level, we have to get our arms around it. Success is dependent on understanding how we are fundamentally going to operate. A lot of CEOs have deferred that responsibility to others and when that mandate is not clear, it gets very murky.

What does happen in a lot of companies, because CEOs have a lot of things to pay attention to, is that once they have stated the very high-level vision, they absolutely can put a head of business process or a head of shared services or a COO type in charge of providing the clarification, providing the day-to-day oversight, establishing the relationships in the organizations so everybody really understands how this vision is going to work. I totally agree that this goes nowhere if the CEO isn’t at least responsible for a very high-level vision.

Gardner: So if what I think I’m hearing is correct, how you do things is just as important as what you do. Because we’re in such a dynamic environment, when it comes to supply chains and communications and the way in which technology influences more and more aspects of business, it needs to be architected, rather than be left to a fiat or a linear or older organizational functioning.

So Bill Rouse, the COO, the chief operating officer, wouldn’t this person be perhaps more aligned with Enterprise Architecture in the way that we’re discussing?

Rouse: Let’s start with the basic data. We can’t find a single instance of a major enterprise transformation in a major company happening successfully without total commitment of top leadership. Organizations just don’t spontaneously transform on their own.

A lot of the ideas and a lot of the insights can come from elsewhere in the organization, but, given that the CEO is totally committed to making this happen, certainly the COO can play a crucial role in how it’s then pursued, and the COO of course will be keenly aware of a whole notion of processes and the need to understand processes.

One of the companies I work very closely with tried to merge three companies by putting inERP. After $300 million, they walked away from the investment, because they realized they had no idea of what the processes were. So the COO is a critical function here.

Just to go back to original point, you want total commitment by the CEO. You can’t just launch the visionary message and walk away. At the same time, you need people who are actually dealing with the business processes to do a lot of the work.

Gardner: What the is the proper relationship between Enterprise Architecture and enterprise transformation?

Ross: I’d say the relationship between Enterprise Architecture and enterprise transformation is two-way. If an organization feels the need for a transformation — in other words, if it feels it needs to do something — it will absolutely need Enterprise Architecture as one of the tools for accomplishing that.

It will provide the clarity the organization needs in a time of mass change. People need to know where they’re headed, and that is true in how they do their processes, how they design their data, and then how they implement IT.

It works just as well in reverse. If a company hasn’t had a clear vision of how they want to operate, then they might introduce architecture to provide some of that discipline and clarity and it will inevitably lead to a transformation. When you go from just doing what every individual thought was best or every business unit thought was best to an enterprise vision of how a company will operate, you’re imposing a transformation. So I think we are going to see these two hand-in-hand.

What’s the relationship?

Rouse: I think enterprise transformation often involves a significant fundamental change of the Enterprise Architecture, broadly defined, which can then be enabled by the enterprise IT architecture.

Naidu: Like I mentioned in the beginning, one is end, another one is means. I look at the enterprise transformation as an end and Enterprise Architecture providing the kind of means. In one way it’s like reaching the destination using some kind of transportation mechanism. That’s how I look at the difference between EA and ET.

Fehskens: One of the fundamental principles of architecture is taking advantage of reuse when it’s appropriate. So I’m just going to reuse what everybody just said. I can’t say it better. Enterprise Architecture is a powerful tool for effecting enterprise transformation.

Jeanne is right. It’s a symmetric or bidirectional back-and-forth kind of relationship.

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

3 Comments

Filed under Conference, Enterprise Architecture, Enterprise Transformation

Architecture and Change

By Leonard Fehskens, The Open Group

The enterprise transformation theme of The Open Group’s San Francisco conference reminded me of the common assertion that architecture is about change, and the implication that Enterprise Architecture is thus about enterprise transformation.

We have to be careful that we don’t make change an end in itself. We have to remember that change is a means to the end of getting something we want that is different from what we have. In the enterprise context, that something has been labeled in different ways. One is “alignment”, specifically “business/IT alignment.” Some have concluded that alignment isn’t quite the right idea, and it’s really “integration” we are pursuing. Others have suggested that “coherency” is a better characterization of what we want.

I think all of these are still just means to an end, and that end is fitness for purpose. The pragmatist in me says I don’t really care if all the parts of a system are “aligned” or “integrated” or “coherent”, as long as that system is fit for purpose, i.e., does what it’s supposed to do.

I’m sure some will argue that alignment and integration and coherency ensure that a system is “optimal” or “efficient”, but doing the wrong thing optimally or efficiently isn’t what we want systems to do. It’s easy to imagine a system that is aligned, integrated and coherent but still not fit for purpose, and it’s just as easy to imagine a system that is not aligned, not integrated and not coherent but that is fit for purpose. Of course, we can insist that alignment, integration and coherency be with respect to a system’s purpose, but if that’s the case, why don’t we say so directly? Why use words that strongly suggest internal properties of the system rather than its relationship to an external purpose?

Whatever we call it, continuous pursuit of something is ultimately the continuous failure to achieve it. It isn’t the chase that matters, it’s the catch. While I am sympathetic to the idea that there is intrinsic value in “doing architecture,” the real value is in the resulting architecture and its implementation. Until we actually implement the architecture, we can only answer the question, “Are we there yet?” with, “No, not yet”.

Let me be clear that I’m not arguing, or even assuming, that things don’t change and we don’t need to cope with change.  Of course they do, and of course we do. But we should take a cue from rock climbers – the ones who don’t fall generally follow the principle “only move one limb at a time, from a secure position.” What stakeholders mean by fitness for purpose must be periodically revisited and revised. It’s fashionable to say “Enterprise Architecture is a journey, not a destination,” and this is reflected in definitions of Enterprise Architecture that refer to it as a “continuous process.” However, the fact is that journey has to pass through specific waypoints. There may be no final destination, but there is always a next destination.

Finally, we should not forget that while the pursuit of fitness for purpose may require that some things change; it may also require that some things not change. We risk losing this insight if we conclude that the primary purpose of architecture is to enable change. The primary purpose of architecture is to ensure fitness for purpose.

For a fuller treatment of the connection between architecture and fitness for purpose, see my presentations to The Open Group Conferences in Boston, July 2010, “What ‘Architecture’ in ‘Enterprise Architecture’ Ought to Mean,” and Amsterdam, October 2010, “Deriving Execution from Strategy: Architecture and the Enterprise.”

Len Fehskens is Vice President of Skills and Capabilities at The Open Group. He is responsible for The Open Group’s activities relating to the professionalization of the discipline of enterprise architecture. Prior to joining The Open Group, Len led the Worldwide Architecture Profession Office for HP Services at Hewlett-Packard. Len is based in the US.

19 Comments

Filed under Enterprise Architecture, Enterprise Transformation

Open Group Security Gurus Dissect the Cloud: Higher of Lower Risk

By Dana Gardner, Interarbor Solutions

For some, any move to the Cloud — at least the public Cloud — means a higher risk for security.

For others, relying more on a public Cloud provider means better security. There’s more of a concentrated and comprehensive focus on security best practices that are perhaps better implemented and monitored centrally in the major public Clouds.

And so which is it? Is Cloud a positive or negative when it comes to cyber security? And what of hybrid models that combine public and private Cloud activities, how is security impacted in those cases?

We posed these and other questions to a panel of security experts at last week’s Open Group Conference in San Francisco to deeply examine how Cloud and security come together — for better or worse.

The panel: Jim Hietala, Vice President of Security for The Open Group; Stuart Boardman, Senior Business Consultant at KPN, where he co-leads the Enterprise Architecture Practice as well as the Cloud Computing Solutions Group; Dave Gilmour, an Associate at Metaplexity Associates and a Director at PreterLex Ltd., and Mary Ann Mezzapelle, Strategist for Enterprise Services and Chief Technologist for Security Services at HP.

The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. The full podcast can be found here.

Here are some excerpts:

Gardner: Is this notion of going outside the firewall fundamentally a good or bad thing when it comes to security?

Hietala: It can be either. Talking to security people in large companies, frequently what I hear is that with adoption of some of those services, their policy is either let’s try and block that until we get a grip on how to do it right, or let’s establish a policy that says we just don’t use certain kinds of Cloud services. Data I see says that that’s really a failed strategy. Adoption is happening whether they embrace it or not.

The real issue is how you do that in a planned, strategic way, as opposed to letting services like Dropbox and other kinds of Cloud Collaboration services just happen. So it’s really about getting some forethought around how do we do this the right way, picking the right services that meet your security objectives, and going from there.

Gardner: Is Cloud Computing good or bad for security purposes?

Boardman: It’s simply a fact, and it’s something that we need to learn to live with.

What I’ve noticed through my own work is a lot of enterprise security policies were written before we had Cloud, but when we had private web applications that you might call Cloud these days, and the policies tend to be directed toward staff’s private use of the Cloud.

Then you run into problems, because you read something in policy — and if you interpret that as meaning Cloud, it means you can’t do it. And if you say it’s not Cloud, then you haven’t got any policy about it at all. Enterprises need to sit down and think, “What would it mean to us to make use of Cloud services and to ask as well, what are we likely to do with Cloud services?”

Gardner: Dave, is there an added impetus for Cloud providers to be somewhat more secure than enterprises?

Gilmour: It depends on the enterprise that they’re actually supplying to. If you’re in a heavily regulated industry, you have a different view of what levels of security you need and want, and therefore what you’re going to impose contractually on your Cloud supplier. That means that the different Cloud suppliers are going to have to attack different industries with different levels of security arrangements.

The problem there is that the penalty regimes are always going to say, “Well, if the security lapses, you’re going to get off with two months of not paying” or something like that. That kind of attitude isn’t going to go in this kind of security.

What I don’t understand is exactly how secure Cloud provision is going to be enabled and governed under tight regimes like that.

An opportunity

Gardner: Jim, we’ve seen in the public sector that governments are recognizing that Cloud models could be a benefit to them. They can reduce redundancy. They can control and standardize. They’re putting in place some definitions, implementation standards, and so forth. Is the vanguard of correct Cloud Computing with security in mind being managed by governments at this point?

Hietala: I’d say that they’re at the forefront. Some of these shared government services, where they stand up Cloud and make it available to lots of different departments in a government, have the ability to do what they want from a security standpoint, not relying on a public provider, and get it right from their perspective and meet their requirements. They then take that consistent service out to lots of departments that may not have had the resources to get IT security right, when they were doing it themselves. So I think you can make a case for that.

Gardner: Stuart, being involved with standards activities yourself, does moving to the Cloud provide a better environment for managing, maintaining, instilling, and improving on standards than enterprise by enterprise by enterprise? As I say, we’re looking at a larger pool and therefore that strikes me as possibly being a better place to invoke and manage standards.

Boardman: Dana, that’s a really good point, and I do agree. Also, in the security field, we have an advantage in the sense that there are quite a lot of standards out there to deal with interoperability, exchange of policy, exchange of credentials, which we can use. If we adopt those, then we’ve got a much better chance of getting those standards used widely in the Cloud world than in an individual enterprise, with an individual supplier, where it’s not negotiation, but “you use my API, and it looks like this.”

Having said that, there are a lot of well-known Cloud providers who do not currently support those standards and they need a strong commercial reason to do it. So it’s going to be a question of the balance. Will we get enough specific weight of people who are using it to force the others to come on board? And I have no idea what the answer to that is.

Gardner: We’ve also seen that cooperation is an important aspect of security, knowing what’s going on on other people’s networks, being able to share information about what the threats are, remediation, working to move quickly and comprehensively when there are security issues across different networks.

Is that a case, Dave, where having a Cloud environment is a benefit? That is to say more sharing about what’s happening across networks for many companies that are clients or customers of a Cloud provider rather than perhaps spotty sharing when it comes to company by company?

Gilmour: There is something to be said for that, Dana. Part of the issue, though, is that companies are individually responsible for their data. They’re individually responsible to a regulator or to their clients for their data. The question then becomes that as soon as you start to share a certain aspect of the security, you’re de facto sharing the weaknesses as well as the strengths.

So it’s a two-edged sword. One of the problems we have is that until we mature a little bit more, we won’t be able to actually see which side is the sharpest.

Gardner: So our premise that Cloud is good and bad for security is holding up, but I’m wondering whether the same things that make you a risk in a private setting — poor adhesion to standards, no good governance, too many technologies that are not being measured and controlled, not instilling good behavior in your employees and then enforcing that — wouldn’t this be the same either way? Is it really Cloud or not Cloud, or is it good security practices or not good security practices? Mary Ann?

No accountability

Mezzapelle: You’re right. It’s a little bit of that “garbage in, garbage out,” if you don’t have the basic things in place in your enterprise, which means the policies, the governance cycle, the audit, and the tracking, because it doesn’t matter if you don’t measure it and track it, and if there is no business accountability.

David said it — each individual company is responsible for its own security, but I would say that it’s the business owner that’s responsible for the security, because they’re the ones that ultimately have to answer that question for themselves in their own business environment: “Is it enough for what I have to get done? Is the agility more important than the flexibility in getting to some systems or the accessibility for other people, as it is with some of the ubiquitous computing?”

So you’re right. If it’s an ugly situation within your enterprise, it’s going to get worse when you do outsourcing, out-tasking, or anything else you want to call within the Cloud environment. One of the things that we say is that organizations not only need to know their technology, but they have to get better at relationship management, understanding who their partners are, and being able to negotiate and manage that effectively through a series of relationships, not just transactions.

Gardner: If data and sharing data is so important, it strikes me that Cloud component is going to be part of that, especially if we’re dealing with business processes across organizations, doing joins, comparing and contrasting data, crunching it and sharing it, making data actually part of the business, a revenue generation activity, all seems prominent and likely.

So to you, Stuart, what is the issue now with data in the Cloud? Is it good, bad, or just the same double-edged sword, and it just depends how you manage and do it?

Boardman: Dana, I don’t know whether we really want to be putting our data in the Cloud, so much as putting the access to our data into the Cloud. There are all kinds of issues you’re going to run up against, as soon as you start putting your source information out into the Cloud, not the least privacy and that kind of thing.

A bunch of APIs

What you can do is simply say, “What information do I have that might be interesting to people? If it’s a private Cloud in a large organization elsewhere in the organization, how can I make that available to share?” Or maybe it’s really going out into public. What a government, for example, can be thinking about is making information services available, not just what you go and get from them that they already published. But “this is the information,” a bunch of APIs if you like. I prefer to call them data services, and to make those available.

So, if you do it properly, you have a layer of security in front of your data. You’re not letting people come in and do joins across all your tables. You’re providing information. That does require you then to engage your users in what is it that they want and what they want to do. Maybe there are people out there who want to take a bit of your information and a bit of somebody else’s and mash it together, provide added value. That’s great. Let’s go for that and not try and answer every possible question in advance.

Gardner: Dave, do you agree with that, or do you think that there is a place in the Cloud for some data?

Gilmour: There’s definitely a place in the Cloud for some data. I get the impression that there is going to drive out of this something like the insurance industry, where you’ll have a secondary Cloud. You’ll have secondary providers who will provide to the front-end providers. They might do things like archiving and that sort of thing.

Now, if you have that situation where your contractual relationship is two steps away, then you have to be very confident and certain of your cloud partner, and it has to actually therefore encompass a very strong level of governance.

The other issue you have is that you’ve got then the intersection of your governance requirements with that of the cloud provider’s governance requirements. Therefore you have to have a really strongly — and I hate to use the word — architected set of interfaces, so that you can understand how that governance is actually going to operate.

Gardner: Wouldn’t data perhaps be safer in a cloud than if they have a poorly managed network?

Mezzapelle: There is data in the Cloud and there will continue to be data in the Cloud, whether you want it there or not. The best organizations are going to start understanding that they can’t control it that way and that perimeter-like approach that we’ve been talking about getting away from for the last five or seven years.

So what we want to talk about is data-centric security, where you understand, based on role or context, who is going to access the information and for what reason. I think there is a better opportunity for services like storage, whether it’s for archiving or for near term use.

There are also other services that you don’t want to have to pay for 12 months out of the year, but that you might need independently. For instance, when you’re running a marketing campaign, you already share your data with some of your marketing partners. Or if you’re doing your payroll, you’re sharing that data through some of the national providers.

Data in different places

So there already is a lot of data in a lot of different places, whether you want Cloud or not, but the context is, it’s not in your perimeter, under your direct control, all of the time. The better you get at managing it wherever it is specific to the context, the better off you will be.

Hietala: It’s a slippery slope [when it comes to customer data]. That’s the most dangerous data to stick out in a Cloud service, if you ask me. If it’s personally identifiable information, then you get the privacy concerns that Stuart talked about. So to the extent you’re looking at putting that kind of data in a Cloud, looking at the Cloud service and trying to determine if we can apply some encryption, apply the sensible security controls to ensure that if that data gets loose, you’re not ending up in the headlines of The Wall Street Journal.

Gardner: Dave, you said there will be different levels on a regulatory basis for security. Wouldn’t that also play with data? Wouldn’t there be different types of data and therefore a spectrum of security and availability to that data?

Gilmour: You’re right. If we come back to Facebook as an example, Facebook is data that, even if it’s data about our known customers, it’s stuff that they have put out there with their will. The data that they give us, they have given to us for a purpose, and it is not for us then to distribute that data or make it available elsewhere. The fact that it may be the same data is not relevant to the discussion.

Three-dimensional solution

That’s where I think we are going to end up with not just one layer or two layers. We’re going to end up with a sort of a three-dimensional solution space. We’re going to work out exactly which chunk we’re going to handle in which way. There will be significant areas where these things crossover.

The other thing we shouldn’t forget is that data includes our software, and that’s something that people forget. Software nowadays is out in the Cloud, under current ways of running things, and you don’t even always know where it’s executing. So if you don’t know where your software is executing, how do you know where your data is?

It’s going to have to be just handled one way or another, and I think it’s going to be one of these things where it’s going to be shades of gray, because it cannot be black and white. The question is going to be, what’s the threshold shade of gray that’s acceptable.

Gardner: Mary Ann, to this notion of the different layers of security for different types of data, is there anything happening in the market that you’re aware of that’s already moving in that direction?

Mezzapelle: The experience that I have is mostly in some of the business frameworks for particular industries, like healthcare and what it takes to comply with the HIPAA regulation, or in the financial services industry, or in consumer products where you have to comply with the PCI regulations.

There has continued to be an issue around information lifecycle management, which is categorizing your data. Within a company, you might have had a document that you coded private, confidential, top secret, or whatever. So you might have had three or four levels for a document.

You’ve already talked about how complex it’s going to be as you move into trying understand, not only for that data, that the name Mary Ann Mezzapelle, happens to be in five or six different business systems over a 100 instances around the world.

That’s the importance of something like an Enterprise Architecture that can help you understand that you’re not just talking about the technology components, but the information, what they mean, and how they are prioritized or critical to the business, which sometimes comes up in a business continuity plan from a system point of view. That’s where I’ve advised clients on where they might start looking to how they connect the business criticality with a piece of information.

One last thing. Those regulations don’t necessarily mean that you’re secure. It makes for good basic health, but that doesn’t mean that it’s ultimately protected.You have to do a risk assessment based on your own environment and the bad actors that you expect and the priorities based on that.

Leaving security to the end

Boardman: I just wanted to pick up here, because Mary Ann spoke about Enterprise Architecture. One of my bugbears — and I call myself an enterprise architect — is that, we have a terrible habit of leaving security to the end. We don’t architect security into our Enterprise Architecture. It’s a techie thing, and we’ll fix that at the back. There are also people in the security world who are techies and they think that they will do it that way as well.

I don’t know how long ago it was published, but there was an activity to look at bringing the SABSA Methodology from security together with TOGAF®. There was a white paper published a few weeks ago.

The Open Group has been doing some really good work on bringing security right in to the process of EA.

Hietala: In the next version of TOGAF, which has already started, there will be a whole emphasis on making sure that security is better represented in some of the TOGAF guidance. That’s ongoing work here at The Open Group.

Gardner: As I listen, it sounds as if the in the Cloud or out of the Cloud security continuum is perhaps the wrong way to look at it. If you have a lifecycle approach to services and to data, then you’ll have a way in which you can approach data uses for certain instances, certain requirements, and that would then apply to a variety of different private Cloud, public Cloud, hybrid Cloud.

Is that where we need to go, perhaps have more of this lifecycle approach to services and data that would accommodate any number of different scenarios in terms of hosting access and availability? The Cloud seems inevitable. So what we really need to focus on are the services and the data.

Boardman: That’s part of it. That needs to be tied in with the risk-based approach. So if we have done that, we can then pick up on that information and we can look at a concrete situation, what have we got here, what do we want to do with it. We can then compare that information. We can assess our risk based on what we have done around the lifecycle. We can understand specifically what we might be thinking about putting where and come up with a sensible risk approach.

You may come to the conclusion in some cases that the risk is too high and the mitigation too expensive. In others, you may say, no, because we understand our information and we understand the risk situation, we can live with that, it’s fine.

Gardner: It sounds as if we are coming at this as an underwriter for an insurance company. Is that the way to look at it?

Current risk

Gilmour: That’s eminently sensible. You have the mortality tables, you have the current risk, and you just work the two together and work out what’s the premium. That’s probably a very good paradigm to give us guidance actually as to how we should approach intellectually the problem.

Mezzapelle: One of the problems is that we don’t have those actuarial tables yet. That’s a little bit of an issue for a lot of people when they talk about, “I’ve got $100 to spend on security. Where am I going to spend it this year? Am I going to spend it on firewalls? Am I going to spend it on information lifecycle management assessment? What am I going to spend it on?” That’s some of the research that we have been doing at HP is to try to get that into something that’s more of a statistic.

So, when you have a particular project that does a certain kind of security implementation, you can see what the business return on it is and how it actually lowers risk. We found that it’s better to spend your money on getting a better system to patch your systems than it is to do some other kind of content filtering or something like that.

Gardner: Perhaps what we need is the equivalent of an Underwriters Laboratories (UL) for permeable organizational IT assets, where the security stamp of approval comes in high or low. Then, you could get you insurance insight– maybe something for The Open Group to look into. Any thoughts about how standards and a consortium approach would come into that?

Hietala: I don’t know about the UL for all security things. That sounds like a risky proposition.

Gardner: It could be fairly popular and remunerative.

Hietala: It could.

Mezzapelle: An unending job.

Hietala: I will say we have one active project in the Security Forum that is looking at trying to allow organizations to measure and understand risk dependencies that they inherit from other organizations.

So if I’m outsourcing a function to XYZ corporation, being able to measure what risk am I inheriting from them by virtue of them doing some IT processing for me, could be a Cloud provider or it could be somebody doing a business process for me, whatever. So there’s work going on there.

I heard just last week about a NSF funded project here in the U.S. to do the same sort of thing, to look at trying to measure risk in a predictable way. So there are things going on out there.

Gardner: We have to wrap up, I’m afraid, but Stuart, it seems as if currently it’s the larger public Cloud provider, something of Amazon and Google and among others that might be playing the role of all of these entities we are talking about. They are their own self-insurer. They are their own underwriter. They are their own risk assessor, like a UL. Do you think that’s going to continue to be the case?

Boardman: No, I think that as Cloud adoption increases, you will have a greater weight of consumer organizations who will need to do that themselves. You look at the question that it’s not just responsibility, but it’s also accountability. At the end of the day, you’re always accountable for the data that you hold. It doesn’t matter where you put it and how many other parties they subcontract that out to.

The weight will change

So there’s a need to have that, and as the adoption increases, there’s less fear and more, “Let’s do something about it.” Then, I think the weight will change.

Plus, of course, there are other parties coming into this world, the world that Amazon has created. I’d imagine that HP is probably one of them as well, but all the big names in IT are moving in here, and I suspect that also for those companies there’s a differentiator in knowing how to do this properly in their history of enterprise involvement.

So yeah, I think it will change. That’s no offense to Amazon, etc. I just think that the balance is going to change.

Gilmour: Yes. I think that’s how it has to go. The question that then arises is, who is going to police the policeman and how is that going to happen? Every company is going to be using the Cloud. Even the Cloud suppliers are using the Cloud. So how is it going to work? It’s one of these never-decreasing circles.

Mezzapelle: At this point, I think it’s going to be more evolution than revolution, but I’m also one of the people who’ve been in that part of the business — IT services — for the last 20 years and have seen it morph in a little bit different way.

Stuart is right that there’s going to be a convergence of the consumer-driven, cloud-based model, which Amazon and Google represent, with an enterprise approach that corporations like HP are representing. It’s somewhere in the middle where we can bring the service level commitments, the options for security, the options for other things that make it more reliable and risk-averse for large corporations to take advantage of it.

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Cybersecurity, Information security, Security Architecture

San Francisco Conference Day 2 – Enterprise Transformation: The New Role of Open Standards

By The Open Group Conference Team

The Open Group Conference in San Francisco has brought together a plenary of speakers from across the globe and disciplines. While their perspective on enterprise architecture is different, most seem to agree that enterprise transformation is gaining momentum within the enterprise architecture community. During Day Two of the Conference in San Francisco, a number of speakers continued the discussion and the role that standards play in the process of fundamentally changing the enterprise.

The New Role of Open Standards

Allen Brown, President and CEO of The Open Group set the tone for the day during his opening address, providing an overview of enterprise transformation and the role that enterprise architecture and open standards have in shaping the future.

“It’s a journey, not an event,” stated Brown. He also reinforced that enterprise transformation in not just about reducing costs – it’s about improving capabilities, functionality and communication.

In addition to highlighting the tremendous accomplishments of its over 400 member organizations, Brown showcased a number of case studies from a wide range of global enterprises who are leveraging enterprise architecture (EA). For example:

  • University Health Network in Ontario is utilizing EA as a solution for improving the quality of healthcare without increasing the cost
  • Caja Madrid relies on EA to improve the bank’s capabilities while reducing its vulnerabilities and the cost of those vulnerabilities
  • SASOL, an integrated energy company in South Africa, is utilizing EA to improve the organization’s function while reducing cost
  • Cisco is utilizing EA as it provides a common language for cross functional communication

Brown also mentioned the release of a new open standard from the FACE Consortium, which is transforming the avionics industry. According to Capt. Tracy Barkhimer, program manager for the Air Combat Electronics Program Office (PMA-209), the new standard “is quite possibly the most important innovation in Naval aviation since computers were first incorporated into airplanes. This will truly pave the way for the future.”

An Architecture –based Approach

The next plenary speaker was Bill Rouse, the Executive Director of Tennenbaum Institute at the Georgia Institute of Technology, and a professor in the College of Computing and School of Industrial and Systems Engineering. His research focuses on understanding and managing complex public-private systems such as healthcare, energy and defense, with emphasis on mathematical and computational modeling of these systems for the purpose of policy design and analysis.

Rouse posed the notion: you can be the innovator or the transformer.

Of course all businesses want to be the former. So how is architecture involved? According to Rouse, architectures are transformative by nature by providing evidence-based decision making by looking at an enterprise’s operational systems, technical levels and socio-technical architectures. However, as he pointed out: “You have to being willing to change.”

Building a Roadmap to Solve the Problem

Tim Barnes, Chief Architect at Devon Energy, one of North America’s leading independent producers of oil and natural gas, shared his hands-on experience with enterprise architecture and the keys to the company’s success. After the company experienced a profound growth between 1998 and 2010, the company needed to simplify its system to eliminate berries that were impacting business growth and driving excessive IT costs. Barnes was chartered by Devon to develop an EA discipline for the company and leverage the EA process to reduce unnecessary complexity, help streamline the business and lower IT costs.

The Cyber Threat

Rounding out the lineup of plenary speakers was Joseph Menn, a renowned journalist in the area of cyber security and the author of Fatal System Error: The Hunt for the New Crime Lords Who are Bringing Down the Internet.

When it comes to cybercrime and security, “no one is telling us how bad it really is,” said Menn. After providing a few fear-provoking examples, and instilling that the Stuxnet affair is just a small example of things to come, Menn made it clear that government will only provide a certain level of protection – enterprises must take action to protect themselves and their intellectual property.

Comments Off

Filed under Certifications, Cybersecurity, Enterprise Architecture, Enterprise Transformation, FACE™, Standards

Cloud Interoperability and Portability Project Findings to be Showcased in San Francisco

By Mark Skilton, Capgemini

Over the past year, The Open Group has been conducting a project to assess the current state of interoperability and portability in Cloud Computing. The findings from this work will be presented at The Open Group San Francisco Conference on Wednesday, February 1 by Mark Skilton (Capgemini) Kapil Bakshi (Cisco) and Chris Harding (The Open Group) – co-chairs and members of the project team.

The work has surveyed the current range of international standards development impacting interoperability. The project then developed a set of proposed architectural reference models targeting data, application, platform, infrastructure and environment portability and interoperability for Cloud ecosystems and connectivity to non-Cloud environments.

The Open Group plans to showcase the current findings and proposed areas of development within The Open Group using the organization’s own international architecture standards models and is also exploring the possibility of promoting work in this area  with other leading standards bodies as well.

If you’re interested in learning more about this project and if you’re at the San Francisco Conference, please come to the session, “The Benefits, Challenges and Survey of Cloud Computing Interoperability and Portability” on Wednesday, February 1 at 4:00 p.m.

Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

Comments Off

Filed under Cloud, Semantic Interoperability, Standards

What’s New in ArchiMate 2.0?

By Andrew Josey, The Open Group, Henry Franken, BiZZdesign

ArchiMate® 2.0, an Open Group Standard, is an upwards-compatible evolution from ArchiMate 1.0 adding new features, as well as addressing usage feedback and comments raised.

ArchiMate 2.0 standard supports modeling throughout the TOGAF Architecture Development Method (ADM).

Figure 1: Correspondence between ArchiMate and the TOGAF ADM

ArchiMate 2.0 consists of:

  • The ArchiMate Core, which contains several minor improvements on the 1.0 version.
  • The Motivation extension, to model stakeholders, drivers for change, business goals, principles, and requirements. This extension mainly addresses the needs in the early TOGAF phases and the requirements management process.
  • The Implementation and Migration extension, to support project portfolio management, gap analysis, and transition and migration planning. This extension mainly addresses the needs in the later phases of the TOGAF ADM cycle.

ArchiMate 2.0 offers a modeling language to create fully integrated models of the organization’s enterprise architecture, the motivation for the enterprise architecture, and the programs, projects and migration paths to implement this enterprise architecture. In this way, full (forward and backward) traceability between the elements in the enterprise architecture, their motivations and their implementation can be obtained.

In the ArchiMate Core, a large number of minor improvements have been made compared to ArchiMate 1.0: inconsistencies have been removed, examples have been improved and additional text has been inserted to clarify certain aspects. Two new concepts have been added based on needs experienced by practitioners:

  • Location: To model a conceptual point or extent in space that can be assigned to structural elements and, indirectly, of behavior elements.
  • Infrastructure Function: To model the internal behavior of a node in the technology layer. This makes the technology layer more consistent with the other two layers.

The Motivation extension defines the following concepts:

  • Stakeholder: The role of an individual, team, or organization (or classes thereof) that represents their interests in, or concerns relative to, the outcome of the architecture.
  • Driver: Something that creates, motivates, and fuels the change in an organization.
  • Assessment: The outcome of some analysis of some driver.
  • Goal: An end state that a stakeholder intends to achieve.
  • Requirement: A statement of need that must be realized by a system.
  • Constraint: A restriction on the way in which a system is realized.
  • Principle: A normative property of all systems in a given context or the way in which they are realized.

For motivation elements, a limited set of relationships has been defined, partly re-used from the ArchiMate Core: aggregation (decomposition), realization, and (positive or negative) influence.

The Implementation and Migration extension defines the following concepts (and re-uses the relationships of the Core):

  • Work Package: A series of actions designed to accomplish a unique goal within a specified time.
  • Deliverable: A precisely defined outcome of a work package.
  • Plateau: A relatively stable state of the architecture that exists during a limited period of time.
  • Gap: An outcome of a gap analysis between two plateaus.

ArchiMate 2 Certification

New with ArchiMate 2.0 is the introduction of a certification program. This includes certification for people and accreditation for training courses. It also includes certification for tools supporting the ArchiMate standard.

The ArchiMate 2 Certification for People program enables professionals around the globe to demonstrate their knowledge of the ArchiMate standard. ArchiMate 2 Certification for People is achieved through an examination and practical exercises as part of an Accredited ArchiMate 2 Training Course.

The Open Group Accreditation for ArchiMate training courses provides an authoritative and independent assurance of the quality and relevance of the training courses.

The Open Group ArchiMate Tool Certification Program makes certification available to tools supporting ArchiMate. The goal of the program is to ensure that architecture artifacts created with a certified tool are conformant to the language.

Further Reading

ArchiMate 2.0 is available for online reading and download from The Open Group Bookstore at www.opengroup.org/bookstore/catalog/c118.htm.

A white paper with further details on ArchiMate 2.0 is available to download from The Open Group Bookstore at www.opengroup.org/bookstore/catalog/w121.htm .

Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Henry Franken is the managing director of BiZZdesign and is chair of The Open Group ArchiMate Forum. As chair of The Open Group ArchiMate Forum, Henry led the development of the ArchiMate Version 2.o standard. Henry is a speaker at many conferences and has co-authored several international publications and Open Group White Papers. Henry is co-founder of the BPM-Forum. At BiZZdesign, Henry is responsible for research and innovation.

Comments Off

Filed under ArchiMate®, Business Architecture, Enterprise Architecture, Standards, TOGAF, TOGAF®

2012 Open Group Predictions, Vol. 2

By The Open Group

Continuing on the theme of predictions, here are a few more, which focus on enterprise architecture, business architecture, general IT and Open Group events in 2012.

Enterprise Architecture – The Industry

By Leonard Fehskens, VP of Skills and Capabilities

Looking back at 2011 and looking forward to 2012, I see growing stress within the EA community as both the demands being placed on it and the diversity of opinions within it increase. While this stress is not likely to fracture the community, it is going to make it much more difficult for both enterprise architects and the communities they serve to make sense of EA in general, and its value proposition in particular.

As I predicted around this time last year, the conventional wisdom about EA continues to spin its wheels.  At the same time, there has been a bit more progress at the leading edge than I had expected or hoped for. The net effect is that the gap between the conventional wisdom and the leading edge has widened. I expect this to continue through the next year as progress at the leading edge is something like the snowball rolling downhill, and newcomers to the discipline will pronounce that it’s obvious the Earth is both flat and the center of the universe.

What I had not expected is the vigor with which the loosely defined concept of business architecture has been adopted as the answer to the vexing challenge of “business/IT alignment.” The big idea seems to be that the enterprise comprises “the business” and IT, and enterprise architecture comprises business architecture and IT architecture. We already know how to do the IT part, so if we can just figure out the business part, we’ll finally have EA down to a science. What’s troubling is how much of the EA community does not see this as an inherently IT-centric perspective that will not win over the “business community.” The key to a truly enterprise-centric concept of EA lies inside that black box labeled “the business” – a black box that accounts for 95% or more of the enterprise.

As if to compensate for this entrenched IT-centric perspective, the EA community has lately adopted the mantra of “enterprise transformation”, a dangerous strategy that risks promising even more when far too many EA efforts have been unable to deliver on the promises they have already made.

At the same time, there is a growing interest in professionalizing the discipline, exemplified by the membership of the Association of Enterprise Architects (AEA) passing 20,000, TOGAF® 9 certifications passing 10,000, and the formation of the Federation of Enterprise Architecture Professional Organizations (FEAPO). The challenge that we face in 2012 and beyond is bringing order to the increasing chaos that characterizes the EA space. The biggest question looming seems to be whether this should be driven by IT. If so, will we be honest about this IT focus and will the potential for EA to become a truly enterprise-wide capability be realized?

Enterprise Architecture – The Profession

By Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects

It’s an exciting time for enterprise architecture, both as an industry and as a profession. There are an abundance of trends in EA, but I wanted to focus on three that have emerged and will continue to evolve in 2012 and beyond.

  • A Defined Career Path for Enterprise Architects: Today, there is no clear career path for the enterprise architect. I’ve heard this from college students, IT and business professionals and current EAs. Up until now, the skills necessary to succeed and the roles within an organization that an EA can and should fill have not been defined. It’s imperative that we determine the skill sets EAs need and the path for EAs to acquire these skills in a linear progression throughout their career. Expect this topic to become top priority in 2012.
  • Continued EA Certification Adoption: Certification will continue to grow as EAs seek ways to differentiate themselves within the industry and to employers. Certifications and memberships through professional bodies such as the Association of Enterprise Architects will offer value to members and employers alike by identifying competent and capable architects. This growth will also be supported by EA certification adoption in emerging markets like India and China, as those countries continue to explore ways to build value and quality for current and perspective clients, and to establish more international credibility.
  • Greater Involvement from the Business: As IT investments become business driven, business executives controlling corporate strategy will need to become more involved in EA and eventually drive the process. Business executive involvement will be especially helpful when outsourcing IT processes, such as Cloud Computing. Expect to see greater interest from executives and business schools that will implement coursework and training to reflect this shift, as well as increased discussion on the value of business architecture.

Business Architecture – Part 2

By Kevin Daley, IBM and Vice-Chair of The Open Group Business Forum

Several key technologies have reached a tipping point in 2011 that will move them out of the investigation and validation by enterprise architects and into the domain of strategy and realization for business architects. Five areas where business architects will be called upon for participation and effort in 2012 are related to:

  • Cloud: This increasingly adopted and disruptive technology will help increase the speed of development and change. The business architect will be called upon to ensure the strategic relevancy of transformation in a repeatable fashion as cycle times and rollouts happen faster.
  • Social Networking / Mobile Computing: Prevalent consumer usage, global user adoption and improvements in hardware and security make this a trend that cannot be ignored. The business architect will help develop new strategies as organizations strive for new markets and broader demographic reach.
  • Internet of Things: This concept from 2000 is reaching critical mass as more and more devices become communicative. The business architect will be called on to facilitate the conversation and design efforts between operational efforts and technologies managing the flood of new and usable information.
  • Big Data and Business Intelligence: Massive amounts of previously untapped data are being exposed, analyzed and made insightful and useful. The business architect will be utilized to help contain the complexity of business possibilities while identifying tactical areas where the new insights can be integrated into existing technologies to optimize automation and business process domains.
  • ERP Resurgence and Smarter Software: Software purchasing looks to continue its 2011 trend towards broader, more intuitive and feature-rich software and applications.  The business architect will be called upon to identify and help drive getting the maximum amount of operational value and output from these platforms to both preserve and extend organizational differentiation.

The State of IT

By Dave Lounsbury, CTO

What will have a profound effect on the IT industry throughout 2012 are the twin horses of mobility and consumerization, both of which are galloping at full tilt within the IT industry right now. Key to these trends are the increased use of personal devices, as well as favorite consumer Cloud services and social networks, which drive a rapidly growing comfort among end users with both data and computational power being everywhere. This comfort brings a level of expectations to end users who will increasingly want to control how they access and use their data, and with what devices. The expectation of control and access will be increasingly brought from home to the workplace.

This has profound implications for core IT organizations. There will be less reliance on core IT services, and with that an increased expectation of “I’ll buy the services, you show me know to knit them in” as the prevalent user approach to IT – thus requiring increased attention to use of standards conformance. IT departments will change from being the only service providers within organizations to being a guiding force when it comes to core business processes, with IT budgets being impacted. I see a rapid tipping point in this direction in 2012.

What does this mean for corporate data? The matters of scale that have been a part of IT—the overarching need for good architecture, security, standards and governance—will now apply to a wide range of users and their devices and services. Security issues will loom larger. Data, apps and hardware are coming from everywhere, and companies will need to develop criteria for knowing whether systems are robust, secure and trustworthy. Governments worldwide will take a close look at this in 2012, but industry must take the lead to keep up with the pace of technology evolution, such as The Open Group and its members have done with the OTTF standard.

Open Group Events in 2012

By Patty Donovan, VP of Membership and Events

In 2012, we will continue to connect with members globally through all mediums available to us – our quarterly conferences, virtual and regional events and social media. Through coordination with our local partners in Brazil, China, France, Japan, South Africa, Sweden, Turkey and the United Arab Emirates, we’ve been able to increase our global footprint and connect members and non-members who may not have been able to attend the quarterly conferences with the issues facing today’s IT professionals. These events in conjunction with our efforts in social media has led to a rise in member participation and helped further develop The Open Group community, and we hope to have continued growth in the coming year and beyond.

We’re always open to new suggestions, so if you have a creative idea on how to connect members, please let me know! Also, please be sure to attend the upcoming Open Group Conference in San Francisco, which is taking place on January 30 through February 3. The conference will address enterprise transformation as well as other key issues in 2012 and beyond.

9 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Data management, Enterprise Architecture, Semantic Interoperability, Standards

Save the Date—The Open Group Conference San Francisco!

By Patty Donovan, The Open Group

It’s that time again to start thinking ahead to The Open Group’s first conference of 2012 to be held in San Francisco, January 30 – February 3, 2012. Not only do we have a great venue for the event, the Intercontinental Mark Hopkins (home of the famous “Top of the Mark” sky lounge—with amazing views of all of San Francisco!), but we have stellar line up for our winter conference centered on the theme of Enterprise Transformation.

Enterprise Transformation is a theme that is increasingly being used by organizations of all types to represent the change processes they implement in response to internal and external business drivers. Enterprise Architecture (EA) can be a means to Enterprise Transformation, but most enterprises today because EA is still largely limited to the IT department and transformation must go beyond the IT department to be successful. The San Francisco conference will focus on the role that both IT and EA can play within the Enterprise Transformation process, including the following:

  • The differences between EA and Enterprise Transformation and how they relate  to one another
  • The use of EA to facilitate Enterprise Transformation
  • How EA can be used to create a foundation for Enterprise Transformation that the Board and business-line managers can understand and use to their advantage
  • How EA facilitates transformation within IT, and how does such transformation support the transformation of the enterprise as a whole
  • How EA can help the enterprise successfully adapt to “disruptive technologies” such as Cloud Computing and ubiquitous mobile access

In addition, we will be featuring a line-up of keynotes by some of the top industry leaders to discuss Enterprise Transformation, as well as themes around our regular tracks of Enterprise Architecture and Professional Certification, Cloud Computing and Cybersecurity. Keynoting at the conference will be:

  • Joseph Menn, author and cybersecurity correspondent for the Financial Times (Keynote: What You’re Up Against: Mobsters, Nation-States and Blurry Lines)
  • Celso Guiotoko, Corporate Vice President and CIO, Nissan Motor Co., Ltd. (Keynote: How Enterprise Architecture is helping NISSAN IT Transformation)
  • Jeanne W. Ross, Director & Principal Research Scientist, MIT Center for Information Systems Research (Keynote: The Enterprise Architect: Architecting Business Success)
  • Lauren C. States, Vice President & Chief Technology Officer, Cloud Computing and Growth Initiatives, IBM Corp. (Keynote: Making Business Drive IT Transformation Through Enterprise Architecture)
  • Andy Mulholland, Chief Global Technical Officer, Capgemini (Keynote: The Transformed Enterprise)
  • William Rouse, Executive Director, Tennenbaum Institute at Georgia Institute of Technology (Keynote: Enterprise Transformation: An Architecture-Based Approach)

For more on the conference tracks or to register, please visit our conference registration page. And stay tuned throughout the next month for more sneak peeks leading up to The Open Group Conference San Francisco!

1 Comment

Filed under Cloud, Cloud/SOA, Cybersecurity, Data management, Enterprise Architecture, Semantic Interoperability, Standards

The Open Group Surpasses 400 Member Milestone

By Allen Brown, The Open Group

I’m pleased to announce The Open Group has recently surpassed the 400 member mark. Reaching this milestone is a true testament to the commitment our members and staff have made to promoting open standards over the past 25 years.

The Open Group’s strategy has been shaped by IT users through the development of open, vendor-neutral standards and certifications. Today’s milestone validates that this strategy is continuing to resonate, particularly with global organizations that demand greater interoperability, trusted ways to architect their information systems and qualified IT people to lead the effort.

Our members continue to collaborate on developing long term, globally accepted solutions surrounding the most critical IT issues facing business today. Some of the work areas include Enterprise Architecture, Cloud Computing, real-time and embedded systems, operating platform, semantic interoperability and cyber-security to name a few. The members’ leadership around these issues is increasingly global through a larger roster of regional events and local offices now based in China, France, Japan, South Africa, South America, Sweden, Turkey, the United Arab Emirates, the UK and US. As a result, we now have more than 30,000 individual members participating from 400 global organizations in more than 85 countries worldwide.

This is a great milestone to end the year on, and we’re looking forward to celebrating more occasions like it resulting from the members’ hard work and contributions in 2012.

2 Comments

Filed under Enterprise Transformation, Semantic Interoperability, Standards

How EA is leading enterprise transformation in France

By Eric Boulay, The Open Group France

Earlier this week, in Paris, The Open Group France held the latest in a series of one-day conferences focused on Enterprise Architecture. As usual, the event delivered high-value content in the form of an excellent keynote presentation and case studies. These covered the retail, gambling, and financial industries — including two from CIOs of major French corporations: Continue reading

2 Comments

Filed under Enterprise Architecture, TOGAF®

PODCAST: Standards effort points to automation via common markup language for improved IT compliance, security

By Dana Gardner, Interabor Solutions

Listen to this recorded podcast here: BriefingsDirect-O-ACEML Standard Effort Points to Broad Automation for Improved IT Compliance and Security Across Systems

The following is the transcript of a sponsored podcast panel discussion on the new Open Automated Compliance Expert Markup Language (O-ACEML) standard, in conjunction with the The Open Group Conference, Austin 2011.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect. Today, we present a sponsored podcast discussion in conjunction with The Open Group Conference in Austin, Texas, the week of July 18, 2011. We’re going to examine the Open Automated Compliance Expert Markup Language (O-ACEML), a new standard creation and effort that helps enterprises automate security compliance across their systems in a consistent and cost-saving manner.

O-ACEML helps to achieve compliance with applicable regulations but also achieves major cost savings. From the compliance audit viewpoint, auditors can carry out similarly consistent and more capable audits in less time. Here to help us understand O-ACEML and managing automated security compliance issues and how the standard is evolving are our guests. We’re here with Jim Hietala, Vice President of Security at The Open Group. Welcome back, Jim.

Jim Hietala: Thanks, Dana. Glad to be with you.

Gardner: We’re also here with Shawn Mullen. He’s a Power Software Security Architect at IBM. Welcome to the show, Shawn.

Shawn Mullen: Thank you.

Gardner: Let’s start by looking at why this is an issue. Why do O-ACEML at all? I assume that security being such a hot topic, as well as ways in which organizations grapple with the regulations, and compliance issues are also very hot, this has now become an issue that needs some standardization. Let me throw this out to both of you. Why are we doing this at all and what are the problems that we need to solve with O-ACEML?

Hietala: One of the things you’ve seen in last 10 or 12 years, since the compliance regulations have really come to the fore, is that the more regulation there is, more specific requirements are put down, and the more challenging it is for organizations to manage. Their IT infrastructure needs to be in compliance with whatever regulations impact them, and the cost of doing so becomes a significant thing. So, anything that could be done to help automate, to drive out cost, and maybe make organizations more effective in complying with the regulations that affect them — whether it’s PCI, HIPAA, or whatever — there’s lot of benefit to large IT organizations in doing that. That’s really what drove us to look at adopting a standard in this area.

Gardner: Jim, just for those folks who are coming in as fresh, are we talking about IT security equipment and the compliance around that, or is it about the process of how you do security, or both? What are the boundaries around this effort and what it focuses on?

Manual process

Hietala: It’s both. It’s enabling the compliance of IT devices specifically around security constraints and the security configuration settings and to some extent, the process. If you look at how people did compliance or managed to compliance without a standard like this, without automation, it tended to be a manual process of setting configuration settings and auditors manually checking on settings. O-ACEML goes to the heart of trying to automate that process and drive some cost out of an equation.

Gardner: Shawn Mullen, how do you see this in terms of the need? What are the trends or environment that necessitate in this?

Mullen: I agree with Jim. This has been going on a while, and we’re seeing it on both classes of customers. On the high-end, we would go from customer-to-customer and they would have their own hardening scripts, their own view of what should be hardened. It may conflict with what compliance organization wanted as far as the settings. This was a standard way of taking what the compliance organization wanted, and also it has an easy way to author it, to change it.

If your own corporate security requirements are more stringent, you can easily change the O-ACEML configuration, so that is satisfies your more stringent corporate compliance or security policy, as well as satisfying the regulatory compliance organization in an easy way to monitor it, to report, and see it.

In addition, on the low end, the small businesses don’t have the expertise to know how to configure their systems. Quite frankly, they don’t want to be security experts. Here is an easy way to print an XML file to harden their systems as it needs to be hardened to meet compliance or just the regular good security practices.

Gardner: One of the things that’s jumped out at me as I’ve looked into this, is the rapid improvement in terms of a cost or return on investment (ROI), almost to the league of a no- brainer category. Help me understand why is it so expensive and inefficient now, when it comes to security equipment audits and regulatory compliance. What might this then therefore bring in terms of improvement?

Mullen: One of the things that we’re seeing in the industry is server consolidation. If you have these hundreds, or in large organizations, thousands of systems and you have to manually configure them, it becomes a very daunting task. Because of that, it’s a one-time shot at doing this, and then the monitoring is even more difficult. With O-ACEML, it’s a way of authoring your security policy as it meets compliance or for your own security policy in pushing that out. This allows you to have a single XML and push it onto heterogeneous platforms. Everything is configured securely and consistently and it gives you a very easy way to get the tooling to monitor those systems, so they are configured correctly today. You’re checking them weekly or daily to ensure that they remain in that desired state.

Gardner: So it’s important not only to automate, but be inclusive and comprehensive in the way you do that or you are back to manual process at least for a significant portion, but that might then not be at your compliance issues. Is that how it works?

Mullen: We had a very interesting presentation here at The Open Group Conference yesterday. I’ll let Jim provide some of the details on that, but customers are finding the best way they can lower their compliance or their cost of meeting compliance is through automation. If you can automate any part of that compliance process, that’s going to save you time and money. If you can get rid of the manual effort with automation, it greatly reduces your cost.

Gardner: Shawn, do we have any sense in the market what the current costs are, even for something that was as well-known as Sarbanes-Oxley? How impressive, or unfortunately intimidating, are some of these costs?

Cost of compliance

Mullen: There was a very good study yesterday. The average cost of an organization to be compliant is $3 million. That’s annual cost. What was also interesting was that the cost of being non-compliant, as they called it, was $9 million.

Hietala: The figures that Shawn was referencing come out of the study by the Ponemon Institute. Larry Ponemon does lots of studies around security risk compliance cost. He authors an annual data breach study that’s pretty widely quoted in the security industry that gets to the cost of data breaches on average for companies.

In the numbers that were presented yesterday, he recently studied 46 very large companies, looking at their cost to be in compliance with the relevant regulations. It’s like $3.5 million a year, and over $9 million for companies that weren’t compliant, which suggests that companies that are actually actively managing towards compliance are probably little more efficient than those that aren’t. What O-ACEML has the opportunity to do for those companies that are in compliance is help drive that $3.5 million down to something much less than that by automating and taking manual labor out of process.

Gardner: So it’s a seemingly very worthwhile effort. How do we get to where we are now, Jim, with the standard and where do we need to go? What’s the level of maturity with this?

Hietala: It’s relatively new. It was just published 60 days ago by The Open Group. The actual specification is on The Open Group website. It’s downloadable, and we would encourage both, system vendors and platform vendors, as well as folks in the security management space or maybe the IT-GRC space, to check it out, take a look at it, and think about adopting it as a way to exchange compliance configuration information with platforms.

We want to encourage adoption by as broad a set of vendors as we can, and we think that having more adoption by the industry, will help make this more available so that end-users can take advantage of it.

Gardner: Back to you Shawn. Now that we’ve determined that we’re in the process of creating this, perhaps, you could set the stage for how it works. What takes place with ACEML? People are familiar with markup languages, but how does this now come to bear on this problem around compliance, automation, and security?

Mullen: Let’s take a single rule, and we’ll use a simple case like the minimum password length. In PCI the minimum password length, for example, is seven. Sarbanes-Oxley, which relies on COBiT password length would be eight.

But with an O-ACEML XML, it’s very easy to author a rule, and there are three segments to it. The first segment is, it’s very human understandable, where you would put something like “password length equals seven.” You can add a descriptive text with it, and that’s all you have to author.

Actionable command

When that is pushed down on to the platform or the system that’s O-ACEML aware, it’s able to take that simple ACEML word or directive and map that into an actionable command relevant to that system. When it finds the map into the actionable command ,it writes it back into the XML. So that’s completing the second phase of the rule. It executes that command either to implement the setting or to check the setting.

The result of the command is then written back into the XML. So now the XML for particular rule has the first part, the authored high-level directive as a compliance organization, how that particular system mapped into a command, and the result of executing that command either in a setting or checking format.

Now we have all of the artifacts we need to ensure that the system is configured correctly, and to generate audit reports. So when the auditor comes in we can say, “This is exactly how any particular system is configured and we know it to be consistent, because we can point to any particular system, get the O-ACEML XML and see all the artifacts and generate reports from that.”

Gardner: Maybe to give a sense of how this works, we can also look at a before-and-after scenario. Maybe you could describe how things are done now, the before or current status approach or standard operating procedure, and then what would be the case after someone would implement and mature O-ACEML implementation.

Mullen: There are similar tools to this, but they don’t all operate exactly the same way. I’ll use an example of BigFix. If I had a particular system, they would offer a way for you to write your own scripts. You would basically be doing what you would do at the end point, but you would be doing it at the BigFix central console. You would write scripts to do the checking. You would be doing all of this work for each of your different platforms, because everyone is a little bit different.

Then you could use BigFix to push the scripts down. They would run, and hopefully you wrote your scripts correctly. You would get results back. What we want to do with ACEML is when you just put the high-level directive down to the system, it understands ACEML and it knows the proper way to do the checking.

What’s interesting about ACEML, and this is one of our differences from, for example, the security content automation protocol (SCAP), is that instead of the vendor saying, “This is how we do it. It has a repository of how the checking goes and everything like that,” you let the end point make the determination. The end point is aware of what OS it is and it’s aware of what version it is.

For example, with IBM UNIX, which is AIX, you would say “password check at this different level.” We’ve increased our password strength, we’ve done a lot of security enhancements around that. If you push the ACEML to a newer level of AIX, it would do the checking slightly differently. So, it really relies on the platform, the device itself, to understand ACEML and understand how best to do its checking.

We see with small businesses and even some of the larger corporations that they’re maintaining their own scripts. They’re doing everything manually. They’re logging on to a system and running some of those scripts. Or, they’re not running scripts at all, but are manually making all of these settings.

It’s an extremely long and burdensome process,when you start considering that there are hundreds of thousands of these systems. There are different OSs. You have to find experts for your Linux systems or your HP-UX or AIX. You have to have all those different talents and skills in these different areas, and again the process is quite lengthy.

Gardner: Jim Hietala, it sounds like we are focusing on servers to begin with, but I imagine that this could be extended to network devices, other endpoints, other infrastructure. What’s the potential universe of applicability here?

Different classes

Hietala: The way to think about it is the universe of IT devices that are in scope for these various compliance regulations. If you think about PCI DSS, it defines pretty tightly what your cardholder data environment consists of. In terms of O-ACEML, it could be networking devices, servers, storage equipment, or any sort of IT device. Broadly speaking, it could apply to lots of different classes of computing devices.

Gardner: Back to you Shawn,. You mentioned the AIX environment. Could you explain a beginning approach that you’ve had with IBM Compliance Expert, or ICE, that might give us a clue as to how well this could work, when applied even more broadly? How does that heritage in ICE develop, and what would that tell us about what we could expect with O-ACEML?

Mullen: We’ve had ICE and this AIX Compliance Expert, using the XML, for a number of years now. It’s been broadly used by a lot of our customers, not only to secure AIX but to secure the virtualization environment in a particular a virtual I/O server. So we use it for that.

One of the things that ACEML brings is that it has some of the lessons we learned from doing our own proprietary XML. It also brings some lessons we learned when looking at other XML for compliance like XCCDF. One of the things we put in there was a remediation element.

For example, the PCI says that your password length should be seven. COBiT says your password length should be eight. It has the XML, so you can blend multiple compliance requirements with a single policy, choosing the more secure setting, so that both compliance organizations, or other three compliance organizations, gets set properly to meet all of those, and apply it to a singular system.

One of the things that we’re hoping vendors will gravitate toward is the ability to have a central console controlling their IT environment or configuring and monitoring their IT environment. It just has to push out a single XML file. It doesn’t have to push out a special XML for Linux versus AIX versus a network device. It can push out that ACEML file to all of the devices. It’s a singular descriptive XML, and each device, in turn, knows how to map it to its own particular platform in security configuring.

Gardner: Jim Hietala, it sounds as if the low-hanging fruit here would be the compliance and automation benefit, but it also sounds as if this is comprehensive. It’s targeted at a very large set of the devices and equipment in the IT infrastructure. This could become a way of propagating new security policies, protocols, approaches, even standards, down the line. Is that part of the vision here — to be able to offer a means by which an automated propagation of future security changes could easily take place?

Hietala: Absolutely, and it goes beyond just the compliance regulations that are inflicted on us or put on us by government organizations to defining a best practice instead of security policies in the organization. Then, using this as a mechanism to push those out to your environment and to ensure that they are being followed and implemented on all the devices in their IT environment.

So, it definitely goes beyond just managing compliance to these external regulations, but to doing a better job of implementing the ideal security configuration settings across your environment.

Gardner: And because this is being done in an open environment like The Open Group, and because it’s inclusive of any folks or vendors or suppliers who want to take part, it sounds as if this could also cross the chasm between an enterprise, IT set, and a consumer or mobile or external third-party provider set.

Is it also a possibility that we’re going beyond heterogeneity, when it comes to different platforms, but perhaps crossing boundaries into different segments of IT and what we’re seeing with the “consumerization” of IT now? I’ll ask this to either of you or both of you.

Moving to the Cloud

Hietala: I’ll make a quick comment and then turn it over to Shawn. Definitely, if you think about how this sort of a standard might apply towards services that are built in somebody’s Cloud, you could see using this as a way to both set configuration settings and check on the status of configuration settings and instances of machines that are running in a Cloud environment. Shawn, maybe you want to expand on that?

Mullen: It’s interesting that you brought this up, because this is the exact conversation we had earlier today in one of the plenary sessions. They were talking about moving your IT out into the Cloud. One of the issues, aside from just the security, was how do you prove that you are meeting these compliance requirements?

O-ACEML is a way to reach into the Cloud to find your particular system and bring back a report that you can present to your auditor. Even though you don’t own the system –it’s not in the data center here in the next office, it’s off in the cloud somewhere — you can bring back all the artifacts necessary to prove to the auditor that you are meeting the regulatory requirements.

Gardner: Jim, how do folks take further steps to either gather more information? Obviously, this would probably of interest to enterprises as well as the suppliers, vendors for professional services organizations. What are the next steps? Where can they go to get some information? What should they do to become involved?

Hietala: The standard specification is up on our website. You can go to the “Publications” tab on our website, and do a search for O-ACEML, and you should find the actual technical standard document. Then, you can get involved directly in the Security Forum by joining The Open Group . As the standard evolves, and as we do more with it, we certainly want more members involved in helping to guide the progress of it over time.

Gardner: Thoughts from you, Shawn, on that same getting involved question?

Mullen: That’s a perfect way to start. We do want to invite different compliance organization, everybody from the electrical power grid — they have their own view of security — to ISO, to payment card industry. For the electrical power grid standard, for example — and ISO is the same way — what ACEML helps them with is they don’t need to understand how Linux does it, how AIX does it. They don’t need to have that deep understanding.

In fact, the way ISO describes it in their PDF around password settings, it basically says, use good password settings, and it doesn’t go into any depth beyond that. The way we architected and designed O-ACEML is that you can just say, “I want good password settings,” and it will default to what we decided. What we focused in on collectively as an international standard in The Open Group was, that good password hygiene means you change your password every six months. It should at least carry this many characters, there should be a non-alpha/numeric.

It removes the burden of these different compliance groups from being security experts and it let’s them just use ACEML and the default settings that The Open Group came up with. We want to reach out to those groups and show them the benefits of publishing some of their security standards in O-ACEML. Beyond that, we’ll work with them to have that standard up, and hopefully they can publish it on their website, or maybe we can publish it on The Open Group website.

Next milestones

Gardner: Well, great. We’ve been learning more about the Open Automated Compliance Expert Markup Language, more commonly known as O-ACEML. And we’ve been seeing how it can help assure compliance along with some applicable regulations across different types of equipment, but has the opportunity to perhaps provide more security across different domains, be that cloud or on-premises or even partner networks. while also achieving major cost savings. We’ve been learning how to get to started on this and what the maturity timeline is.

Jim Hietala, what would be the next milestone? What should people expect next in terms of how this is being rolled out?

Hietala: You’ll see more from us in terms of adoption of the standard. We’re looking already at case studies and so forth to really describe in terms that everyone can understand what benefits organizations are seeing from using O-ACEML. Given the environment we’re in today, we’re seeing about security breaches and hacktivism and so forth everyday in the newspapers.

I think we can expect to see more regulation and more frequent revisions of regulations and standards affecting IT organizations and their security, which really makes it imperative for engineers in IT environment in such a way that you can accommodate those changes, as they are brought to your organization, do so in an effective way, and at the least cost. Those are really the kinds of things that O-ACEML has targeted, and I think there is a lot of benefit to organizations to using it.

Gardner: Shawn, one more question to you as a follow-up to what Jim said, not only that should we expect more regulations, but we’ll see them coming from different governments, different strata of governments, so state, local, federal perhaps. For multinational organization, this could be a very complex undertaking, so I’m curious as to whether O-ACEML could also help when it comes to managing multiple regulations across multiple jurisdictions for larger organizations.

Mullen: That was the goal when we came up with O-ACEML. Anybody could author it, and again, if a single system fell under the purview of multiple compliance requirements, we could plan that together and that system would be a multiple one. It’s an international standard, we want it to be used by multiple compliance organizations. And compliance is a good thing. It’s just good IT governance. It will save companies money in the long run, as we saw with these statistics. The goal is to lower the cost of being compliant, so you get good IT governance, just with a lower cost.

Gardner: Thanks. This sponsored podcast is coming to you in conjunction with The Open Group Conference in Austin, Texas, in the week of July 18, 2011. Thanks to both our guests. Jim Hietala, the Vice President of Security at The Open Group. Thank you, Jim.

Hietala: Thank you, Dana.

Gardner: And also Shawn Mullen, Power Software Security Architect at IBM. Thank you, Shawn.

Mullen: Thank you, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com.

Copyright The Open Group 2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirect™ blogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

3 Comments

Filed under Cybersecurity

The Open Group releases O-ACEML standard, automates compliance configuration

By Jim Hietala, The Open Group

The Open Group recently published the Open Automated Compliance Expert Markup Language (O-ACEML) standard. This new technical standard addresses needs to automate the process of configuring IT environments to meet compliance requirements. O-ACEML will also enable customer organizations and their auditors to streamline data gathering and reporting on compliance postures.

O-ACEML is aimed at helping organizations to reduce the cost of compliance by easing manual compliance processes. The standard is an open, simple, and well defined XML schema that allows compliance requirements to be described in machine understandable XML, as opposed to requiring humans to interpret text from documents. The standard also allows for a remediation element, which enables multiple requirements (from different compliance regulations) to be blended into a single policy. An example of where this is needed would be in password length and complexity requirements, which may differ between different regulations. O-ACEML allows for the most secure setting to be selected and applied, enabling all of the regulations to be met or exceeded.

O-ACEML is intended to allow platform vendors and compliance management and IT-GRC providers to utilize a common language for exchanging compliance information. The existence of a single common standard will benefit platform vendors and compliance management tool vendors, by reducing development costs and providing a single data interchange format. Customer organizations will benefit by reducing costs for managing compliance in complex IT environments, and by increasing effectiveness. Where previously organizations might have just polled a small but representative sample of their environment to assess compliance, the existence of a standard allowing automated compliance checking makes it feasible to survey the entire environment rather than just a small sample. Organizations publishing government compliance regulations, as well as the de facto standard compliance organizations that have emerged in many industries will benefit by enabling more cost effective adoption and simpler compliance with their regulations and standards.

In terms of how O-ACEML relates to other compliance related standards and content frameworks, it has similarities and differences to NIST’s Security Content Automation Protocol (SCAP), and to the Unified Compliance Framework (UCF). One of the main differences is that O-ACEML was architected such that a Compliance Organization could author its IT security requirements in a high-level language, without the need to understand the specific configuration command and settings an OS or device will use to implement the requirement. A distinguishing capability of O-ACEML is that it gathers artifacts as it moves from Compliance Organization directive, implementation on a particular device, and the result of the configuration command. The final step of this automation not only produces a computer system configured meet or exceed the compliance requirements, it also produces an xml document from which compliance reporting can be simplified. The Open Group plans to work with NIST and the creators of the UCF to ensure interoperability and integration between O-ACEML and SCAP and UCF.

If you have responsibility for managing compliance in your organization, or if you are a vendor whose software product involves compliance or security configuration management, we invite you to learn more about O-ACEML.

An IT security industry veteran, Jim Hietala is Vice President of Security at The Open Group, where he is responsible for security programs and standards activities. He holds the CISSP and GSEC certifications. Jim is based in the U.S.

8 Comments

Filed under Cybersecurity, Standards

Announcing our new website: Building awareness of The Open Group’s standards and certifications

By Patricia Donovan, The Open Group

Those who have already visited The Open Group website today may have noticed it has a new appearance. And if you haven’t, please visit it now!

Yes; we’ve refined the design and encapsulated the information accumulated over the years into an easily digestible and navigatable site. But the real change is in the approach to how we use it as a business tool. In many ways, our new website is an extension of the mission we set for ourselves nearly 25 years ago: to drive the creation of Boundaryless Information Flow™ by giving people access to the information they need most, in the way they expect to find it.

You may recall that in 2010, we sent out surveys asking your opinions on what our members find to be important and what features and activities they value, as well as thoughts on compelling images, colors and other visuals. The new website, and some of the other communications you are now seeing from The Open Group, are a direct result of your input.

The new website is easier to scan, read and navigate, enabling visitors to find what they need quickly. Just as importantly, our key messages and value propositions are evident and clear. We are confident that our new web presence will improve The Open Group’s visibility and reputation as the global thought leader in the development of open, vendor-neutral standards and certifications — which will increase awareness for the valuable work done by the members who make up The Open Group Forums and Work Groups.

Additionally, the foundation has been laid to make the website a more agile, more interactive, Web 2.0 site — a tool that evolves organically, enables us to add features we were unable to offer previously, and allows us to meet your needs in real time.

I hope you will visit the new website at the same address, www.opengroup.org, and acquaint yourself with the new site. We’re quite proud of it, but we know there’s still work to do beyond today’s launch. In the coming months, we hope to continue improving the site so that it best serves you, our members.

In the meantime, please note some of the pages you may have previously bookmarked may no longer work and need to be bookmarked again; and for a time you’ll still be able to access material on our former site. Finally, please send any web feedback to webfeedback@opengroup.org.

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the US.

2 Comments

Filed under Uncategorized

PODCAST: Embracing EA and TOGAF® aids companies in improving innovation, market response and governance

By Dana Gardner, Interabor Solutions

Listen to this recorded podcast here: BriefingsDirect-How to Leverage Advanced TOGAF 9 Use for Business Benefits

The following is the transcript of a sponsored podcast panel discussion on how to leverage advanced concepts in TOGAF® for business benefits, in conjunction with the The Open Group Conference, Austin 2011.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion in conjunction with the latest Open Group Conference in Austin, Texas, the week of July 18, 2011. We’ve assembled a panel to examine the maturing use of TOGAF. That’s The Open Group Architecture Framework, and how enterprise architects and business leaders are advancing and exploiting the latest Framework, Version 9. We’ll further explore how the full embrace of TOGAF, its principles and methodologies, are benefiting companies in their pursuit of improved innovation, responsiveness to markets, and operational governance. Is enterprise architecture (EA) joining other business transformation agents as a part of a larger and extended strategic value? How? And what exactly are the best practitioners of TOGAF getting for their efforts in terms of business achievements?

Here with us to delve into advanced use and expanded benefits of EA frameworks and what that’s doing for their user organizations is Chris Forde, Vice President of Enterprise Architecture and Membership Capabilities for The Open Group, based in Shanghai. Welcome, Chris.

Chris Forde: Good morning, Dana.

Gardner: We’re also here with Jason Uppal. He is the Chief Architect at QR Systems, based in Toronto. Welcome, Jason.

Jason Uppal: Thank you, Dana.

Gardner: Jason, let’s cut to the quick here. We hear an awful lot about architecture. We hear about planning and methodologies, putting the right frameworks in place, but is TOGAF having an impact on the bottom line in the organizations that are employing it?

Uppal: One of the things with a framework like a TOGAF is that, on the outside, it’s a framework. At the same time, when you apply this along with the other disciplines, it’s making big difference in the organization, partially it’s because it’s allowing the IT organizations to step up to the plate within the core enterprise as a whole and ask how they can actually exploit the current assets that they already have. And secondly, how do they make sure the new assets that they do bring into the organization are aligned to the business needs.

One of the examples where EA has a huge impact in many of the organizations that I have experience with is that, with key EA methods, we’re able to capture the innovation that exists in the organization and make that innovation real, as opposed to just suggestions that are thrown in a box, and nobody ever sees them.

Gardner: What is it about capturing that innovation that gets us to something we can measure in terms of an achievable bottom-line benefit?

Evolve over time

Uppal: Say you define an end-to-end process using architecture development method (ADM) methods in TOGAF. What it does is give me a way to capture that innovation at the lowest level and then evolve it over time. Those people who are part of the innovation at the beginning see their innovation or idea progressing through the organization, as the innovation gets aligned to value statements, and value statements get aligned to their capabilities, and the strategies, and the projects, and hence to the end of the day.

Therefore, if I make a suggestion of some sort, that innovation or idea is seen throughout the organization through the methods like ADM, and the linkage is explicit and very visible to the people. Therefore, they feel comfortable that their ideas are going somewhere, they are just not getting stuck.

Forde: There’s an additional point here, Dana, to underscore the answer that Jason gave to your question. In the end result, what you want to be seeing out of your architectural program is moving the KPIs for the business, the business levers that they are expecting to be moved out. If that is related to cost reduction or is related to top-line numbers or whatever, that explicit linkage through to the business levers in an architecture program is critical.

Gardner: Chris, you have a good view on the global markets and the variability of goals here. Many companies are looking to either cut cost or improve productivity. Others are looking to expand. Others are looking to manage how to keep operations afloat. There are so many different variables. How do things like the TOGAF 9 and EA have a common benefit to all of those various pursuits? What is the common denominator that makes EA so powerful?

Forde: Going back to the framework reference, what we have with TOGAF 9 is a number of assets, but primarily it’s a tool that’s available to be customized, and it’s expected to be customized.

If you come to the toolset with a problem, you need to focus the framework on the area that’s going to help you get rapid value to solving your particular problem set. So once you get into that particular space, then you can look at migrating out from that entry point, if that’s the approach, to expanding your use of the framework, the methods, the capabilities, that are implicit and explicit in the framework to address other areas. You can start at the top and work your way down through the framework, from this kind of über value proposition, right down through delivery to the departmental level or whatever. Or, you can come into the bottom, in the infrastructure layer, in IT for example, and work your way up. Or, you can come in at the middle. The question is what is impeding your company’s growth or your department’s growth, if those are the issues that are facing you.

One of the reasons that this framework is so useful in so many different dimensions is that it is a framework. It’s designed to be customized, and is applicable to many different problems.

Gardner: Back to you, Jason. When we think about a beginning effort, perhaps a crawl-walk- run approach to EA and TOGAF, the promise is that further development, advancement, understanding, implementation will lead to larger, more strategic goals.

Let’s define what it means to get to that level of maturity. When we think about an advanced user of TOGAF, what does that mean? Then, we’ll get into how they can then leverage that to further goals. But, what do we really mean by an advanced user at this point?

Advanced user

Uppal: When we think about an advanced user, in our practice we look at it from different points of view and ask what value I’m delivering to the organization. It could very well be delivering value to a CTO in the organization. That is not to say that’s not an advanced user, because that’s strictly focused on technology.

But then, the CTO focus is that it allows us to focus on the current assets that are under deployment in the organization. How do you get the most out of them? So, that’s an advanced user who can figure out how to standardize and scale those assets into a scalable way so therefore they become reusable in the organization. As we move up the food chain from very technology-centric view of a more optimized and transformed scale, advanced user at that point is looking at and saying that now I have a framework like TOGAF, that advanced user has all these tools in their back pocket.

Now, depending on the stakeholder that they’re working with, be that a CEO, a CFO, or a junior manager in the line of business, I can actually focus them on defining a specific capability that they are working towards and create transition roadmaps. Once those transition roadmaps are established, then I can drive that through. An advanced user in the organization is somebody who has all these tools available to them, frameworks available to them, but at the same time, are very focused on a specific value delivery point in their scope.

One beauty of TOGAF is that, because we get to define what enterprise is and we are not told that we have to interview the CEO on day one, I can define an enterprise from a manager’s point of view or a CFO’s point of view and work within that framework. That to me is an advanced user.

Gardner: When we talk about applied architecture, what does that mean? How is it that we move from concept into execution?

Uppal: The frameworks that we have are well thought-out frameworks. So, it moves the conversation away from this framework debate and very quickly moves our conversation into what we do with it. When we talk about a framework like TOGAF, now I can look at and say that if I wanted to apply it now, I have an executive who has defined a business strategy, which typically is a two page PowerPoint presentation, sometimes accompanied by Excel. That’s a good starting point for an enterprise architect. Now, I use methods like TOGAF to define the capabilities in that strategy that they are trying to optimize, where they are, and what they want to transition to.

Very creative

This is where a framework allows me to be very creative, defining the capabilities and the transition points, and giving a roadmap to get to those transitions. That is the cleverness and cuteness of architecture work, and the real skills of an architect comes into, not in defining the framework, but defining the application of the framework to a specific business strategy.

Gardner: Jason, we mentioned that there is a great deal of variability in what different companies in different regions and in different industries need to accomplish, but one of the common questions I get a lot these days is what to outsource and what to manage internally and how to decide the boundaries between a core competency and extended outsourcing or hybrid computing types of models? How does the applied architecture come to the rescue, when this sort of question, which I think is fundamental to an enterprise, comes up?

Uppal: That’s a great question. That’s one of the area where if architects do their job well, we can help the organization move much further along. Because, what we do in the business space, and we have done it many times with the framework, is to look at the value chain of the organization. And looking at the value chain, then to map that out to the capabilities required.

Once we know those capabilities, then I can squarely put that question to the executives and say, “Tell me which capability you want to be the best at. Tell me what capability you want to lead the market in. And, tell me which capability you want to be mediocre and just be at below the benchmark in industry.” Once I get an understanding of which capability I want to be the best at, that’s where I want to focus my energy. Those ones that I am prepared to live with being mediocre, then I can put another strategy into place and ask how I outsource these things, and focus my outsourcing deal on the cost and service.

This is opposed to having very confused contract with the outsourcer, where one day I’m outsourcing for the cost reasons. The other day, I’m outsourcing for growth reasons. It becomes very difficult for an organization to manage the contracts and bend it to provide the support. That conversation, at the beginning, is getting executives to commit to which capability they want to be best at. That is a good conversation for an enterprise architect.

My personal experience has been that if I get a call back from the executive, and they say they want to be best at every one of them, then I say, “Well, you really don’t have a clue what you are talking about. You can’t be super fast and super good at every single thing that you do.”

Gardner: So making those choices is what’s critical. Some of the confusion I also hear about in the field is how to do a cost-benefit analysis about what processes I might keep internal, versus either hybrid or external source processes?

Is there something about the applied architecture and TOGAF 9 that sets up some system of record or methodology approach that allows that cost-benefit analysis of these situations to be made in advance? Is there anything that the planning process brings to the table in trying to make proper decisions about sourcing?

Capability-based planning

Uppal: Absolutely. This is where the whole of our capability-based planning conversation is. It was introduced in TOGAF 9, and we got more legs to go into developing that concept further, as we learn how best to do some of these things.

When I look at a capability-based planning, I expect my executives to look at it from a point of view and ask what are the opportunities and threats. What it is that you can get out there in the industry, if you have this capability in your back pocket? Don’t worry about how we are going to get it first, let’s decide that it’s worth getting it.

Then, we focus the organization into the long haul and say, well, if we don’t have this capability and nobody in the industry has this capability, if we do have it, what will it do for us? It provides us another view, a long-term view, of the organization. How are we going to focus our attention on the capabilities?

One of the beauties of doing EA is, is that when we start EA at the starting point of a strategic intent, that gives us a good 10-15 year view of what our business is going to be like. When we start architecture at the business strategy level, that gives us a six months to five-year view.

Enterprise architects are very effective at having two views of the world — a 5, 10, or 15 year view of the world, and a 6 months to 3 year view of the world. If we don’t focus on the strategic intent, we’ll never know what is possible, and we would always be working on what is possible within our organization, as opposed to thinking of what is possible in the industry as a whole.

Gardner: So, in a sense, you have multiple future tracks or trajectories that you can evaluate, but without a framework, without an architectural approach, you would never be able to have that set of choices before you.

Chris Forde, any thoughts on what Jason’s been saying in terms of the sourcing and cost benefits and risks analysis that go into that?

Forde: In the kinds of environment that most organizations are operating in — government, for- profit, not-for-profit organizations — everybody is trying to understand what it is they need to be good at and what it is their partners are very good at that they can leverage. Their choices around this are of course critical.

One of the things that you need to consider is that if you are going to give x out and have the power to manage that and operate whatever it is, whatever process it might be, what do you have to be good at in order to make them effective? One of the things you need to be good at is managing third parties. One of the advanced uses of an EA is applying the architecture to those management processes. In the maturity of things you can see potentially an effective organization managing a number of partners through an architected approach to things. So when we talked about what do advanced users do, what I am offering is that an advanced use of EA is in the application of it to third-party management.

Gardner: So the emphasis is on the process, not necessarily who is executing on that process?

Framework necessary

Forde: Correct, because you need a framework. Think about what most major Fortune 500 companies in the United States do. They have multiple, multiple IT partners for application development and potentially for operations. They split the network out. They split the desktop out. This creates an amazing degree of complexity around multiple contracts. If you have an integrator, that’s great, but how do you manage the integrator?

There’s a whole slew of complex problems. What we’ve learned over the years is that the original idea of “outsourcing,” or whatever the term that’s going to be used, we tend to think of that in the abstract, as one activity, when in fact it might be anywhere from 5-25 partners. Coordinating that complexity is a major issue for organizations, and taking an architected approach to that problem is an advanced use of EA.

Gardner: So stated another way, Jason, the process is important, but the management of processes is perhaps your most important core competency. Is that fair, and how does EA support that need for a core competency of managing processes across multiple organizations?

Uppal: That’s absolutely correct. Chris is right. For example, there are two capabilities an organization decided on, one that they wanted to be very, very good at.

We worked with a large concrete manufacturing company in the northern part of the country. If you’re a concrete manufacturing company, your biggest cost is the cement. If you can exploit your capability to optimize the cement and substitute products with the chemicals and get the same performance, you can actually get a lot more return and higher margins for the same concrete.

In this organization, the concrete manufacturing process itself was core competency. That had to be kept in-house. The infrastructure is essential to make the concrete, but it wasn’t the core competency of the organization. So those things had to be outsourced. In this organization we have to build a process — how to manage the outsourcer and, at the same time, have a capability and a process. Also, how to become best concrete manufacturers. Those two essential capabilities were identified.

An EA framework like TOGAF actually allows you to build both of those capabilities, because it doesn’t care. It just thinks, okay, I have a capability to build, and I am going to give you a set of instructions, the way you do it. The next thing is the cleverness of the architect — how he uses his tools to actually define the best possible solutions.

Gardner: Of course, it’s not just enough to identify and implement myriad sourcing or complex sourcing activities, but you need to monitor and have an operational governance oversight capability as well. Is there something in TOGAF 9 specifically that lends itself to taking this into the operational and then creating ongoing efficiencies as a result?

Uppal: Absolutely, because this is one of the areas where in ADM, when we get back to our implementation of governance, and post implementation of governance, value realization, how do we actually manage the architecture over the life of it? This is one of the areas where TOGAF 9 has done a considerably good job, and we’ve still got a long way to go in how we actually monitor and what value is being realized.

Very explicit

Our governance model is very explicit about who does what and when and how you monitor it. We extended this conversation using TOGAF 9 many times. At the end, when the capability is deployed, the initial value statement that was created in the business architecture is given back to the executive who asked for that capability.

We say, “This is what the benefits of these capabilities are and you signed off at the beginning. Now, you’re going to find out that you got the capability. We are going to pass this thing into strategic planning next year, because for next year’s planning starting point, this is going to be your baseline.” So not only is the governance just to make sure it’s via monitoring, but did we actually get the business scores that we anticipated out of it.

Gardner: Another area that’s of great interest to me nowadays is looking at the IT organization as they pursue things like Cloud, software as a service (SaaS), and hybrid models. Do they gather a core competency at how to manage these multiple partners, as Chris pointed out, or does another part of the company that may have been dealing with outsourcing at a business process level teach the IT department how to do this?

Any sense from either of our panelists on whether IT becomes a leader or a laggard in how to manage these relationships, and how important is managing the IT element of that in the long run? Let’s start with you, Jason.

Uppal: It depends on the industry the IT is in. For example, if you’re an organization that is very engineering focused, engineers have a lot more experience managing outsourcing deals than IT organizations do. In that case, the engineering leads this conversation.

But in most organizations, which are service-oriented organizations, engineering has not been a primary discipline, and IT has a lot of experience managing outside contracts. In that case, the whole Cloud conversation becomes a very effective conversation within the IT organization.

When we think about cloud, we have actually done Cloud before. This is not a new thing, except that before we looked at it from a hosting point of view and from a SaaS point of view. Now, cloud is going in a much further extended way, where entire capability is provided to you. That capability is not only that the infrastructure is being used for somebody else, but the entire industry’s knowledge is in that capability. This is becoming a very popular thing, and rightfully so, not because it’s a sexy thing to have. In healthcare, especially in countries where it’s a socialized healthcare and it’s not monopolized, they are sharing this knowledge in the cloud space with all the hospitals. It’s becoming a very productive thing, and enterprise architects are driving it, because we’re thinking of capabilities, not components.

Gardner: Chris Forde, similar question. How do you see the role of IT shifting or changing as a result of the need to manage more processes across multiple sources?

Forde: It’s an interesting question. I tend to agree with the earlier part of Jason’s response. I am not disagreeing with any of it, actually, but the point that he made about it is that it’s a “it depends” answer.

IT interaction

Under normal circumstances the IT organizations are very good at interacting with other technology areas of the business. From what I’ve seen with the organizations I have dealt with, typically they see slices of business processes, rather than the end-to-end process entirely. Even within the IT organizations typically, because of the size of many organizations, you have some sort of division of responsibilities. As far as Jason’s emphasis on capabilities and business processes, of course the capabilities and processes transcend functional areas in an organization.

To the extent that a business unit or a business area has a process owner end to end, they may well be better positioned to manage the BPMO type of things. If there’s a heavy technology orientation around the process outsourcing, then you will see the IT organization being involved to one extent or another.

The real question is, where is the most effective knowledge, skill, and experience around managing these outsourcing capabilities? It may be in the IT organization or it may be in the business unit, but you have to assess where that is.

That’s one of the functions that the architecture approaches. You need to assess what it is that’s going to make you successful in this. If what you need happens to be in the IT organization, then go with that ability. If it is more effective in the business unit, then go with that. And perhaps the answer is that you need to combine or create a new functional organization for the specific purpose of meeting that activity and outsource need.

I’m hedging a little bit, Dana, in saying that it depends.

Gardner: It certainly raises some very interesting issues. At the same time that we’re seeing this big question mark around sourcing and how to do that well, we’re also in a period where more organizations are being data-driven and looking to have deeper, more accessible, and real-time analytics applied to their business decisions. Just as with sourcing, IT also has an integral role in this, having been perhaps the architects or implementators of warehousing, data marts, and business intelligence (BI).

Back to you Jason. As we enter into a phase where organizations are also trying to measure and get scientific and data-driven about their decisions, how does IT, and more importantly, how does TOGAF and EA come to help them do that?

Uppal: We have a number of experiences like that, Dana. One is a financial services organization. The entire organization’s function is that they manage some 100-plus billion dollars worth of assets. In that kind of organization, all the decision making process is based on the data that they get. And 95 percent of the data is not within the organization. It is vendor data that they’re getting from outside.

So in that kind of conversation, we look and say that the organization needs a capability to manage data. Once we define a capability, then we start putting metrics on this thing. What does this capability need to be able to do?

In this particular example, we put a metric on this and said that the data gets identified in the morning, by the afternoon we bring it into the organization, and by the end of the day we get rid of it. That’s how fast the data has to be procured, transformed into the organization, brought it in, and delivered it to end-use. That end-user makes the decision whether we will never look at the data again.

Data capability

Having that fast speed of data management capability in the organization, and this is one of the areas where architects can take a look at, this is the capability you need. Now I can give you a roadmap to get to that capability.

Gardner: Chris Forde, how do you see the need for a data-driven enterprise coincide with IT and EA?

Forde: For most, if not all, companies, information and data are critical to their operation and planning activities, both on a day-to-day basis, month-to-month, annually, and in longer time spans. So the information needs of a company are absolutely critical in any architected approach to solutioning or value-add type of activities.

I don’t think I would accept the assumption that the IT department is best-placed to understand what those information needs are. The IT organization may be well-placed to provide input into what technologies could be applied to those problems, but if the information needs are normally being applied to business problems, as opposed to technology problems, I would suggest that it is probably the business units that are best-placed to decide what their information needs are and how best to apply them.

The technologist’s role, at least in the model I’m suggesting, is to be supportive in that and deliver the right technology, at the right time, for the right purpose.

Gardner: Then, how would a well-advanced applied architecture methodology and framework help those business units attain their information needs, but also be in a position to exploit IT’s helping hand when needed?

Forde: It’s mostly providing the context to frame the problem in a way that it can be addressed, chunked down to reasonable delivery timeframes, and then marshaling the resources to bring that to reality.

From a pure framework and applied methodology standpoint, if you’re coming at it from an idealized situation, you’re going to be doing it from a strategic business need and you’re going to be talking to the business units about what their capability and functional needs are. And at that time, you’re really in the place of what business processes they’re dealing with and what information they need in order to accomplish what the particular set of goals is.

This is way in advance of any particular technology choice being made. That’s the idealized situation, but that’s typically what most frameworks, and in particular, the TOGAF 9 Framework from The Open Group, would go for.

Gardner: We’re just beginning these conversations about advanced concepts in EA and there are going to be quite a bit more offerings and feedback and collaboration around this subject at The Open Group Conference in Austin. Perhaps before we sign off, Jason, you can give us a quick encapsulation of what you will be discussing in terms of your presentation at the conference.

Uppal: One of the things that we’ve been looking at from the industry’s point of view is saying that this conversation around the frameworks is a done deal now, because everybody accepted that we have good enough frameworks. We’re moving to the next phase of what we do with these frameworks.

In our future conferences, we’re going to be addressing that and saying what people are specifically doing with these frameworks, not to debate the framework itself, but the application of it.

Continuous planning

In Austin we’ll be looking at how we’re using a TOGAF framework to improve ongoing annual business and IT planning. We have a specific example that we are going to bring out where we looked at an organization that was doing once-a-year planning. That was not a very effective way for the organizations. They wanted to change it to continuous planning, which means planning that happens throughout the year.

We identified four or five very specific measurable goals that the program had, such as accuracy of your plan, business goals being achieved by the plan, time and cost to manage and govern the plan, and stakeholders’ satisfaction. Those are the areas that we are defining as to how the TOGAF like framework will be applied to solve a specific problem like enterprise planning and governance.

That’s something we will be bringing to our conference in Austin and that event will be held on a Sunday. In the future, we’ll be doing a lot more of those specific applications of a framework like a TOGAF to a unique set of problems that are very tangible and they very quickly resonate with the executives, not in IT, but in the entire organization.

Forde: Can I follow along with a little bit of a plug here, Dana.

Gardner: Certainly.

Forde: Jason is going to be talking as a senior architect on the applied side of TOGAF on this Sunday. For the Monday plenary, this is basically the rundown. We have David Baker, a Principal from PricewaterhouseCoopers, talking about business driven architecture for strategic transformations.

Following that, Tim Barnes, the Chief Architect at Devon Energy out of Canada, covering what they are doing from an EA perspective with their organization.

Then, we’re going to wrap up the morning with Mike Walker, the Principal Architect for EA Strategy and Architecture at Microsoft, talking about IT Architecture to the Enterprise Architecture.

This is a very powerful lineup of people addressing this business focus in EA and the application of it for strategic transformations, which I think are issues that many, many organizations are struggling with.

Gardner: Looking at, again, the question I started us off with, how do TOGAF and EA affect the bottom line? We’ve heard about how it affects the implementation for business transformation processes. We’ve talked about operational governance. We looked at how sourcing, business process management and implementation, and ongoing refinement are impacted. We also got into data and how analytics and information sharing are affected. Then, as Jason just mentioned, planning and strategy as a core function across a variety of different types of business problems.

So, I don’t think we can in any way say that there’s a minor impact on the bottom line from this. Last word to you, Jason.

Uppal: This is a time now for the enterprise architects to really step up to the plate and be accountable for real performance influence on the organization’s bottom line.

If we can improve things like exploiting assets better today than what we have, improve our planning program, and have very measurable and unambiguous performance indicator that we’re committing to, this is a huge step forward for enterprise architects and moving away from technology and frameworks to real-time problems that resonate with executives and align to business and in IT.

Gardner: Well, great. You’ve been listening to a sponsored podcast discussion in conjunction with The Open Group Conference in Austin, Texas, the week of July 18, 2011.

I would like to thank our guests. We have been joined by Chris Forde, Vice President of Enterprise Architecture and Membership Capabilities for The Open Group. Thanks, Chris.

Forde: Thanks, Dana.

Gardner: And also Jason Uppal. He is the Chief Architect at QR Systems. Thank you, Jason.

Uppal: Thank you, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks for joining, and come back next time.

Jason Uppal will be presenting “Advanced Concepts in Applying TOGAF 9” at The Open Group Conference, Austin, July 18-22. Join us for best practices and case studies on Enterprise Architecture, Cloud, Security and more, presented by preeminent thought leaders in the industry.

Copyright The Open Group 2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirect™ blogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

Comments Off

Filed under Enterprise Architecture, TOGAF®

EA Fundamentalism

By Stuart Boardman, Getronics

It’s an unfortunate fact that when times get tough, the tough, rather than get going, tend to pick on other people. What we see is that most formal and informal groups tend to turn inwards and splinter into factions, each possessing the only true gospel. When times are good, we’re all too busy doing what we actually want to do to want to waste time sniping at other folks.

Maybe this isn’t the reason but it strikes me that in the EA blogosphere at the moment (e.g. the EA group on LinkedIn) every discussion seems to deteriorate into debate about what the proper definition of EA is (guess how many different “right answers” there are) or which of TOGAF® or Zachman or <insert your favourite framework here> is the (only) correct framework or why all of them are totally wrong, or worse still, what the correct interpretation of the minutiae of some aspect of framework X might be.

Perhaps the only comfort we can draw from the current lack of proper recognition of EA by the business is the fact that the Zachmanites are actually not firing bullets at the Rheinlanders (or some other tribe). Apart from the occasional character assassination, it’s all reasonably civilized. There’s just not enough to lose. But this sort of inward looking debate gets us nowhere.

I use TOGAF® . If you use another framework that’s better suited to your purpose, I don’t have a problem with that. I use it as framework to help me think. That’s what frameworks are for. A good framework doesn’t exclude the possibility that you use other guidance and insights to address areas it doesn’t cover. For example, I make a lot of use of the Business Model Canvas from Osterwalder and Pigneur and I draw ideas from folks like Tom Graves (who in turn has specialized the Business Model Canvas to EA). A framework (and any good methodology) is not a cookbook. If you understand what it tries to achieve, you can adapt it to fit each practical situation. You can leave the salt out. You can even leave the meat out! There are some reasonable criticisms of TOGAF® from within and outside The Open Group. But I can use TOGAF® with those in mind. And I do. One of the things I like about The Open Group is that it’s open to change – and always working on it. So the combination of The Open Group and TOGAF® and the awareness of important things coming from other directions provides me with an environment that, on the one hand, encourages rigour, and on the other, constantly challenges my assumptions.

It’s not unusual in my work that I liaise with other people officially called Enterprise Architects. Some of these folks think EA is only about IT. Some of them think it’s only about abstractions. I also work with Business Architects and Business Process Architects and Business Strategists and Requirements Engineers and….. I could go on for a very long time indeed. All of these people have definitions of their own scope and responsibilities, which overlap quite enough to allow not just for fundamentalism but also serious turf wars. Just as out there in the real world, the fundamentalists and those who define their identity by what they are not are the ones who start wars which everyone loses.

The good news is that just about enough of the time enough of these folks are happy to look at what we are all trying to achieve and who can bring what to the party and will work together to produce a result that justifies our existence. And every time that happens I learn new things – things that will make a me a better Enterprise Architect. So if I get noticeably irritated by the religious disputes and respond a bit unreasonably in web forum debates, I hope you’ll forgive me. I don’t like war.

By the way, credit for the “fundamentalism” analogy goes to my friend and former colleague, François Belanger. Thanks François.

Enterprise Architecture will be a major topic of discussion at The Open Group Conference, Austin, July 18-22. Join us for best practices and case studies on Enterprise Architecture, Cloud, Security and more, presented by preeminent thought leaders in the industry.

Stuart Boardman is a Senior Business Consultant with Getronics Consulting where he co-leads the Enterprise Architecture practice as well as the Cloud Computing solutions group. He is co-lead of The Open Group Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI. He is a frequent speaker at conferences on the topics of Cloud, SOA, and Identity.

4 Comments

Filed under Enterprise Architecture, TOGAF®