Tag Archives: TOGAF

Summer in the Capitol – Looking Back at The Open Group Conference in Washington, D.C.

By Jim Hietala, The Open Group

This past week in Washington D.C., The Open Group held our Q3 conference. The theme for the event was “Cybersecurity – Defend Critical Assets and Secure the Global Supply Chain,” and the conference featured a number of thought-provoking speakers and presentations.

Cybersecurity is at a critical juncture, and conference speakers highlighted the threat and attack reality and described industry efforts to move forward in important areas. The conference also featured a new capability, as several of the events were Livestreamed to the Internet.

For those who did not make the event, here’s a summary of a few of the key presentations, as well as what The Open Group is doing in these areas.

Joel Brenner, attorney with Cooley, was our first keynote. Joel’s presentation was titled, “Turning Us Inside-Out: Crime and Economic Espionage on our Networks,” The talk mirrored his recent book, “America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime, and Warfare,” and Joel talked about current threats to critical infrastructure, attack trends and challenges in securing information. Joel’s presentation was a wakeup call to the very real issues of IP theft and identity theft. Beyond describing the threat and attack landscape, Joel discussed some of the management challenges related to ownership of the problem, namely that the different stakeholders in addressing cybersecurity in companies, including legal, technical, management and HR, all tend to think that this is someone else’s problem. Joel stated the need for policy spanning the entire organization to fully address the problem.

Kristin Baldwin, principal deputy, systems engineering, Office of the Assistant Secretary of Defense, Research and Engineering, described the U.S. Department of Defense (DoD) trusted defense systems strategy and challenges, including requirements to secure their multi-tiered supply chain. She also talked about how the acquisition landscape has changed over the past few years. In addition, for all programs the DoD now requires the creation of a program protection plan, which is the single focal point for security activities on the program. Kristin’s takeaways included needing a holistic approach to security, focusing attention on the threat, and avoiding risk exposure from gaps and seams. DoD’s Trusted Defense Systems Strategy provides an overarching framework for trusted systems. Stakeholder integration with acquisition, intelligence, engineering, industry and research communities is key to success. Systems engineering brings these stakeholders, risk trades, policy and design decisions together. Kristin also stressed the importance of informing leadership early and providing programs with risk-based options.

Dr. Ron Ross of NIST presented a perfect storm of proliferation of information systems and networks, increasing sophistication of threat, resulting in an increasing number of penetrations of information systems in the public and private sectors potentially affecting security and privacy. He proposed a need an integrated project team approach to information security. Dr. Ross also provided an overview of the changes coming in NIST SP 800-53, version 4, which is presently available in draft form. He also advocated a dual protection strategy approach involving traditional controls at network perimeters that assumes attackers outside of organizational networks, as well as agile defenses, are already inside the perimeter. The objective of agile defenses is to enable operation while under attack and to minimize response times to ongoing attacks. This new approach mirrors thinking from the Jericho Forum and others on de-perimeterization and security and is very welcome.

The Open Group Trusted Technology Forum provided a panel discussion on supply chain security issues and the approach that the forum is taking towards addressing issues relating to taint and counterfeit in products. The panel included Andras Szakal of IBM, Edna Conway of Cisco and Dan Reddy of EMC, as well as Dave Lounsbury, CTO of The Open Group. OTTF continues to make great progress in the area of supply chain security, having published a snapshot of the Open Trusted Technology Provider Framework, working to create a conformance program, and in working to harmonize with other standards activities.

Dave Hornford, partner at Conexiam and chair of The Open Group Architecture Forum, provided a thought provoking presentation titled, “Secure Business Architecture, or just Security Architecture?” Dave’s talk described the problems in approaches that are purely focused on securing against threats and brought forth the idea that focusing on secure business architecture was a better methodology for ensuring that stakeholders had visibility into risks and benefits.

Geoff Besko, CEO of Seccuris and co-leader of the security integration project for the next version of TOGAF®, delivered a presentation that looked at risk from a positive and negative view. He recognized that senior management frequently have a view of risk embracing as taking risk with am eye on business gains if revenue/market share/profitability, while security practitioners tend to focus on risk as something that is to be mitigated. Finding common ground is key here.

Katie Lewin, who is responsible for the GSA FedRAMP program, provided an overview of the program, and how it is helping raise the bar for federal agency use of secure Cloud Computing.

The conference also featured a workshop on security automation, which featured presentations on a number of standards efforts in this area, including on SCAP, O-ACEML from The Open Group, MILE, NEA, AVOS and SACM. One conclusion from the workshop was that there’s presently a gap and a need for a higher level security automation architecture encompassing the many lower level protocols and standards that exist in the security automation area.

In addition to the public conference, a number of forums of The Open Group met in working sessions to advance their work in the Capitol. These included:

All in all, the conference clarified the magnitude of the cybersecurity threat, and the importance of initiatives from The Open Group and elsewhere to make progress on real solutions.

Join us at our next conference in Barcelona on October 22-25!

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Cybersecurity, Enterprise Architecture, Information security, OTTF, Security Architecture, Supply chain risk, TOGAF®

TOGAF® 9 Certification Growth – Number of Individuals Certified Doubles in the Last 12 Months – Now Over 14,800

By Andrew Josey, The Open Group

The number of individuals certified in the TOGAF® 9 certification program as of July 1st 2012 is 14,851. This represents a doubling of the number of individuals certified in the last 12 months with 7,640 new certifications during that period. The latest statistics show that certifications are now growing at two thousand individuals per quarter.

TOGAF is being adopted globally. The top five countries include the UK, Netherlands, USA, Australia and India.

Here is a list of individuals certifications among the top 20 countries, as of July 2012

Rank # Individuals Country Percentage
1 2444 UK 16.2
2 1916 USA 12.8
3 1607 Netherlands 10.8
4 1093 Australia 7.3
5 913 India 6.1
6 705 Canada 4.7
7 637 South Africa 4.2
8 524 Finland 3.5
9 517 France 3.4
10 434 China 2.8
11 379 Norway 2.5%
12 344 Sweden 2.3%
13 280 Germany 1.8%
14 271 Belgium 1.8%
15 244 United Arab Emirates 1.6%
16 224 Denmark 1.5%
17 209 Japan 1.4%
18 176 New Zealand 1.1%
19 173 Saudi Arabia 1.1%
20 136 Czech Republic 0.9%

There are 43 TOGAF 9 training partners worldwide and 48 accredited TOGAF 9 courses.  More information on TOGAF 9 Certification, including the official accredited training course calendar and a directory of certified people and, can be found on The Open Group website at: http://www.opengroup.org/togaf9/cert/.

(This blog post was edited on August 16)

Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

14 Comments

Filed under Certifications, Enterprise Architecture, TOGAF®

ArchiMate at the Washington, D.C. Conference #ogDCA

By Iver Band, Standard Insurance Company

The Open Group offers many opportunities to learn about ArchiMate®, the fast-growing visual modeling language standard for Enterprise Architecture. ArchiMate enables enterprise architects to develop rich and clear graphical representations that are accessible to a wide range of stakeholders while providing clear direction to downstream architects and designers. Looking forward to this week’s Washington, D.C. conference, let’s examine the various sessions where attendees can learn more about this modeling language standard.

On Sunday, July 15, start with the ArchiMate 2.0 pre-conference introductory session from 4:30-5:00 p.m. ET led by BiZZdesign CEO and ArchiMate Forum Chair Henry Franken. Right afterward, from 5:00-5:30 ET, learn about ArchiMate certification along with other certifications offered by The Open Group.  Conference attendees can engage further with the language at one of the interactive Learning Lab sessions from 5:30-6:15 p.m. ET.

On Tuesday, July 17, learn how to use the ArchiMate language for architecture projects based on TOGAF®.  From 11:30-12:45 p.m. ET, I will join Henry, and together, we will present an in-depth tutorial on “Using the TOGAF Architecture Content Framework with the ArchiMate Modeling Language.” From 2:00-2:45 p.m. ET,  I will explore how to use ArchiMate to shed light on the complex interplay between people and organizations, and their often conflicting challenges, principles, goals and concerns.  My presentation “Modeling the Backstory with ArchiMate 2.0 Motivation Extension” will demonstrate this approach with a case study on improving customer service. Then, from 2:45-3:30 p.m. ET, The Business Forge Principal Neil Levette will present the session “Using the ArchiMate Standard as Tools for Modeling the Business.” Neil will explain how to use the ArchiMate language with Archi, a free tool, to model key business management mechanisms and the relationships between business motivations and operations. Finally, from 4:00-5:30 p.m. ET, Henry and I will join the “Ask the Experts: TOGAF and ArchiMate” panel to address conference attendee and Open Group member questions.

Don’t miss these opportunities to learn more about this powerful standard!

Iver Band is the vice chair of The Open Group ArchiMate Forum and is an enterprise architect at Standard Insurance Company in Portland, Oregon. Iver chose the TOGAF and ArchiMate standards for his IT organization and applies them enthusiastically to his daily responsibilities. He co-developed the initial examination content for the ArchiMate 2 Certification for People  and made other contributions to the ArchiMate 2 standard. He is TOGAF 9 Certified,  ArchiMate 2 Certified and a Certified Information Systems Security Professional.

Comments Off

Filed under ArchiMate®, Conference, Enterprise Architecture

Leveraging TOGAF to Deliver DoDAF Capabilities

By Chris Armstrong, Armstrong Process Group

In today’s environment of competing priorities and constrained resources, companies and government agencies are in even greater need to understand how to balance those priorities, leverage existing investments and align their critical resources to realize their business strategy. Sound appealing? It turns out that this is the fundamental goal of establishing an Enterprise Architecture (EA) capability. In fact, we have seen some of our clients position EA as the Enterprise Decision Support capability – that is, providing an architecture-grounded, fact-based approach to making business and IT decisions.

Many government agencies and contractors have been playing the EA game for some time — often in the context of mandatory compliance with architecture frameworks, such as the Federal Enterprise Architecture (FEA) and the Department of Defense Architecture Framework (DoDAF). These frameworks often focus significantly on taxonomies and reference models that organizations are required to use when describing their current state and their vision of a future state. We’re seeing a new breed of organizations that are looking past contractual compliance and want to exploit the business transformation dimension of EA.

In the Department of Defense (DoD) world, this is in part due to the new “capability driven” aspect of DoDAF version 2.0, where an organization aligns its architecture to a set of capabilities that are relevant to its mission. The addition of the Capability Viewpoint (CV) in DoDAF 2 enables organizations to describe their capability requirements and how their organization supports and delivers those capabilities. The CV also provides models for representing capability gaps and how new capabilities are going to be deployed over time and managed in the context of an overall capability portfolio.

Another critical difference in DoDAF 2 is the principle of “fit-for-purpose,” which allows organizations to select which architecture viewpoints and models to develop based on mission/program requirements and organizational context. One fundamental consequence of this is that an organization is no longer required to create all the models for each DoDAF viewpoint. They are to select the models and viewpoints that are relevant to developing and deploying their new, evolved capabilities.

While DoDAF 2 does provide some brief guidance on how to build architecture descriptions and subsequently leverage them for capability deployment and management, many organizations are seeking a more well-defined set of techniques and methods based on industry standard best practices.

This is where the effectiveness of DoDAF 2 can be significantly enhanced by integrating it with The Open Group Architecture Framework (TOGAF®) version 9.1, in particular the TOGAF Architecture Development Method (ADM). The ADM not only describes how to develop descriptions of the baseline and target architectures, but also provides considerable guidance on how to establish an EA capability and performing architecture roadmapping and migration planning. Most important, the TOGAF ADM describes how to drive the realization of the target architecture through integration with the systems engineering and solution delivery lifecycles. Lastly, TOGAF describes how to sustain an EA capability through the operation of a governance framework to manage the evolution of the architecture. In a nutshell, DoDAF 2 provides a common vocabulary for architecture content, while TOGAF provides a common vocabulary for developing and using that content.

I hope that those of you in the Washington, D.C. area will join me at The Open Group conference next week, where we’ll continue the discussion of how to deliver DoDAF capabilities using TOGAF. For those of you who can’t make it, I’m pleased to announce that The Open Group will also be delivering a Livestream of my presentation (free of charge) on Monday, July 16 at 2:45 p.m. ET.

Hope to see you there!

Chris Armstrong, president of Armstrong Process Group, Inc., is an internationally recognized thought leader in Enterprise Architecture, formal modeling, process improvement, systems and software engineering, requirements management, and iterative and agile development. Chris represents APG at The Open Group, the Object Management Group and the Eclipse Foundation.

 

2 Comments

Filed under Conference, Enterprise Architecture, TOGAF®

Adapting to an eBook World

By Chris Harding, The Open Group

Have you ever wanted to read something to prepare for a meeting while traveling, but  been frustrated by the difficulty of managing paper or a bulky PC? Travelers who read for pleasure have found eBooks a very convenient way to meet their needs. This format is now becoming available for select Open Group standards and guides, so that you can read them more easily when “on the road.”

The eBook format allows the device to lay out the text, rather than trying to fit pre-formatted pages to devices of all shapes and size (It is based on HTML). This makes reading an eBook a much easier and more pleasant experience than trying to read a static format such as PDF on a device where the page doesn’t fit.

There are portable electronic devices designed primarily for the purpose of reading digital books – the Amazon Kindle is the best known – but eBooks can also be read on tablets, mobile phones (on which the quality can be surprisingly good) and, of course, on laptops, using free-to-download software apps. The eBook readers are, essentially, small-sized special-purpose tablets with superb text display quality and – a big advantage on a long flight – batteries that can go weeks rather than hours without re-charging. As the quality and battery life of tablets continues to improve, they are starting to overtake specialized reader devices, which have one major disadvantage: a lack of standardization.

There are a number of different eBook formats, the most prominent being EPUB, an open standard created by the International Digital Publishing Forum, KF8, the proprietary format used by Amazon Kindle, and Mobipocket, a format that the Kindle will also handle (There is an excellent Wikipedia article on eBook formats, see http://en.wikipedia.org/wiki/Comparison_of_e-book_formats). You can read any of the most popular formats on a tablet (or PC, Mac, iPhone or Android device) using a software app, but you are likely to find that a specialized reader device is limited in the formats that it can handle.

Many of the Open Group SOA Standards and Guides are now freely available in the EPUB and Mobipocket formats from The Open Group bookstore. See http://soa-standards.opengroup.org/post/eBook-Versions-of-SOA-Standards-and-Guides-5884765 for the current list. We are hoping to make all our new SOA standards and guides available in this way, and also some Open Group publications on Cloud Computing. EPUB versions of TOGAF® Version 9.1, the TOGAF 9.1 Pocket Guide and the TOGAF 9 study guides are available for purchase from The Open Group’s official publisher, Van Haren. The SOA and the TOGAF EPUBS can be obtained from The Open Group bookstore at http://www.opengroup.org/bookstore/catalog .

Thirty years ago, I used to attend meetings of the CCITT (now the ITU-T) in Geneva. The trolleys that were pushed around the UN building, piled high with working documents for distribution to delegates, were an impressive sight, but the sheer weight of paper that had to be carried to and from the meetings was a real problem. Laptops with Internet access have removed the need to carry documents. Now, eBooks are making it easy to read them while traveling!

We have started to make eBook versions of our standards and guides available and are still exploring the possibilities. We’d love to hear your thoughts on what will or won’t work, and what will work best.  Please feel free to share your ideas in the comments section below.

Andrew Josey, director of standards at The Open Group, contributed to the technical aspects of this blog post. 

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

Comments Off

Filed under Cloud/SOA, TOGAF®

Learn How Enterprise Architects Can Better Relate TOGAF and DoDAF to Bring Best IT Practices to Defense Contracts

By Dana Gardner, Interarbor Solutions

This BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference in Washington, D.C., beginning July 16. The conference will focus on how Enterprise Architecture (EA), enterprise transformation, and securing global supply chains.

We’re joined by one of the main speakers at the July 16 conference, Chris Armstrong, President of Armstrong Process Group, to examine how governments in particular are using various frameworks to improve their architectural planning and IT implementations.

Armstrong is an internationally recognized thought leader in EA, formal modeling, process improvement, systems and software engineering, requirements management, and iterative and agile development.

He represents the Armstrong Process Group at the Open Group, the Object Management Group (OMG), and Eclipse Foundation. Armstrong also co-chairs The Open Group Architectural Framework (TOGAF®), and Model Driven Architecture (MDA) process modeling efforts, and also the TOGAF 9 Tool Certification program, all at The Open Group.

At the conference, Armstrong will examine the use of TOGAF 9 to deliver Department of Defense (DoD) Architecture Framework or DoDAF 2 capabilities. And in doing so, we’ll discuss how to use TOGAF architecture development methods to drive the development and use of DoDAF 2 architectures for delivering new mission and program capabilities. His presentation will also be Livestreamed free from The Open Group Conference. The full podcast can be found here.

Here are some excerpts:

Gardner: TOGAF and DoDAF, where have they been? Where are they going? And why do they need to relate to one another more these days?

Armstrong: TOGAF [forms] a set of essential components for establishing and operating an EA capability within an organization. And it contains three of the four key components of any EA.

First, the method by which EA work is done, including how it touches other life cycles within the organization and how it’s governed and managed. Then, there’s a skills framework that talks about the skills and experiences that the individual practitioners must have in order to participate in the EA work. Then, there’s a taxonomy framework that describes the semantics and form of the deliverables and the knowledge that the EA function is trying to manage.

One-stop shop

One of the great things that TOGAF has going for it is that, on the one hand, it’s designed to be a one-stop shop — namely providing everything that a end-user organization might need to establish an EA practice. But it does acknowledge that there are other components, predominantly in the various taxonomies and reference models, that various end-user organizations may want to substitute or augment.

It turns out that TOGAF has a nice synergy with other taxonomies, such as DoDAF, as it provides the backdrop for how to establish the overall EA capability, how to exploit it, and put it into practice to deliver new business capabilities.

Frameworks, such as DoDAF, focus predominantly on the taxonomy, mainly the kinds of things we’re keeping track of, the semantics relationships, and perhaps some formalism on how they’re structured. There’s a little bit of method guidance within DoDAF, but not a lot. So we see the marriage of the two as a natural synergy.

Gardner: So their complementary natures allows for more particulars on the defense side, but the overall TOGAF looks at the implementation method and skills for how this works best. Is this something new, or are we just learning to do it better?

Armstrong: I think we’re seeing the state of industry advance and looking at trying to have the federal government, both United States and abroad, embrace global industry standards for EA work. Historically, particularly in the US government, a lot of defense agencies and their contractors have often been focusing on a minimalistic compliance perspective with respect to DoDAF. In order to get paid for this work or be authorized to do this work, one of our requirements is we must produce DoDAF.

People are doing that because they’ve been commanded to do it. We’re seeing a new level of awareness. There’s some synergy with what’s going on in the DoDAF space, particularly as it relates to migrating from DoDAF 1.5 to DoDAF 2.

Agencies need some method and technique guidance on exactly how to come up with those particular viewpoints that are going to be most relevant, and how to exploit what DoDAF has to offer, in a way that advances the business as opposed to just solely being to conforming or compliant?

Gardner: Have there been hurdles, perhaps culturally, because of the landscape of these different companies and their inability to have that boundary-less interaction. What’s been the hurdle? What’s prevented this from being more beneficial at that higher level?

Armstrong: Probably overall organizational and practitioner maturity. There certainly are a lot of very skilled organizations and individuals out there. However, we’re trying to get them all lined up with the best practice for establishing an EA capability and then operating it and using it to a business strategic advantage, something that TOGAF defines very nicely and which the DoDAF taxonomy and work products hold in very effectively.

Gardner: Help me understand, Chris. Is this discussion that you’ll be delivering on July 16 primarily for TOGAF people to better understand how to implement vis-à-vis, DoDAF, is this the other direction, or is it a two-way street?

Two-way street

Armstrong: It’s a two-way street. One of the big things that particularly the DoD space has going for it is that there’s quite a bit of maturity in the notion of formally specified models, as DoDAF describes them, and the various views that DoDAF includes.

We’d like to think that, because of that maturity, the general TOGAF community can glean a lot of benefit from the experience they’ve had. What does it take to capture these architecture descriptions, some of the finer points about managing some of those assets. People within the TOGAF general community are always looking for case studies and best practices that demonstrate to them that what other people are doing is something that they can do as well.

We also think that the federal agency community also has a lot to glean from this. Again, we’re trying to get some convergence on standard methods and techniques, so that they can more easily have resources join their teams and immediately be productive and add value to their projects, because they’re all based on a standard EA method and framework.

One of the major changes between DoDAF 1 and DoDAF 2 is the focusing on fitness for purpose. In the past, a lot of organizations felt that it was their obligation to describe all architecture viewpoints that DoDAF suggests without necessarily taking a step back and saying, “Why would I want to do that?”

So it’s trying to make the agencies think more critically about how they can be the most agile, mainly what’s the least amount of architecture description that we can invest and that has the greatest possible value. Organizations now have the discretion to determine what fitness for purpose is.

Then, there’s the whole idea in DoDAF 2, that the architecture is supposed to be capability-driven. That is, you’re not just describing architecture, because you have some tools that happened to be DoDAF conforming, but there is a new business capability that you’re trying to inject into the organization through capability-based transformation, which is going to involve people, process, and tools.

One of the nice things that TOGAF’s architecture development method has to offer is a well-defined set of activities and best practices for deciding how you determine what those capabilities are and how you engage your stakeholders to really help collect the requirements for what fit for purpose means.

Gardner: As with the private sector, it seems that everyone needs to move faster. I see you’ve been working on agile development. With organizations like the OMG and Eclipse is there something that doing this well — bringing the best of TOGAF and DoDAF together — enables a greater agility and speed when it comes to completing a project?

Different perspectives

Armstrong: Absolutely. When you talk about what agile means to the general community, you may get a lot of different perspectives and a lot of different answers. Ultimately, we at APG feel that agility is fundamentally about how well your organization responds to change.

If you take a step back, that’s really what we think is the fundamental litmus test of the goodness of an architecture. Whether it’s an EA, a segment architecture, or a system architecture, the architects need to think thoughtfully and considerately about what things are almost certainly going to happen in the near future. I need to anticipate, and be able to work these into my architecture in such a way that when these changes occur, the architecture can respond in a timely, relevant fashion.

We feel that, while a lot of people think that agile is just a pseudonym for not planning, not making commitments, going around in circles forever, we call that chaos, another five letter word. But agile in our experience really demands rigor, and discipline.

Of course, a lot of the culture of the DoD brings that rigor and discipline to it, but also the experience that that community has had, in particular, of formally modeling architecture description. That sets up those government agencies to act agilely much more than others.

Gardner: Do you know of anyone that has done it successfully or is in the process? Even if you can’t name them, perhaps you can describe how something like this works?

Armstrong: First, there has been some great work done by the MITRE organization through their work in collaboration at The Open Group. They’ve written a white paper that talks about which DoDAF deliverables are likely to be useful in specific architecture development method activities. We’re going to be using that as a foundation for the talk we’re going to be giving at the conference in July.

The biggest thing that TOGAF has to offer is that a nascent organization that’s jumping into the DoDAF space may just look at it from an initial compliance perspective, saying, “We have to create an AV-1, and an OV-1, and a SvcV-5,” and so on.

Providing guidance

TOGAF will provide the guidance for what is EA. Why should I care? What kind of people do I need within my organization? What kind of skills do they need? What kind of professional certification might be appropriate to get all of the participants up on the same page, so that when we’re talking about EA, we’re all using the same language?

TOGAF also, of course, has a great emphasis on architecture governance and suggests that immediately, when you’re first propping up your EA capability, you need to put into your plan how you’re going to operate and maintain these architectural assets, once they’ve been produced, so that you can exploit them in some reuse strategy moving forward.

So, the preliminary phase of the TOGAF architecture development method provides those agencies best practices on how to get going with EA, including exactly how an organization is going to exploit what the DoDAF taxonomy framework has to offer.

Then, once an organization or a contractor is charged with doing some DoDAF work, because of a new program or a new capability, they would immediately begin executing Phase A: Architecture Vision, and follow the best practices that TOGAF has to offer.

Just what is that capability that we’re trying to describe? Who are the key stakeholders, and what are their concerns? What are their business objectives and requirements? What constraints are we going to be placed under?

Part of that is to create a high-level description of the current or baseline architecture descriptions, and then the future target state, so that all parties have at least a coarse-grained idea of kind of where we’re at right now, and what our vision is of where we want to be.

Because this is really a high level requirements and scoping set of activities, we expect that that’s going to be somewhat ambiguous. As the project unfolds, they’re going to discover details that may cause some adjustment to that final target.

Internalize best practices

So, we’re seeing defense contractors being able to internalize some of these best practices, and really be prepared for the future so that they can win the greatest amount of business and respond as rapidly and appropriately as possible, as well as how they can exploit these best practices to affect greater business transformation across their enterprises.

Gardner: We mentioned that your discussion on these issues, on July 16 will be Livestreamed for free, but you’re also doing some pre-conference and post-conference activities — webinars, and other things. Tell us how this is all coming together, and for those who are interested, how they could take advantage of all of these.

Armstrong: We’re certainly very privileged that The Open Group has offered this as opportunity to share this content with the community. On Monday, June 25, we’ll be delivering a webinar that focuses on architecture change management in the DoDAF space, particularly how an organization migrates from DoDAF 1 to DoDAF 2.

I’ll be joined by a couple of other people from APG, David Rice, one of our Principal Enterprise Architects who is a member of the DoDAF 2 Working Group, as well as J.D. Baker, who is the Co-chair of the OMG’s Analysis and Design Taskforce, and a member of the Unified Profile for DoDAF and MODAF (UPDM) work group, a specification from the OMG.

We’ll be talking about things that organizations need to think about as they migrate from DoDAF 1 to DoDAF 2. We’ll be focusing on some of the key points of the DoDAF 2 meta-model, namely the rearrangement of the architecture viewpoints and the architecture partitions and how that maps from the classical DoDAF 1.5 viewpoint, as well as focusing on this notion of capability-driven architectures and fitness for purpose.

We also have the great privilege after the conference to be delivering a follow-up webinar on implementation methods and techniques around advanced DoDAF architectures. Particularly, we’re going to take a closer look at something that some people may be interested in, namely tool interoperability and how the DoDAF meta-model offers that through what’s called the Physical Exchange Specification (PES).

We’ll be taking a look a little bit more closely at this UPDM thing I just mentioned, focusing on how we can use formal modeling languages based on OMG standards, such as UML, SysML, BPMN, and SoaML, to do very formal architectural modeling.

One of the big challenges with EA is, at the end of the day, EA comes up with a set of policies, principles, assets, and best practices that talk about how the organization needs to operate and realize new solutions within that new framework. If EA doesn’t have a hand-off to the delivery method, namely systems engineering and solution delivery, then none of this architecture stuff makes a bit of a difference.

Driving the realization

We’re going to be talking a little bit about how DoDAF-based architecture description and TOGAF would drive the realization of those capabilities through traditional systems, engineering, and software development method.

************

For more information on The Open Group’s upcoming conference in Washington, D.C., please visit: http://www.opengroup.org/dc2012

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

Comments Off

Filed under Conference, Enterprise Architecture, Enterprise Transformation, TOGAF®

Cybersecurity Threats Key Theme at Washington, D.C. Conference – July 16-20, 2012

By The Open Group Conference Team

Identify risks and eliminating vulnerabilities that could undermine integrity and supply chain security is a significant global challenge and a top priority for governments, vendors, component suppliers, integrators and commercial enterprises around the world.

The Open Group Conference in Washington, D.C. will bring together leading minds in technology and government policy to discuss issues around cybersecurity and how enterprises can establish and maintain the necessary levels of integrity in a global supply chain. In addition to tutorial sessions on TOGAF and ArchiMate, the conference offers approximately 60 sessions on a varied of topics, including:

  • Cybersecurity threats and key approaches to defending critical assets and securing the global supply chain
  • Information security and Cloud security for global, open network environments within and across enterprises
  • Enterprise transformation, including Enterprise Architecture, TOGAF and SOA
  • Cloud Computing for business, collaborative Cloud frameworks and Cloud architectures
  • Transforming DoD avionics software through the use of open standards

Keynote sessions and speakers include:

  • America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime and Warfare - Keynote Speaker: Joel Brenner, author and attorney at Cooley LLP
  • Meeting the Challenge of Cybersecurity Threats through Industry-Government Partnerships - Keynote Speaker: Kristin Baldwin, principal deputy, deputy assistant secretary of defense for Systems Engineering
  • Implementation of the Federal Information Security Management Act (FISMA) - Keynote Speaker: Dr. Ron Ross, project leader at NIST (TBC)
  • Supply Chain: Mitigating Tainted and Counterfeit Products - Keynote Panel: Andras Szakal, VP and CTO at IBM Federal; Daniel Reddy, consulting product manager in the Product Security Office at EMC Corporation; John Boyens, senior advisor in the Computer Security Division at NIST; Edna Conway, chief security strategist of supply chain at Cisco; and Hart Rossman, VP and CTO of Cyber Security Services at SAIC
  • The New Role of Open Standards – Keynote Speaker: Allen Brown, CEO of The Open Group
  • Case Study: Ontario Healthcare - Keynote Speaker: Jason Uppal, chief enterprise architect at QRS
  • Future Airborne Capability Environment (FACE): Transforming the DoD Avionics Software Industry Through the Use of Open Standards - Keynote Speaker: Judy Cerenzia, program director at The Open Group; Kirk Avery of Lockheed Martin; and Robert Sweeney of Naval Air Systems Command (NAVAIR)

The full program can be found here: http://www3.opengroup.org/events/timetable/967

For more information on the conference tracks or to register, please visit our conference registration page. Please stay tuned throughout the next month as we continue to release blog posts and information leading up to The Open Group Conference in Washington, D.C. and be sure to follow the conference hashtag on Twitter – #ogDCA!

1 Comment

Filed under ArchiMate®, Cloud, Cloud/SOA, Conference, Cybersecurity, Enterprise Architecture, Information security, OTTF, Standards, Supply chain risk

Part 3 of 3: Building an Enterprise Architecture Value Proposition Using TOGAF® 9.1. and ArchiMate® 2.0

By Serge Thorn, Architecting the Enterprise

This is the third and final post in a three-part series by Serge Thorn. For more in this series, please see Part One and Part Two.

Value Management uses a combination of concepts and methods to create sustainable value for both organizations and their stakeholders. Some tools and techniques are specific to Value Management and others are generic tools that many organizations and individuals use. There exist many Value Management techniques such as cost-benefits analysis, SWOT analysis, value analysis, Pareto analysis, objectives hierarchy, function analysis system technique (FAST), and more…

The one I suggest to illustrate is close to the objectives hierarchy technique, which is a diagrammatic process for identifying objectives in a hierarchical manner and often used in conjunction with business functions. Close, because I will use a combination of the TOGAF® 9.1 metamodel with the ArchiMate® 2.0 Business Layer, Application Layer and Motivation Extensions Metamodels, consider core entities such as value, business goals, objectives, business processes and functions, business and application services, application functions and components. This approach was inspired by the presentation by Michael van den Dungen and Arjan Visser at The Open Group Conference in Amsterdam 2010, and here I’m also adding some ArchiMate 2.0 concepts.

First, the entities from the TOGAF 9.1 metamodel:

Then I will consider the entities from ArchiMate 2.0. Some may be identical to TOGAF 9.1. In the Business Layer, one key concept will obviously be the value. In this case I will consider the product (“A coherent collection of services, accompanied by a contract/set of agreements, which is offered as a whole to (internal or external) customers” according to ArchiMate 2.0), as the Enterprise Architecture program. In addition to that, I would refer to business services, functions, and processes.

In the Motivation Extension Metamodel, the goals. The objective entity in TOGAF 9.1 can also be represented using the concept of “goal.”

And in the Application Layer Metamodel, application services, functions, and components.

It is important to mention that when we deliver a value proposition, we must demonstrate to the business where the benefits will be with concrete examples. For example: the business sees Operational Excellence and Customer Intimacy as key drivers, and soon you will realize that BPM suites or CRM could support the business goals. These are the reasons why we consider the Application Layer Metamodel.

We could then use a combination of the ArchiMate 2.0 viewpoints such as: Stakeholder Viewpoint, Goal Realization Viewpoint, Motivation Viewpoint, or some other viewpoints to demonstrate the value of Enterprise Architecture for a specific business transformation program (or any other strategic initiative).

To be mentioned that the concept of benefit does not exist in any of the metamodels.

I have added the concept as an extension to ArchiMate in the following diagram which is the mapping of the value to a program related to the “improvement of customers’ relationships.” I also have intentionally limited the number of concepts or entities, such as processes, application services or measures.

Using these ArchiMate 2.0 modelling techniques can demonstrate to your stakeholders the value proposition for a business program, supported by an Enterprise Architecture initiative.

As a real example, if the expected key business benefit is operational excellence through process controls, which would represent a goal, you could present such a high level diagram to explain why application components like a BPM Suite could help (detecting fraud and errors, embedding preventive controls, continuously auditing and monitoring processes, and more).

There is definitely not a single way of demonstrating the value of Enterprise Architecture and you probably will have to adapt the process and the way you will present that value to all companies you will be working with. Without a doubt Enterprise Architecture contributes to the success of an organization and brings numerous benefits, but very often it needs to be able to demonstrate that value. Using some techniques as described previously will help to justify such an initiative.

The next steps will be the development of measures, metrics and KPIs to continuously monitor that value proposition.

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

Comments Off

Filed under ArchiMate®, Enterprise Architecture, Enterprise Transformation, TOGAF®

Part 2 of 3: Building an Enterprise Architecture Value Proposition Using TOGAF® 9.1. and ArchiMate® 2.0

By Serge Thorn, Architecting the Enterprise

This is the second post in a three-part series by Serge Thorn.

Continuing from Part One of this series, here are more examples of what an enterprise cannot achieve without Enterprise Architecture:

Reduce IT costs by consolidating, standardizing, rationalizing and integrating corporate information systems

Cost avoidance can be achieved by identifying overlapping functional scope of two or more proposed projects in an organization or the potential cost savings of IT support by standardizing on one solution.

Consolidation can happen at various levels for architectures — for shared enterprise services, applications and information, for technologies and even data centers.

This could involve consolidating the number of database servers, application or web servers and storage devices, consolidating redundant security platforms, or adopting virtualization, grid computing and related consolidation initiatives. Consolidation may be a by-product of another technology transformation or it may be the driver of these transformations.

Whatever motivates the change, the key is to be in alignment, once again, with the overall business strategy. Enterprise architects understand where the business is going, so they can pick the appropriate consolidation strategy. Rationalization, standardization and consolidation processes helps organizations understand their current enterprise maturity level and move forward on the appropriate roadmap.

More spending on innovation

Enterprise Architecture should serve as a driver of innovation. Innovation is highly important when developing a target Enterprise Architecture and in realizing the organization’s strategic goals and objectives. For example, it may help to connect the dots between business requirements and the new approaches SOA and cloud services can deliver.

Enabling strategic business goals via better operational excellence

Building Enterprise Architecture defines the structure and operation of an organization. The intent of Enterprise Architecture is to determine how an organization can most effectively achieve its current and future objectives. It must be designed to support an organization’s specific business strategies.

Jeanne W. Ross, Peter Weill, David C. Robertson in “Enterprise Architecture as Strategy: Creating a Foundation for Business” wrote “Companies with more-mature architectures reported greater success in achieving strategic goals” (p. 89). “This included better operational excellence, more customer intimacy, and greater product leadership” (p. 100).

Customer intimacy

Enterprises that are customer focused and aim to provide solutions for their customers should design their business model, IT systems and operational activities to support this strategy at the process level. This involves the selection of one or few high-value customer niches, followed by an obsessive effort at getting to know these customers in detail.

Greater product leadership

This approach enabled by Enterprise Architecture is dedicated to providing the best possible products from the perspective of the features and benefits offered to the customer. It is the basic philosophy about products that push performance boundaries. Products or services delivered by the business will be refined by leveraging IT to do the end customer’s job better. This will be accomplished by the delivery of new business capabilities (e.g. on-line websites, BI, etc.).

Comply with regulatory requirements

Enterprise Architecture helps companies to know and represent their processes and systems and how they correlate. This is fundamental for risk management and managing regulation requirements, such as those derived from Sarbanes-Oxley, COSO, HIPAA, etc.

This list could be continued as there are many other reasons why Enterprise Architecture brings benefits to organizations. Once your benefits have been documented you could also consider some value management techniques. TOGAF® 9.1 refers in the Architecture Vision phase to a target value proposition for a specific project.  In the next blog, we’ll address the issue of applying the value proposition to the Enterprise Architecture initiative as a whole.

The third and final part of this blog series will discuss value management. 

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

2 Comments

Filed under ArchiMate®, Enterprise Architecture, Enterprise Transformation, TOGAF®

Part 1 of 3: Building an Enterprise Architecture Value Proposition Using TOGAF® 9.1. and ArchiMate® 2.0

By Serge Thorn, Architecting the Enterprise

This is the first post in a three-part series by Serge Thorn. 

When introducing Enterprise Architecture as a program or initiative, it is regularly done from an IT perspective rarely considering what the costs will be and if there will be any return on investment. This presents a particular challenge to Enterprise Architecture.

Generally speaking, IT departments have all sorts of criteria to justify projects and measure their performance. They use measurements, metrics and KPIs. Going to the solution level, they commonly use indicators such as percentage uptime for systems from the system management team, error rates for applications from the development support team or number of calls resolved on the first call from the service desk, etc. These KPIs usually are defined at an early stage and very often delivered in dashboards from various support applications.

On the other hand, it is much more difficult to define and implement a quantifiable measure for Enterprise Architecture. Many activities introduced with appropriate governance will enhance the quality of the delivered products and services, but it still will be a challenge to attribute results to the quality of Enterprise Architecture efforts.

This being said, Enterprise Architects should be able to define and justify the benefits of their activities to their stakeholders, and to help executives understand how Enterprise Architecture will contribute to the primary value-adding objectives and processes, before starting the voyage. The more it is described and understood, the more the Enterprise Architecture team will gain support from the management. There are plenty of contributions that Enterprise Architecture brings and they will have to be documented and presented at an early stage.

There won’t be just one single answer to demonstrate the value of an Enterprise Architecture but there seems to be a common pattern when considering feedback from various companies I have worked with.

Without Enterprise Architecture you can probably NOT fully achieve:

IT alignment with the business goals

As an example among others, the problem with most IT plans is that they do not indicate what the business value is and what strategic or tactical business benefit the organization is planning to achieve. The simple matter is that any IT plan needs also to have a business metric, not only an IT metric of delivery. Another aspect is the ability to create and share a common vision of the future shared by the business and IT communities.

Integration

With the rapid pace of change in business environment, the need to transform organizations into agile enterprises that can respond quickly to change has never been greater. Methodologies and computer technologies are needed to enable rapid business and system change. The solution also lies in enterprise integration (both business and technology integration).

For business integration, we use Enterprise Architecture methodologies and frameworks to integrate functions, processes, data, locations, people, events and business plans throughout an organization. Specifically, the unification and integration of business processes and data across the enterprise and potential linkage with external partners become more and more important.

To also have technology integration, we may use enterprise portals, enterprise application integration (EAI/ESB), web services, service-oriented architecture (SOA), business process management (BPM) and try to lower the number of interfaces.

Change management

In recent years the scope of Enterprise Architecture has expanded beyond the IT domain and enterprise architects are increasingly taking on broader roles relating to organizational strategy and change management. Frameworks such as TOGAF® 9.1 include processes and tools for managing both the business/people and the technology sides of an organization. Enterprise Architecture supports the creation of changes related to the various architecture domains, evaluating the impact on the enterprise, taking into account risk management, financial aspects (cost/benefit analysis), and most importantly ensuring alignment with business goals and objectives. Enterprise Architecture value is essentially tied to its ability to help companies to deal with complexity and changes.

Reduced time to market and increased IT responsiveness

Enterprise Architecture should reduce systems development, applications generation and modernization timeframes for legacy systems. It should also decrease resource requirements. All of this can be accomplished by re-using standards or existing components, such as the architecture and solution building blocks in TOGAF 9.1. Delivery time and design/development costs can also be decreased by the reuse of reference models. All that information should be managed in an Enterprise Architecture repository.

Better access to information across applications and improved interoperability

Data and information architectures manage the organization assets of information, optimally and efficiently. This supports the quality, accuracy and timely availability of data for executive and strategic business decision-making, across applications.

Readily available descriptive representations and documentation of the enterprise

Architecture is also a set of descriptive representations (i.e. “models”) that are relevant for describing an enterprise such that it can be produced to management’s requirements and maintained over the period of its useful life. Using an architecture repository, developing a variety of artifacts and modelling some of the key elements of the enterprise, will contribute to build this documentation.

The second part of the series will include more examples of what an enterprise cannot achieve without Enterprise Architecture. 

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

3 Comments

Filed under ArchiMate®, Enterprise Architecture, Enterprise Transformation, TOGAF®

When Was Your Last Enterprise Architecture Maturity Assessment

By Serge Thorn, Architecting the Enterprise

Every company should plan regular architecture capability maturity assessments using a model. These should provide a framework that represents the key components of a productive enterprise architecture process. A model provides an evolutionary way to improve the overall process that starts out in an ad hoc state, transforms into an immature process, and then finally becomes a well defined, disciplined, managed and mature process. The goal is to enhance the overall odds for success of the Enterprise Architecture by identifying weak areas and providing a defined path towards improvement. As the architecture matures, it should increase the benefits it offers the organization.

Architecture maturity assessments help to determine how companies can maximize competitive advantage, identify ways of cutting costs, improve quality of services and reduce time to market. These assessments are undertaken as part of the Enterprise Architecture management. There are some methodologies for assessment of the comprehensive Enterprise Architecture maturity. Examples of these are the U.S. Department of Commerce ACMM, the Open Group architecture maturity model and a BSC-based Architecture Score Card presented by IFEAD. For application or technology portfolios, portfolio evaluation models can be used.

As a part of project development, assessments (in reality compliance) of architecture solutions are made against the business objectives and requirements (desired process and service structures and business models) and the constraints derived from the Enterprise Architecture context (these may be standards, principles, policies or other restrictions for solution development). Assessment and compliance of technologies are also a central part of Enterprise Architecture development projects. Finally, the development of Enterprise Architectures undergoes the scrutiny of the software development quality assurance method in use. Many IT providers have adopted a comprehensive software quality assurance approach like CMMI, or ISO/IEC 15504 (known as SPICE).

Using the Architecture Capability Maturity Model from TOGAF® 9.1 is a great way of evaluating the way companies have implemented the framework, to identify the gaps between the business vision and the business capabilities. Unfortunately no sufficient assessment instruments or tools have been developed for TOGAF based assessments.

Instruments or tools should contain maturity and documentation assessment questionnaires and a method on how to conduct such an assessment. In the following example you may observe four different phases on how to run an assessment:

Phase 1 would include several steps:

  • Planning & preparation workshop with the stakeholders. Stakeholders should represent both Business and IT.
  • Interviews with stakeholders based on a questionnaire related to all process areas (elements in TOGAF) or domains that have characteristics important to the development and deployment of Enterprise Architecture. Each process area could be divided into a number of practices, which are statements that describe the process area for each level of maturity, on a scale of 0 to 5. Each practice would have a set of practice indicators, evidence that the requirements for a process area to be at a given level have been met. A set of questions that will be asked in the interviews establishes whether or not the practice indicators exist and thus the level of maturity for the practice. If all the practices for a given level within a Process Area are present, then that level will be achieved. Ideally, directly relevant documentary evidence will be provided to demonstrate that the practice Indicator exists. As this is not always practical, sometimes for this exercise, only verbal evidence from subject matter experts will be considered.
  • Production of a report.
  • Calculation of a maturity score. For the representation, we use the term maturity level or organizational maturity as described below

Sources

  • CMMI for Development (Version 1.2, 2006)
  • Appraisal Requirements for CMMI (ARC) (Version 1.2, 2006)
  • The U.S. Department of Commerce Enterprise Architecture Capability Maturity Model (2007)
  • TOGAF® 9.1
  • NASCIO Enterprise Architecture Maturity Model (Version 1.3, 2003)

We then deliver a report which includes the maturity of each process area or element (There are more elements in this example than those in the chapter 51 of the TOGAF® Version 9.1).

The use of radar may also be a nice way to present the results. Please see the example below:

  • Presentation of the report to the stakeholders with strengths, weaknesses, gap analysis and recommendations
  • Next steps

Phase 2 would include several steps:

  • Based on results from Phase 1, a consensus workshop would produce a roadmap and action plan with recommendations to attain the next required level of maturity.
  • The workshop would provide a tool to produce an objective view of the report provided in Phase 1. This would give stakeholders and the senior management team a detailed view of how well Enterprise Architecture is deployed in the organization, it provides a full understanding of the business drivers and organization issues and a clear view of where the stakeholders want the organization to be. The outputs of this phase are priorities, and an action plan that is agreed and understood by the key stakeholders in the organization. There could also be a target radar diagram as shown below:

The updated report may then look like this:

Phase 3 would be the management of Enterprise Architecture as described in the report and Phase 4, which is similar to Phase 1.

Conducting an evaluation of an organization’s current practices against an architecture capability maturity assessment model allows companies to determine the level at which it currently stands. It will indicate the organization’s maturity in the area of Enterprise Architecture and highlight the practices that the organization needs to focus on in order to see the greatest improvement and the highest return on investment. The recommendation is that assessments should be carried out annually.

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

1 Comment

Filed under Enterprise Architecture, TOGAF®

Open Group Security Gurus Dissect the Cloud: Higher of Lower Risk

By Dana Gardner, Interarbor Solutions

For some, any move to the Cloud — at least the public Cloud — means a higher risk for security.

For others, relying more on a public Cloud provider means better security. There’s more of a concentrated and comprehensive focus on security best practices that are perhaps better implemented and monitored centrally in the major public Clouds.

And so which is it? Is Cloud a positive or negative when it comes to cyber security? And what of hybrid models that combine public and private Cloud activities, how is security impacted in those cases?

We posed these and other questions to a panel of security experts at last week’s Open Group Conference in San Francisco to deeply examine how Cloud and security come together — for better or worse.

The panel: Jim Hietala, Vice President of Security for The Open Group; Stuart Boardman, Senior Business Consultant at KPN, where he co-leads the Enterprise Architecture Practice as well as the Cloud Computing Solutions Group; Dave Gilmour, an Associate at Metaplexity Associates and a Director at PreterLex Ltd., and Mary Ann Mezzapelle, Strategist for Enterprise Services and Chief Technologist for Security Services at HP.

The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. The full podcast can be found here.

Here are some excerpts:

Gardner: Is this notion of going outside the firewall fundamentally a good or bad thing when it comes to security?

Hietala: It can be either. Talking to security people in large companies, frequently what I hear is that with adoption of some of those services, their policy is either let’s try and block that until we get a grip on how to do it right, or let’s establish a policy that says we just don’t use certain kinds of Cloud services. Data I see says that that’s really a failed strategy. Adoption is happening whether they embrace it or not.

The real issue is how you do that in a planned, strategic way, as opposed to letting services like Dropbox and other kinds of Cloud Collaboration services just happen. So it’s really about getting some forethought around how do we do this the right way, picking the right services that meet your security objectives, and going from there.

Gardner: Is Cloud Computing good or bad for security purposes?

Boardman: It’s simply a fact, and it’s something that we need to learn to live with.

What I’ve noticed through my own work is a lot of enterprise security policies were written before we had Cloud, but when we had private web applications that you might call Cloud these days, and the policies tend to be directed toward staff’s private use of the Cloud.

Then you run into problems, because you read something in policy — and if you interpret that as meaning Cloud, it means you can’t do it. And if you say it’s not Cloud, then you haven’t got any policy about it at all. Enterprises need to sit down and think, “What would it mean to us to make use of Cloud services and to ask as well, what are we likely to do with Cloud services?”

Gardner: Dave, is there an added impetus for Cloud providers to be somewhat more secure than enterprises?

Gilmour: It depends on the enterprise that they’re actually supplying to. If you’re in a heavily regulated industry, you have a different view of what levels of security you need and want, and therefore what you’re going to impose contractually on your Cloud supplier. That means that the different Cloud suppliers are going to have to attack different industries with different levels of security arrangements.

The problem there is that the penalty regimes are always going to say, “Well, if the security lapses, you’re going to get off with two months of not paying” or something like that. That kind of attitude isn’t going to go in this kind of security.

What I don’t understand is exactly how secure Cloud provision is going to be enabled and governed under tight regimes like that.

An opportunity

Gardner: Jim, we’ve seen in the public sector that governments are recognizing that Cloud models could be a benefit to them. They can reduce redundancy. They can control and standardize. They’re putting in place some definitions, implementation standards, and so forth. Is the vanguard of correct Cloud Computing with security in mind being managed by governments at this point?

Hietala: I’d say that they’re at the forefront. Some of these shared government services, where they stand up Cloud and make it available to lots of different departments in a government, have the ability to do what they want from a security standpoint, not relying on a public provider, and get it right from their perspective and meet their requirements. They then take that consistent service out to lots of departments that may not have had the resources to get IT security right, when they were doing it themselves. So I think you can make a case for that.

Gardner: Stuart, being involved with standards activities yourself, does moving to the Cloud provide a better environment for managing, maintaining, instilling, and improving on standards than enterprise by enterprise by enterprise? As I say, we’re looking at a larger pool and therefore that strikes me as possibly being a better place to invoke and manage standards.

Boardman: Dana, that’s a really good point, and I do agree. Also, in the security field, we have an advantage in the sense that there are quite a lot of standards out there to deal with interoperability, exchange of policy, exchange of credentials, which we can use. If we adopt those, then we’ve got a much better chance of getting those standards used widely in the Cloud world than in an individual enterprise, with an individual supplier, where it’s not negotiation, but “you use my API, and it looks like this.”

Having said that, there are a lot of well-known Cloud providers who do not currently support those standards and they need a strong commercial reason to do it. So it’s going to be a question of the balance. Will we get enough specific weight of people who are using it to force the others to come on board? And I have no idea what the answer to that is.

Gardner: We’ve also seen that cooperation is an important aspect of security, knowing what’s going on on other people’s networks, being able to share information about what the threats are, remediation, working to move quickly and comprehensively when there are security issues across different networks.

Is that a case, Dave, where having a Cloud environment is a benefit? That is to say more sharing about what’s happening across networks for many companies that are clients or customers of a Cloud provider rather than perhaps spotty sharing when it comes to company by company?

Gilmour: There is something to be said for that, Dana. Part of the issue, though, is that companies are individually responsible for their data. They’re individually responsible to a regulator or to their clients for their data. The question then becomes that as soon as you start to share a certain aspect of the security, you’re de facto sharing the weaknesses as well as the strengths.

So it’s a two-edged sword. One of the problems we have is that until we mature a little bit more, we won’t be able to actually see which side is the sharpest.

Gardner: So our premise that Cloud is good and bad for security is holding up, but I’m wondering whether the same things that make you a risk in a private setting — poor adhesion to standards, no good governance, too many technologies that are not being measured and controlled, not instilling good behavior in your employees and then enforcing that — wouldn’t this be the same either way? Is it really Cloud or not Cloud, or is it good security practices or not good security practices? Mary Ann?

No accountability

Mezzapelle: You’re right. It’s a little bit of that “garbage in, garbage out,” if you don’t have the basic things in place in your enterprise, which means the policies, the governance cycle, the audit, and the tracking, because it doesn’t matter if you don’t measure it and track it, and if there is no business accountability.

David said it — each individual company is responsible for its own security, but I would say that it’s the business owner that’s responsible for the security, because they’re the ones that ultimately have to answer that question for themselves in their own business environment: “Is it enough for what I have to get done? Is the agility more important than the flexibility in getting to some systems or the accessibility for other people, as it is with some of the ubiquitous computing?”

So you’re right. If it’s an ugly situation within your enterprise, it’s going to get worse when you do outsourcing, out-tasking, or anything else you want to call within the Cloud environment. One of the things that we say is that organizations not only need to know their technology, but they have to get better at relationship management, understanding who their partners are, and being able to negotiate and manage that effectively through a series of relationships, not just transactions.

Gardner: If data and sharing data is so important, it strikes me that Cloud component is going to be part of that, especially if we’re dealing with business processes across organizations, doing joins, comparing and contrasting data, crunching it and sharing it, making data actually part of the business, a revenue generation activity, all seems prominent and likely.

So to you, Stuart, what is the issue now with data in the Cloud? Is it good, bad, or just the same double-edged sword, and it just depends how you manage and do it?

Boardman: Dana, I don’t know whether we really want to be putting our data in the Cloud, so much as putting the access to our data into the Cloud. There are all kinds of issues you’re going to run up against, as soon as you start putting your source information out into the Cloud, not the least privacy and that kind of thing.

A bunch of APIs

What you can do is simply say, “What information do I have that might be interesting to people? If it’s a private Cloud in a large organization elsewhere in the organization, how can I make that available to share?” Or maybe it’s really going out into public. What a government, for example, can be thinking about is making information services available, not just what you go and get from them that they already published. But “this is the information,” a bunch of APIs if you like. I prefer to call them data services, and to make those available.

So, if you do it properly, you have a layer of security in front of your data. You’re not letting people come in and do joins across all your tables. You’re providing information. That does require you then to engage your users in what is it that they want and what they want to do. Maybe there are people out there who want to take a bit of your information and a bit of somebody else’s and mash it together, provide added value. That’s great. Let’s go for that and not try and answer every possible question in advance.

Gardner: Dave, do you agree with that, or do you think that there is a place in the Cloud for some data?

Gilmour: There’s definitely a place in the Cloud for some data. I get the impression that there is going to drive out of this something like the insurance industry, where you’ll have a secondary Cloud. You’ll have secondary providers who will provide to the front-end providers. They might do things like archiving and that sort of thing.

Now, if you have that situation where your contractual relationship is two steps away, then you have to be very confident and certain of your cloud partner, and it has to actually therefore encompass a very strong level of governance.

The other issue you have is that you’ve got then the intersection of your governance requirements with that of the cloud provider’s governance requirements. Therefore you have to have a really strongly — and I hate to use the word — architected set of interfaces, so that you can understand how that governance is actually going to operate.

Gardner: Wouldn’t data perhaps be safer in a cloud than if they have a poorly managed network?

Mezzapelle: There is data in the Cloud and there will continue to be data in the Cloud, whether you want it there or not. The best organizations are going to start understanding that they can’t control it that way and that perimeter-like approach that we’ve been talking about getting away from for the last five or seven years.

So what we want to talk about is data-centric security, where you understand, based on role or context, who is going to access the information and for what reason. I think there is a better opportunity for services like storage, whether it’s for archiving or for near term use.

There are also other services that you don’t want to have to pay for 12 months out of the year, but that you might need independently. For instance, when you’re running a marketing campaign, you already share your data with some of your marketing partners. Or if you’re doing your payroll, you’re sharing that data through some of the national providers.

Data in different places

So there already is a lot of data in a lot of different places, whether you want Cloud or not, but the context is, it’s not in your perimeter, under your direct control, all of the time. The better you get at managing it wherever it is specific to the context, the better off you will be.

Hietala: It’s a slippery slope [when it comes to customer data]. That’s the most dangerous data to stick out in a Cloud service, if you ask me. If it’s personally identifiable information, then you get the privacy concerns that Stuart talked about. So to the extent you’re looking at putting that kind of data in a Cloud, looking at the Cloud service and trying to determine if we can apply some encryption, apply the sensible security controls to ensure that if that data gets loose, you’re not ending up in the headlines of The Wall Street Journal.

Gardner: Dave, you said there will be different levels on a regulatory basis for security. Wouldn’t that also play with data? Wouldn’t there be different types of data and therefore a spectrum of security and availability to that data?

Gilmour: You’re right. If we come back to Facebook as an example, Facebook is data that, even if it’s data about our known customers, it’s stuff that they have put out there with their will. The data that they give us, they have given to us for a purpose, and it is not for us then to distribute that data or make it available elsewhere. The fact that it may be the same data is not relevant to the discussion.

Three-dimensional solution

That’s where I think we are going to end up with not just one layer or two layers. We’re going to end up with a sort of a three-dimensional solution space. We’re going to work out exactly which chunk we’re going to handle in which way. There will be significant areas where these things crossover.

The other thing we shouldn’t forget is that data includes our software, and that’s something that people forget. Software nowadays is out in the Cloud, under current ways of running things, and you don’t even always know where it’s executing. So if you don’t know where your software is executing, how do you know where your data is?

It’s going to have to be just handled one way or another, and I think it’s going to be one of these things where it’s going to be shades of gray, because it cannot be black and white. The question is going to be, what’s the threshold shade of gray that’s acceptable.

Gardner: Mary Ann, to this notion of the different layers of security for different types of data, is there anything happening in the market that you’re aware of that’s already moving in that direction?

Mezzapelle: The experience that I have is mostly in some of the business frameworks for particular industries, like healthcare and what it takes to comply with the HIPAA regulation, or in the financial services industry, or in consumer products where you have to comply with the PCI regulations.

There has continued to be an issue around information lifecycle management, which is categorizing your data. Within a company, you might have had a document that you coded private, confidential, top secret, or whatever. So you might have had three or four levels for a document.

You’ve already talked about how complex it’s going to be as you move into trying understand, not only for that data, that the name Mary Ann Mezzapelle, happens to be in five or six different business systems over a 100 instances around the world.

That’s the importance of something like an Enterprise Architecture that can help you understand that you’re not just talking about the technology components, but the information, what they mean, and how they are prioritized or critical to the business, which sometimes comes up in a business continuity plan from a system point of view. That’s where I’ve advised clients on where they might start looking to how they connect the business criticality with a piece of information.

One last thing. Those regulations don’t necessarily mean that you’re secure. It makes for good basic health, but that doesn’t mean that it’s ultimately protected.You have to do a risk assessment based on your own environment and the bad actors that you expect and the priorities based on that.

Leaving security to the end

Boardman: I just wanted to pick up here, because Mary Ann spoke about Enterprise Architecture. One of my bugbears — and I call myself an enterprise architect — is that, we have a terrible habit of leaving security to the end. We don’t architect security into our Enterprise Architecture. It’s a techie thing, and we’ll fix that at the back. There are also people in the security world who are techies and they think that they will do it that way as well.

I don’t know how long ago it was published, but there was an activity to look at bringing the SABSA Methodology from security together with TOGAF®. There was a white paper published a few weeks ago.

The Open Group has been doing some really good work on bringing security right in to the process of EA.

Hietala: In the next version of TOGAF, which has already started, there will be a whole emphasis on making sure that security is better represented in some of the TOGAF guidance. That’s ongoing work here at The Open Group.

Gardner: As I listen, it sounds as if the in the Cloud or out of the Cloud security continuum is perhaps the wrong way to look at it. If you have a lifecycle approach to services and to data, then you’ll have a way in which you can approach data uses for certain instances, certain requirements, and that would then apply to a variety of different private Cloud, public Cloud, hybrid Cloud.

Is that where we need to go, perhaps have more of this lifecycle approach to services and data that would accommodate any number of different scenarios in terms of hosting access and availability? The Cloud seems inevitable. So what we really need to focus on are the services and the data.

Boardman: That’s part of it. That needs to be tied in with the risk-based approach. So if we have done that, we can then pick up on that information and we can look at a concrete situation, what have we got here, what do we want to do with it. We can then compare that information. We can assess our risk based on what we have done around the lifecycle. We can understand specifically what we might be thinking about putting where and come up with a sensible risk approach.

You may come to the conclusion in some cases that the risk is too high and the mitigation too expensive. In others, you may say, no, because we understand our information and we understand the risk situation, we can live with that, it’s fine.

Gardner: It sounds as if we are coming at this as an underwriter for an insurance company. Is that the way to look at it?

Current risk

Gilmour: That’s eminently sensible. You have the mortality tables, you have the current risk, and you just work the two together and work out what’s the premium. That’s probably a very good paradigm to give us guidance actually as to how we should approach intellectually the problem.

Mezzapelle: One of the problems is that we don’t have those actuarial tables yet. That’s a little bit of an issue for a lot of people when they talk about, “I’ve got $100 to spend on security. Where am I going to spend it this year? Am I going to spend it on firewalls? Am I going to spend it on information lifecycle management assessment? What am I going to spend it on?” That’s some of the research that we have been doing at HP is to try to get that into something that’s more of a statistic.

So, when you have a particular project that does a certain kind of security implementation, you can see what the business return on it is and how it actually lowers risk. We found that it’s better to spend your money on getting a better system to patch your systems than it is to do some other kind of content filtering or something like that.

Gardner: Perhaps what we need is the equivalent of an Underwriters Laboratories (UL) for permeable organizational IT assets, where the security stamp of approval comes in high or low. Then, you could get you insurance insight– maybe something for The Open Group to look into. Any thoughts about how standards and a consortium approach would come into that?

Hietala: I don’t know about the UL for all security things. That sounds like a risky proposition.

Gardner: It could be fairly popular and remunerative.

Hietala: It could.

Mezzapelle: An unending job.

Hietala: I will say we have one active project in the Security Forum that is looking at trying to allow organizations to measure and understand risk dependencies that they inherit from other organizations.

So if I’m outsourcing a function to XYZ corporation, being able to measure what risk am I inheriting from them by virtue of them doing some IT processing for me, could be a Cloud provider or it could be somebody doing a business process for me, whatever. So there’s work going on there.

I heard just last week about a NSF funded project here in the U.S. to do the same sort of thing, to look at trying to measure risk in a predictable way. So there are things going on out there.

Gardner: We have to wrap up, I’m afraid, but Stuart, it seems as if currently it’s the larger public Cloud provider, something of Amazon and Google and among others that might be playing the role of all of these entities we are talking about. They are their own self-insurer. They are their own underwriter. They are their own risk assessor, like a UL. Do you think that’s going to continue to be the case?

Boardman: No, I think that as Cloud adoption increases, you will have a greater weight of consumer organizations who will need to do that themselves. You look at the question that it’s not just responsibility, but it’s also accountability. At the end of the day, you’re always accountable for the data that you hold. It doesn’t matter where you put it and how many other parties they subcontract that out to.

The weight will change

So there’s a need to have that, and as the adoption increases, there’s less fear and more, “Let’s do something about it.” Then, I think the weight will change.

Plus, of course, there are other parties coming into this world, the world that Amazon has created. I’d imagine that HP is probably one of them as well, but all the big names in IT are moving in here, and I suspect that also for those companies there’s a differentiator in knowing how to do this properly in their history of enterprise involvement.

So yeah, I think it will change. That’s no offense to Amazon, etc. I just think that the balance is going to change.

Gilmour: Yes. I think that’s how it has to go. The question that then arises is, who is going to police the policeman and how is that going to happen? Every company is going to be using the Cloud. Even the Cloud suppliers are using the Cloud. So how is it going to work? It’s one of these never-decreasing circles.

Mezzapelle: At this point, I think it’s going to be more evolution than revolution, but I’m also one of the people who’ve been in that part of the business — IT services — for the last 20 years and have seen it morph in a little bit different way.

Stuart is right that there’s going to be a convergence of the consumer-driven, cloud-based model, which Amazon and Google represent, with an enterprise approach that corporations like HP are representing. It’s somewhere in the middle where we can bring the service level commitments, the options for security, the options for other things that make it more reliable and risk-averse for large corporations to take advantage of it.

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Cybersecurity, Information security, Security Architecture

5 Tips Enterprise Architects Can Learn from the Winchester Mystery House

By E.G.Nadhan, HP Enterprise Services

Not far from where The Open Group Conference was held in San Francisco this week is the Winchester Mystery House, once the personal residence of Sarah Winchester, widow of the gun magnate William Wirt Winchester. It took 38 years to build this house. Extensions and modifications were primarily based on a localized requirement du jour. Today, the house has several functional abnormalities that have no practical explanation.

To build a house right, you need a blueprint that details what is to be built, where, why and how based on the home owner’s requirements (including cost). As the story goes, Sarah Winchester’s priorities were different. However, if we don’t follow this systematic approach as enterprise architects, we are likely to land up with some Winchester IT houses as well.

Or, have we already? Enterprises are always tempted to address the immediate problem at hand with surprisingly short timelines. Frequent implementations of sporadic, tactical additions evolve to a Winchester Architecture. Right or wrong, Sarah Winchester did this by choice. If enterprises of today land up with such architectures, it can only by chance and not by choice.

So, here are my tips to architect by choice rather than chance:

  • Establish your principles: Fundamental architectural principles must be in place that serve as a rock solid foundation upon which architectures are based. These principles are based on generic, common-sense tenets that are refined to apply specifically to your enterprise.
  • Install solid governance: The appropriate level of architectural governance must be in place with the participation from the stakeholders concerned. This governance must be exercised, keeping these architectural principles in context.
  • Ensure business alignment: After establishing the architectural vision, Enterprise Architecture must lead in with a clear definition of the over-arching business architecture which defines the manner in which the other architectural layers are realized. Aligning business to IT is one of the primary responsibilities of an enterprise architect.
  • Plan for continuous evaluation: Enterprise Architecture is never really done. There are constant triggers (internal and external) for implementing improvements and extensions. Consumer behavior, market trends and technological evolution can trigger aftershocks within the foundational concepts that the architecture is based upon.

Thus, it is interesting that The Open Group conference was miles away from the Winchester House. By choice, I would expect enterprise architects to go to The Open Group Conference. By chance, if you do happen by the Winchester House and are able to relate it to your Enterprise Architecture, please follow the tips above to architect by choice, and not by chance.

If you have instances where you have seen the Winchester pattern, do let me know by commenting here or following me on Twitter @NadhanAtHP.

This blog post was originally posted on HP’s Transforming IT Blog.

HP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHP.

4 Comments

Filed under Enterprise Architecture, TOGAF®

What’s New in ArchiMate 2.0?

By Andrew Josey, The Open Group, Henry Franken, BiZZdesign

ArchiMate® 2.0, an Open Group Standard, is an upwards-compatible evolution from ArchiMate 1.0 adding new features, as well as addressing usage feedback and comments raised.

ArchiMate 2.0 standard supports modeling throughout the TOGAF Architecture Development Method (ADM).

Figure 1: Correspondence between ArchiMate and the TOGAF ADM

ArchiMate 2.0 consists of:

  • The ArchiMate Core, which contains several minor improvements on the 1.0 version.
  • The Motivation extension, to model stakeholders, drivers for change, business goals, principles, and requirements. This extension mainly addresses the needs in the early TOGAF phases and the requirements management process.
  • The Implementation and Migration extension, to support project portfolio management, gap analysis, and transition and migration planning. This extension mainly addresses the needs in the later phases of the TOGAF ADM cycle.

ArchiMate 2.0 offers a modeling language to create fully integrated models of the organization’s enterprise architecture, the motivation for the enterprise architecture, and the programs, projects and migration paths to implement this enterprise architecture. In this way, full (forward and backward) traceability between the elements in the enterprise architecture, their motivations and their implementation can be obtained.

In the ArchiMate Core, a large number of minor improvements have been made compared to ArchiMate 1.0: inconsistencies have been removed, examples have been improved and additional text has been inserted to clarify certain aspects. Two new concepts have been added based on needs experienced by practitioners:

  • Location: To model a conceptual point or extent in space that can be assigned to structural elements and, indirectly, of behavior elements.
  • Infrastructure Function: To model the internal behavior of a node in the technology layer. This makes the technology layer more consistent with the other two layers.

The Motivation extension defines the following concepts:

  • Stakeholder: The role of an individual, team, or organization (or classes thereof) that represents their interests in, or concerns relative to, the outcome of the architecture.
  • Driver: Something that creates, motivates, and fuels the change in an organization.
  • Assessment: The outcome of some analysis of some driver.
  • Goal: An end state that a stakeholder intends to achieve.
  • Requirement: A statement of need that must be realized by a system.
  • Constraint: A restriction on the way in which a system is realized.
  • Principle: A normative property of all systems in a given context or the way in which they are realized.

For motivation elements, a limited set of relationships has been defined, partly re-used from the ArchiMate Core: aggregation (decomposition), realization, and (positive or negative) influence.

The Implementation and Migration extension defines the following concepts (and re-uses the relationships of the Core):

  • Work Package: A series of actions designed to accomplish a unique goal within a specified time.
  • Deliverable: A precisely defined outcome of a work package.
  • Plateau: A relatively stable state of the architecture that exists during a limited period of time.
  • Gap: An outcome of a gap analysis between two plateaus.

ArchiMate 2 Certification

New with ArchiMate 2.0 is the introduction of a certification program. This includes certification for people and accreditation for training courses. It also includes certification for tools supporting the ArchiMate standard.

The ArchiMate 2 Certification for People program enables professionals around the globe to demonstrate their knowledge of the ArchiMate standard. ArchiMate 2 Certification for People is achieved through an examination and practical exercises as part of an Accredited ArchiMate 2 Training Course.

The Open Group Accreditation for ArchiMate training courses provides an authoritative and independent assurance of the quality and relevance of the training courses.

The Open Group ArchiMate Tool Certification Program makes certification available to tools supporting ArchiMate. The goal of the program is to ensure that architecture artifacts created with a certified tool are conformant to the language.

Further Reading

ArchiMate 2.0 is available for online reading and download from The Open Group Bookstore at www.opengroup.org/bookstore/catalog/c118.htm.

A white paper with further details on ArchiMate 2.0 is available to download from The Open Group Bookstore at www.opengroup.org/bookstore/catalog/w121.htm .

Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Henry Franken is the managing director of BiZZdesign and is chair of The Open Group ArchiMate Forum. As chair of The Open Group ArchiMate Forum, Henry led the development of the ArchiMate Version 2.o standard. Henry is a speaker at many conferences and has co-authored several international publications and Open Group White Papers. Henry is co-founder of the BPM-Forum. At BiZZdesign, Henry is responsible for research and innovation.

Comments Off

Filed under ArchiMate®, Business Architecture, Enterprise Architecture, Standards, TOGAF, TOGAF®

Security and Cloud Computing Themes to be explored at The Open Group San Francisco Conference

By The Open Group Conference Team

Cybersecurity and Cloud Computing are two of the most pressing trends facing enterprises today. The Open Group Conference San Francisco will feature tracks on both trends where attendees can learn about the latest developments in both disciplines as well as hear practical advice for implementing both secure architectures and for moving enterprises into the Cloud.  Below are some of the highlights and featured speakers from both tracks.

Security

The San Francisco conference will provide an opportunity for practitioners to explore the theme of “hacktivism,” the use and abuse of IT to drive social change, and its potential impact on business strategy and Enterprise Transformation.  Traditionally, IT security has focused on protecting the IT infrastructure and the integrity of the data held within.  However, in a rapidly changing world where hacktivism is an enterprise’s biggest threat, how can enterprise IT security respond?

Featured speakers and panels include:

  • Steve Whitlock, Chief Security Strategist, Boeing, “Information Security in the Internet Age”
  • Jim Hietala, Vice President, Security, The Open Group, “The Open Group Security Survey Results”
  • Dave Hornford, Conexiam, and Chair, The Open Group Architecture Forum, “Overview of TOGAF® and SABSA® Integration White Paper”
  • Panel – “The Global Supply Chain: Presentation and Discussion on the Challenges of Protecting Products Against Counterfeit and Tampering”

Cloud Computing

According to Gartner, Cloud Computing is now entering the “trough of disillusionment” on its hype cycle. It is critical that organizations better understand the practical business, operational and regulatory issues associated with the implementation of Cloud Computing in order to truly maximize its potential benefits.

Featured speakers and panels include:

  • David JW Gilmour, Metaplexity Associates, “Architecting for Information Security in a Cloud Environment”
  • Chris Lockhart, Senior Enterprise Architect, UnitedHeal, “Un-Architecture: How a Fortune 25 Company Solved the Greatest IT Problem”
  • Penelope Gordon, Cloud and Business Architect, 1Plug Corporation, “Measuring the Business Performance of Cloud Products”
  • Jitendra Maan, Tata Consultancy, “Mobile Intelligence with Cloud Strategy”
  • Panel – “The Benefits, Challenges and Survey of Cloud Computing Interoperability and Portability”
    • Mark Skilton, Capgemini; Kapil Bakshi, Cisco; Jeffrey Raugh, Hewlett-Packard

Please join us in San Francisco for these speaking tracks, as well as those on our featured them of Enterprise Transformation and the role of enterprise architecture. For more information, please go to the conference homepage: http://www3.opengroup.org/sanfrancisco2012

2 Comments

Filed under Cloud, Cloud/SOA, Cybersecurity, Information security, Security Architecture, Semantic Interoperability, TOGAF

How to manage requirements within the Enterprise Architecture using the TOGAF® and SABSA® frameworks

By Pascal de Koning, KPN 

You want to put your company’s business strategy into action. What’s the best way to accomplish this?  This can be done in a structured manner by using an Enterprise Architecture
Framework like TOGAF®. TOGAF® offers an overview of business and IT related architectures, as well as a process model to deliver these, called the Architecture Development Method (ADM-figure 1).

As the figure shows, Requirements Management plays a central role in the architecture work in the TOGAF® methodology. It’s very important to know the business requirements, because these demand what’s needed in the underlying architecture layer. In fact, this counts for every layer. Each architecture layer fulfills the requirements that are defined in the layer above. Without proper Requirements Management, the whole architecture would be loose sand.

Unfortunately, TOGAF® does not offer guidance on Requirements Management. It does however stress the importance and central role of Requirements Management, but doesn’t offer a way to actually do Requirements Management. This is a white spot in the TOGAF® ADM. To resolve this, a requirements management method is needed that is well-described and flexible to use on all levels in the architecture. We found this in the SABSA® (Sherwood’s Applied Business-driven Security Architecture) framework. SABSA® offers the unique Business Attribute Profiling (BAP) technique as a means to effectively carry out Requirements Management.

Business Attribute Profiling is a requirements engineering technique that translates business goals and drivers into requirements (see figure 2). Some advantages of this technique are:

  • Executive communication in non-ICT terms
  • Grouping and structuring of requirements, keeping oversight
  • Traceability mapping between business drivers, requirements and capabilities

The BAP process decomposes the business goal into its core elements. Each core element is a single business attribute. Examples of business attributes are Available, Scalable, Supported, Confidential, Traceable, etc.

As business processes tend to become more Internet-based, cyber security is becoming more important every day because the business processes are increasingly vulnerable to forces outside the business. Organizations must now consider not only the processes and requirements when planning an architecture, but they also need to consider the security of that architecture. A Security Architecture consists of all the security-related drivers, requirements, services and capabilities within the Enterprise. With the adoption of the Business Attribute Profiling technique for Requirements Management, it is now possible to integrate information security into the Enterprise Architecture.

The TOGAF®-SABSA® Integration white paper elaborates more on this and provides a guide that describes how TOGAF® and SABSA® can be combined such that the SABSA® business risk-driven security architecture approach is seamlessly integrated into the a TOGAF®-based enterprise architecture. It can be downloaded from https://www2.opengroup.org/ogsys/jsp/publications/PublicationDetails.jsp?publicationid=12449

TOGAF® is a registered trademark of The Open Group.  SABSA® is a registered trademark of The SABSA Institute.

Pascal de Koning MSc CISSP is a Senior Business Consultant with KPN Trusted Services, where he leads the security consulting practice. He is chairman of The Open Group TOGAF-SABSA Integration Working Group. He has worked on information security projects for the Dutch central government, European Union and KPN, to name just a few. Pascal has written articles for Computable and PvIB, and is a frequent speaker at conferences like RSA Europe and COSAC on the topics of Cyber Security and Enterprise Security Architecture. When not working, Pascal loves to go running.

1 Comment

Filed under Enterprise Architecture, Security Architecture, TOGAF®

Why do pencils have erasers?

By Andrew Josey and Garry Doherty, The Open Group

We know that TOGAF® isn’t perfect. In fact, it probably never will be, but sometimes, especially after a major release, it’s a good idea to stop and look backwards after its been in implementation for a while… just to make sure we’ve gotten it right and to review the standard for reasons of further clarification and to improve consistency.

That’s why we’re releasing TOGAF® 9.1. It contains a set of corrections to address comments raised since the introduction of TOGAF® 9 in 2009. We have been able to address over 400 of the comments received against TOGAF® 9, resulting in over 450 changes to the standard.

The maintenance updates in TOGAF® 9.1 are based on feedback received on the framework as organizations have put it to good use over the past three years. As such the changes are upwards compatible adding clarification, consistency and additional detail where needed. Some of the most significant updates include:

  • The SOA chapter (Part III, Chapter 22, Using TOGAF to Define & Govern SOAs) has been updated to include the latest Open Group SOA Work Group output providing guidance on adapting the ADM phases for SOA
  • ADM Phases E and F (Part II, Chapters 13 and 14) have been reworked to match the level of detail in other phases, and the uses of terminology for Transition Architecture, Roadmap, and Implementation Strategy clarified and made consistent
  • Corrections have been applied to aspects of the Content Metamodel (Part IV, Chapter 34, The Content Metamodel) including the metamodel diagrams
  • The concepts of levels, iterations and partitions have been clarified and made consistent. This includes a reorganization of material in Part III, Chapter 19, Applying Iteration to the ADM and Chapter 20, Applying the ADM across the Architecture Landscape, and also Part V, Chapter 40, Architecture Partitioning
  • The terms “artifact” versus “viewpoint” have been clarified and made consistent. This includes a restructuring of Part IV, Chapter 35, Architectural Artifacts
  • Changes have been made to improve general usability including:
    • The artifacts for each phase are now listed in the phase descriptions
    • Duplicate text in several places has been replaced with an appropriate reference
    • The “Objectives” sections of the phases have been reworked
    • Some artifacts have been renamed to better reflect their usage

If you’re already TOGAF® 9 certified,  don’t worry about the status of your certification. The TOGAF® 9 Certification for People Program has been designed to accommodate maintenance updates to the TOGAF® 9 standard such as TOGAF® 9.1. So impacts on the program are minimal:

  • The two levels of certification remain as TOGAF® 9 Foundation and TOGAF® 9 Certified.
  • Individuals who are currently certified in the TOGAF® 9 People Certification program remain certified.

TOGAF 9.1 is available for online reading at http://www.opengroup.org/togaf/ and available in The Open Group Bookstore at http://www.opengroup.org/bookstore/catalog/g116.htm .

A detailed description of the changes between TOGAF 9 and TOGAF 9.1 is available at http://www.opengroup.org/bookstore/catalog/u112.htm .

So now you know why pencils have erasers… because perfection is a constantly moving target!

5 Comments

Filed under Enterprise Architecture, Standards, TOGAF, TOGAF®

The Open Group and SABSA Institute Publish TOGAF® Integration Whitepaper

By Jim Hietala, Vice President, Security, The Open Group

2011 confirmed what many in the Enterprise Architecture industry have feared – data breaches are on the rise. It’s not just the number and cost of data breaches, but the sheer volume of information that cyber criminals are able to get their hands on. Today’s organizations cannot risk being vulnerable.

To help address this issue, The Open Group Security and Architecture Forums, and the SABSA® Institute, developers of the SABSA® security and risk management framework, joined forces to explore how security methodologies and risk management approaches can be an integrated with enterprise-level architectures for better protection and flexibility.

If you are an enterprise architect with responsibility for ensuring architectures are secure or a security professional tasked with developing secure architectures you’ll be interested in the work the Architecture Forum and SABSA® have done over the last 15 months, culminating in a whitepaper released today that provides a valuable contribution to the security and enterprise architecture communities.

 A Project Designed to Protect

All too often vulnerabilities can occur due to lack of alignment across organizations, with security and IT experts failing to consider the entire infrastructure together rather than different parts separately.

The impetus for this project came from large enterprises and consulting organizations that frequently saw TOGAF® being used as a tool for developing enterprise architecture, and SABSA® as a tool for creating security architectures. Practitioners of either TOGAF® or SABSA® asked for guidance on how best to align these frameworks in practical usage, and on how to re-use artifacts from each.

This quote from the whitepaper sums up the rationale for the effort best:

 “For too long, information security has been considered a separate discipline, isolated from the enterprise architecture. This Whitepaper documents an approach to enhance the TOGAF® enterprise architecture methodology with the SABSA® security architecture approach and thus create one holistic architecture methodology.”

The vision for the project has been to support enterprise architects who need to take operational risk management into account, by providing guidance describing how TOGAF® and SABSA® can be combined such that the SABSA® business risk and opportunity-driven security architecture approach can be seamlessly integrated into the TOGAF® business strategy-driven approach to develop a richer, more complete enterprise architecture.

There are two important focal points for this effort, first to provide a practical approach for seamlessly integrating SABSA® security requirements and services in common TOGAF®-based architecture engagements – instead of treating security as a separate entity within the architecture.

The second focal point is to illustrate how the requirements management processes in TOGAF® can be fulfilled in their widest generic sense (i.e., not only with regard to security architecture) by application of the SABSA® concept of Business Attribute Profiling to the entire ADM process.

Download a free copy of the TOGAF® and SABSA® whitepaper here.

If you are interested in exploring TOGAF® 9, online access to the framework is available here.

Information on SABSA® may be obtained here.

A large number of individuals participated in the development of this valuable resource. Thank you to all project team members who made this effort a reality, including from the SABSA® Institute, the Open Group Architecture Forum, and the Open Group Security Forum!

3 Comments

Filed under Enterprise Architecture, Security Architecture, TOGAF®

What is Business Technology Management and how does it relate to Enterprise Architecture?

By Serge Thorn, Architecting the Enterprise

Business Technology Management (BTM) is not a methodology but I would say a concept, or eventually the aggregation of several guidelines and techniques. It is also described as a management science which aims to unify business and technology business strategies with the aim of extracting the full potential value of business technology solutions. In a nutshell, it allows you to unify business and technology decision making. Sound familiar?

Continue reading

2 Comments

Filed under Enterprise Architecture, TOGAF®

How EA is leading enterprise transformation in France

By Eric Boulay, The Open Group France

Earlier this week, in Paris, The Open Group France held the latest in a series of one-day conferences focused on Enterprise Architecture. As usual, the event delivered high-value content in the form of an excellent keynote presentation and case studies. These covered the retail, gambling, and financial industries — including two from CIOs of major French corporations: Continue reading

2 Comments

Filed under Enterprise Architecture, TOGAF®