Tag Archives: TOGAF

The Open Group Newport Beach Conference – Early Bird Registration Ends January 4

By The Open Group Conference Team

The Open Group is busy gearing up for the Newport Beach Conference. Taking place January 28-31, 2013, the conference theme is “Big Data – The Transformation We Need to Embrace Today” and will bring together leading minds in technology to discuss the challenges and solutions facing Enterprise Architecture around the growth of Big Data. Register today!

Information is power, and we stand at a time when 90% of the data in the world today was generated in the last two years alone.  Despite the sheer enormity of the task, off the shelf hardware, open source frameworks, and the processing capacity of the Cloud, mean that Big Data processing is within the cost-effective grasp of the average business. Organizations can now initiate Big Data projects without significant investment in IT infrastructure.

In addition to tutorial sessions on TOGAF® and ArchiMate®, the conference offers roughly 60 sessions on a varied of topics including:

  • The ways that Cloud Computing is transforming the possibilities for collecting, storing, and processing big data.
  • How to contend with Big Data in your Enterprise?
  • How does Big Data enable your Business Architecture?
  • What does the Big Data revolution mean for the Enterprise Architect?
  • Real-time analysis of Big Data in the Cloud.
  • Security challenges in the world of outsourced data.
  • What is an architectural view of Security for the Cloud?

Plenary speakers include:

  • Christian Verstraete, Chief Technologist – Cloud Strategy, HP
  • Mary Ann Mezzapelle, Strategist – Security Services, HP
  • Michael Cavaretta, Ph.D, Technical Leader, Predictive Analytics / Data Mining Research and Advanced Engineering, Ford Motor Company
  • Adrian Lane, Analyst and Chief Technical Officer, Securosis
  • David Potter, Chief Technical Officer, Promise Innovation Oy
  • Ron Schuldt, Senior Partner, UDEF-IT, LLC

A full conference agenda is available here. Tracks include:

  • Architecting Big Data
  • Big Data and Cloud Security
  • Data Architecture and Big Data
  • Business Architecture
  • Distributed Services Architecture
  • EA and Disruptive Technologies
  • Architecting the Cloud
  • Cloud Computing for Business

Early Bird Registration

Early Bird registration for The Open Group Conference in Newport Beach ends January 4. Register now and save! For more information or to register: http://www.opengroup.org/event/open-group-newport-beach-2013/reg

Upcoming Conference Submission Deadlines

In addition to the Early Bird registration deadline to attend the Newport Beach conference, there are upcoming deadlines for speaker proposal submissions to Open Group conferences in Sydney, Philadelphia and London. To submit a proposal to speak, click here.

Venue Industry Focus Submission Deadline
Sydney (April 15-17) Finance, Defense, Mining January 18, 2013
Philadelphia (July 15-17) Healthcare, Finance, Defense April 5, 2013
London (October 21-23) Finance, Government, Healthcare July 8, 2013

We expect space on the agendas of these events to be at a premium, so it is important for proposals to be submitted as early as possible. Proposals received after the deadline dates will still be considered, if space is available; if not, they may be carried over to a future conference. Priority will be given to proposals received by the deadline dates and to proposals that include an end-user organization, at least as a co-presenter.

Comments Off

Filed under Conference

Call for Submissions

By Patty Donovan, The Open Group

The Open Group Blog is celebrating its second birthday this month! Over the past few years, our blog posts have tended to cover Open Group activities – conferences, announcements, our lovely members, etc. While several members and Open Group staff serve as regular contributors, we’d like to take this opportunity to invite our community members to share their thoughts and expertise on topics related to The Open Group’s areas of expertise as guest contributors.

Here are a few examples of popular guest blog posts that we’ve received over the past year

Blog posts generally run between 500 and 800 words and address topics relevant to The Open Group workgroups, forums, consortiums and events. Some suggested topics are listed below.

  • ArchiMate®
  • Big Data
  • Business Architecture
  • Cloud Computing
  • Conference recaps
  • DirectNet
  • Enterprise Architecture
  • Enterprise Management
  • Future of Airborne Capability Environment (FACE™)
  • Governing Board Businesses
  • Governing Board Certified Architects
  • Governing Board Certified IT Specialists
  • Identity Management
  • IT Security
  • The Jericho Forum
  • The Open Group Trusted Technology Forum (OTTF)
  • Quantum Lifecycle Management
  • Real-Time Embedded Systems
  • Semantic Interoperability
  • Service-Oriented Architecture
  • TOGAF®

If you have any questions or would like to contribute, please contact opengroup (at) bateman-group.com.

Please note that all content submitted to The Open Group blog is subject to The Open Group approval process. The Open Group reserves the right to deny publication of any contributed works. Anything published shall be copyright of The Open Group.

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Uncategorized

Barcelona Highlights

By Steve Philp, The Open Group

Within a 15 minute walk of Camp Nou (home of FC Barcelona), The Open Group Conference “kicked off” on Monday morning with some excellent plenary presentations from Scott Radedztsky of Deloitte followed by Peter Haviland and Mick Adams of Ernst & Young, and after the break from Helen Sun of Oracle and finally Ron Tolido and Manuel Sevilla from Capgemini. You can see most of these Big Data presentations for yourself on The Open Group’s Livestream page.

The “second half” of the day was split into tracks for Big Data, Enterprise Architecture (EA), TOGAF® and ArchiMate®. Henry Franken of BiZZdesign talked about EA in terms of TOGAF and ArchiMate (you can see this on our Livestream site, too) and the other ArchiMate presentations from Peter Filip of Tatra Bank, Gerben Wierda of APG Asset Management and Mieke Mahakena of Capgemini were also well received by an enthusiastic audience. Networking and drinks followed at the end of the track sessions, and the “crowd” went away happy after day one.

Tuesday started with a plenary presentation by Dr. Robert Winter from the University of St Gallen on EA and Transformation Management. See the following clip to learn more about his presentation and his research.


This was followed by tracks on distributed services architecture, security, TOGAF 9 case studies, information architecture, quantum lifecycle management (QLM) and a new track on Practice Driven Research on Enterprise Transformation (PRET) and Trends in EA Research (TEAR). The evening entertainment on day two consisted of dinner and a spectacular flamenco dancing show at the Palacio de Flamenco – where a good time was had by all.

After the show there was also time for a number of us to watch Barcelona v. Celtic in their European Champions League match at the Camp Nou. This is the view from my seat:

 

The game ended in a 2-1 victory for Barcelona, and following the game there was much debate and friendly banter in the bar between the conference delegates and the Celtic fans that were staying at our hotel.

The track theme continued on day three of the conference along with member meetings such as the next version of TOGAF Working Group, the TOGAF Standard and ArchiMate Language Harmonization Project, Certification Standing Committee, and TOGAF Value Realization Working Group, etc. Member meetings of the Architecture Forum and Security Forum were held on Thursday and brought the Barcelona event to its conclusion.

At the end of the day, if your “goal” is to listen to some great presentations, network with your peers, participate in meetings and influence the generation of new IT standards, then you should get a ticket for our next fixture in Newport Beach, Calif., USA on January 28-31, 2013. The theme, again, will be Big Data.

I look forward to seeing you there!

Steve Philp is the Marketing Director at The Open Group. Over the past 20 years, Steve has worked predominantly in sales, marketing and general management roles within the IT training industry. Based in Reading, UK, he joined the Open Group in 2008 to promote and develop the organization’s skills and experience-based IT certifications. More recently, he has become responsible for corporate marketing as well as certification.

Comments Off

Filed under Conference

ArchiMate® 2.0 and Beyond

By The Open Group Conference Team

In this video, Henry Franken of BiZZdesign discusses ArchiMate® 2.0, the new version of the graphical modeling language for Enterprise Architecture that provides businesses with the means to communicate with different stakeholders from the business goals level to implementation scenarios.

Franken explains that the first edition allowed users to express Enterprise Architecture at its core – modeling business applications and infrastructure. ArchiMate® 2.0 has two major additions to make it fully aligned with TOGAF® – the motivation extension and the migration and planning extension. The motivation extension provides users with the ability to fully express business motivations and goals to enterprise architects; the migration and planning extension helps lay out programs and projects to make a business transition.

There are several sessions on ArchiMate® at the upcoming Open Group Conference in Barcelona. Notably, Henry Franken’s “Delivering Enterprise Architecture with TOGAF® and ArchiMate®” session on October 22 at 2:00-2:45 p.m. UTC / 8:00-8:45 a.m. EST will be livestreamed on The Open Group Website.

To view these sessions and for more information on the conference, please go to: http://www3.opengroup.org/barcelona2012

Comments Off

Filed under ArchiMate®, Conference, Enterprise Architecture

TOGAF® and BIAN – A strong proposition for the Banking Industry

By Thomas Obitz, KPMG

Earlier this year, a working group led by Paul Bonnie, ING and I published a white paper on the integration of TOGAF® and BIAN, the framework of the Banking Industry Architecture Network. Gartner even suggested that the white paper greatly aids the big problem of arriving at a consistent reference model for banks. So how does a white paper help practicing architects in banks?

Every enterprise architect knows the two most difficult questions in a complex transformation initiative: How to describe the architecture of an organization – how to break down its functions and services, and arrive at a model which makes sense to everybody; and where to get started – what needs to be done, and how do the outputs fit together?

For this second question, the industry has pretty much agreed on the answer – TOGAF. It is a best practice process with a tremendous acceptance in the market place. However, it is industry independent, and, therefore, will not provide any models describing the specifics of a bank, or even the banking IT landscape. This gap of vertical content is a significant hurdle when attempting to get architecture initiatives off the ground.

Looking at our options within The Open Group Architecture Forum to address this challenge, creating industry-specific variants of the TOGAF framework would have stretched resources a bit too thin – and so the Architecture Forum decided to find a partner to collaborate with. We found it in BIAN.

BIAN, the Banking Industry Architecture Network, publishes a reference model for the services required as building blocks in the IT landscape of a bank. Like TOGAF, it leverages the experience of its members to identify best practices, and it has the support of major banks, leading software vendors and consultancies. The current services landscape has reached a certain level of maturity, describing more than 250 services.

The white paper describes how TOGAF and BIAN fit together, and where and how to use the BIAN collateral. Adapting the frameworks together yields several key benefits:

  • The services landscape provides architects with a canvas to structure the IT landscape, to map their inherent challenges, and scope solutions quickly. Hence, it speeds up activities in the time critical mobilization phase of a transformation initiative and helps to keep momentum.
  • Once a solution has been scoped in alignment with the services landscape, vendors supporting the BIAN reference model can provide components that implement the services. Consequently, it helps in the process of vendor selection.
  • As the responsibilities of components and the business objects exchanged between them are defined, integration between components of the landscape becomes much easier, reducing integration cost and complexity.

In a recent engagement with a retail bank, I used the services landscape as the starting point for the analysis of the challenges the bank was facing and to map out potential solutions. It allowed the team to start out quickly with a structure that was accepted and natural.

So when you are looking for an approach to making a large transformation initiative fly – have a look at our paper, and use it as a tool for making your life easier. And please do give us feedback on your experiences with it via email or in the comments section of this blog post.


Thomas Obitz is a Principal Advisor with KPMG LLP in London. Building on more than 20 years of experience in the IT industry, he acts primarily as a lead architect of major initiatives, as an enterprise architect, and a business architect. He has more than 13 years of experience in the Financial Services industry, with a strong focus on Investment Banking and Capital Markets. 

1 Comment

Filed under Business Architecture, TOGAF®

The Open Group Barcelona Conference – Early Bird Registration ends September 21

By The Open Group Conference Team

Early Bird registration for The Open Group Conference in Barcelona ends September 21. Register now and save!

The conference runs October 22-24, 2012. On Monday, October 22, the plenary theme is “Big Data – The Next Frontier in the Enterprise,” and speakers will address the challenges and solutions facing Enterprise Architecture within the context of the growth of Big Data. Topics to be explored include:

  • How does an enterprise adopt the means to contend with Big Data within its information architecture?
  • How does Big Data enable your business architecture?
  • What are the issues concerned with real-time analysis of the data resources on the cloud?
  • What are the information security challenges in the world of outsourced and massively streamed data analytics?
  • What is the architectural view of security for cloud computing? How can you take a risk-based approach to cloud security?

Plenary speakers include:

  • Peter Haviland, head of Business Architecture, Ernst & Young
  • Ron Tolido, CTO of Application Services in Europe, Capgemini; and Manuel Sevilla, chief technical officer, Global Business Information Management, Capgemini
  • Scott Radeztsky, chief technical officer, Deloitte Analytics Innovation Centers
  • Helen Sun, director of Enterprise Architecture, Oracle

On Tuesday, October 23, Dr. Robert Winter, Institute of Information Management, University of St. Gallen, Switzerland, will kick off the day with a keynote on EA Management and Transformation Management.

Tracks include:

  • Practice-driven Research on Enterprise Transformation (PRET)
  • Trends in Enterprise Architecture Research (TEAR)
  • TOGAF® and ArchiMate® Case Studies
  • Information Architecture
  • Distributed Services Architecture
  • Holistic Enterprise Architecture Workshop
  • Business Innovation & Technical Disruption
  • Security Architecture
  • Big Data
  • Cloud Computing for Business
  • Cloud Security and Cloud Architecture
  • Agile Enterprise Architecture
  • Enterprise Architecture and Business Value
  • Setting Up A Successful Enterprise Architecture Practice

For more information or to register: http://www.opengroup.org/barcelona2012/registration

Comments Off

Filed under Conference

ArchiMate 2.0 – Ready for the Future of Enterprise Architecture!

By Henry Franken, BIZZdesign

Models have played an important role in business for a long time. Process models, information- and data models, application landscapes, strategic models, operational models – you name it, organizations have tried it. With the rise of Enterprise Architecture (EA) as a strategic discipline for many organizations, we saw two interesting developments. First of all, organizations try to connect their models, to gain insight in the way the enterprise works from many different perspectives. Secondly, we saw the trend that models become more high-level, focusing on the essence of the organization.

These developments have led to the development of the ArchiMate® language, which allows high-level modeling within a domain, but allows modeling the relations between domains. Even more, in recognition that architecture is a communications game, a key driver for the language was to also allow for effective visualizations for key stakeholders based on solid architectural analyses.

The first edition of the ArchiMate language enabled organizations to create holistic architecture models with concepts from three domains: business, application and technology. With a handful of concepts and relations, this allowed organizations to model the relation between products and services, processes, supporting applications and information, as well as infrastructure. Having modeled this formally, organizations can do impact assessments, generate visualizations for various stakeholders and so on.

ArchiMate has recently been extended by members within the ArchiMate Forum within The Open Group), resulting in ArchiMate 2.0 – a new version of ArchiMate that is fully aligned with The Open Group Architecture Framework (TOGAF®). Two new extensions have been developed for this purpose, making sure the language now covers the entire Architecture Development Method (ADM) of TOGAF.

The new motivation extension allows organizations to graphically model the answer to the “why” question of EA: Who are key stakeholders of EA? What are their drivers? How do these drivers lead to principles and requirements that are realized in the architecture? This extension mainly aligns with the early phases of the TOGAF ADM.

The new ArchiMate 2.0 standard also has an implementation and migration extension that aligns with the later phases of the ADM. Using this extension, architects can align with project management and graphically model plateaus, projects and programs, as well as their deliverables.

One of the key strengths of ArchiMate – as well as TOGAF – is its openness – it allows practitioners worldwide to join in and help push the language forward. Indeed, we are seeing the adoption of the language, as well as certifications of practitioners grow worldwide.

The Open Group has introduced certification programs for individuals, training vendors and tool vendors, and the uptake of these programs is very successful! We are now seeing many individuals obtaining an ArchiMate 2.0 certificate, training vendors applying for training accreditation, and tool vendors implementing the ArchiMate modeling language into Enterprise Architecture modeling tools, all while  being certified by The Open Group.

With all these great developments within the last few years – fluent integration with TOGAF and a fast growing number of professionals using ArchiMate – I believe it is safe to say that with ArchiMate 2.0 you are ready for the future of Enterprise Architecture!

Henry Franken is the managing director of BiZZdesign and is chair of The Open Group ArchiMate Forum. As chair of The Open Group ArchiMate Forum, Henry led the development of the ArchiMate Version 2.o standard. Henry is a speaker at many conferences and has co-authored several international publications and Open Group White Papers. Henry is co-founder of the BPM-Forum. At BiZZdesign, Henry is responsible for research and innovation.

4 Comments

Filed under ArchiMate®, Certifications, Enterprise Architecture, Enterprise Transformation, TOGAF®

Video Highlights Day 2 of Washington, D.C.

By The Open Group Conference Team

How can you use the tools of Enterprise Architecture and open standards to improve the capability of your company doing business? The Day 2 speakers of The Open Group Conference in Washington, D.C. addressed this question, focusing on Enterprise Transformation. Sessions included:

  • “Case Study: University Health Network (Toronto),” by Jason Uppal, chief enterprise architect at QR Systems, Inc. and winner of the 2012 Edison Award for Innovation
  • “Future Airborne Capability Environment (FACE™): Transforming the DoD Avionics Software Industry Through the Use of Open Standards,” by Judy Cerenzia, FACE™ program director at The Open Group, Kirk Avery, chief software architect at Lockheed Martin and Philip Minor, director at System of Systems of Engineering Directorate at the Office of Chief Systems Engineer, ASA(ALT)
  • “Using the TOGAF® Architecture Content Framework with the ArchiMate® Modeling Language,” by Henry Franken, CEO of BIZZdesign, and Iver Band, enterprise architect at Standard Insurance

David Lounsbury, CTO of The Open Group summarizes some of the day’s sessions:

Comments Off

Filed under ArchiMate®, Business Architecture, Certifications, Conference, Cybersecurity, Enterprise Architecture, Enterprise Transformation, FACE™, Information security, TOGAF®, Uncategorized

Summer in the Capitol – Looking Back at The Open Group Conference in Washington, D.C.

By Jim Hietala, The Open Group

This past week in Washington D.C., The Open Group held our Q3 conference. The theme for the event was “Cybersecurity – Defend Critical Assets and Secure the Global Supply Chain,” and the conference featured a number of thought-provoking speakers and presentations.

Cybersecurity is at a critical juncture, and conference speakers highlighted the threat and attack reality and described industry efforts to move forward in important areas. The conference also featured a new capability, as several of the events were Livestreamed to the Internet.

For those who did not make the event, here’s a summary of a few of the key presentations, as well as what The Open Group is doing in these areas.

Joel Brenner, attorney with Cooley, was our first keynote. Joel’s presentation was titled, “Turning Us Inside-Out: Crime and Economic Espionage on our Networks,” The talk mirrored his recent book, “America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime, and Warfare,” and Joel talked about current threats to critical infrastructure, attack trends and challenges in securing information. Joel’s presentation was a wakeup call to the very real issues of IP theft and identity theft. Beyond describing the threat and attack landscape, Joel discussed some of the management challenges related to ownership of the problem, namely that the different stakeholders in addressing cybersecurity in companies, including legal, technical, management and HR, all tend to think that this is someone else’s problem. Joel stated the need for policy spanning the entire organization to fully address the problem.

Kristin Baldwin, principal deputy, systems engineering, Office of the Assistant Secretary of Defense, Research and Engineering, described the U.S. Department of Defense (DoD) trusted defense systems strategy and challenges, including requirements to secure their multi-tiered supply chain. She also talked about how the acquisition landscape has changed over the past few years. In addition, for all programs the DoD now requires the creation of a program protection plan, which is the single focal point for security activities on the program. Kristin’s takeaways included needing a holistic approach to security, focusing attention on the threat, and avoiding risk exposure from gaps and seams. DoD’s Trusted Defense Systems Strategy provides an overarching framework for trusted systems. Stakeholder integration with acquisition, intelligence, engineering, industry and research communities is key to success. Systems engineering brings these stakeholders, risk trades, policy and design decisions together. Kristin also stressed the importance of informing leadership early and providing programs with risk-based options.

Dr. Ron Ross of NIST presented a perfect storm of proliferation of information systems and networks, increasing sophistication of threat, resulting in an increasing number of penetrations of information systems in the public and private sectors potentially affecting security and privacy. He proposed a need an integrated project team approach to information security. Dr. Ross also provided an overview of the changes coming in NIST SP 800-53, version 4, which is presently available in draft form. He also advocated a dual protection strategy approach involving traditional controls at network perimeters that assumes attackers outside of organizational networks, as well as agile defenses, are already inside the perimeter. The objective of agile defenses is to enable operation while under attack and to minimize response times to ongoing attacks. This new approach mirrors thinking from the Jericho Forum and others on de-perimeterization and security and is very welcome.

The Open Group Trusted Technology Forum provided a panel discussion on supply chain security issues and the approach that the forum is taking towards addressing issues relating to taint and counterfeit in products. The panel included Andras Szakal of IBM, Edna Conway of Cisco and Dan Reddy of EMC, as well as Dave Lounsbury, CTO of The Open Group. OTTF continues to make great progress in the area of supply chain security, having published a snapshot of the Open Trusted Technology Provider Framework, working to create a conformance program, and in working to harmonize with other standards activities.

Dave Hornford, partner at Conexiam and chair of The Open Group Architecture Forum, provided a thought provoking presentation titled, “Secure Business Architecture, or just Security Architecture?” Dave’s talk described the problems in approaches that are purely focused on securing against threats and brought forth the idea that focusing on secure business architecture was a better methodology for ensuring that stakeholders had visibility into risks and benefits.

Geoff Besko, CEO of Seccuris and co-leader of the security integration project for the next version of TOGAF®, delivered a presentation that looked at risk from a positive and negative view. He recognized that senior management frequently have a view of risk embracing as taking risk with am eye on business gains if revenue/market share/profitability, while security practitioners tend to focus on risk as something that is to be mitigated. Finding common ground is key here.

Katie Lewin, who is responsible for the GSA FedRAMP program, provided an overview of the program, and how it is helping raise the bar for federal agency use of secure Cloud Computing.

The conference also featured a workshop on security automation, which featured presentations on a number of standards efforts in this area, including on SCAP, O-ACEML from The Open Group, MILE, NEA, AVOS and SACM. One conclusion from the workshop was that there’s presently a gap and a need for a higher level security automation architecture encompassing the many lower level protocols and standards that exist in the security automation area.

In addition to the public conference, a number of forums of The Open Group met in working sessions to advance their work in the Capitol. These included:

All in all, the conference clarified the magnitude of the cybersecurity threat, and the importance of initiatives from The Open Group and elsewhere to make progress on real solutions.

Join us at our next conference in Barcelona on October 22-25!

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Cybersecurity, Enterprise Architecture, Information security, OTTF, Security Architecture, Supply chain risk, TOGAF®

TOGAF® 9 Certification Growth – Number of Individuals Certified Doubles in the Last 12 Months – Now Over 14,800

By Andrew Josey, The Open Group

The number of individuals certified in the TOGAF® 9 certification program as of July 1st 2012 is 14,851. This represents a doubling of the number of individuals certified in the last 12 months with 7,640 new certifications during that period. The latest statistics show that certifications are now growing at two thousand individuals per quarter.

TOGAF is being adopted globally. The top five countries include the UK, Netherlands, USA, Australia and India.

Here is a list of individuals certifications among the top 20 countries, as of July 2012

Rank # Individuals Country Percentage
1 2444 UK 16.2
2 1916 USA 12.8
3 1607 Netherlands 10.8
4 1093 Australia 7.3
5 913 India 6.1
6 705 Canada 4.7
7 637 South Africa 4.2
8 524 Finland 3.5
9 517 France 3.4
10 434 China 2.8
11 379 Norway 2.5%
12 344 Sweden 2.3%
13 280 Germany 1.8%
14 271 Belgium 1.8%
15 244 United Arab Emirates 1.6%
16 224 Denmark 1.5%
17 209 Japan 1.4%
18 176 New Zealand 1.1%
19 173 Saudi Arabia 1.1%
20 136 Czech Republic 0.9%

There are 43 TOGAF 9 training partners worldwide and 48 accredited TOGAF 9 courses.  More information on TOGAF 9 Certification, including the official accredited training course calendar and a directory of certified people and, can be found on The Open Group website at: http://www.opengroup.org/togaf9/cert/.

(This blog post was edited on August 16)

Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

14 Comments

Filed under Certifications, Enterprise Architecture, TOGAF®

ArchiMate at the Washington, D.C. Conference #ogDCA

By Iver Band, Standard Insurance Company

The Open Group offers many opportunities to learn about ArchiMate®, the fast-growing visual modeling language standard for Enterprise Architecture. ArchiMate enables enterprise architects to develop rich and clear graphical representations that are accessible to a wide range of stakeholders while providing clear direction to downstream architects and designers. Looking forward to this week’s Washington, D.C. conference, let’s examine the various sessions where attendees can learn more about this modeling language standard.

On Sunday, July 15, start with the ArchiMate 2.0 pre-conference introductory session from 4:30-5:00 p.m. ET led by BiZZdesign CEO and ArchiMate Forum Chair Henry Franken. Right afterward, from 5:00-5:30 ET, learn about ArchiMate certification along with other certifications offered by The Open Group.  Conference attendees can engage further with the language at one of the interactive Learning Lab sessions from 5:30-6:15 p.m. ET.

On Tuesday, July 17, learn how to use the ArchiMate language for architecture projects based on TOGAF®.  From 11:30-12:45 p.m. ET, I will join Henry, and together, we will present an in-depth tutorial on “Using the TOGAF Architecture Content Framework with the ArchiMate Modeling Language.” From 2:00-2:45 p.m. ET,  I will explore how to use ArchiMate to shed light on the complex interplay between people and organizations, and their often conflicting challenges, principles, goals and concerns.  My presentation “Modeling the Backstory with ArchiMate 2.0 Motivation Extension” will demonstrate this approach with a case study on improving customer service. Then, from 2:45-3:30 p.m. ET, The Business Forge Principal Neil Levette will present the session “Using the ArchiMate Standard as Tools for Modeling the Business.” Neil will explain how to use the ArchiMate language with Archi, a free tool, to model key business management mechanisms and the relationships between business motivations and operations. Finally, from 4:00-5:30 p.m. ET, Henry and I will join the “Ask the Experts: TOGAF and ArchiMate” panel to address conference attendee and Open Group member questions.

Don’t miss these opportunities to learn more about this powerful standard!

Iver Band is the vice chair of The Open Group ArchiMate Forum and is an enterprise architect at Standard Insurance Company in Portland, Oregon. Iver chose the TOGAF and ArchiMate standards for his IT organization and applies them enthusiastically to his daily responsibilities. He co-developed the initial examination content for the ArchiMate 2 Certification for People  and made other contributions to the ArchiMate 2 standard. He is TOGAF 9 Certified,  ArchiMate 2 Certified and a Certified Information Systems Security Professional.

Comments Off

Filed under ArchiMate®, Conference, Enterprise Architecture

Leveraging TOGAF to Deliver DoDAF Capabilities

By Chris Armstrong, Armstrong Process Group

In today’s environment of competing priorities and constrained resources, companies and government agencies are in even greater need to understand how to balance those priorities, leverage existing investments and align their critical resources to realize their business strategy. Sound appealing? It turns out that this is the fundamental goal of establishing an Enterprise Architecture (EA) capability. In fact, we have seen some of our clients position EA as the Enterprise Decision Support capability – that is, providing an architecture-grounded, fact-based approach to making business and IT decisions.

Many government agencies and contractors have been playing the EA game for some time — often in the context of mandatory compliance with architecture frameworks, such as the Federal Enterprise Architecture (FEA) and the Department of Defense Architecture Framework (DoDAF). These frameworks often focus significantly on taxonomies and reference models that organizations are required to use when describing their current state and their vision of a future state. We’re seeing a new breed of organizations that are looking past contractual compliance and want to exploit the business transformation dimension of EA.

In the Department of Defense (DoD) world, this is in part due to the new “capability driven” aspect of DoDAF version 2.0, where an organization aligns its architecture to a set of capabilities that are relevant to its mission. The addition of the Capability Viewpoint (CV) in DoDAF 2 enables organizations to describe their capability requirements and how their organization supports and delivers those capabilities. The CV also provides models for representing capability gaps and how new capabilities are going to be deployed over time and managed in the context of an overall capability portfolio.

Another critical difference in DoDAF 2 is the principle of “fit-for-purpose,” which allows organizations to select which architecture viewpoints and models to develop based on mission/program requirements and organizational context. One fundamental consequence of this is that an organization is no longer required to create all the models for each DoDAF viewpoint. They are to select the models and viewpoints that are relevant to developing and deploying their new, evolved capabilities.

While DoDAF 2 does provide some brief guidance on how to build architecture descriptions and subsequently leverage them for capability deployment and management, many organizations are seeking a more well-defined set of techniques and methods based on industry standard best practices.

This is where the effectiveness of DoDAF 2 can be significantly enhanced by integrating it with The Open Group Architecture Framework (TOGAF®) version 9.1, in particular the TOGAF Architecture Development Method (ADM). The ADM not only describes how to develop descriptions of the baseline and target architectures, but also provides considerable guidance on how to establish an EA capability and performing architecture roadmapping and migration planning. Most important, the TOGAF ADM describes how to drive the realization of the target architecture through integration with the systems engineering and solution delivery lifecycles. Lastly, TOGAF describes how to sustain an EA capability through the operation of a governance framework to manage the evolution of the architecture. In a nutshell, DoDAF 2 provides a common vocabulary for architecture content, while TOGAF provides a common vocabulary for developing and using that content.

I hope that those of you in the Washington, D.C. area will join me at The Open Group conference next week, where we’ll continue the discussion of how to deliver DoDAF capabilities using TOGAF. For those of you who can’t make it, I’m pleased to announce that The Open Group will also be delivering a Livestream of my presentation (free of charge) on Monday, July 16 at 2:45 p.m. ET.

Hope to see you there!

Chris Armstrong, president of Armstrong Process Group, Inc., is an internationally recognized thought leader in Enterprise Architecture, formal modeling, process improvement, systems and software engineering, requirements management, and iterative and agile development. Chris represents APG at The Open Group, the Object Management Group and the Eclipse Foundation.

 

2 Comments

Filed under Conference, Enterprise Architecture, TOGAF®

Adapting to an eBook World

By Chris Harding, The Open Group

Have you ever wanted to read something to prepare for a meeting while traveling, but  been frustrated by the difficulty of managing paper or a bulky PC? Travelers who read for pleasure have found eBooks a very convenient way to meet their needs. This format is now becoming available for select Open Group standards and guides, so that you can read them more easily when “on the road.”

The eBook format allows the device to lay out the text, rather than trying to fit pre-formatted pages to devices of all shapes and size (It is based on HTML). This makes reading an eBook a much easier and more pleasant experience than trying to read a static format such as PDF on a device where the page doesn’t fit.

There are portable electronic devices designed primarily for the purpose of reading digital books – the Amazon Kindle is the best known – but eBooks can also be read on tablets, mobile phones (on which the quality can be surprisingly good) and, of course, on laptops, using free-to-download software apps. The eBook readers are, essentially, small-sized special-purpose tablets with superb text display quality and – a big advantage on a long flight – batteries that can go weeks rather than hours without re-charging. As the quality and battery life of tablets continues to improve, they are starting to overtake specialized reader devices, which have one major disadvantage: a lack of standardization.

There are a number of different eBook formats, the most prominent being EPUB, an open standard created by the International Digital Publishing Forum, KF8, the proprietary format used by Amazon Kindle, and Mobipocket, a format that the Kindle will also handle (There is an excellent Wikipedia article on eBook formats, see http://en.wikipedia.org/wiki/Comparison_of_e-book_formats). You can read any of the most popular formats on a tablet (or PC, Mac, iPhone or Android device) using a software app, but you are likely to find that a specialized reader device is limited in the formats that it can handle.

Many of the Open Group SOA Standards and Guides are now freely available in the EPUB and Mobipocket formats from The Open Group bookstore. See http://soa-standards.opengroup.org/post/eBook-Versions-of-SOA-Standards-and-Guides-5884765 for the current list. We are hoping to make all our new SOA standards and guides available in this way, and also some Open Group publications on Cloud Computing. EPUB versions of TOGAF® Version 9.1, the TOGAF 9.1 Pocket Guide and the TOGAF 9 study guides are available for purchase from The Open Group’s official publisher, Van Haren. The SOA and the TOGAF EPUBS can be obtained from The Open Group bookstore at http://www.opengroup.org/bookstore/catalog .

Thirty years ago, I used to attend meetings of the CCITT (now the ITU-T) in Geneva. The trolleys that were pushed around the UN building, piled high with working documents for distribution to delegates, were an impressive sight, but the sheer weight of paper that had to be carried to and from the meetings was a real problem. Laptops with Internet access have removed the need to carry documents. Now, eBooks are making it easy to read them while traveling!

We have started to make eBook versions of our standards and guides available and are still exploring the possibilities. We’d love to hear your thoughts on what will or won’t work, and what will work best.  Please feel free to share your ideas in the comments section below.

Andrew Josey, director of standards at The Open Group, contributed to the technical aspects of this blog post. 

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

Comments Off

Filed under Cloud/SOA, TOGAF®

Learn How Enterprise Architects Can Better Relate TOGAF and DoDAF to Bring Best IT Practices to Defense Contracts

By Dana Gardner, Interarbor Solutions

This BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference in Washington, D.C., beginning July 16. The conference will focus on how Enterprise Architecture (EA), enterprise transformation, and securing global supply chains.

We’re joined by one of the main speakers at the July 16 conference, Chris Armstrong, President of Armstrong Process Group, to examine how governments in particular are using various frameworks to improve their architectural planning and IT implementations.

Armstrong is an internationally recognized thought leader in EA, formal modeling, process improvement, systems and software engineering, requirements management, and iterative and agile development.

He represents the Armstrong Process Group at the Open Group, the Object Management Group (OMG), and Eclipse Foundation. Armstrong also co-chairs The Open Group Architectural Framework (TOGAF®), and Model Driven Architecture (MDA) process modeling efforts, and also the TOGAF 9 Tool Certification program, all at The Open Group.

At the conference, Armstrong will examine the use of TOGAF 9 to deliver Department of Defense (DoD) Architecture Framework or DoDAF 2 capabilities. And in doing so, we’ll discuss how to use TOGAF architecture development methods to drive the development and use of DoDAF 2 architectures for delivering new mission and program capabilities. His presentation will also be Livestreamed free from The Open Group Conference. The full podcast can be found here.

Here are some excerpts:

Gardner: TOGAF and DoDAF, where have they been? Where are they going? And why do they need to relate to one another more these days?

Armstrong: TOGAF [forms] a set of essential components for establishing and operating an EA capability within an organization. And it contains three of the four key components of any EA.

First, the method by which EA work is done, including how it touches other life cycles within the organization and how it’s governed and managed. Then, there’s a skills framework that talks about the skills and experiences that the individual practitioners must have in order to participate in the EA work. Then, there’s a taxonomy framework that describes the semantics and form of the deliverables and the knowledge that the EA function is trying to manage.

One-stop shop

One of the great things that TOGAF has going for it is that, on the one hand, it’s designed to be a one-stop shop — namely providing everything that a end-user organization might need to establish an EA practice. But it does acknowledge that there are other components, predominantly in the various taxonomies and reference models, that various end-user organizations may want to substitute or augment.

It turns out that TOGAF has a nice synergy with other taxonomies, such as DoDAF, as it provides the backdrop for how to establish the overall EA capability, how to exploit it, and put it into practice to deliver new business capabilities.

Frameworks, such as DoDAF, focus predominantly on the taxonomy, mainly the kinds of things we’re keeping track of, the semantics relationships, and perhaps some formalism on how they’re structured. There’s a little bit of method guidance within DoDAF, but not a lot. So we see the marriage of the two as a natural synergy.

Gardner: So their complementary natures allows for more particulars on the defense side, but the overall TOGAF looks at the implementation method and skills for how this works best. Is this something new, or are we just learning to do it better?

Armstrong: I think we’re seeing the state of industry advance and looking at trying to have the federal government, both United States and abroad, embrace global industry standards for EA work. Historically, particularly in the US government, a lot of defense agencies and their contractors have often been focusing on a minimalistic compliance perspective with respect to DoDAF. In order to get paid for this work or be authorized to do this work, one of our requirements is we must produce DoDAF.

People are doing that because they’ve been commanded to do it. We’re seeing a new level of awareness. There’s some synergy with what’s going on in the DoDAF space, particularly as it relates to migrating from DoDAF 1.5 to DoDAF 2.

Agencies need some method and technique guidance on exactly how to come up with those particular viewpoints that are going to be most relevant, and how to exploit what DoDAF has to offer, in a way that advances the business as opposed to just solely being to conforming or compliant?

Gardner: Have there been hurdles, perhaps culturally, because of the landscape of these different companies and their inability to have that boundary-less interaction. What’s been the hurdle? What’s prevented this from being more beneficial at that higher level?

Armstrong: Probably overall organizational and practitioner maturity. There certainly are a lot of very skilled organizations and individuals out there. However, we’re trying to get them all lined up with the best practice for establishing an EA capability and then operating it and using it to a business strategic advantage, something that TOGAF defines very nicely and which the DoDAF taxonomy and work products hold in very effectively.

Gardner: Help me understand, Chris. Is this discussion that you’ll be delivering on July 16 primarily for TOGAF people to better understand how to implement vis-à-vis, DoDAF, is this the other direction, or is it a two-way street?

Two-way street

Armstrong: It’s a two-way street. One of the big things that particularly the DoD space has going for it is that there’s quite a bit of maturity in the notion of formally specified models, as DoDAF describes them, and the various views that DoDAF includes.

We’d like to think that, because of that maturity, the general TOGAF community can glean a lot of benefit from the experience they’ve had. What does it take to capture these architecture descriptions, some of the finer points about managing some of those assets. People within the TOGAF general community are always looking for case studies and best practices that demonstrate to them that what other people are doing is something that they can do as well.

We also think that the federal agency community also has a lot to glean from this. Again, we’re trying to get some convergence on standard methods and techniques, so that they can more easily have resources join their teams and immediately be productive and add value to their projects, because they’re all based on a standard EA method and framework.

One of the major changes between DoDAF 1 and DoDAF 2 is the focusing on fitness for purpose. In the past, a lot of organizations felt that it was their obligation to describe all architecture viewpoints that DoDAF suggests without necessarily taking a step back and saying, “Why would I want to do that?”

So it’s trying to make the agencies think more critically about how they can be the most agile, mainly what’s the least amount of architecture description that we can invest and that has the greatest possible value. Organizations now have the discretion to determine what fitness for purpose is.

Then, there’s the whole idea in DoDAF 2, that the architecture is supposed to be capability-driven. That is, you’re not just describing architecture, because you have some tools that happened to be DoDAF conforming, but there is a new business capability that you’re trying to inject into the organization through capability-based transformation, which is going to involve people, process, and tools.

One of the nice things that TOGAF’s architecture development method has to offer is a well-defined set of activities and best practices for deciding how you determine what those capabilities are and how you engage your stakeholders to really help collect the requirements for what fit for purpose means.

Gardner: As with the private sector, it seems that everyone needs to move faster. I see you’ve been working on agile development. With organizations like the OMG and Eclipse is there something that doing this well — bringing the best of TOGAF and DoDAF together — enables a greater agility and speed when it comes to completing a project?

Different perspectives

Armstrong: Absolutely. When you talk about what agile means to the general community, you may get a lot of different perspectives and a lot of different answers. Ultimately, we at APG feel that agility is fundamentally about how well your organization responds to change.

If you take a step back, that’s really what we think is the fundamental litmus test of the goodness of an architecture. Whether it’s an EA, a segment architecture, or a system architecture, the architects need to think thoughtfully and considerately about what things are almost certainly going to happen in the near future. I need to anticipate, and be able to work these into my architecture in such a way that when these changes occur, the architecture can respond in a timely, relevant fashion.

We feel that, while a lot of people think that agile is just a pseudonym for not planning, not making commitments, going around in circles forever, we call that chaos, another five letter word. But agile in our experience really demands rigor, and discipline.

Of course, a lot of the culture of the DoD brings that rigor and discipline to it, but also the experience that that community has had, in particular, of formally modeling architecture description. That sets up those government agencies to act agilely much more than others.

Gardner: Do you know of anyone that has done it successfully or is in the process? Even if you can’t name them, perhaps you can describe how something like this works?

Armstrong: First, there has been some great work done by the MITRE organization through their work in collaboration at The Open Group. They’ve written a white paper that talks about which DoDAF deliverables are likely to be useful in specific architecture development method activities. We’re going to be using that as a foundation for the talk we’re going to be giving at the conference in July.

The biggest thing that TOGAF has to offer is that a nascent organization that’s jumping into the DoDAF space may just look at it from an initial compliance perspective, saying, “We have to create an AV-1, and an OV-1, and a SvcV-5,” and so on.

Providing guidance

TOGAF will provide the guidance for what is EA. Why should I care? What kind of people do I need within my organization? What kind of skills do they need? What kind of professional certification might be appropriate to get all of the participants up on the same page, so that when we’re talking about EA, we’re all using the same language?

TOGAF also, of course, has a great emphasis on architecture governance and suggests that immediately, when you’re first propping up your EA capability, you need to put into your plan how you’re going to operate and maintain these architectural assets, once they’ve been produced, so that you can exploit them in some reuse strategy moving forward.

So, the preliminary phase of the TOGAF architecture development method provides those agencies best practices on how to get going with EA, including exactly how an organization is going to exploit what the DoDAF taxonomy framework has to offer.

Then, once an organization or a contractor is charged with doing some DoDAF work, because of a new program or a new capability, they would immediately begin executing Phase A: Architecture Vision, and follow the best practices that TOGAF has to offer.

Just what is that capability that we’re trying to describe? Who are the key stakeholders, and what are their concerns? What are their business objectives and requirements? What constraints are we going to be placed under?

Part of that is to create a high-level description of the current or baseline architecture descriptions, and then the future target state, so that all parties have at least a coarse-grained idea of kind of where we’re at right now, and what our vision is of where we want to be.

Because this is really a high level requirements and scoping set of activities, we expect that that’s going to be somewhat ambiguous. As the project unfolds, they’re going to discover details that may cause some adjustment to that final target.

Internalize best practices

So, we’re seeing defense contractors being able to internalize some of these best practices, and really be prepared for the future so that they can win the greatest amount of business and respond as rapidly and appropriately as possible, as well as how they can exploit these best practices to affect greater business transformation across their enterprises.

Gardner: We mentioned that your discussion on these issues, on July 16 will be Livestreamed for free, but you’re also doing some pre-conference and post-conference activities — webinars, and other things. Tell us how this is all coming together, and for those who are interested, how they could take advantage of all of these.

Armstrong: We’re certainly very privileged that The Open Group has offered this as opportunity to share this content with the community. On Monday, June 25, we’ll be delivering a webinar that focuses on architecture change management in the DoDAF space, particularly how an organization migrates from DoDAF 1 to DoDAF 2.

I’ll be joined by a couple of other people from APG, David Rice, one of our Principal Enterprise Architects who is a member of the DoDAF 2 Working Group, as well as J.D. Baker, who is the Co-chair of the OMG’s Analysis and Design Taskforce, and a member of the Unified Profile for DoDAF and MODAF (UPDM) work group, a specification from the OMG.

We’ll be talking about things that organizations need to think about as they migrate from DoDAF 1 to DoDAF 2. We’ll be focusing on some of the key points of the DoDAF 2 meta-model, namely the rearrangement of the architecture viewpoints and the architecture partitions and how that maps from the classical DoDAF 1.5 viewpoint, as well as focusing on this notion of capability-driven architectures and fitness for purpose.

We also have the great privilege after the conference to be delivering a follow-up webinar on implementation methods and techniques around advanced DoDAF architectures. Particularly, we’re going to take a closer look at something that some people may be interested in, namely tool interoperability and how the DoDAF meta-model offers that through what’s called the Physical Exchange Specification (PES).

We’ll be taking a look a little bit more closely at this UPDM thing I just mentioned, focusing on how we can use formal modeling languages based on OMG standards, such as UML, SysML, BPMN, and SoaML, to do very formal architectural modeling.

One of the big challenges with EA is, at the end of the day, EA comes up with a set of policies, principles, assets, and best practices that talk about how the organization needs to operate and realize new solutions within that new framework. If EA doesn’t have a hand-off to the delivery method, namely systems engineering and solution delivery, then none of this architecture stuff makes a bit of a difference.

Driving the realization

We’re going to be talking a little bit about how DoDAF-based architecture description and TOGAF would drive the realization of those capabilities through traditional systems, engineering, and software development method.

************

For more information on The Open Group’s upcoming conference in Washington, D.C., please visit: http://www.opengroup.org/dc2012

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

Comments Off

Filed under Conference, Enterprise Architecture, Enterprise Transformation, TOGAF®

Cybersecurity Threats Key Theme at Washington, D.C. Conference – July 16-20, 2012

By The Open Group Conference Team

Identify risks and eliminating vulnerabilities that could undermine integrity and supply chain security is a significant global challenge and a top priority for governments, vendors, component suppliers, integrators and commercial enterprises around the world.

The Open Group Conference in Washington, D.C. will bring together leading minds in technology and government policy to discuss issues around cybersecurity and how enterprises can establish and maintain the necessary levels of integrity in a global supply chain. In addition to tutorial sessions on TOGAF and ArchiMate, the conference offers approximately 60 sessions on a varied of topics, including:

  • Cybersecurity threats and key approaches to defending critical assets and securing the global supply chain
  • Information security and Cloud security for global, open network environments within and across enterprises
  • Enterprise transformation, including Enterprise Architecture, TOGAF and SOA
  • Cloud Computing for business, collaborative Cloud frameworks and Cloud architectures
  • Transforming DoD avionics software through the use of open standards

Keynote sessions and speakers include:

  • America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime and Warfare – Keynote Speaker: Joel Brenner, author and attorney at Cooley LLP
  • Meeting the Challenge of Cybersecurity Threats through Industry-Government Partnerships – Keynote Speaker: Kristin Baldwin, principal deputy, deputy assistant secretary of defense for Systems Engineering
  • Implementation of the Federal Information Security Management Act (FISMA) – Keynote Speaker: Dr. Ron Ross, project leader at NIST (TBC)
  • Supply Chain: Mitigating Tainted and Counterfeit Products – Keynote Panel: Andras Szakal, VP and CTO at IBM Federal; Daniel Reddy, consulting product manager in the Product Security Office at EMC Corporation; John Boyens, senior advisor in the Computer Security Division at NIST; Edna Conway, chief security strategist of supply chain at Cisco; and Hart Rossman, VP and CTO of Cyber Security Services at SAIC
  • The New Role of Open Standards – Keynote Speaker: Allen Brown, CEO of The Open Group
  • Case Study: Ontario Healthcare – Keynote Speaker: Jason Uppal, chief enterprise architect at QRS
  • Future Airborne Capability Environment (FACE): Transforming the DoD Avionics Software Industry Through the Use of Open Standards – Keynote Speaker: Judy Cerenzia, program director at The Open Group; Kirk Avery of Lockheed Martin; and Robert Sweeney of Naval Air Systems Command (NAVAIR)

The full program can be found here: http://www3.opengroup.org/events/timetable/967

For more information on the conference tracks or to register, please visit our conference registration page. Please stay tuned throughout the next month as we continue to release blog posts and information leading up to The Open Group Conference in Washington, D.C. and be sure to follow the conference hashtag on Twitter – #ogDCA!

1 Comment

Filed under ArchiMate®, Cloud, Cloud/SOA, Conference, Cybersecurity, Enterprise Architecture, Information security, OTTF, Standards, Supply chain risk

Part 3 of 3: Building an Enterprise Architecture Value Proposition Using TOGAF® 9.1. and ArchiMate® 2.0

By Serge Thorn, Architecting the Enterprise

This is the third and final post in a three-part series by Serge Thorn. For more in this series, please see Part One and Part Two.

Value Management uses a combination of concepts and methods to create sustainable value for both organizations and their stakeholders. Some tools and techniques are specific to Value Management and others are generic tools that many organizations and individuals use. There exist many Value Management techniques such as cost-benefits analysis, SWOT analysis, value analysis, Pareto analysis, objectives hierarchy, function analysis system technique (FAST), and more…

The one I suggest to illustrate is close to the objectives hierarchy technique, which is a diagrammatic process for identifying objectives in a hierarchical manner and often used in conjunction with business functions. Close, because I will use a combination of the TOGAF® 9.1 metamodel with the ArchiMate® 2.0 Business Layer, Application Layer and Motivation Extensions Metamodels, consider core entities such as value, business goals, objectives, business processes and functions, business and application services, application functions and components. This approach was inspired by the presentation by Michael van den Dungen and Arjan Visser at The Open Group Conference in Amsterdam 2010, and here I’m also adding some ArchiMate 2.0 concepts.

First, the entities from the TOGAF 9.1 metamodel:

Then I will consider the entities from ArchiMate 2.0. Some may be identical to TOGAF 9.1. In the Business Layer, one key concept will obviously be the value. In this case I will consider the product (“A coherent collection of services, accompanied by a contract/set of agreements, which is offered as a whole to (internal or external) customers” according to ArchiMate 2.0), as the Enterprise Architecture program. In addition to that, I would refer to business services, functions, and processes.

In the Motivation Extension Metamodel, the goals. The objective entity in TOGAF 9.1 can also be represented using the concept of “goal.”

And in the Application Layer Metamodel, application services, functions, and components.

It is important to mention that when we deliver a value proposition, we must demonstrate to the business where the benefits will be with concrete examples. For example: the business sees Operational Excellence and Customer Intimacy as key drivers, and soon you will realize that BPM suites or CRM could support the business goals. These are the reasons why we consider the Application Layer Metamodel.

We could then use a combination of the ArchiMate 2.0 viewpoints such as: Stakeholder Viewpoint, Goal Realization Viewpoint, Motivation Viewpoint, or some other viewpoints to demonstrate the value of Enterprise Architecture for a specific business transformation program (or any other strategic initiative).

To be mentioned that the concept of benefit does not exist in any of the metamodels.

I have added the concept as an extension to ArchiMate in the following diagram which is the mapping of the value to a program related to the “improvement of customers’ relationships.” I also have intentionally limited the number of concepts or entities, such as processes, application services or measures.

Using these ArchiMate 2.0 modelling techniques can demonstrate to your stakeholders the value proposition for a business program, supported by an Enterprise Architecture initiative.

As a real example, if the expected key business benefit is operational excellence through process controls, which would represent a goal, you could present such a high level diagram to explain why application components like a BPM Suite could help (detecting fraud and errors, embedding preventive controls, continuously auditing and monitoring processes, and more).

There is definitely not a single way of demonstrating the value of Enterprise Architecture and you probably will have to adapt the process and the way you will present that value to all companies you will be working with. Without a doubt Enterprise Architecture contributes to the success of an organization and brings numerous benefits, but very often it needs to be able to demonstrate that value. Using some techniques as described previously will help to justify such an initiative.

The next steps will be the development of measures, metrics and KPIs to continuously monitor that value proposition.

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

Comments Off

Filed under ArchiMate®, Enterprise Architecture, Enterprise Transformation, TOGAF®

Part 2 of 3: Building an Enterprise Architecture Value Proposition Using TOGAF® 9.1. and ArchiMate® 2.0

By Serge Thorn, Architecting the Enterprise

This is the second post in a three-part series by Serge Thorn.

Continuing from Part One of this series, here are more examples of what an enterprise cannot achieve without Enterprise Architecture:

Reduce IT costs by consolidating, standardizing, rationalizing and integrating corporate information systems

Cost avoidance can be achieved by identifying overlapping functional scope of two or more proposed projects in an organization or the potential cost savings of IT support by standardizing on one solution.

Consolidation can happen at various levels for architectures — for shared enterprise services, applications and information, for technologies and even data centers.

This could involve consolidating the number of database servers, application or web servers and storage devices, consolidating redundant security platforms, or adopting virtualization, grid computing and related consolidation initiatives. Consolidation may be a by-product of another technology transformation or it may be the driver of these transformations.

Whatever motivates the change, the key is to be in alignment, once again, with the overall business strategy. Enterprise architects understand where the business is going, so they can pick the appropriate consolidation strategy. Rationalization, standardization and consolidation processes helps organizations understand their current enterprise maturity level and move forward on the appropriate roadmap.

More spending on innovation

Enterprise Architecture should serve as a driver of innovation. Innovation is highly important when developing a target Enterprise Architecture and in realizing the organization’s strategic goals and objectives. For example, it may help to connect the dots between business requirements and the new approaches SOA and cloud services can deliver.

Enabling strategic business goals via better operational excellence

Building Enterprise Architecture defines the structure and operation of an organization. The intent of Enterprise Architecture is to determine how an organization can most effectively achieve its current and future objectives. It must be designed to support an organization’s specific business strategies.

Jeanne W. Ross, Peter Weill, David C. Robertson in “Enterprise Architecture as Strategy: Creating a Foundation for Business” wrote “Companies with more-mature architectures reported greater success in achieving strategic goals” (p. 89). “This included better operational excellence, more customer intimacy, and greater product leadership” (p. 100).

Customer intimacy

Enterprises that are customer focused and aim to provide solutions for their customers should design their business model, IT systems and operational activities to support this strategy at the process level. This involves the selection of one or few high-value customer niches, followed by an obsessive effort at getting to know these customers in detail.

Greater product leadership

This approach enabled by Enterprise Architecture is dedicated to providing the best possible products from the perspective of the features and benefits offered to the customer. It is the basic philosophy about products that push performance boundaries. Products or services delivered by the business will be refined by leveraging IT to do the end customer’s job better. This will be accomplished by the delivery of new business capabilities (e.g. on-line websites, BI, etc.).

Comply with regulatory requirements

Enterprise Architecture helps companies to know and represent their processes and systems and how they correlate. This is fundamental for risk management and managing regulation requirements, such as those derived from Sarbanes-Oxley, COSO, HIPAA, etc.

This list could be continued as there are many other reasons why Enterprise Architecture brings benefits to organizations. Once your benefits have been documented you could also consider some value management techniques. TOGAF® 9.1 refers in the Architecture Vision phase to a target value proposition for a specific project.  In the next blog, we’ll address the issue of applying the value proposition to the Enterprise Architecture initiative as a whole.

The third and final part of this blog series will discuss value management. 

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

2 Comments

Filed under ArchiMate®, Enterprise Architecture, Enterprise Transformation, TOGAF®

Part 1 of 3: Building an Enterprise Architecture Value Proposition Using TOGAF® 9.1. and ArchiMate® 2.0

By Serge Thorn, Architecting the Enterprise

This is the first post in a three-part series by Serge Thorn. 

When introducing Enterprise Architecture as a program or initiative, it is regularly done from an IT perspective rarely considering what the costs will be and if there will be any return on investment. This presents a particular challenge to Enterprise Architecture.

Generally speaking, IT departments have all sorts of criteria to justify projects and measure their performance. They use measurements, metrics and KPIs. Going to the solution level, they commonly use indicators such as percentage uptime for systems from the system management team, error rates for applications from the development support team or number of calls resolved on the first call from the service desk, etc. These KPIs usually are defined at an early stage and very often delivered in dashboards from various support applications.

On the other hand, it is much more difficult to define and implement a quantifiable measure for Enterprise Architecture. Many activities introduced with appropriate governance will enhance the quality of the delivered products and services, but it still will be a challenge to attribute results to the quality of Enterprise Architecture efforts.

This being said, Enterprise Architects should be able to define and justify the benefits of their activities to their stakeholders, and to help executives understand how Enterprise Architecture will contribute to the primary value-adding objectives and processes, before starting the voyage. The more it is described and understood, the more the Enterprise Architecture team will gain support from the management. There are plenty of contributions that Enterprise Architecture brings and they will have to be documented and presented at an early stage.

There won’t be just one single answer to demonstrate the value of an Enterprise Architecture but there seems to be a common pattern when considering feedback from various companies I have worked with.

Without Enterprise Architecture you can probably NOT fully achieve:

IT alignment with the business goals

As an example among others, the problem with most IT plans is that they do not indicate what the business value is and what strategic or tactical business benefit the organization is planning to achieve. The simple matter is that any IT plan needs also to have a business metric, not only an IT metric of delivery. Another aspect is the ability to create and share a common vision of the future shared by the business and IT communities.

Integration

With the rapid pace of change in business environment, the need to transform organizations into agile enterprises that can respond quickly to change has never been greater. Methodologies and computer technologies are needed to enable rapid business and system change. The solution also lies in enterprise integration (both business and technology integration).

For business integration, we use Enterprise Architecture methodologies and frameworks to integrate functions, processes, data, locations, people, events and business plans throughout an organization. Specifically, the unification and integration of business processes and data across the enterprise and potential linkage with external partners become more and more important.

To also have technology integration, we may use enterprise portals, enterprise application integration (EAI/ESB), web services, service-oriented architecture (SOA), business process management (BPM) and try to lower the number of interfaces.

Change management

In recent years the scope of Enterprise Architecture has expanded beyond the IT domain and enterprise architects are increasingly taking on broader roles relating to organizational strategy and change management. Frameworks such as TOGAF® 9.1 include processes and tools for managing both the business/people and the technology sides of an organization. Enterprise Architecture supports the creation of changes related to the various architecture domains, evaluating the impact on the enterprise, taking into account risk management, financial aspects (cost/benefit analysis), and most importantly ensuring alignment with business goals and objectives. Enterprise Architecture value is essentially tied to its ability to help companies to deal with complexity and changes.

Reduced time to market and increased IT responsiveness

Enterprise Architecture should reduce systems development, applications generation and modernization timeframes for legacy systems. It should also decrease resource requirements. All of this can be accomplished by re-using standards or existing components, such as the architecture and solution building blocks in TOGAF 9.1. Delivery time and design/development costs can also be decreased by the reuse of reference models. All that information should be managed in an Enterprise Architecture repository.

Better access to information across applications and improved interoperability

Data and information architectures manage the organization assets of information, optimally and efficiently. This supports the quality, accuracy and timely availability of data for executive and strategic business decision-making, across applications.

Readily available descriptive representations and documentation of the enterprise

Architecture is also a set of descriptive representations (i.e. “models”) that are relevant for describing an enterprise such that it can be produced to management’s requirements and maintained over the period of its useful life. Using an architecture repository, developing a variety of artifacts and modelling some of the key elements of the enterprise, will contribute to build this documentation.

The second part of the series will include more examples of what an enterprise cannot achieve without Enterprise Architecture. 

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

3 Comments

Filed under ArchiMate®, Enterprise Architecture, Enterprise Transformation, TOGAF®

When Was Your Last Enterprise Architecture Maturity Assessment

By Serge Thorn, Architecting the Enterprise

Every company should plan regular architecture capability maturity assessments using a model. These should provide a framework that represents the key components of a productive enterprise architecture process. A model provides an evolutionary way to improve the overall process that starts out in an ad hoc state, transforms into an immature process, and then finally becomes a well defined, disciplined, managed and mature process. The goal is to enhance the overall odds for success of the Enterprise Architecture by identifying weak areas and providing a defined path towards improvement. As the architecture matures, it should increase the benefits it offers the organization.

Architecture maturity assessments help to determine how companies can maximize competitive advantage, identify ways of cutting costs, improve quality of services and reduce time to market. These assessments are undertaken as part of the Enterprise Architecture management. There are some methodologies for assessment of the comprehensive Enterprise Architecture maturity. Examples of these are the U.S. Department of Commerce ACMM, the Open Group architecture maturity model and a BSC-based Architecture Score Card presented by IFEAD. For application or technology portfolios, portfolio evaluation models can be used.

As a part of project development, assessments (in reality compliance) of architecture solutions are made against the business objectives and requirements (desired process and service structures and business models) and the constraints derived from the Enterprise Architecture context (these may be standards, principles, policies or other restrictions for solution development). Assessment and compliance of technologies are also a central part of Enterprise Architecture development projects. Finally, the development of Enterprise Architectures undergoes the scrutiny of the software development quality assurance method in use. Many IT providers have adopted a comprehensive software quality assurance approach like CMMI, or ISO/IEC 15504 (known as SPICE).

Using the Architecture Capability Maturity Model from TOGAF® 9.1 is a great way of evaluating the way companies have implemented the framework, to identify the gaps between the business vision and the business capabilities. Unfortunately no sufficient assessment instruments or tools have been developed for TOGAF based assessments.

Instruments or tools should contain maturity and documentation assessment questionnaires and a method on how to conduct such an assessment. In the following example you may observe four different phases on how to run an assessment:

Phase 1 would include several steps:

  • Planning & preparation workshop with the stakeholders. Stakeholders should represent both Business and IT.
  • Interviews with stakeholders based on a questionnaire related to all process areas (elements in TOGAF) or domains that have characteristics important to the development and deployment of Enterprise Architecture. Each process area could be divided into a number of practices, which are statements that describe the process area for each level of maturity, on a scale of 0 to 5. Each practice would have a set of practice indicators, evidence that the requirements for a process area to be at a given level have been met. A set of questions that will be asked in the interviews establishes whether or not the practice indicators exist and thus the level of maturity for the practice. If all the practices for a given level within a Process Area are present, then that level will be achieved. Ideally, directly relevant documentary evidence will be provided to demonstrate that the practice Indicator exists. As this is not always practical, sometimes for this exercise, only verbal evidence from subject matter experts will be considered.
  • Production of a report.
  • Calculation of a maturity score. For the representation, we use the term maturity level or organizational maturity as described below

Sources

  • CMMI for Development (Version 1.2, 2006)
  • Appraisal Requirements for CMMI (ARC) (Version 1.2, 2006)
  • The U.S. Department of Commerce Enterprise Architecture Capability Maturity Model (2007)
  • TOGAF® 9.1
  • NASCIO Enterprise Architecture Maturity Model (Version 1.3, 2003)

We then deliver a report which includes the maturity of each process area or element (There are more elements in this example than those in the chapter 51 of the TOGAF® Version 9.1).

The use of radar may also be a nice way to present the results. Please see the example below:

  • Presentation of the report to the stakeholders with strengths, weaknesses, gap analysis and recommendations
  • Next steps

Phase 2 would include several steps:

  • Based on results from Phase 1, a consensus workshop would produce a roadmap and action plan with recommendations to attain the next required level of maturity.
  • The workshop would provide a tool to produce an objective view of the report provided in Phase 1. This would give stakeholders and the senior management team a detailed view of how well Enterprise Architecture is deployed in the organization, it provides a full understanding of the business drivers and organization issues and a clear view of where the stakeholders want the organization to be. The outputs of this phase are priorities, and an action plan that is agreed and understood by the key stakeholders in the organization. There could also be a target radar diagram as shown below:

The updated report may then look like this:

Phase 3 would be the management of Enterprise Architecture as described in the report and Phase 4, which is similar to Phase 1.

Conducting an evaluation of an organization’s current practices against an architecture capability maturity assessment model allows companies to determine the level at which it currently stands. It will indicate the organization’s maturity in the area of Enterprise Architecture and highlight the practices that the organization needs to focus on in order to see the greatest improvement and the highest return on investment. The recommendation is that assessments should be carried out annually.

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

1 Comment

Filed under Enterprise Architecture, TOGAF®

Open Group Security Gurus Dissect the Cloud: Higher of Lower Risk

By Dana Gardner, Interarbor Solutions

For some, any move to the Cloud — at least the public Cloud — means a higher risk for security.

For others, relying more on a public Cloud provider means better security. There’s more of a concentrated and comprehensive focus on security best practices that are perhaps better implemented and monitored centrally in the major public Clouds.

And so which is it? Is Cloud a positive or negative when it comes to cyber security? And what of hybrid models that combine public and private Cloud activities, how is security impacted in those cases?

We posed these and other questions to a panel of security experts at last week’s Open Group Conference in San Francisco to deeply examine how Cloud and security come together — for better or worse.

The panel: Jim Hietala, Vice President of Security for The Open Group; Stuart Boardman, Senior Business Consultant at KPN, where he co-leads the Enterprise Architecture Practice as well as the Cloud Computing Solutions Group; Dave Gilmour, an Associate at Metaplexity Associates and a Director at PreterLex Ltd., and Mary Ann Mezzapelle, Strategist for Enterprise Services and Chief Technologist for Security Services at HP.

The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. The full podcast can be found here.

Here are some excerpts:

Gardner: Is this notion of going outside the firewall fundamentally a good or bad thing when it comes to security?

Hietala: It can be either. Talking to security people in large companies, frequently what I hear is that with adoption of some of those services, their policy is either let’s try and block that until we get a grip on how to do it right, or let’s establish a policy that says we just don’t use certain kinds of Cloud services. Data I see says that that’s really a failed strategy. Adoption is happening whether they embrace it or not.

The real issue is how you do that in a planned, strategic way, as opposed to letting services like Dropbox and other kinds of Cloud Collaboration services just happen. So it’s really about getting some forethought around how do we do this the right way, picking the right services that meet your security objectives, and going from there.

Gardner: Is Cloud Computing good or bad for security purposes?

Boardman: It’s simply a fact, and it’s something that we need to learn to live with.

What I’ve noticed through my own work is a lot of enterprise security policies were written before we had Cloud, but when we had private web applications that you might call Cloud these days, and the policies tend to be directed toward staff’s private use of the Cloud.

Then you run into problems, because you read something in policy — and if you interpret that as meaning Cloud, it means you can’t do it. And if you say it’s not Cloud, then you haven’t got any policy about it at all. Enterprises need to sit down and think, “What would it mean to us to make use of Cloud services and to ask as well, what are we likely to do with Cloud services?”

Gardner: Dave, is there an added impetus for Cloud providers to be somewhat more secure than enterprises?

Gilmour: It depends on the enterprise that they’re actually supplying to. If you’re in a heavily regulated industry, you have a different view of what levels of security you need and want, and therefore what you’re going to impose contractually on your Cloud supplier. That means that the different Cloud suppliers are going to have to attack different industries with different levels of security arrangements.

The problem there is that the penalty regimes are always going to say, “Well, if the security lapses, you’re going to get off with two months of not paying” or something like that. That kind of attitude isn’t going to go in this kind of security.

What I don’t understand is exactly how secure Cloud provision is going to be enabled and governed under tight regimes like that.

An opportunity

Gardner: Jim, we’ve seen in the public sector that governments are recognizing that Cloud models could be a benefit to them. They can reduce redundancy. They can control and standardize. They’re putting in place some definitions, implementation standards, and so forth. Is the vanguard of correct Cloud Computing with security in mind being managed by governments at this point?

Hietala: I’d say that they’re at the forefront. Some of these shared government services, where they stand up Cloud and make it available to lots of different departments in a government, have the ability to do what they want from a security standpoint, not relying on a public provider, and get it right from their perspective and meet their requirements. They then take that consistent service out to lots of departments that may not have had the resources to get IT security right, when they were doing it themselves. So I think you can make a case for that.

Gardner: Stuart, being involved with standards activities yourself, does moving to the Cloud provide a better environment for managing, maintaining, instilling, and improving on standards than enterprise by enterprise by enterprise? As I say, we’re looking at a larger pool and therefore that strikes me as possibly being a better place to invoke and manage standards.

Boardman: Dana, that’s a really good point, and I do agree. Also, in the security field, we have an advantage in the sense that there are quite a lot of standards out there to deal with interoperability, exchange of policy, exchange of credentials, which we can use. If we adopt those, then we’ve got a much better chance of getting those standards used widely in the Cloud world than in an individual enterprise, with an individual supplier, where it’s not negotiation, but “you use my API, and it looks like this.”

Having said that, there are a lot of well-known Cloud providers who do not currently support those standards and they need a strong commercial reason to do it. So it’s going to be a question of the balance. Will we get enough specific weight of people who are using it to force the others to come on board? And I have no idea what the answer to that is.

Gardner: We’ve also seen that cooperation is an important aspect of security, knowing what’s going on on other people’s networks, being able to share information about what the threats are, remediation, working to move quickly and comprehensively when there are security issues across different networks.

Is that a case, Dave, where having a Cloud environment is a benefit? That is to say more sharing about what’s happening across networks for many companies that are clients or customers of a Cloud provider rather than perhaps spotty sharing when it comes to company by company?

Gilmour: There is something to be said for that, Dana. Part of the issue, though, is that companies are individually responsible for their data. They’re individually responsible to a regulator or to their clients for their data. The question then becomes that as soon as you start to share a certain aspect of the security, you’re de facto sharing the weaknesses as well as the strengths.

So it’s a two-edged sword. One of the problems we have is that until we mature a little bit more, we won’t be able to actually see which side is the sharpest.

Gardner: So our premise that Cloud is good and bad for security is holding up, but I’m wondering whether the same things that make you a risk in a private setting — poor adhesion to standards, no good governance, too many technologies that are not being measured and controlled, not instilling good behavior in your employees and then enforcing that — wouldn’t this be the same either way? Is it really Cloud or not Cloud, or is it good security practices or not good security practices? Mary Ann?

No accountability

Mezzapelle: You’re right. It’s a little bit of that “garbage in, garbage out,” if you don’t have the basic things in place in your enterprise, which means the policies, the governance cycle, the audit, and the tracking, because it doesn’t matter if you don’t measure it and track it, and if there is no business accountability.

David said it — each individual company is responsible for its own security, but I would say that it’s the business owner that’s responsible for the security, because they’re the ones that ultimately have to answer that question for themselves in their own business environment: “Is it enough for what I have to get done? Is the agility more important than the flexibility in getting to some systems or the accessibility for other people, as it is with some of the ubiquitous computing?”

So you’re right. If it’s an ugly situation within your enterprise, it’s going to get worse when you do outsourcing, out-tasking, or anything else you want to call within the Cloud environment. One of the things that we say is that organizations not only need to know their technology, but they have to get better at relationship management, understanding who their partners are, and being able to negotiate and manage that effectively through a series of relationships, not just transactions.

Gardner: If data and sharing data is so important, it strikes me that Cloud component is going to be part of that, especially if we’re dealing with business processes across organizations, doing joins, comparing and contrasting data, crunching it and sharing it, making data actually part of the business, a revenue generation activity, all seems prominent and likely.

So to you, Stuart, what is the issue now with data in the Cloud? Is it good, bad, or just the same double-edged sword, and it just depends how you manage and do it?

Boardman: Dana, I don’t know whether we really want to be putting our data in the Cloud, so much as putting the access to our data into the Cloud. There are all kinds of issues you’re going to run up against, as soon as you start putting your source information out into the Cloud, not the least privacy and that kind of thing.

A bunch of APIs

What you can do is simply say, “What information do I have that might be interesting to people? If it’s a private Cloud in a large organization elsewhere in the organization, how can I make that available to share?” Or maybe it’s really going out into public. What a government, for example, can be thinking about is making information services available, not just what you go and get from them that they already published. But “this is the information,” a bunch of APIs if you like. I prefer to call them data services, and to make those available.

So, if you do it properly, you have a layer of security in front of your data. You’re not letting people come in and do joins across all your tables. You’re providing information. That does require you then to engage your users in what is it that they want and what they want to do. Maybe there are people out there who want to take a bit of your information and a bit of somebody else’s and mash it together, provide added value. That’s great. Let’s go for that and not try and answer every possible question in advance.

Gardner: Dave, do you agree with that, or do you think that there is a place in the Cloud for some data?

Gilmour: There’s definitely a place in the Cloud for some data. I get the impression that there is going to drive out of this something like the insurance industry, where you’ll have a secondary Cloud. You’ll have secondary providers who will provide to the front-end providers. They might do things like archiving and that sort of thing.

Now, if you have that situation where your contractual relationship is two steps away, then you have to be very confident and certain of your cloud partner, and it has to actually therefore encompass a very strong level of governance.

The other issue you have is that you’ve got then the intersection of your governance requirements with that of the cloud provider’s governance requirements. Therefore you have to have a really strongly — and I hate to use the word — architected set of interfaces, so that you can understand how that governance is actually going to operate.

Gardner: Wouldn’t data perhaps be safer in a cloud than if they have a poorly managed network?

Mezzapelle: There is data in the Cloud and there will continue to be data in the Cloud, whether you want it there or not. The best organizations are going to start understanding that they can’t control it that way and that perimeter-like approach that we’ve been talking about getting away from for the last five or seven years.

So what we want to talk about is data-centric security, where you understand, based on role or context, who is going to access the information and for what reason. I think there is a better opportunity for services like storage, whether it’s for archiving or for near term use.

There are also other services that you don’t want to have to pay for 12 months out of the year, but that you might need independently. For instance, when you’re running a marketing campaign, you already share your data with some of your marketing partners. Or if you’re doing your payroll, you’re sharing that data through some of the national providers.

Data in different places

So there already is a lot of data in a lot of different places, whether you want Cloud or not, but the context is, it’s not in your perimeter, under your direct control, all of the time. The better you get at managing it wherever it is specific to the context, the better off you will be.

Hietala: It’s a slippery slope [when it comes to customer data]. That’s the most dangerous data to stick out in a Cloud service, if you ask me. If it’s personally identifiable information, then you get the privacy concerns that Stuart talked about. So to the extent you’re looking at putting that kind of data in a Cloud, looking at the Cloud service and trying to determine if we can apply some encryption, apply the sensible security controls to ensure that if that data gets loose, you’re not ending up in the headlines of The Wall Street Journal.

Gardner: Dave, you said there will be different levels on a regulatory basis for security. Wouldn’t that also play with data? Wouldn’t there be different types of data and therefore a spectrum of security and availability to that data?

Gilmour: You’re right. If we come back to Facebook as an example, Facebook is data that, even if it’s data about our known customers, it’s stuff that they have put out there with their will. The data that they give us, they have given to us for a purpose, and it is not for us then to distribute that data or make it available elsewhere. The fact that it may be the same data is not relevant to the discussion.

Three-dimensional solution

That’s where I think we are going to end up with not just one layer or two layers. We’re going to end up with a sort of a three-dimensional solution space. We’re going to work out exactly which chunk we’re going to handle in which way. There will be significant areas where these things crossover.

The other thing we shouldn’t forget is that data includes our software, and that’s something that people forget. Software nowadays is out in the Cloud, under current ways of running things, and you don’t even always know where it’s executing. So if you don’t know where your software is executing, how do you know where your data is?

It’s going to have to be just handled one way or another, and I think it’s going to be one of these things where it’s going to be shades of gray, because it cannot be black and white. The question is going to be, what’s the threshold shade of gray that’s acceptable.

Gardner: Mary Ann, to this notion of the different layers of security for different types of data, is there anything happening in the market that you’re aware of that’s already moving in that direction?

Mezzapelle: The experience that I have is mostly in some of the business frameworks for particular industries, like healthcare and what it takes to comply with the HIPAA regulation, or in the financial services industry, or in consumer products where you have to comply with the PCI regulations.

There has continued to be an issue around information lifecycle management, which is categorizing your data. Within a company, you might have had a document that you coded private, confidential, top secret, or whatever. So you might have had three or four levels for a document.

You’ve already talked about how complex it’s going to be as you move into trying understand, not only for that data, that the name Mary Ann Mezzapelle, happens to be in five or six different business systems over a 100 instances around the world.

That’s the importance of something like an Enterprise Architecture that can help you understand that you’re not just talking about the technology components, but the information, what they mean, and how they are prioritized or critical to the business, which sometimes comes up in a business continuity plan from a system point of view. That’s where I’ve advised clients on where they might start looking to how they connect the business criticality with a piece of information.

One last thing. Those regulations don’t necessarily mean that you’re secure. It makes for good basic health, but that doesn’t mean that it’s ultimately protected.You have to do a risk assessment based on your own environment and the bad actors that you expect and the priorities based on that.

Leaving security to the end

Boardman: I just wanted to pick up here, because Mary Ann spoke about Enterprise Architecture. One of my bugbears — and I call myself an enterprise architect — is that, we have a terrible habit of leaving security to the end. We don’t architect security into our Enterprise Architecture. It’s a techie thing, and we’ll fix that at the back. There are also people in the security world who are techies and they think that they will do it that way as well.

I don’t know how long ago it was published, but there was an activity to look at bringing the SABSA Methodology from security together with TOGAF®. There was a white paper published a few weeks ago.

The Open Group has been doing some really good work on bringing security right in to the process of EA.

Hietala: In the next version of TOGAF, which has already started, there will be a whole emphasis on making sure that security is better represented in some of the TOGAF guidance. That’s ongoing work here at The Open Group.

Gardner: As I listen, it sounds as if the in the Cloud or out of the Cloud security continuum is perhaps the wrong way to look at it. If you have a lifecycle approach to services and to data, then you’ll have a way in which you can approach data uses for certain instances, certain requirements, and that would then apply to a variety of different private Cloud, public Cloud, hybrid Cloud.

Is that where we need to go, perhaps have more of this lifecycle approach to services and data that would accommodate any number of different scenarios in terms of hosting access and availability? The Cloud seems inevitable. So what we really need to focus on are the services and the data.

Boardman: That’s part of it. That needs to be tied in with the risk-based approach. So if we have done that, we can then pick up on that information and we can look at a concrete situation, what have we got here, what do we want to do with it. We can then compare that information. We can assess our risk based on what we have done around the lifecycle. We can understand specifically what we might be thinking about putting where and come up with a sensible risk approach.

You may come to the conclusion in some cases that the risk is too high and the mitigation too expensive. In others, you may say, no, because we understand our information and we understand the risk situation, we can live with that, it’s fine.

Gardner: It sounds as if we are coming at this as an underwriter for an insurance company. Is that the way to look at it?

Current risk

Gilmour: That’s eminently sensible. You have the mortality tables, you have the current risk, and you just work the two together and work out what’s the premium. That’s probably a very good paradigm to give us guidance actually as to how we should approach intellectually the problem.

Mezzapelle: One of the problems is that we don’t have those actuarial tables yet. That’s a little bit of an issue for a lot of people when they talk about, “I’ve got $100 to spend on security. Where am I going to spend it this year? Am I going to spend it on firewalls? Am I going to spend it on information lifecycle management assessment? What am I going to spend it on?” That’s some of the research that we have been doing at HP is to try to get that into something that’s more of a statistic.

So, when you have a particular project that does a certain kind of security implementation, you can see what the business return on it is and how it actually lowers risk. We found that it’s better to spend your money on getting a better system to patch your systems than it is to do some other kind of content filtering or something like that.

Gardner: Perhaps what we need is the equivalent of an Underwriters Laboratories (UL) for permeable organizational IT assets, where the security stamp of approval comes in high or low. Then, you could get you insurance insight– maybe something for The Open Group to look into. Any thoughts about how standards and a consortium approach would come into that?

Hietala: I don’t know about the UL for all security things. That sounds like a risky proposition.

Gardner: It could be fairly popular and remunerative.

Hietala: It could.

Mezzapelle: An unending job.

Hietala: I will say we have one active project in the Security Forum that is looking at trying to allow organizations to measure and understand risk dependencies that they inherit from other organizations.

So if I’m outsourcing a function to XYZ corporation, being able to measure what risk am I inheriting from them by virtue of them doing some IT processing for me, could be a Cloud provider or it could be somebody doing a business process for me, whatever. So there’s work going on there.

I heard just last week about a NSF funded project here in the U.S. to do the same sort of thing, to look at trying to measure risk in a predictable way. So there are things going on out there.

Gardner: We have to wrap up, I’m afraid, but Stuart, it seems as if currently it’s the larger public Cloud provider, something of Amazon and Google and among others that might be playing the role of all of these entities we are talking about. They are their own self-insurer. They are their own underwriter. They are their own risk assessor, like a UL. Do you think that’s going to continue to be the case?

Boardman: No, I think that as Cloud adoption increases, you will have a greater weight of consumer organizations who will need to do that themselves. You look at the question that it’s not just responsibility, but it’s also accountability. At the end of the day, you’re always accountable for the data that you hold. It doesn’t matter where you put it and how many other parties they subcontract that out to.

The weight will change

So there’s a need to have that, and as the adoption increases, there’s less fear and more, “Let’s do something about it.” Then, I think the weight will change.

Plus, of course, there are other parties coming into this world, the world that Amazon has created. I’d imagine that HP is probably one of them as well, but all the big names in IT are moving in here, and I suspect that also for those companies there’s a differentiator in knowing how to do this properly in their history of enterprise involvement.

So yeah, I think it will change. That’s no offense to Amazon, etc. I just think that the balance is going to change.

Gilmour: Yes. I think that’s how it has to go. The question that then arises is, who is going to police the policeman and how is that going to happen? Every company is going to be using the Cloud. Even the Cloud suppliers are using the Cloud. So how is it going to work? It’s one of these never-decreasing circles.

Mezzapelle: At this point, I think it’s going to be more evolution than revolution, but I’m also one of the people who’ve been in that part of the business — IT services — for the last 20 years and have seen it morph in a little bit different way.

Stuart is right that there’s going to be a convergence of the consumer-driven, cloud-based model, which Amazon and Google represent, with an enterprise approach that corporations like HP are representing. It’s somewhere in the middle where we can bring the service level commitments, the options for security, the options for other things that make it more reliable and risk-averse for large corporations to take advantage of it.

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Cybersecurity, Information security, Security Architecture