Category Archives: Security Architecture

Introducing Two New Security Standards for Risk Analysis—Part II – Risk Analysis Standard

By Jim Hietala, VP Security, The Open Group

Last week we took a look at one of the new risk standards recently introduced by The Open Group® Security Forum at the The Open Group London Conference 2013, the Risk Taxonomy Technical Standard 2.0 (O-RT). Today’s blog looks at its sister standard, the Risk Analysis (O-RA) Standard, which provides risk professionals the tools they need to perform thorough risk analyses within their organizations for better decision-making about risk.

Risk Analysis (O-RA) Standard

The new Risk Analysis Standard provides a comprehensive guide for performing effective analysis scenarios within organizations using the Factor Analysis of Information Risk (FAIR™) framework. O-RA is geared toward managing the frequency and magnitude of loss that can arise from a threat, whether human, animal or a natural event–in other words “how often bad things happened and how bad they are when they occur.” Used together, the O-RT and O-RA Standards provide organizations with a way to perform consistent risk modeling, that can not only help thoroughly explain risk factors to stakeholders but allow information security professionals to strengthen existing or create better analysis methods. O-RA may also be used in conjunction with other risk frameworks to perform risk analysis.

The O-RA standard is also meant to provide something more than a mere assessment of risk. Many professionals within the security industry often fail to distinguish between “assessing” risk vs. “analysis” of risk. This standard goes beyond assessment by supporting effective analyses so that risk statements are less vulnerable to problems and are more meaningful and defensible than assessments that provide only the broad risk-ratings (“this is a 4 on a scale of 1-to-5”) normally used in assessments.

O-RA also lays out standard process for approaching risk analysis that can help organizations streamline the way they approach risk measurement. By focusing in on these four core process elements, organizations are able to perform more effective analyses:

  • Clearly identifying and characterizing the assets, threats, controls and impact/loss elements at play within the scenario being assessed
  • Understanding the organizational context for analysis (i.e. what’s at stake from an organizational perspective)
  • Measuring/estimating various risk factors
  • Calculating risk using a model that represents a logical, rational, and useful view of what risk is and how it works.

Because measurement and calculation are essential elements of properly analyzing risk variables, an entire chapter of the standard is dedicated to how to measure and calibrate risk. This chapter lays out a number of useful approaches for establishing risk variables, including establishing baseline risk estimates and ranges; creating distribution ranges and most likely values; using Monte Carlo simulations; accounting for uncertainty; determining accuracy vs. precision and subjective vs. objective criteria; deriving vulnerability; using ordinal scales; and determining diminishing returns.

Finally, a practical, real-world example is provided to take readers through an actual risk analysis scenario. Using the FAIR model, the example outlines the process for dealing with an threat in which an HR executive at a large bank has left the user name and password that allow him access to all the company’s HR systems on a Post-It note tacked onto his computer in his office in clear view of anyone (other employees, cleaning crews, etc.) who comes into the office.

The scenario outlines four stages in assessing this risk:

  1. .    Stage 1: Identify Scenario Components (Scope the Analysis)
  2. .    Stage 2: Evaluate Loss Event Frequency (LEF)
  3. .    Stage 3: Evaluate Loss Magnitude (LM)
  4. .    Stage 4: Derive and Articulate Risk

Each step of the risk analysis process is thoroughly outlined for the scenario to provide Risk Analysts an example of how to perform an analysis process using the FAIR framework. Considerable guidance is provided for stages 2 and 3, in particular, as those are the most critical elements in determining organizational risk.

Ultimately, the O-RA is a guide to help organizations make better decisions about which risks are the most critical for the organization to prioritize and pay attention to versus those that are less important and may not warrant attention. It is critical for Risk Analysts and organizations to become more consistent in this practice because lack of consistency in determining risk among information security professionals has been a major obstacle in allowing security professionals a more legitimate “seat at the table” in the boardroom with other business functions (finance, HR, etc.) within organizations.

For our profession to evolve and grow, consistency and accurate measurement is key. Issues and solutions must be identified consistently and comparisons and measurement must be based on solid foundations, as illustrated below.

Risk2

Chained Dependencies

O-RA can help organizations arrive at better decisions through consistent analysis techniques as well as provide more legitimacy within the profession.  Without a foundation from which to manage information risk, Risk Analysts and information security professionals may rely too heavily on intuition, bias, commercial or personal agendas for their analyses and decision making. By outlining a thorough foundation for Risk Analysis, O-RA provides not only a common foundation for performing risk analyses but the opportunity to make better decisions and advance the security profession.

For more on the O-RA Standard or to download it, please visit: https://www2.opengroup.org/ogsys/catalog/C13G.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Introducing Two New Security Standards for Risk Analysis—Part I – Risk Taxonomy Technical Standard 2.0

By Jim Hietala, VP Security, The Open Group

At the The Open Group London 2013 Conference, The Open Group® announced three new initiatives related to the Security Forum’s work around Risk Management. The first of these was the establishment of a new certification program for Risk Analysts working within the security profession, the Open FAIR Certification Program.  Aimed at providing a professional certification for Risk Analysts, the program will bring a much-needed level of assuredness to companies looking to hire Risk Analysts, certifying that analysts who have completed the Open FAIR program understand the fundamentals of risk analysis and are qualified to perform that analysis.

Forming the basis of the Open FAIR certification program are two new Open Group standards, version 2.0 of the Risk Taxonomy (O-RT) standard originally introduced by the Security Forum in 2009, and a new Risk Analysis (O-RA) Standard, both of which were also announced at the London conference. These standards are the result of ongoing work around risk analysis that the Security Forum has been conducting for a number of years now in order to help organizations better understand and identify their exposure to risk, particularly when it comes to information security risk.

The Risk Taxonomy and Risk Analysis standards not only form the basis and body of knowledge for the Open FAIR certification, but provide practical advice for security practitioners who need to evaluate and counter the potential threats their organization may face.

Today’s blog will look at the first standard, the Risk Taxonomy Technical Standard, version 2.0. Next week, we’ll look at the other standard for Risk Analysis.

Risk Taxonomy (O-RT) Technical Standard 2.0

Originally, published in January 2009, the O-RT is intended to provide a common language and references for security and business professionals who need to understand or analyze risk conditions, providing a common language for them to use when discussing those risks. Version 2.0 of the standard contains a number of updates based both on feedback provided by professionals that have been using the standard and as a result of research conducted by Security Forum member CXOWARE.

The majority of the changes to Version 2.0 are refinements in terminology, including changes in language that better reflect what each term encompasses. For example, the term “Control Strength” in the original standard has now been changed to “Resistance Strength” to reflect that controls used in that part of the taxonomy must be resistive in nature.

More substantive changes were made to the portion of the taxonomy that discusses how Loss Magnitude is evaluated.

Why create a taxonomy for risk?  For two reasons. First, the taxonomy provides a foundation from which risk analysis can be performed and talked about. Second, a tightly defined taxonomy reduces the inability to effectively measure or estimate risk scenarios, leading to better decision making, as illustrated by the following “risk management stack.”

Effective Management


↑

Well-informed Decisions

Effective Comparisons


↑

Meaningful Measurements

Accurate Risk Model

The complete Risk Taxonomy is comprised of two branches: Loss Event Frequency (LEF) and Loss Magnitude (LM), illustrated here:

Risk1

Focusing solely on pure risk (which only results in loss) rather than speculative risk (which might result in either loss or profit), the O-RT is meant to help estimate the probable frequency and magnitude of future loss.

Traditionally LM has been far more difficult to determine than LEF, in part because organizations don’t always perform analyses on their losses or they just stick to evaluating “low hanging fruit” variables rather than delve into determining more complex risk factors. The new taxonomy takes a deep dive into the Loss Magnitude branch of the risk analysis taxonomy providing guidance that will allow Risk Analysts to better tackle the difficult task of determining LM. It includes terminology outlining six specific forms of loss an organization can experience (productivity, response, replacement, fines and judgments, competitive advantage, reputation) as well as how to determine Loss Flow, a new concept in this standard.

The Loss Flow analysis helps identify how a loss may affect both primary (owners, employees, etc.) and secondary (customers, stockholders, regulators, etc.) stakeholders as a result of a threat agent’s action on an asset. The new standard provides a thorough overview on how to assess Loss Flow and identify the loss factors of any given threat.

Finally, the standard also includes a practical, real-world scenario to help analysts understand how to put the taxonomy to use in within their organizations. O-RT provides a common linguistic foundation that will allow security professionals to then perform the risk analyses as outlined in the O-RA Standard.

For more on the Risk Taxonomy Standard or to download it, visit: https://www2.opengroup.org/ogsys/catalog/C13K.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Jericho Forum declares “success” and sunsets

By Ian Dobson & Jim Hietala, The Open Group
Ten years ago, the Jericho Forum set out on a mission to evangelise the issues, problems, solutions and provide thought-leadership around the emerging business and security issues of de-perimeterisation, with the aim of one day being able to declare “job-done”.

That day has now arrived.  Today, de-perimeterisation is an established “fact” – touching not just information security but all areas of modern business, including the bring your own IT phenomenon (devices, IDs, services) as well as all forms of cloud computing. It’s widely understood and quoted by the entire industry.  It has become part of today’s computing and security lexicon.

With our de-perimeterisation mission accomplished, the Jericho Forum has decided the time has come to “declare success”, celebrate it as a landmark victory in the evolution of information security, and sunset as a separate Forum in The Open Group.

Our “declare success and sunset” victory celebration on Monday 21st Oct 2013 at the Central Hall Westminster, London UK, was our valedictory announcement that the Jericho Forum will formally sunset on 1st Nov 2013.  The event included many past leading Jericho Forum members attending as guests, with awards of commemorative plaques to those whose distinctive leadership steered the information security mind-set change success that the Jericho Forum has now achieved.

For those who missed the live-streamed event, you can watch it on the livestream recording at http://new.livestream.com/opengroup/Lon13

We are fortunate to be able to pass our Jericho Forum legacy of de-perimeterisation achievements and publications to the good care of The Open Group’s Security Forum, which has undertaken to maintain the Jericho Forum’s deliverables, protect it’s legacy from mis-representation, and perhaps adopt and evolve Jericho’s thought-leadership approach on future information security challenges.

Ian Dobson, Director Jericho Forum
Jim Hietala, VP Security
The Open Group
21st October 2013


Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world. In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Security Architecture

The Open Group London 2013 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

On Monday October 21st, The Open Group kicked off the first day of our Business Transformation conference in London!  Over 275 guests attended many engaging presentations by subject matter experts in finance, healthcare and government.  Attendees from around the globe represented 28 countries including those from as far away as Columbia, Philippines, Australia, Japan and South Africa.

Allen Brown, President and CEO of The Open Group, welcomed the prestigious group.  Allen announced that The Open Group has 67 new member organizations so far this year!

The plenary launched with “Just Exactly What is Going On in Business and Technology?” by Andy Mulholland, Former Global CTO of Capgemini, who was named one of the top 25 influential CTOs by InfoWorld.  Andy’s key topics regarding digital disruption included real drivers of change, some big and fundamental implications, business model innovation, TOGAF® and the Open Platform 3.0™ initiative.

Next up was Judith Jones, CEO, Architecting the Enterprise Ltd., with a presentation entitled “One World EA Framework for Governments – The Way Forward”.  Judith shared findings from the World Economic Forum, posing the question “what keeps 1000 global leaders awake at night”? Many stats were presented with over 50 global risks – economical, societal, environmental, geopolitical and technological.

Jim Hietala, VP, Security of The Open Group announced the launch of the Open FAIR Certification for People Program.  The new program brings a much-needed certification to the market which focuses on risk analysis. Key partners include CXOWARE, Architecting the Enterprise, SNA Technologies and The Unit bv.

Richard Shreeve, Consultancy Director, IPL and Angela Parratt, Head of Transformation and joint CIO, Bath and North East Somerset Council presented “Using EA to Inform Business Transformation”.  Their case study addressed the challenges of modeling complexity in diverse organizations and the EA-led approach to driving out cost and complexity while maintaining the quality of service delivery.

Allen Brown announced that the Jericho Forum® leaders together with The Open Group management have concluded that the Jericho Forum has achieved its original mission – to establish “de-perimeterization” that touches all areas of modern business.  In declaring this mission achieved, we are now in the happy position to celebrate a decade of success and move to ensuring that the legacy of the Jericho Forum is both maintained within The Open Group and continues to be built upon.  (See photo below.)

Following the plenary, the sessions were divided into tracks – Finance/Commerce, Healthcare and Tutorials/Workshops.

During the Healthcare track, one of the presenters, Larry Schmidt, Chief Technologist with HP, discussed “Challenges and Opportunities for Big Data in Healthcare”. Larry elaborated on the 4 Vs of Big Data – value, velocity, variety and voracity.

Among the many presenters in the Finance/Commerce track, Omkhar Arasaratnam, Chief Security Architect, TD Bank Group, Canada, featured “Enterprise Architecture – We Do That?: How (not) to do Enterprise Architecture at a Bank”.  Omkhar provided insight as to how he took traditional, top down, center-based architectural methodologies and applied it to a highly federated environment.

Tutorials/workshops consisted of EA Practice and Architecture Methods and Techniques.

You can view all of the plenary and many of the track presentations at livestream.com.  For those who attended, please stay tuned for the full conference proceedings.

The evening concluded with a networking reception at the beautiful and historic and Central Hall Westminster.  What an interesting, insightful, collaborative day it was!

IMG_1311

Comments Off

Filed under Business Architecture, Certifications, Cloud, Cloud/SOA, Conference, Cybersecurity, Information security, Open Platform 3.0, Professional Development, RISK Management, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day Three Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

We are winding down Day 3 and gearing up for the next two days of training and workshops.  Today’s subject areas included TOGAF®, ArchiMate®, Risk Management, Innovation Management, Open Platform 3.0™ and Future Trends.

The objective of the Future Trends session was to discuss “emerging business and technical trends that will shape enterprise IT”, according to Dave Lounsbury, Chief Technical Officer of The Open Group.

This track also featured a presentation by Dr. William Lafontaine, VP High Performance Computing, Analytics & Cognitive Markets, IBM Research, who gave an overview of the “Global Technology Outlook 2013”.  He stated the Mega Trends are:  Growing Scale/Lower Barrier of Entry; Increasing Complexity/Yet More Consumable; Fast Pace; Contextual Overload.  Mike Walker, Strategies & Enterprise Architecture Advisor for HP, noted the key disrupters that will affect our future are the business of IT, technology itself, expectation of consumers and globalization.

The session concluded with an in-depth Q&A with Bill, Dave, Mike (as shown below) and Allen Brown, CEO of The Open Group.Philly Day 3

Other sessions included presentations by TJ Virdi (Senior Enterprise Architect, Boeing) on Innovation Management, Jack Jones (President, CXOWARE, Inc.) on Risk Management and Stephen Bennett (Executive Principal, Oracle) on Big Data.

A special thanks goes to our many sponsors during this dynamic conference: Windstream, Architecting the Enterprise, Metaplexity, BIZZdesign, Corso, Avolution, CXOWARE, Penn State – Online Program in Enterprise Architecture, and Association of Enterprise Architects.

Stay tuned for post-conference proceedings to be posted soon!  See you at our conference in London, October 21-24.

Comments Off

Filed under ArchiMate®, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Open Platform 3.0, RISK Management, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

philly 2.jpgDay 2 at The Open Group conference in the City of Brotherly Love, as Philadelphia is also known, was another busy and remarkable day.

The plenary started with a fascinating presentation, “Managing the Health of the Nation” by David Nash, MD, MBA, Dean of Jefferson School of Population Health.  Healthcare is the number one industry in the city of Philadelphia, with the highest number of patients in beds in the top 10 US cities. The key theme of his thought-provoking speech was “boundaryless information sharing” (sound familiar?), which will enable a healthcare system that is “safe, effective, patient-centered, timely, equitable, efficient”.

Following Dr. Nash’s presentation was the Healthcare Transformation Panel moderated by Allen Brown, CEO of The Open Group.  Participants were:  Gina Uppal (Fulbright-Killam Fellow, American University Program), Mike Lambert (Open Group Fellow, Architecting the Enterprise), Rosemary Kennedy (Associate Professor, Thomas Jefferson University), Blaine Warkentine, MD, MPH and Fran Charney (Pennsylvania Patient Safety Authority). The group brought different sets of experiences within the healthcare system and provided reaction to Dr. Nash’s speech.  All agree on the need for fundamental change and that technology will be key.

The conference featured a spotlight on The Open Group’s newest forum, Open Platform 3.0™ by Dr. Chris Harding, Director of Interoperability.  Open Platform 3.0 was formed to advance The Open Group vision of Boundaryless Information Flow™ to help enterprises in the use of Cloud, Social, Mobile Computing and Big Data.  For more info; http://www.opengroup.org/getinvolved/forums/platform3.0

The Open Group flourishes because of people interaction and collaboration.  The accolades continued with several members being recognized for their outstanding contributions to The Open Group Trusted Technology Forum (OTTF) and the Service-Oriented Architecture (SOA) and Cloud Computing Work Groups.  To learn more about our Forums and Work Groups and how to get involved, please visit http://www.opengroup.org/getinvolved

Presentations and workshops were also held in the Healthcare, Finance and Government vertical industries. Presenters included Larry Schmidt (Chief Technologist, HP), Rajamanicka Ponmudi (IT Architect, IBM) and Robert Weisman (CEO, Build the Vision, Inc.).

2 Comments

Filed under ArchiMate®, Business Architecture, Cloud/SOA, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Open Platform 3.0, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day One Highlights

By Loren K.  Baynes, Director, Global Marketing Communications at The Open Group.

PhillyOn Monday, July 15th, we kicked off our conference in Philadelphia. As Allen Brown, CEO of The Open Group, commented in his opening remarks, Philadelphia is the birthplace of American democracy.  This is the first time The Open Group has hosted a conference in this historical city.

Today’s plenary sessions featured keynote speakers covering topics ranging from an announcement of a new Open Group standard, appointment of a new Fellow, Enterprise Architecture and Transformation, Big Data and spotlights on The Open Group forums, Real-time Embedded Systems and Open Trusted Technology, as well as a new initiative on Healthcare.

Allen Brown noted that The Open Group has 432 member organizations with headquarters in 32 countries and over 40,000 individual members in 126 countries.

The Open Group Vision is Boundaryless Information Flow™ achieved through global interoperability in a secure, reliable and timely manner.  But as stated by Allen, “Boundaryless does not mean there are no boundaries.  It means that boundaries are permeable to enable business”

Allen also presented an overview of the new “Dependability Through Assuredness™ Standard.  The Open Group Real-time Embedded Systems Forum is the home of this standard. More news to come!

Allen introduced Dr. Mario Tokoro, (CEO of Sony Computer Systems Laboratories) who began this project in 2006. Dr. Tokoro stated, “Thank you from the bottom of my heart for understanding the need for this standard.”

Eric Sweden, MSIH MBA, Program Director, Enterprise Architecture & Governance\National Association of State CIOs (NASCIO) offered a presentation entitled “State of the States – NASCIO on Enterprise Architecture: An Emphasis on Cross-Jurisdictional Collaboration across States”.  Eric noted “Enterprise Architecture is a blueprint for better government.” Furthermore, “Cybersecurity is a top priority for government”.

Dr. Michael Cavaretta, Technical Lead and Data Scientist with Ford Motor Company discussed “The Impact of Big Data on the Enterprise”.  The five keys, according to Dr. Cavaretta, are “perform, analyze, assess, track and monitor”.  Please see the following transcript from a Big Data analytics podcast, hosted by The Open Group, Dr. Cavaretta participated in earlier this year. http://blog.opengroup.org/2013/01/28/the-open-group-conference-plenary-speaker-sees-big-data-analytics-as-a-way-to-bolster-quality-manufacturing-and-business-processes/

The final presentation during Monday morning’s plenary was “Enabling Transformation Through Architecture” by Lori Summers (Director of Technology) and Amit Mayabhate (Business Architect Manager) with Fannie Mae Multifamily.

Lori stated that their organization had adopted Business Architecture and today they have an integrated team who will complete the transformation, realize value delivery and achieve their goals.

Amit noted “Traceability from the business to architecture principles was key to our design.”

In addition to the many interesting and engaging presentations, several awards were presented.  Joe Bergmann, Director, Real-time and Embedded Systems Forum, The Open Group, was appointed Fellow by Allen Brown in recognition of Joe’s major achievements over the past 20+ years with The Open Group.

Other special recognition recipients include members from Oracle, IBM, HP and Red Hat.

In addition to the plenary session, we hosted meetings on Finance, Government and Healthcare industry verticals. Today is only Day One of The Open Group conference in Philadelphia. Please stay tuned for more exciting conference highlights over the next couple days.

Comments Off

Filed under ArchiMate®, Business Architecture, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Security Architecture, Standards, TOGAF®

The Open Group Sydney – My Conference Highlights

By Mac Lemon, MD Australia at Enterprise Architects

Sydney

Well the dust has settled now with the conclusion of The Open Group ‘Enterprise Transformation’ Conference held in Sydney, Australia for the first time on April 15-20. Enterprise Architects is proud to have been recognised at the event by The Open Group as being pivotal in the success of this event. A number of our clients including NBN, Australia Post, QGC, RIO and Westpac presented excellent papers on leading edge approaches in strategy and architecture and a number of EA’s own thought leaders in Craig Martin, Christine Stephenson and Ana Kukec also delivered widely acclaimed papers.

Attendance at the conference was impressive and demonstrated that there is substantial appetite for a dedicated event focussed on the challenges of business and technology strategy and architecture. We saw many international visitors both as delegates and presenting papers and there is no question that a 2014 Open Group Forum will be the stand out event in the calendar for business and technology strategy and architecture professionals.

My top 10 take-outs from the conference include the following:

  1. The universal maturing in understanding the criticality of Business Architecture and the total convergence upon Business Capability Modelling as a cornerstone of business architecture;
  2. The improving appreciation of techniques for understanding and expressing business strategy and motivation, such as strategy maps, business model canvass and business motivation modelling;
  3. That customer experience is emerging as a common driver for many transformation initiatives;
  4. While the process for establishing the case and roadmap for transformation appears well enough understood, the process for management of the blueprint through transformation is not and generally remains a major program risk;
  5. Then next version of TOGAF® should offer material uplift in support for security architecture which otherwise remains at low levels of maturity from a framework standardisation perspective;
  6. ArchiMate® is generating real interest as a preferred enterprise architecture modelling notation – and that stronger alignment of ArchiMate® and TOGAF® meta models in then next version of TOGAF® is highly anticipated;
  7. There is industry demand for recognised certification of architects to demonstrate learning alongside experience as the mark of a good architect. There remains an unsatisfied requirement for certification that falls in the gap between TOGAF® and the Open CA certification;
  8. Australia can be proud of its position in having the second highest per capita TOGAF® certification globally behind the Netherlands;
  9. While the topic of interoperability in government revealed many battle scarred veterans convinced of the hopelessness of the cause – there remain an equal number of campaigners willing to tackle the challenge and their free and frank exchange of views was entertaining enough to justify worth the price of a conference ticket;
  10. Unashamedly – Enterprise Architects remains in a league of its own in the concentration of strategy and architecture thought leadership in Australia – if not globally.

Mac LemonMac Lemon is the Managing Director of Enterprise Architects Pty Ltd and is based in Melbourne, Australia.

This is an extract from Mac’s recent blog post on the Enterprise Architects web site which you can view here.

Comments Off

Filed under ArchiMate®, Business Architecture, Certifications, Conference, Enterprise Architecture, Enterprise Transformation, Professional Development, Security Architecture, TOGAF, TOGAF®

Open Group Panel Explores Changing Field of Risk Management and Analysis in the Era of Big Data

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: The Open Group Panel Explores Changing Field of Risk Management and Analysis in Era of Big Data

This is a transcript of a sponsored podcast discussion on the threats from and promise of Big Data in securing enterprise information assets in conjunction with the The Open Group Conference in Newport Beach.

Dana Gardner: Hello, and welcome to a special thought leadership interview series coming to you in conjunction with The Open Group Conference on January 28 in Newport Beach, California.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these business transformation discussions. The conference itself is focusing on Big Data the transformation we need to embrace today.

We’re here now with a panel of experts to explore new trends and solutions in the area of risk management and analysis. We’ll learn how large enterprises are delivering risk assessments and risk analysis, and we’ll see how Big Data can be both an area to protect from in form of risks, but also as a tool for better understanding and mitigating risks.

With that, please join me in welcoming our panel. We’re here with Jack Freund, PhD, the Information Security Risk Assessment Manager at TIAA-CREF. Welcome, Jack.

Jack Freund: Hello Dana, how are you?

Gardner: I’m great. Glad you could join us.

We are also here with Jack Jones, Principal of CXOWARE. He has more than nine years of experience as a Chief Information Security Officer, is the inventor of the Factor Analysis Information Risk (FAIR) framework. Welcome, Jack.

Jack Jones: Thank you. And we’re also here with Jim Hietala, Vice President, Security for The Open Group. Welcome, Jim.

Jim Hietala: Thanks, Dana.

Gardner: All right, let’s start out with looking at this from a position of trends. Why is the issue of risk analysis so prominent now? What’s different from, say, five years ago? And we’ll start with you, Jack Jones.

Jones: The information security industry has struggled with getting the attention of and support from management and businesses for a long time, and it has finally come around to the fact that the executives care about loss exposure — the likelihood of bad things happening and how bad those things are likely to be.

It’s only when we speak of those terms or those issues in terms of risk, that we make sense to those executives. And once we do that, we begin to gain some credibility and traction in terms of getting things done.

Gardner: So we really need to talk about this in the terms that a business executive would appreciate, not necessarily an IT executive.

Effects on business

Jones: Absolutely. They’re tired of hearing about vulnerabilities, hackers, and that sort of thing. It’s only when we can talk in terms of the effect on the business that it makes sense to them.

Gardner: Jack Freund, I should also point out that you have more than 14 years in enterprise IT experience. You’re a visiting professor at DeVry University and you chair a risk-management subcommittee for ISACA? Is that correct?

Freund: ISACA, yes.

Gardner: And do you agree?

Freund: The problem that we have as a profession, and I think it’s a big problem, is that we have allowed ourselves to escape the natural trend that the other IT professionals have already taken.

There was a time, years ago, when you could code in the basement, and nobody cared much about what you were doing. But now, largely speaking, developers and systems administrators are very focused on meeting the goals of the organization.

Security has been allowed to miss that boat a little. We have been allowed to hide behind this aura of a protector and of an alerter of terrible things that could happen, without really tying ourselves to the problem that the organizations are facing and how can we help them succeed in what they’re doing.

Gardner: Jim Hietala, how do you see things that are different now than a few years ago when it comes to risk assessment?

Hietala: There are certainly changes on the threat side of the landscape. Five years ago, you didn’t really have hacktivism or this notion of an advanced persistent threat (APT).

That highly skilled attacker taking aim at governments and large organizations didn’t really exist -– or didn’t exist to the degree it does today. So that has changed.

You also have big changes to the IT platform landscape, all of which bring new risks that organizations need to really think about. The mobility trend, the Cloud trend, the big-data trend that we are talking about today, all of those things bring new risk to the organization.

As Jack Jones mentioned, business executives don’t want to hear about, “I’ve got 15 vulnerabilities in the mobility part of my organization.” They want to understand what’s the risk of bad things happening because of mobility, what we’re doing about it, and what’s happening to risk over time?

So it’s a combination of changes in the threats and attackers, as well as just changes to the IT landscape, that we have to take a different look at how we measure and present risk to the business.

Gardner: Because we’re at a big-data conference, do you share my perception, Jack Jones, that Big Data can be a source of risk and vulnerability, but also the analytics and the business intelligence (BI) tools that we’re employing with Big Data can be used to alert you to risks or provide a strong tool for better understanding your true risk setting or environment.

Crown jewels

Jones: You are absolutely right. You think of Big Data and, by definition, it’s where your crown jewels, and everything that leads to crown jewels from an information perspective, are going to be found. It’s like one-stop shopping for the bad guy, if you want to look at it in that context. It definitely needs to be protected. The architecture surrounding it and its integration across a lot of different platforms and such, can be leveraged and probably result in a complex landscape to try and secure.

There are a lot of ways into that data and such, but at least if you can leverage that same Big Data architecture, it’s an approach to information security. With log data and other threat and vulnerability data and such, you should be able to make some significant gains in terms of how well-informed your analyses and your decisions are, based on that data.

Gardner: Jack Freund, do you share that? How does Big Data fit into your understanding of the evolving arena of risk assessment and analysis?

Freund: If we fast-forward it five years, and this is even true today, a lot of people on the cutting edge of Big Data will tell you the problem isn’t so much building everything together and figuring out what it can do. They are going to tell you that the problem is what we do once we figure out everything that we have. This is the problem that we have traditionally had on a much smaller scale in information security. When everything is important, nothing is important.

Gardner: To follow up on that, where do you see the gaps in risk analysis in large organizations? In other words, what parts of organizations aren’t being assessed for risk and should be?

Freund: The big problems that exist largely today in the way that risk assessments are done, is the focus on labels. We want to quickly address the low, medium, and high things and know where they are. But the problem is that there are inherent problems in the way that we think about those labels, without doing any of the analysis legwork.

I think that’s what’s really missing is that true analysis. If the system goes offline, do we lose money? If the system becomes compromised, what are the cost-accounting things that will happen that allow us to figure out how much money we’re going to lose.

That analysis work is largely missing. That’s the gap. The gap is if the control is not in place, then there’s a risk that must be addressed in some fashion. So we end up with these very long lists of horrible, terrible things that can be done to us in all sorts of different ways, without any relevance to the overall business of the organization.

Every day, our organizations are out there selling products, offering services, which is and of itself, its own risky venture. So tying what we do from an information security perspective to that is critical for not just the success of the organization, but the success of our profession.

Gardner: So we can safely say that large companies are probably pretty good at a cost-benefit analysis or they wouldn’t be successful. Now, I guess we need to ask them to take that a step further and do a cost-risk analysis, but in business terms, being mindful that their IT systems might be a much larger part of that than they had at once considered. Is that fair, Jack?

Risk implications

Jones: Businesses have been making these decisions, chasing the opportunity, but generally, without any clear understanding of the risk implications, at least from the information security perspective. They will have us in the corner screaming and throwing red flags in there, and talking about vulnerabilities and threats from one thing or another.

But, we come to the table with red, yellow, and green indicators, and on the other side of the table, they’ve got numbers. Well, here is what we expect to earn in revenue from this initiative, and the information security people are saying it’s crazy. How do you normalize the quantitative revenue gain versus red, yellow, and green?

Gardner: Jim Hietala, do you see it in the same red, yellow, green or are there some other frameworks or standard methodologies that The Open Group is looking at to make this a bit more of a science?

Hietala: Probably four years ago, we published what we call the Risk Taxonomy Standard which is based upon FAIR, the management framework that Jack Jones invented. So, we’re big believers in bringing that level of precision to doing risk analysis. Having just gone through training for FAIR myself, as part of the standards effort that we’re doing around certification, I can say that it really brings a level of precision and a depth of analysis to risk analysis that’s been lacking frequently in IT security and risk management.

Gardner: We’ve talked about how organizations need to be mindful that their risks are higher and different than in the past and we’ve talked about how standardization and methodologies are important, helping them better understand this from a business perspective, instead of just a technology perspective.

But, I’m curious about a cultural and organizational perspective. Whose job should this fall under? Who is wearing the white hat in the company and can rally the forces of good and make all the bad things managed? Is this a single person, a cultural, an organizational mission? How do you make this work in the enterprise in a real-world way? Let’s go to you, Jack Freund.

Freund: The profession of IT risk management is changing. That profession will have to sit between the business and information security inclusive of all the other IT functions that make that happen.

In order to be successful sitting between these two groups, you have to be able to speak the language of both of those groups. You have to be able to understand profit and loss and capital expenditure on the business side. On the IT risk side, you have to be technical enough to do all those sorts of things.

But I think the sum total of those two things is probably only about 50 percent of the job of IT risk management today. The other 50 percent is communication. Finding ways to translate that language and to understand the needs and concerns of each side of that relationship is really the job of IT risk management.

To answer your question, I think it’s absolutely the job of IT risk management to do that. From my own experiences with the FAIR framework, I can say that using FAIR is the Rosetta Stone for speaking between those two groups.

Necessary tools

It gives you the tools necessary to speak in the insurance and risk terms that business appreciate. And it gives you the ability to be as technical and just nerdy, if you will, as you need to be in order to talk to IT security and the other IT functions in order to make sure everybody is on the same page and everyone feels like their concerns are represented in the risk-assessment functions that are happening.

Gardner: Jack Jones, can you add to that?

Jones: I agree with what Jack said wholeheartedly. I would add, though, that integration or adoption of something like this is a lot easier the higher up in the organization you go.

For CFOs traditionally, their neck is most clearly on the line for risk-related issues within most organizations. At least in my experience, if you get their ear on this and present the information security data analyses to them, they jump on board, they drive it through the organization, and it’s just brain-dead easy.

If you try to drive it up through the ranks, maybe you get an enthusiastic supporter in the information security organization, especially if it’s below the CISO level, and they try a grassroots sort of effort to bring it in, it’s a tougher thing. It can still work. I’ve seen it work very well, but, it’s a longer row to hoe.

Gardner: There have been a lot of research, studies, and surveys on data breaches. What are some of the best sources, or maybe not so good sources, for actually measuring this? How do you know if you’re doing it right? How do you know if you’re moving from yellow to green, instead of to red? To you, Jack Freund.

Freund: There are a couple of things in that question. The first is there’s this inherent assumption in a lot of organizations that we need to move from yellow to green, and that may not be the case. So, becoming very knowledgeable about the risk posture and the risk tolerance of the organization is a key.

That’s part of the official mindset of IT security. When you graduate an information security person today, they are minted knowing that there are a lot of bad things out there, and their goal in life is to reduce them. But, that may not be the case. The case may very well be that things are okay now, but we have bigger things to fry over here that we’re going to focus on. So, that’s one thing.

The second thing, and it’s a very good question, is how we know that we’re getting better? How do we trend that over time? Overall, measuring that value for the organization has to be able to show a reduction of a risk or at least reduction of risk to the risk-tolerance levels of the organization.

Calculating and understanding that requires something that I always phrase as we have to become comfortable with uncertainty. When you are talking about risk in general, you’re talking about forward-looking statements about things that may or may not happen. So, becoming comfortable with the fact that they may or may not happen means that when you measure them today, you have to be willing to be a little bit squishy in how you’re representing that.

In FAIR and in other academic works, they talk about using ranges to do that. So, things like high, medium, and low, could be represented in terms of a minimum, maximum, and most likely. And that tends to be very, very effective. People can respond to that fairly well.

Gathering data

Jones: With regard to the data sources, there are a lot of people out there doing these sorts of studies, gathering data. The problem that’s hamstringing that effort is the lack of a common set of definitions, nomenclature, and even taxonomy around the problem itself.

You will have one study that will have defined threat, vulnerability, or whatever differently from some other study, and so the data can’t be normalized. It really harms the utility of it. I see data out there and I think, “That looks like that can be really useful.” But, I hesitate to use it because I don’t understand. They don’t publish their definitions, approach, and how they went after it.

There’s just so much superficial thinking in the profession on this that we now have dug under the covers. Too often, I run into stuff that just can’t be defended. It doesn’t make sense, and therefore the data can’t be used. It’s an unfortunate situation.

I do think we’re heading in a positive direction. FAIR can provide a normalizing structure for that sort of thing. The VERIS framework, which by the way, is also derived in part from FAIR, also has gained real attraction in terms of the quality of the research they have done and the data they’re generating. We’re headed in the right direction, but we’ve got a long way to go.

Gardner: Jim Hietala, we’re seemingly looking at this on a company-by-company basis. But, is there a vertical industry slice or industry-wide slice where we could look at what’s happening to everyone and put some standard understanding, or measurement around what’s going on in the overall market, maybe by region, maybe by country?

Hietala: There are some industry-specific initiatives and what’s really needed, as Jack Jones mentioned, are common definitions for things like breach, exposure, loss, all those, so that the data sources from one organization can be used in another, and so forth. I think about the financial services industry. I know that there is some information sharing through an organization called the FS-ISAC about what’s happening to financial services organizations in terms of attacks, loss, and those sorts of things.

There’s an opportunity for that on a vertical-by-vertical basis. But, like Jack said, there is a long way to go on that. In some industries, healthcare for instance, you are so far from that, it’s ridiculous. In the US here, the HIPAA security rule says you must do a risk assessment. So, hospitals have done annual risk assessments, will stick the binder on the shelf, and they don’t think much about information security in between those annual risk assessments. That’s a generalization, but various industries are at different places on a continuum of maturity of their risk management approaches.

Gardner: As we get better with having a common understanding of the terms and the measurements and we share more data, let’s go back to this notion of how to communicate this effectively to those people that can use it and exercise change management as a result. That could be the CFO, the CEO, what have you, depending on the organization.

Do you have any examples? Can we look to an organization that’s done this right, and examine their practices, the way they’ve communicated it, some of the tools they’ve used and say, “Aha, they’re headed in the right direction maybe we could follow a little bit.” Let’s start with you, Jack Freund.

Freund: I have worked and consulted for various organizations that have done risk management at different levels. The ones that have embraced FAIR tend to be the ones that overall feel that risk is an integral part of their business strategy. And I can give a couple of examples of scenarios that have played out that I think have been successful in the way they have been communicated.

Coming to terms

The key to keep in mind with this is that one of the really important things is that when you’re a security professional, you’re again trained to feel like you need results. But, the results for the IT risk management professional are different. The results are “I’ve communicated this effectively, so I am done.” And then whatever the results are, are the results that needed to be. And that’s a really hard thing to come to terms with.

I’ve been involved in large-scale efforts to assess risk for a Cloud venture. We needed to move virtually every confidential record that we have to the Cloud in order to be competitive with the rest of our industry. If our competitors are finding ways to utilize the Cloud before us, we can lose out. So, we need to find a way to do that, and to be secure and compliant with all the laws and regulations and such.

Through that scenario, one of the things that came out was that key ownership became really, really important. We had the opportunity to look at the various control structures and we analyzed them using FAIR. What we ended up with was sort of a long-tail risk. Most people will probably do their job right over a long enough period of time. But, over that same long period of time, the odds of somebody making a mistake not in your favor are probably likely, but, not significantly enough so that you can’t make the move.

But, the problem became that the loss side, the side that typically gets ignored with traditional risk-assessment methodologies, was so significant that the organization needed to make some judgment around that, and they needed to have a sense of what we needed to do in order to minimize that.

That became a big point of discussion for us and it drove the conversation away from bad things could happen. We didn’t bury the lead. The lead was that this is the most important thing to this organization in this particular scenario.

So, let’s talk about things we can do. Are we comfortable with it? Do we need to make any sort of changes? What are some control opportunities? How much do they cost? This is a significantly more productive conversation than just, “Here is a bunch of bad things that happen. I’m going to cross my arms and say no.”

Gardner: Jack Jones, examples at work?

Jones: In an organization that I’ve been working with recently, their board of directors said they wanted a quantitative view of information security risk. They just weren’t happy with the red, yellow, green. So, they came to us, and there were really two things that drove them there. One was that they were looking at cyber insurance. They wanted to know how much cyber insurance they should take out, and how do you figure that out when you’ve got a red, yellow, green scale?

They were able to do a series of analyses on a population of the scenarios that they thought were relevant in their world, get an aggregate view of their annualized loss exposure, and make a better informed decision about that particular problem.

Gardner: I’m curious how prevalent cyber insurance is, and is that going to be a leveling effect in the industry where people speak a common language the equivalent of actuarial tables, but for security in enterprise and cyber security?

Jones: One would dream and hope, but at this point, what I’ve seen out there in terms of the basis on which insurance companies are setting their premiums and such is essentially the same old “risk assessment” stuff that the industry has been doing poorly for years. It’s not based on data or any real analysis per se, at least what I’ve run into. What they do is set their premiums high to buffer themselves and typically cover as few things as possible. The question of how much value it’s providing the customers becomes a problem.

Looking to the future

Gardner: We’re coming up on our time limit. So, let’s quickly look to the future. Is there such thing as risk management as a service? Can we outsource this? Is there a way in which moving more of IT into Cloud or hybrid models would mitigate risk, because the Cloud provider would standardize? Then, many players in that environment, those who were buying those services, would be under that same umbrella? Let’s start with you Jim Hietala. What’s the future of this and what do the Cloud trends bring to the table?

Hietala: I’d start with a maxim that comes out of the financial services industry, which is that you can outsource the function, but you still own the risk. That’s an unfortunate reality. You can throw things out in the Cloud, but it doesn’t absolve you from understanding your risk and then doing things to manage it to transfer it if there’s insurance or whatever the case may be.

That’s just a reality. Organizations in the risky world we live in are going to have to get more serious about doing effective risk analysis. From The Open Group standpoint, we see this as an opportunity area.

As I mentioned, we’ve standardized the taxonomy piece of FAIR. And we really see an opportunity around the profession going forward to help the risk-analysis community by further standardizing FAIR and launching a certification program for a FAIR-certified risk analyst. That’s in demand from large organizations that are looking for evidence that people understand how to apply FAIR and use it in doing risk analyses.

Gardner: Jack Freund, looking into your crystal ball, how do you see this discipline evolving?

Freund: I always try to consider things as they exist within other systems. Risk is a system of systems. There are a series of pressures that are applied, and a series of levers that are thrown in order to release that sort of pressure.

Risk will always be owned by the organization that is offering that service. If we decide at some point that we can move to the Cloud and all these other things, we need to look to the legal system. There is a series of pressures that they are going to apply, and who is going to own that, and how that plays itself out.

If we look to the Europeans and the way that they’re managing risk and compliance, they’re still as strict as we in United States think that they may be about things, but there’s still a lot of leeway in a lot of the ways that laws are written. You’re still being asked to do things that are reasonable. You’re still being asked to do things that are standard for your industry. But, we’d still like the ability to know what that is, and I don’t think that’s going to go away anytime soon.

Judgment calls

We’re still going to have to make judgment calls. We’re still going to have to do 100 things with a budget for 10 things. Whenever that happens, you have to make a judgment call. What’s the most important thing that I care about? And that’s why risk management exists, because there’s a certain series of things that we have to deal with. We don’t have the resources to do them all, and I don’t think that’s going to change over time. Regardless of whether the landscape changes, that’s the one that remains true.

Gardner: The last word to you, Jack Jones. It sounds as if we’re continuing down the path of being mostly reactive. Is there anything you can see on the horizon that would perhaps tip the scales, so that the risk management and analysis practitioners can really become proactive and head things off before they become a big problem?

Jones: If we were to take a snapshot at any given point in time of an organization’s loss exposure, how much risk they have right then, that’s a lagging indicator of the decisions they’ve made in the past, and their ability to execute against those decisions.

We can do some great root-cause analysis around that and ask how we got there. But, we can also turn that coin around and ask how good we are at making well-informed decisions, and then executing against them, the asking what that implies from a risk perspective downstream.

If we understand the relationship between our current state, and past and future states, we have those linkages defined, especially, if we have an analytic framework underneath it. We can do some marvelous what-if analysis.

What if this variable changed in our landscape? Let’s run a few thousand Monte Carlo simulations against that and see what comes up. What does that look like? Well, then let’s change this other variable and then see which combination of dials, when we turn them, make us most robust to change in our landscape.

But again, we can’t begin to get there, until we have this foundational set of definitions, frameworks, and such to do that sort of analysis. That’s what we’re doing with FAIR, but without some sort of framework like that, there’s no way you can get there.

Gardner: I am afraid we’ll have to leave it there. We’ve been talking with a panel of experts on how new trends and solutions are emerging in the area of risk management and analysis. And we’ve seen how new tools for communication and using Big Data to understand risks are also being brought to the table.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference in Newport Beach, California. I’d like to thank our panel: Jack Freund, PhD, Information Security Risk Assessment Manager at TIAA-CREF. Thanks so much Jack.

Freund: Thank you, Dana.

Gardner: We’ve also been speaking with Jack Jones, Principal at CXOWARE.

Jones: Thank you. Thank you, pleasure to be here.

Gardner: And last, Jim Hietala, the Vice President for Security at The Open Group. Thanks.

Hietala: Thanks, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions; your host and moderator through these thought leadership interviews. Thanks again for listening and come back next time.

Comments Off

Filed under Security Architecture

Operational Resilience through Managing External Dependencies

By Ian Dobson & Jim Hietala, The Open Group

These days, organizations are rarely self-contained. Businesses collaborate through partnerships and close links with suppliers and customers. Outsourcing services and business processes, including into Cloud Computing, means that key operations that an organization depends on are often fulfilled outside their control.

The challenge here is how to manage the dependencies your operations have on factors that are outside your control. The goal is to perform your risk management so it optimizes your operational success through being resilient against external dependencies.

The Open Group’s Dependency Modeling (O-DM) standard specifies how to construct a dependency model to manage risk and build trust over organizational dependencies between enterprises – and between operational divisions within a large organization. The standard involves constructing a model of the operations necessary for an organization’s success, including the dependencies that can affect each operation. Then, applying quantitative risk sensitivities to each dependency reveals those operations that have highest exposure to risk of not being successful, informing business decision-makers where investment in reducing their organization’s exposure to external risks will result in best return.

O-DM helps you to plan for success through operational resilience, assured business continuity, and effective new controls and contingencies, enabling you to:

  • Cut costs without losing capability
  • Make the most of tight budgets
  • Build a resilient supply chain
  •  Lead programs and projects to success
  • Measure, understand and manage risk from outsourcing relationships and supply chains
  • Deliver complex event analysis

The O-DM analytical process facilitates organizational agility by allowing you to easily adjust and evolve your organization’s operations model, and produces rapid results to illustrate how reducing the sensitivity of your dependencies improves your operational resilience. O-DM also allows you to drill as deep as you need to go to reveal your organization’s operational dependencies.

O-DM support training on the development of operational dependency models conforming to this standard is available, as are software computation tools to automate speedy delivery of actionable results in graphic formats to facilitate informed business decision-making.

The O-DM standard represents a significant addition to our existing Open Group Risk Management publications:

The O-DM standard may be accessed here.

Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world.  In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Security Architecture

Architecting for Secure Business Collaboration

By Ian Dobson & Jim Hietala, The Open Group

The Open Group Framework for Secure Collaboration Oriented Architectures (O-SCOA) Guide provides system and security architects and designers with a blueprint specifying the requirements for secure design of enterprise architectures that support safe and secure operation, globally, over any unsecured network.

This secure COA framework was originally developed by the Jericho Forum®, a forum of The Open Group, from 2007-2009. They started with an overview paper outlining the objectives and framework concepts, and quickly followed it with a high-level COA framework that mapped the primary components – processes, services, attributes and technologies – and identified the sub-components under each. Then, over the next 18 months the forum developed and published a series of requirements papers on the results of the methodical analysis of the security requirements that each sub-component should be architected to fulfill.

The O-SCOA Guide brings together an updated version of all these papers in one publication, adding the latest developments in the critical identity management component.  It also includes the business case for building Enterprise Architectures that follow the O-SCOA guidance to assure safe and secure operations between business partners over insecure global networks. Additionally, it includes the Jericho Commandments, first published in 2006, which have stood the test of time as the proven benchmark for assessing how secure any Enterprise Architecture is for operations in open systems.

The SCOA guide may be downloaded here.

Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world.  In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Security Architecture

Optimizing ISO/IEC 27001 Using O-ISM3

By Jim Hietala, The Open Group and Vicente Aceituno, Sistemas Informáticos Abiertos

The Open Group has just published a guide titled “Optimizing ISO/IEC 27001 using O-ISM3” that will be of interest to organizations using ISO27001/27002 as their Information Security Management System (ISMS).

By way of background, The Open Group published our Open Information Security Management Maturity Model last year, O-ISM3. O-ISM3 brings continuous improvement to information security management, and it provides a framework for security decision-making that is top down in nature, where security controls, security objectives and spending decisions are driven by (and aligned with) business objectives.

We have for some time now heard from information security managers that they would like a resource aimed at showing how the O-ISM3 standard could be used to manage information security alongside ISO27001/27002. This new guide provides specific guidance on this topic.

We view this as an important resource, for the following reasons:

  • O-ISM3 complements ISO27001/2 by adding the “how” dimension to information security management
  • O-ISM3 uses a process-oriented approach, defining inputs and outputs, and allowing for evaluation by process-specific metrics
  • O-ISM3 provides a framework for continuous improvement of information security processes

This resource:

  • Maps O-ISM3 and ISO27001 security objectives
  • Maps ISO27001/27002 controls and documents to O-ISM3 security processes, documents, and outputs
  • Provides a critical linkage between the controls-based approach found in ISO27001 to the process-based approach found in O-ISM3

If you have interest in information security management, we encourage you to have a look at Optimizing ISO/IEC 27001 using O-ISM3. The guide may be downloaded (at no cost, minimal registration required) here.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Vicente Aceituno, CISA, has 20 years experience in the field of IT and Information Security. During his career in Spain and the UK, he has worked for companies like Coopers & Lybrand, BBC News, Everis, and SIA Group. He is the main author of the Information Security Management Method ISM3, author of the information security book “Seguridad de la Información,” Director of the ISM3 Consortium (www.ism3.com) and President of the Spanish chapter of the ISSA.

3 Comments

Filed under Cybersecurity, Information security, Security Architecture

Summer in the Capitol – Looking Back at The Open Group Conference in Washington, D.C.

By Jim Hietala, The Open Group

This past week in Washington D.C., The Open Group held our Q3 conference. The theme for the event was “Cybersecurity – Defend Critical Assets and Secure the Global Supply Chain,” and the conference featured a number of thought-provoking speakers and presentations.

Cybersecurity is at a critical juncture, and conference speakers highlighted the threat and attack reality and described industry efforts to move forward in important areas. The conference also featured a new capability, as several of the events were Livestreamed to the Internet.

For those who did not make the event, here’s a summary of a few of the key presentations, as well as what The Open Group is doing in these areas.

Joel Brenner, attorney with Cooley, was our first keynote. Joel’s presentation was titled, “Turning Us Inside-Out: Crime and Economic Espionage on our Networks,” The talk mirrored his recent book, “America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime, and Warfare,” and Joel talked about current threats to critical infrastructure, attack trends and challenges in securing information. Joel’s presentation was a wakeup call to the very real issues of IP theft and identity theft. Beyond describing the threat and attack landscape, Joel discussed some of the management challenges related to ownership of the problem, namely that the different stakeholders in addressing cybersecurity in companies, including legal, technical, management and HR, all tend to think that this is someone else’s problem. Joel stated the need for policy spanning the entire organization to fully address the problem.

Kristin Baldwin, principal deputy, systems engineering, Office of the Assistant Secretary of Defense, Research and Engineering, described the U.S. Department of Defense (DoD) trusted defense systems strategy and challenges, including requirements to secure their multi-tiered supply chain. She also talked about how the acquisition landscape has changed over the past few years. In addition, for all programs the DoD now requires the creation of a program protection plan, which is the single focal point for security activities on the program. Kristin’s takeaways included needing a holistic approach to security, focusing attention on the threat, and avoiding risk exposure from gaps and seams. DoD’s Trusted Defense Systems Strategy provides an overarching framework for trusted systems. Stakeholder integration with acquisition, intelligence, engineering, industry and research communities is key to success. Systems engineering brings these stakeholders, risk trades, policy and design decisions together. Kristin also stressed the importance of informing leadership early and providing programs with risk-based options.

Dr. Ron Ross of NIST presented a perfect storm of proliferation of information systems and networks, increasing sophistication of threat, resulting in an increasing number of penetrations of information systems in the public and private sectors potentially affecting security and privacy. He proposed a need an integrated project team approach to information security. Dr. Ross also provided an overview of the changes coming in NIST SP 800-53, version 4, which is presently available in draft form. He also advocated a dual protection strategy approach involving traditional controls at network perimeters that assumes attackers outside of organizational networks, as well as agile defenses, are already inside the perimeter. The objective of agile defenses is to enable operation while under attack and to minimize response times to ongoing attacks. This new approach mirrors thinking from the Jericho Forum and others on de-perimeterization and security and is very welcome.

The Open Group Trusted Technology Forum provided a panel discussion on supply chain security issues and the approach that the forum is taking towards addressing issues relating to taint and counterfeit in products. The panel included Andras Szakal of IBM, Edna Conway of Cisco and Dan Reddy of EMC, as well as Dave Lounsbury, CTO of The Open Group. OTTF continues to make great progress in the area of supply chain security, having published a snapshot of the Open Trusted Technology Provider Framework, working to create a conformance program, and in working to harmonize with other standards activities.

Dave Hornford, partner at Conexiam and chair of The Open Group Architecture Forum, provided a thought provoking presentation titled, “Secure Business Architecture, or just Security Architecture?” Dave’s talk described the problems in approaches that are purely focused on securing against threats and brought forth the idea that focusing on secure business architecture was a better methodology for ensuring that stakeholders had visibility into risks and benefits.

Geoff Besko, CEO of Seccuris and co-leader of the security integration project for the next version of TOGAF®, delivered a presentation that looked at risk from a positive and negative view. He recognized that senior management frequently have a view of risk embracing as taking risk with am eye on business gains if revenue/market share/profitability, while security practitioners tend to focus on risk as something that is to be mitigated. Finding common ground is key here.

Katie Lewin, who is responsible for the GSA FedRAMP program, provided an overview of the program, and how it is helping raise the bar for federal agency use of secure Cloud Computing.

The conference also featured a workshop on security automation, which featured presentations on a number of standards efforts in this area, including on SCAP, O-ACEML from The Open Group, MILE, NEA, AVOS and SACM. One conclusion from the workshop was that there’s presently a gap and a need for a higher level security automation architecture encompassing the many lower level protocols and standards that exist in the security automation area.

In addition to the public conference, a number of forums of The Open Group met in working sessions to advance their work in the Capitol. These included:

All in all, the conference clarified the magnitude of the cybersecurity threat, and the importance of initiatives from The Open Group and elsewhere to make progress on real solutions.

Join us at our next conference in Barcelona on October 22-25!

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Cybersecurity, Enterprise Architecture, Information security, OTTF, Security Architecture, Supply chain risk, TOGAF®

The Increasing Importance of Cybersecurity: The Open Group Conference in Washington, D.C.

By Jim Hietala, The Open Group

As we move through summer here in the U.S., cybersecurity continues to be top of mind, not only for security professionals, but for IT management as well as for senior managers in large organizations.

The IT security world tends to fixate on the latest breach reported or the latest vulnerability disclosed. Clearly the recent news around Stuxnet and Flame has caused a stir in the community, as professionals debate what it means to have cyberwar attacks being carried out by nations. However, there have also been other significant developments in cybersecurity that have heightened the need for better understanding of risk and security posture in large organizations.

In the U.S., the SEC recently issued guidance to public companies on disclosing the risks of cybersecurity incidents in financial reports, as well as disclosing actual breaches if there is material affect. This is a significant new development, as there’s little that directs the attention of CEO’s and Boards like new financial disclosure requirements. In publicly traded organizations that struggled to find funding to perform adequate risk management and for IT security initiatives, IT folks will have a new impetus and mandate, likely with support from the highest levels.

The upcoming Open Group conference in Washington, D.C. on July 16-20 will explore cybersecurity, with a focus on defending critical assets and securing the global supply chain. To highlight a few of the notable presentations:

  • Joel Brenner, author of America the Vulnerable, attorney, and former senior counsel at the NSA, will keynote on Monday, July 16 and will speak on “America the Vulnerable: Inside the New Threat Matrix.”
  • Kristen Baldwin, principal deputy, DASD, Systems Engineering, and acting cirector, Systems Analysis, will speak on “Meeting the Challenge of Cybersecurity Threats through Industry-Government Partnerships.”
  • Dr. Ron Ross, project leader, NIST, will talk to “Integrating Cyber Security Requirements into Main Stream Organizational Mission and Business Processes.”
  • Andras Szakal, VP & CTO, IBM Federal will moderate a panel that will include Daniel Reddy, EMC; Edna Conway, Cisco; and Hart Rossman, SAIC on “Mitigating Tainted & Counterfeit Products.”
  • Dazza (Daniel) J. Greenwood, JD, MIT and CIVICS.com Consultancy Services, and Thomas Hardjono, executive director of MIT Kerberos Consortium, will discuss “Meeting the Challenge of Identity and Security.”

Apart from our quarterly conferences and member meetings, The Open Group undertakes a broad set of programs aimed at addressing challenges in information security.

Our Security Forum focuses on developing standards and best practices in the areas of information security management and secure architecture. The Real Time and Embedded Systems Forum addresses high assurance systems and dependability through work focused on MILS, software assurance, and dependability engineering for open systems. Our Trusted Technology Forum addresses supply chain issues of taint and counterfeit products through the development of the Trusted Technology Provider Framework, which is a draft standard aimed at enabling commercial off the shelf ICT products to be built with integrity, and bought with confidence. Finally, The Open Group Jericho Forum continues to provide thought leadership in the area of information security, most notably in the areas of de-perimeterization, secure cloud computing and identity management.

I hope to see you at the conference. More information about the conference, including the full program can be found here: http://www.opengroup.org/dc2012

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.


Comments Off

Filed under Conference, Cybersecurity, Information security, OTTF, Security Architecture

Open Group Security Gurus Dissect the Cloud: Higher of Lower Risk

By Dana Gardner, Interarbor Solutions

For some, any move to the Cloud — at least the public Cloud — means a higher risk for security.

For others, relying more on a public Cloud provider means better security. There’s more of a concentrated and comprehensive focus on security best practices that are perhaps better implemented and monitored centrally in the major public Clouds.

And so which is it? Is Cloud a positive or negative when it comes to cyber security? And what of hybrid models that combine public and private Cloud activities, how is security impacted in those cases?

We posed these and other questions to a panel of security experts at last week’s Open Group Conference in San Francisco to deeply examine how Cloud and security come together — for better or worse.

The panel: Jim Hietala, Vice President of Security for The Open Group; Stuart Boardman, Senior Business Consultant at KPN, where he co-leads the Enterprise Architecture Practice as well as the Cloud Computing Solutions Group; Dave Gilmour, an Associate at Metaplexity Associates and a Director at PreterLex Ltd., and Mary Ann Mezzapelle, Strategist for Enterprise Services and Chief Technologist for Security Services at HP.

The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. The full podcast can be found here.

Here are some excerpts:

Gardner: Is this notion of going outside the firewall fundamentally a good or bad thing when it comes to security?

Hietala: It can be either. Talking to security people in large companies, frequently what I hear is that with adoption of some of those services, their policy is either let’s try and block that until we get a grip on how to do it right, or let’s establish a policy that says we just don’t use certain kinds of Cloud services. Data I see says that that’s really a failed strategy. Adoption is happening whether they embrace it or not.

The real issue is how you do that in a planned, strategic way, as opposed to letting services like Dropbox and other kinds of Cloud Collaboration services just happen. So it’s really about getting some forethought around how do we do this the right way, picking the right services that meet your security objectives, and going from there.

Gardner: Is Cloud Computing good or bad for security purposes?

Boardman: It’s simply a fact, and it’s something that we need to learn to live with.

What I’ve noticed through my own work is a lot of enterprise security policies were written before we had Cloud, but when we had private web applications that you might call Cloud these days, and the policies tend to be directed toward staff’s private use of the Cloud.

Then you run into problems, because you read something in policy — and if you interpret that as meaning Cloud, it means you can’t do it. And if you say it’s not Cloud, then you haven’t got any policy about it at all. Enterprises need to sit down and think, “What would it mean to us to make use of Cloud services and to ask as well, what are we likely to do with Cloud services?”

Gardner: Dave, is there an added impetus for Cloud providers to be somewhat more secure than enterprises?

Gilmour: It depends on the enterprise that they’re actually supplying to. If you’re in a heavily regulated industry, you have a different view of what levels of security you need and want, and therefore what you’re going to impose contractually on your Cloud supplier. That means that the different Cloud suppliers are going to have to attack different industries with different levels of security arrangements.

The problem there is that the penalty regimes are always going to say, “Well, if the security lapses, you’re going to get off with two months of not paying” or something like that. That kind of attitude isn’t going to go in this kind of security.

What I don’t understand is exactly how secure Cloud provision is going to be enabled and governed under tight regimes like that.

An opportunity

Gardner: Jim, we’ve seen in the public sector that governments are recognizing that Cloud models could be a benefit to them. They can reduce redundancy. They can control and standardize. They’re putting in place some definitions, implementation standards, and so forth. Is the vanguard of correct Cloud Computing with security in mind being managed by governments at this point?

Hietala: I’d say that they’re at the forefront. Some of these shared government services, where they stand up Cloud and make it available to lots of different departments in a government, have the ability to do what they want from a security standpoint, not relying on a public provider, and get it right from their perspective and meet their requirements. They then take that consistent service out to lots of departments that may not have had the resources to get IT security right, when they were doing it themselves. So I think you can make a case for that.

Gardner: Stuart, being involved with standards activities yourself, does moving to the Cloud provide a better environment for managing, maintaining, instilling, and improving on standards than enterprise by enterprise by enterprise? As I say, we’re looking at a larger pool and therefore that strikes me as possibly being a better place to invoke and manage standards.

Boardman: Dana, that’s a really good point, and I do agree. Also, in the security field, we have an advantage in the sense that there are quite a lot of standards out there to deal with interoperability, exchange of policy, exchange of credentials, which we can use. If we adopt those, then we’ve got a much better chance of getting those standards used widely in the Cloud world than in an individual enterprise, with an individual supplier, where it’s not negotiation, but “you use my API, and it looks like this.”

Having said that, there are a lot of well-known Cloud providers who do not currently support those standards and they need a strong commercial reason to do it. So it’s going to be a question of the balance. Will we get enough specific weight of people who are using it to force the others to come on board? And I have no idea what the answer to that is.

Gardner: We’ve also seen that cooperation is an important aspect of security, knowing what’s going on on other people’s networks, being able to share information about what the threats are, remediation, working to move quickly and comprehensively when there are security issues across different networks.

Is that a case, Dave, where having a Cloud environment is a benefit? That is to say more sharing about what’s happening across networks for many companies that are clients or customers of a Cloud provider rather than perhaps spotty sharing when it comes to company by company?

Gilmour: There is something to be said for that, Dana. Part of the issue, though, is that companies are individually responsible for their data. They’re individually responsible to a regulator or to their clients for their data. The question then becomes that as soon as you start to share a certain aspect of the security, you’re de facto sharing the weaknesses as well as the strengths.

So it’s a two-edged sword. One of the problems we have is that until we mature a little bit more, we won’t be able to actually see which side is the sharpest.

Gardner: So our premise that Cloud is good and bad for security is holding up, but I’m wondering whether the same things that make you a risk in a private setting — poor adhesion to standards, no good governance, too many technologies that are not being measured and controlled, not instilling good behavior in your employees and then enforcing that — wouldn’t this be the same either way? Is it really Cloud or not Cloud, or is it good security practices or not good security practices? Mary Ann?

No accountability

Mezzapelle: You’re right. It’s a little bit of that “garbage in, garbage out,” if you don’t have the basic things in place in your enterprise, which means the policies, the governance cycle, the audit, and the tracking, because it doesn’t matter if you don’t measure it and track it, and if there is no business accountability.

David said it — each individual company is responsible for its own security, but I would say that it’s the business owner that’s responsible for the security, because they’re the ones that ultimately have to answer that question for themselves in their own business environment: “Is it enough for what I have to get done? Is the agility more important than the flexibility in getting to some systems or the accessibility for other people, as it is with some of the ubiquitous computing?”

So you’re right. If it’s an ugly situation within your enterprise, it’s going to get worse when you do outsourcing, out-tasking, or anything else you want to call within the Cloud environment. One of the things that we say is that organizations not only need to know their technology, but they have to get better at relationship management, understanding who their partners are, and being able to negotiate and manage that effectively through a series of relationships, not just transactions.

Gardner: If data and sharing data is so important, it strikes me that Cloud component is going to be part of that, especially if we’re dealing with business processes across organizations, doing joins, comparing and contrasting data, crunching it and sharing it, making data actually part of the business, a revenue generation activity, all seems prominent and likely.

So to you, Stuart, what is the issue now with data in the Cloud? Is it good, bad, or just the same double-edged sword, and it just depends how you manage and do it?

Boardman: Dana, I don’t know whether we really want to be putting our data in the Cloud, so much as putting the access to our data into the Cloud. There are all kinds of issues you’re going to run up against, as soon as you start putting your source information out into the Cloud, not the least privacy and that kind of thing.

A bunch of APIs

What you can do is simply say, “What information do I have that might be interesting to people? If it’s a private Cloud in a large organization elsewhere in the organization, how can I make that available to share?” Or maybe it’s really going out into public. What a government, for example, can be thinking about is making information services available, not just what you go and get from them that they already published. But “this is the information,” a bunch of APIs if you like. I prefer to call them data services, and to make those available.

So, if you do it properly, you have a layer of security in front of your data. You’re not letting people come in and do joins across all your tables. You’re providing information. That does require you then to engage your users in what is it that they want and what they want to do. Maybe there are people out there who want to take a bit of your information and a bit of somebody else’s and mash it together, provide added value. That’s great. Let’s go for that and not try and answer every possible question in advance.

Gardner: Dave, do you agree with that, or do you think that there is a place in the Cloud for some data?

Gilmour: There’s definitely a place in the Cloud for some data. I get the impression that there is going to drive out of this something like the insurance industry, where you’ll have a secondary Cloud. You’ll have secondary providers who will provide to the front-end providers. They might do things like archiving and that sort of thing.

Now, if you have that situation where your contractual relationship is two steps away, then you have to be very confident and certain of your cloud partner, and it has to actually therefore encompass a very strong level of governance.

The other issue you have is that you’ve got then the intersection of your governance requirements with that of the cloud provider’s governance requirements. Therefore you have to have a really strongly — and I hate to use the word — architected set of interfaces, so that you can understand how that governance is actually going to operate.

Gardner: Wouldn’t data perhaps be safer in a cloud than if they have a poorly managed network?

Mezzapelle: There is data in the Cloud and there will continue to be data in the Cloud, whether you want it there or not. The best organizations are going to start understanding that they can’t control it that way and that perimeter-like approach that we’ve been talking about getting away from for the last five or seven years.

So what we want to talk about is data-centric security, where you understand, based on role or context, who is going to access the information and for what reason. I think there is a better opportunity for services like storage, whether it’s for archiving or for near term use.

There are also other services that you don’t want to have to pay for 12 months out of the year, but that you might need independently. For instance, when you’re running a marketing campaign, you already share your data with some of your marketing partners. Or if you’re doing your payroll, you’re sharing that data through some of the national providers.

Data in different places

So there already is a lot of data in a lot of different places, whether you want Cloud or not, but the context is, it’s not in your perimeter, under your direct control, all of the time. The better you get at managing it wherever it is specific to the context, the better off you will be.

Hietala: It’s a slippery slope [when it comes to customer data]. That’s the most dangerous data to stick out in a Cloud service, if you ask me. If it’s personally identifiable information, then you get the privacy concerns that Stuart talked about. So to the extent you’re looking at putting that kind of data in a Cloud, looking at the Cloud service and trying to determine if we can apply some encryption, apply the sensible security controls to ensure that if that data gets loose, you’re not ending up in the headlines of The Wall Street Journal.

Gardner: Dave, you said there will be different levels on a regulatory basis for security. Wouldn’t that also play with data? Wouldn’t there be different types of data and therefore a spectrum of security and availability to that data?

Gilmour: You’re right. If we come back to Facebook as an example, Facebook is data that, even if it’s data about our known customers, it’s stuff that they have put out there with their will. The data that they give us, they have given to us for a purpose, and it is not for us then to distribute that data or make it available elsewhere. The fact that it may be the same data is not relevant to the discussion.

Three-dimensional solution

That’s where I think we are going to end up with not just one layer or two layers. We’re going to end up with a sort of a three-dimensional solution space. We’re going to work out exactly which chunk we’re going to handle in which way. There will be significant areas where these things crossover.

The other thing we shouldn’t forget is that data includes our software, and that’s something that people forget. Software nowadays is out in the Cloud, under current ways of running things, and you don’t even always know where it’s executing. So if you don’t know where your software is executing, how do you know where your data is?

It’s going to have to be just handled one way or another, and I think it’s going to be one of these things where it’s going to be shades of gray, because it cannot be black and white. The question is going to be, what’s the threshold shade of gray that’s acceptable.

Gardner: Mary Ann, to this notion of the different layers of security for different types of data, is there anything happening in the market that you’re aware of that’s already moving in that direction?

Mezzapelle: The experience that I have is mostly in some of the business frameworks for particular industries, like healthcare and what it takes to comply with the HIPAA regulation, or in the financial services industry, or in consumer products where you have to comply with the PCI regulations.

There has continued to be an issue around information lifecycle management, which is categorizing your data. Within a company, you might have had a document that you coded private, confidential, top secret, or whatever. So you might have had three or four levels for a document.

You’ve already talked about how complex it’s going to be as you move into trying understand, not only for that data, that the name Mary Ann Mezzapelle, happens to be in five or six different business systems over a 100 instances around the world.

That’s the importance of something like an Enterprise Architecture that can help you understand that you’re not just talking about the technology components, but the information, what they mean, and how they are prioritized or critical to the business, which sometimes comes up in a business continuity plan from a system point of view. That’s where I’ve advised clients on where they might start looking to how they connect the business criticality with a piece of information.

One last thing. Those regulations don’t necessarily mean that you’re secure. It makes for good basic health, but that doesn’t mean that it’s ultimately protected.You have to do a risk assessment based on your own environment and the bad actors that you expect and the priorities based on that.

Leaving security to the end

Boardman: I just wanted to pick up here, because Mary Ann spoke about Enterprise Architecture. One of my bugbears — and I call myself an enterprise architect — is that, we have a terrible habit of leaving security to the end. We don’t architect security into our Enterprise Architecture. It’s a techie thing, and we’ll fix that at the back. There are also people in the security world who are techies and they think that they will do it that way as well.

I don’t know how long ago it was published, but there was an activity to look at bringing the SABSA Methodology from security together with TOGAF®. There was a white paper published a few weeks ago.

The Open Group has been doing some really good work on bringing security right in to the process of EA.

Hietala: In the next version of TOGAF, which has already started, there will be a whole emphasis on making sure that security is better represented in some of the TOGAF guidance. That’s ongoing work here at The Open Group.

Gardner: As I listen, it sounds as if the in the Cloud or out of the Cloud security continuum is perhaps the wrong way to look at it. If you have a lifecycle approach to services and to data, then you’ll have a way in which you can approach data uses for certain instances, certain requirements, and that would then apply to a variety of different private Cloud, public Cloud, hybrid Cloud.

Is that where we need to go, perhaps have more of this lifecycle approach to services and data that would accommodate any number of different scenarios in terms of hosting access and availability? The Cloud seems inevitable. So what we really need to focus on are the services and the data.

Boardman: That’s part of it. That needs to be tied in with the risk-based approach. So if we have done that, we can then pick up on that information and we can look at a concrete situation, what have we got here, what do we want to do with it. We can then compare that information. We can assess our risk based on what we have done around the lifecycle. We can understand specifically what we might be thinking about putting where and come up with a sensible risk approach.

You may come to the conclusion in some cases that the risk is too high and the mitigation too expensive. In others, you may say, no, because we understand our information and we understand the risk situation, we can live with that, it’s fine.

Gardner: It sounds as if we are coming at this as an underwriter for an insurance company. Is that the way to look at it?

Current risk

Gilmour: That’s eminently sensible. You have the mortality tables, you have the current risk, and you just work the two together and work out what’s the premium. That’s probably a very good paradigm to give us guidance actually as to how we should approach intellectually the problem.

Mezzapelle: One of the problems is that we don’t have those actuarial tables yet. That’s a little bit of an issue for a lot of people when they talk about, “I’ve got $100 to spend on security. Where am I going to spend it this year? Am I going to spend it on firewalls? Am I going to spend it on information lifecycle management assessment? What am I going to spend it on?” That’s some of the research that we have been doing at HP is to try to get that into something that’s more of a statistic.

So, when you have a particular project that does a certain kind of security implementation, you can see what the business return on it is and how it actually lowers risk. We found that it’s better to spend your money on getting a better system to patch your systems than it is to do some other kind of content filtering or something like that.

Gardner: Perhaps what we need is the equivalent of an Underwriters Laboratories (UL) for permeable organizational IT assets, where the security stamp of approval comes in high or low. Then, you could get you insurance insight– maybe something for The Open Group to look into. Any thoughts about how standards and a consortium approach would come into that?

Hietala: I don’t know about the UL for all security things. That sounds like a risky proposition.

Gardner: It could be fairly popular and remunerative.

Hietala: It could.

Mezzapelle: An unending job.

Hietala: I will say we have one active project in the Security Forum that is looking at trying to allow organizations to measure and understand risk dependencies that they inherit from other organizations.

So if I’m outsourcing a function to XYZ corporation, being able to measure what risk am I inheriting from them by virtue of them doing some IT processing for me, could be a Cloud provider or it could be somebody doing a business process for me, whatever. So there’s work going on there.

I heard just last week about a NSF funded project here in the U.S. to do the same sort of thing, to look at trying to measure risk in a predictable way. So there are things going on out there.

Gardner: We have to wrap up, I’m afraid, but Stuart, it seems as if currently it’s the larger public Cloud provider, something of Amazon and Google and among others that might be playing the role of all of these entities we are talking about. They are their own self-insurer. They are their own underwriter. They are their own risk assessor, like a UL. Do you think that’s going to continue to be the case?

Boardman: No, I think that as Cloud adoption increases, you will have a greater weight of consumer organizations who will need to do that themselves. You look at the question that it’s not just responsibility, but it’s also accountability. At the end of the day, you’re always accountable for the data that you hold. It doesn’t matter where you put it and how many other parties they subcontract that out to.

The weight will change

So there’s a need to have that, and as the adoption increases, there’s less fear and more, “Let’s do something about it.” Then, I think the weight will change.

Plus, of course, there are other parties coming into this world, the world that Amazon has created. I’d imagine that HP is probably one of them as well, but all the big names in IT are moving in here, and I suspect that also for those companies there’s a differentiator in knowing how to do this properly in their history of enterprise involvement.

So yeah, I think it will change. That’s no offense to Amazon, etc. I just think that the balance is going to change.

Gilmour: Yes. I think that’s how it has to go. The question that then arises is, who is going to police the policeman and how is that going to happen? Every company is going to be using the Cloud. Even the Cloud suppliers are using the Cloud. So how is it going to work? It’s one of these never-decreasing circles.

Mezzapelle: At this point, I think it’s going to be more evolution than revolution, but I’m also one of the people who’ve been in that part of the business — IT services — for the last 20 years and have seen it morph in a little bit different way.

Stuart is right that there’s going to be a convergence of the consumer-driven, cloud-based model, which Amazon and Google represent, with an enterprise approach that corporations like HP are representing. It’s somewhere in the middle where we can bring the service level commitments, the options for security, the options for other things that make it more reliable and risk-averse for large corporations to take advantage of it.

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Cybersecurity, Information security, Security Architecture

Security and Cloud Computing Themes to be explored at The Open Group San Francisco Conference

By The Open Group Conference Team

Cybersecurity and Cloud Computing are two of the most pressing trends facing enterprises today. The Open Group Conference San Francisco will feature tracks on both trends where attendees can learn about the latest developments in both disciplines as well as hear practical advice for implementing both secure architectures and for moving enterprises into the Cloud.  Below are some of the highlights and featured speakers from both tracks.

Security

The San Francisco conference will provide an opportunity for practitioners to explore the theme of “hacktivism,” the use and abuse of IT to drive social change, and its potential impact on business strategy and Enterprise Transformation.  Traditionally, IT security has focused on protecting the IT infrastructure and the integrity of the data held within.  However, in a rapidly changing world where hacktivism is an enterprise’s biggest threat, how can enterprise IT security respond?

Featured speakers and panels include:

  • Steve Whitlock, Chief Security Strategist, Boeing, “Information Security in the Internet Age”
  • Jim Hietala, Vice President, Security, The Open Group, “The Open Group Security Survey Results”
  • Dave Hornford, Conexiam, and Chair, The Open Group Architecture Forum, “Overview of TOGAF® and SABSA® Integration White Paper”
  • Panel – “The Global Supply Chain: Presentation and Discussion on the Challenges of Protecting Products Against Counterfeit and Tampering”

Cloud Computing

According to Gartner, Cloud Computing is now entering the “trough of disillusionment” on its hype cycle. It is critical that organizations better understand the practical business, operational and regulatory issues associated with the implementation of Cloud Computing in order to truly maximize its potential benefits.

Featured speakers and panels include:

  • David JW Gilmour, Metaplexity Associates, “Architecting for Information Security in a Cloud Environment”
  • Chris Lockhart, Senior Enterprise Architect, UnitedHeal, “Un-Architecture: How a Fortune 25 Company Solved the Greatest IT Problem”
  • Penelope Gordon, Cloud and Business Architect, 1Plug Corporation, “Measuring the Business Performance of Cloud Products”
  • Jitendra Maan, Tata Consultancy, “Mobile Intelligence with Cloud Strategy”
  • Panel – “The Benefits, Challenges and Survey of Cloud Computing Interoperability and Portability”
    • Mark Skilton, Capgemini; Kapil Bakshi, Cisco; Jeffrey Raugh, Hewlett-Packard

Please join us in San Francisco for these speaking tracks, as well as those on our featured them of Enterprise Transformation and the role of enterprise architecture. For more information, please go to the conference homepage: http://www3.opengroup.org/sanfrancisco2012

2 Comments

Filed under Cloud, Cloud/SOA, Cybersecurity, Information security, Security Architecture, Semantic Interoperability, TOGAF

Overlapping Criminal and State Threats Pose Growing Cyber Security Threat to Global Internet Commerce, Says Open Group Speaker

By Dana Gardner, Interarbor Solutions

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference this January in San Francisco.

The conference will focus on how IT and enterprise architecture support enterprise transformation. Speakers in conference events will also explore the latest in service oriented architecture (SOA), cloud computing, and security.

We’re here now with one of the main speakers, Joseph Menn, Cyber Security Correspondent for the Financial Times and author of Fatal System Error: The Hunt for the New Crime Lords Who are Bringing Down the Internet.

Joe has covered security since 1999 for both the Financial Times and then before that, for the Los Angeles Times. Fatal System Error is his third book, he also wrote All the Rave: The Rise and Fall of Shawn Fanning’s Napster.

As a lead-in to his Open Group presentation, entitled “What You’re Up Against: Mobsters, Nation-States, and Blurry Lines,” Joe explores the current cyber-crimelandscape, the underground cyber-gang movement, and the motive behind governments collaborating with organized crime in cyber space. The interview is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. The full podcast can be found here.

Here are some excerpts:

Gardner: Have we entered a new period where just balancing risks and costs isn’t a sufficient bulwark against burgeoning cyber crime?

Menn: Maybe you can make your enterprise a little trickier to get into than the other guy’s enterprise, but crime pays very, very well, and in the big picture, their ecosystem is better than ours. They do capitalism better than we do. They specialize to a great extent. They reinvest in R&D.

On our end, on the good guys’ side, it’s hard if you’re a chief information security officer (CISO) or a chief security officer (CSO) to convince the top brass to pay more. You don’t really know what’s working and what isn’t. You don’t know if you’ve really been had by something that we call advanced persistent threat (APT). Even the top security minds in the country can’t be sure whether they’ve been had or not. So it’s hard to know what to spend on.

More efficient

The other side doesn’t have that problem. They’re getting more efficient in the same way that they used to lead technical innovation. They’re leading economic innovation. The freemium model is best evidenced by crimeware kits like ZeuS, where you can get versions that are pretty effective and will help you steal a bunch of money for free. Then if you like that, you have the add-on to pay extra for — the latest and greatest that are sure to get through the antivirus systems.

Gardner: When you say “they,” who you are really talking about?

Menn: They, the bad guys? It’s largely Eastern European organized crime. In some countries, they can be caught. In other countries they can’t be caught, and there really isn’t any point in trying.

It’s a geopolitical issue, which is something that is not widely understood, because in general, officials don’t talk about it. Working on my book, and in reporting for the newspapers, I’ve met really good cyber investigators for the Secret Service and the FBI, but I’ve yet to meet one that thinks he’s going to get promoted for calling a press conference and announcing that they can’t catch anyone.

So the State Department, meanwhile, keeps hoping that the other side is going to turn a new leaf, but they’ve been hoping that for 10 or more years, and it hasn’t happened. So it’s incumbent upon the rest of us to call a spade a spade here.

What’s really going on is that Russian intelligence and, depending on who is in office at a given time, Ukrainian authorities, are knowingly protecting some of the worst and most effective cyber criminals on the planet.

Gardner: And what would be their motivation?

Menn: As a starting point, the level of garden-variety corruption over there is absolutely mind-blowing. More than 50 percent of Russian citizens responding to the survey say that they had paid a bribe to somebody in the past 12 months. But it’s gone well beyond that.

The same resources, human and technical, that are used to rob us blind are also being used in what is fairly called cyber war. The same criminal networks that are after our bank accounts were, for example, used in denial-of-service (DOS) attacks on Georgia and Estonian websites belonging to government, major media, and Estonia banks.

It’s the same guy, and it’s a “look-the-other-way” thing. You can do whatever crime you want, and when we call upon you to serve Mother Russia, you will do so. And that has accelerated. Just in the past couple of weeks, with the disputed elections in Russia, you’ve seen mass DOS attacks against opposition websites, mainstream media websites, and live journals. It’s a pretty handy tool to have at your disposal. I provide all the evidence that would be needed to convince the reasonable people in my book.

Gardner: In your book you use the terms “bringing down the Internet.” Is this all really a threat to the integrity of the Internet?

Menn: Well integrity is the key word there. No, I don’t think anybody is about to stop us all from the privilege of watching skateboarding dogs onYouTube. What I mean by that is the higher trust in the Internet in the way it’s come to be used, not the way it was designed, but the way it is used now for online banking, ecommerce, and for increasingly storing corporate — and heaven help us, government secrets — in the cloud. That is in very, very great trouble.

Not a prayer

I don’t think that now you can even trust transactions not to be monitored and pilfered. The latest, greatest versions of ZeuS gets past multi-factor authentication and are not detected by any antivirus that’s out there. So consumers don’t have a prayer, in the words of Art Coviello, CEO of RSA, and corporations aren’t doing much better.

So the way the Internet is being used now is in very, very grave trouble and not reliable. That’s what I mean by it. If they turned all the botnets in the world on a given target, that target is gone. For multiple root servers and DNS, they could do some serious damage. I don’t know if they could stop the whole thing, but you’re right, they don’t want to kill the golden goose. I don’t see a motivation for that.

Gardner: If we look at organized crime in historical context, we found that there is a lot of innovation over the decades. Is that playing out on the Internet as well?

Menn: Sure. The mob does well in any place where there is a market for something, and there isn’t an effective regulatory framework that sustains it – prohibition back in the day, prostitution, gambling, and that sort of thing.

… The Russian and Ukrainian gangs went to extortion as an early model, and ironically, some of the first websites that they extorted with the threat were the offshore gambling firms. They were cash rich, they had pretty weak infrastructure, and they were wary about going to the FBI. They started by attacking those sites in 2003-04 and then they moved on to more garden-variety companies. Some of them paid off and some said, “This is going to look little awkward in our SEC filings” and they didn’t pay off.

Once the cyber gang got big enough, sooner or later, they also wanted the protection of traditional organized crime, because those people had better connections inside the intelligence agencies and the police force and could get them protection. That’s the way it worked. It was sort of an organic alliance, rather than “Let’s develop this promising area.”

… That is what happens. Initially it was garden-variety payoffs and protection. Then, around 2007, with the attack on Estonia, these guys started proving their worth to the Kremlin, and others saw that with the attacks that ran through their system.

This has continued to evolve very rapidly. Now the DOS attacks are routinely used as the tool for political repression all around the world –Vietnam, Iran and everywhere you’ll see critics that are silenced from DOS attacks. In most cases, it’s not the spy agencies or whoever themselves, but it’s their contract agents. They just go to their friends in the similar gangs and say, “Hey do this.” What’s interesting is that they are both in this gray area now, both Russia and China, which we haven’t talked about as much.

In China, hacking really started out as an expression of patriotism. Some of the biggest attacks, Code Red being one of them, were against targets in countries that were perceived to have slighted China or had run into some sort of territorial flap with China, and, lo and behold, they got hacked.

In the past several years, with this sort of patriotic hacking, the anti-defense establishment hacking in the West that we are reading a lot about finally, those same guys have gone off and decided to enrich themselves as well. There were actually disputes in some of the major Chinese hacking groups. Some people said it was unethical to just go after money, and some of these early groups split over that.

Once the cyber gang got big enough, sooner or later, they also wanted the protection of traditional organized crime, because those people had better connections inside the intelligence agencies and the police force and could get them protection. That’s the way it worked. It was sort of an organic alliance, rather than “Let’s develop this promising area.”

… That is what happens. Initially it was garden-variety payoffs and protection. Then, around 2007, with the attack on Estonia, these guys started proving their worth to the Kremlin, and others saw that with the attacks that ran through their system.

This has continued to evolve very rapidly. Now the DOS attacks are routinely used as the tool for political repression all around the world –Vietnam, Iran and everywhere you’ll see critics that are silenced from DOS attacks. In most cases, it’s not the spy agencies or whoever themselves, but it’s their contract agents. They just go to their friends in the similar gangs and say, “Hey do this.” What’s interesting is that they are both in this gray area now, both Russia and China, which we haven’t talked about as much.

In China, hacking really started out as an expression of patriotism. Some of the biggest attacks, Code Red being one of them, were against targets in countries that were perceived to have slighted China or had run into some sort of territorial flap with China, and, lo and behold, they got hacked.

In the past several years, with this sort of patriotic hacking, the anti-defense establishment hacking in the West that we are reading a lot about finally, those same guys have gone off and decided to enrich themselves as well. There were actually disputes in some of the major Chinese hacking groups. Some people said it was unethical to just go after money, and some of these early groups split over that.

In Russia, it went the other way. It started out with just a bunch of greedy criminals, and then they said, “Hey — we can do even better and be protected. You have better protection if you do some hacking for the motherland.” In China, it’s the other way. They started out hacking for the motherland, and then added, “Hey — we can get rich while serving our country.”

So they’re both sort of in the same place, and unfortunately it makes it pretty close to impossible for law enforcement in [the U.S.] to do anything about it, because it gets into political protection. What you really need is White House-level dealing with this stuff. If President Obama is going to talk to his opposite numbers about Chinese currency, Russian support of something we don’t like, or oil policy, this has got to be right up there too — or nothing is going to happen at all.

Gardner: What about the pure capitalism side, stealing intellectual property (IP) and taking over products in markets with the aid of these nefarious means? How big a deal is this now for enterprises and commercial organizations?

Menn: It is much, much worse than anybody realizes. The U.S. counterintelligence a few weeks ago finally put out a report saying that Russia and China are deliberately stealing our IP, the IP of our companies. That’s an open secret. It’s been happening for years. You’re right. The man in the street doesn’t realize this, because companies aren’t used to fessing up. Therefore, there is little outrage and little pressure for retaliation or diplomatic engagement on these issues.

I’m cautiously optimistic that that is going to change a little bit. This year the Securities and Exchange Commission (SEC) gave very detailed guidance about when you have to disclose when you’ve been hacked. If there is a material impact to your company, you have to disclose it here and there, even if it’s unknown.

Gardner: So the old adage of shining light on this probably is in the best interest of everyone. Is the message then keeping this quiet isn’t necessarily the right way to go?

Menn: Not only is it not the right way to go, but it’s safer to come out of the woods and fess up now. The stigma is almost gone. If you really blow the PR like Sony, then you’re going to suffer some, but I haven’t heard a lot of people say, “Boy, Google is run by a bunch of stupid idiots. They got hacked by the Chinese.”

It’s the definition of an asymmetrical fight here. There is no company that’s going to stand up against the might of the Chinese military, and nobody is going to fault them for getting nailed. Where we should fault them is for covering it up.

I think you should give the American people some credit. They realize that you’re not the bad guy, if you get nailed. As I said, nobody thinks that Google has a bunch of stupid engineers. It is somewhere between extremely difficult to impossible to ward off against “zero-days” and the dedicated teams working on social engineering, because the TCP/IP is fundamentally broken and it ain’t your fault.

 [These threats] are an existential threat not only to your company, but to our country and to our way of life. It is that bad. One of the problems is that in the U.S., executives tend to think a quarter or two ahead. If your source code gets stolen, your blueprints get taken, nobody might know that for a few years, and heck, by then you’re retired.

With the new SEC guidelines and some national plans in the U.K. and in the U.S., that’s not going to cut it anymore. Executives will be held accountable. This is some pretty drastic stuff. The things that you should be thinking about, if you’re in an IT-based business, include figuring out the absolutely critical crown jewel one, two, or three percent of your stuff, and keeping it off network machines.

Short-term price

Gardner: So we have to think differently, don’t we?

Menn: Basically, regular companies have to start thinking like banks, and banks have to start thinking like intelligence agencies. Everybody has to level up here.

Gardner: What do the intelligence agencies have to start thinking about?

Menn: The discussions that are going on now obviously include greatly increased monitoring, pushing responsibility for seeing suspicious stuff down to private enterprise, and obviously greater information sharing between private enterprise, and government officials.

But, there’s some pretty outlandish stuff that’s getting kicked around, including looking the other way if you, as a company, sniff something out in another country and decide to take retaliatory action on your own. There’s some pretty sea-change stuff that’s going on.

Gardner: So that would be playing offense as well as defense?

Menn: In the Defense Authorization Act that just passed, for the first time, Congress officially blesses offensive cyber-warfare, which is something we’ve already been doing, just quietly.

We’re entering some pretty new areas here, and one of the things that’s going on is that the cyber warfare stuff, which is happening, is basically run by intelligence folks, rather by a bunch of lawyers worrying about collateral damage and the like, and there’s almost no oversight because intelligence agencies in general get low oversight.

Gardner: Just quickly looking to the future, we have some major trends. We have an increased movement toward mobility, cloud, big data, social. How do these big shifts in IT impact this cyber security issue?

Menn: Well, there are some that are clearly dangerous, and there are some things that are a mixed bag. Certainly, the inroads of social networking into the workplace are bad from a security point of view. Perhaps worse is the consumerization of IT, the bring-your-own-device trend, which isn’t going to go away. That’s bad, although there are obviously mitigating things you can do.

The cloud itself is a mixed bag. Certainly, in theory, it could be made more secure than what you have on premise. If you’re turning it over to the very best of the very best, they can do a lot more things than you can in terms of protecting it, particularly if you’re a smaller business.

If you look to the large-scale banks and people with health records and that sort of thing that really have to be ultra-secure, they’re not going to do this yet, because the procedures are not really set up to their specs yet. That may likely come in the future. But, cloud security, in my opinion, is not there yet. So that’s a mixed blessing.

Radical steps

You need to think strategically about this, and that includes some pretty radical steps. There are those who say there are two types of companies out there — those that have been hacked and those that don’t know that they’ve been hacked.

Everybody needs to take a look at this stuff beyond their immediate corporate needs and think about where we’re heading as a society. And to the extent that people are already expert in the stuff or can become expert in this stuff, they need to share that knowledge, and that will often mean, saying “Yes, we got hacked” publicly, but it also means educating those around them about the severity of the threat.

One of the reasons I wrote my book, and spent years doing it, is not because I felt that I could tell every senior executive what they needed to do. I wanted to educate a broader audience, because there are some pretty smart people, even in Washington, who have known about this for years and have been unable to do anything about it. We haven’t really passed anything that’s substantial in terms of legislation.

As a matter of political philosophy, I feel that if enough people on the street realize what’s going on, then quite often leaders will get in front of them and at least attempt to do the right thing. Senior executives should be thinking about educating their customers, their peers, the general public, and Washington to make sure that the stuff that passes isn’t as bad as it might otherwise be.

************

If you are interested in attending The Open Group’s upcoming conference, please register here: http://www3.opengroup.org/event/open-group-conference-san-francisco/registration

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

Comments Off

Filed under Cloud, Cybersecurity, Information security, Security Architecture

How to manage requirements within the Enterprise Architecture using the TOGAF® and SABSA® frameworks

By Pascal de Koning, KPN 

You want to put your company’s business strategy into action. What’s the best way to accomplish this?  This can be done in a structured manner by using an Enterprise Architecture
Framework like TOGAF®. TOGAF® offers an overview of business and IT related architectures, as well as a process model to deliver these, called the Architecture Development Method (ADM-figure 1).

As the figure shows, Requirements Management plays a central role in the architecture work in the TOGAF® methodology. It’s very important to know the business requirements, because these demand what’s needed in the underlying architecture layer. In fact, this counts for every layer. Each architecture layer fulfills the requirements that are defined in the layer above. Without proper Requirements Management, the whole architecture would be loose sand.

Unfortunately, TOGAF® does not offer guidance on Requirements Management. It does however stress the importance and central role of Requirements Management, but doesn’t offer a way to actually do Requirements Management. This is a white spot in the TOGAF® ADM. To resolve this, a requirements management method is needed that is well-described and flexible to use on all levels in the architecture. We found this in the SABSA® (Sherwood’s Applied Business-driven Security Architecture) framework. SABSA® offers the unique Business Attribute Profiling (BAP) technique as a means to effectively carry out Requirements Management.

Business Attribute Profiling is a requirements engineering technique that translates business goals and drivers into requirements (see figure 2). Some advantages of this technique are:

  • Executive communication in non-ICT terms
  • Grouping and structuring of requirements, keeping oversight
  • Traceability mapping between business drivers, requirements and capabilities

The BAP process decomposes the business goal into its core elements. Each core element is a single business attribute. Examples of business attributes are Available, Scalable, Supported, Confidential, Traceable, etc.

As business processes tend to become more Internet-based, cyber security is becoming more important every day because the business processes are increasingly vulnerable to forces outside the business. Organizations must now consider not only the processes and requirements when planning an architecture, but they also need to consider the security of that architecture. A Security Architecture consists of all the security-related drivers, requirements, services and capabilities within the Enterprise. With the adoption of the Business Attribute Profiling technique for Requirements Management, it is now possible to integrate information security into the Enterprise Architecture.

The TOGAF®-SABSA® Integration white paper elaborates more on this and provides a guide that describes how TOGAF® and SABSA® can be combined such that the SABSA® business risk-driven security architecture approach is seamlessly integrated into the a TOGAF®-based enterprise architecture. It can be downloaded from https://www2.opengroup.org/ogsys/jsp/publications/PublicationDetails.jsp?publicationid=12449

TOGAF® is a registered trademark of The Open Group.  SABSA® is a registered trademark of The SABSA Institute.

Pascal de Koning MSc CISSP is a Senior Business Consultant with KPN Trusted Services, where he leads the security consulting practice. He is chairman of The Open Group TOGAF-SABSA Integration Working Group. He has worked on information security projects for the Dutch central government, European Union and KPN, to name just a few. Pascal has written articles for Computable and PvIB, and is a frequent speaker at conferences like RSA Europe and COSAC on the topics of Cyber Security and Enterprise Security Architecture. When not working, Pascal loves to go running.

1 Comment

Filed under Enterprise Architecture, Security Architecture, TOGAF®

The Open Group and SABSA Institute Publish TOGAF® Integration Whitepaper

By Jim Hietala, Vice President, Security, The Open Group

2011 confirmed what many in the Enterprise Architecture industry have feared – data breaches are on the rise. It’s not just the number and cost of data breaches, but the sheer volume of information that cyber criminals are able to get their hands on. Today’s organizations cannot risk being vulnerable.

To help address this issue, The Open Group Security and Architecture Forums, and the SABSA® Institute, developers of the SABSA® security and risk management framework, joined forces to explore how security methodologies and risk management approaches can be an integrated with enterprise-level architectures for better protection and flexibility.

If you are an enterprise architect with responsibility for ensuring architectures are secure or a security professional tasked with developing secure architectures you’ll be interested in the work the Architecture Forum and SABSA® have done over the last 15 months, culminating in a whitepaper released today that provides a valuable contribution to the security and enterprise architecture communities.

 A Project Designed to Protect

All too often vulnerabilities can occur due to lack of alignment across organizations, with security and IT experts failing to consider the entire infrastructure together rather than different parts separately.

The impetus for this project came from large enterprises and consulting organizations that frequently saw TOGAF® being used as a tool for developing enterprise architecture, and SABSA® as a tool for creating security architectures. Practitioners of either TOGAF® or SABSA® asked for guidance on how best to align these frameworks in practical usage, and on how to re-use artifacts from each.

This quote from the whitepaper sums up the rationale for the effort best:

 “For too long, information security has been considered a separate discipline, isolated from the enterprise architecture. This Whitepaper documents an approach to enhance the TOGAF® enterprise architecture methodology with the SABSA® security architecture approach and thus create one holistic architecture methodology.”

The vision for the project has been to support enterprise architects who need to take operational risk management into account, by providing guidance describing how TOGAF® and SABSA® can be combined such that the SABSA® business risk and opportunity-driven security architecture approach can be seamlessly integrated into the TOGAF® business strategy-driven approach to develop a richer, more complete enterprise architecture.

There are two important focal points for this effort, first to provide a practical approach for seamlessly integrating SABSA® security requirements and services in common TOGAF®-based architecture engagements – instead of treating security as a separate entity within the architecture.

The second focal point is to illustrate how the requirements management processes in TOGAF® can be fulfilled in their widest generic sense (i.e., not only with regard to security architecture) by application of the SABSA® concept of Business Attribute Profiling to the entire ADM process.

Download a free copy of the TOGAF® and SABSA® whitepaper here.

If you are interested in exploring TOGAF® 9, online access to the framework is available here.

Information on SABSA® may be obtained here.

A large number of individuals participated in the development of this valuable resource. Thank you to all project team members who made this effort a reality, including from the SABSA® Institute, the Open Group Architecture Forum, and the Open Group Security Forum!

3 Comments

Filed under Enterprise Architecture, Security Architecture, TOGAF®

The Open Group updates Enterprise Security Architecture, guidance and reference architecture for information security

By Jim Hietala, The Open Group

One of two key focus areas for The Open Group Security Forum is security architecture. The Security Forum has several ongoing projects in this area, including our TOGAF® and SABSA integration project, which will produce much needed guidance on how to use these frameworks together.

When the Network Application Consortium ceased operating a few years ago, The Open Group agreed to bring the intellectual property from the organization into our Security Forum, along with extending membership to the former NAC members. While the NAC did great work in information security, one publication from the NAC stood out as a highly valuable resource. This document, Enterprise Security Acrhitecture (ESA), A Framework and Template for Policy-Driven Security, was originally published by the NAC in 2004, and provided valuable guidance to IT architects and security architects. At the time it was first published, the ESA document filled a void in the IT security community by describing important information security functions, and how they related to each other in an overall enterprise security architecture. ESA was at the time unique in describing information security architectural concepts, and in providing examples in a reference architecture format.

The IT environment has changed significantly over the past several years since the original publication of the ESA document. Major changes that have affected information security architecture in this time include the increased usage of mobile computing devices, increased need to collaborate (and federation of identities among partner organizations), and changes in the threats and attacks.

Members of the Security Forum, having realized the need to revisit the document and update its guidance to address these changes, have significantly rewritten the document to provide new and revised guidance. Significant changes to the ESA document have been made in the areas of federated identity, mobile device security, designing for malice, and new categories of security controls including data loss prevention and virtualization security.

In keeping with the many changes to our industry, The Open Group Security Forum has now updated and published a significant revision to the Enterprise Security Architecture (O-ESA), which you can access and download (for free, minimal registration required) here; or purchase a hardcover edition here.

Our thanks to the many members of the Security Forum (and former NAC members) who contributed to this work, and in particular to Stefan Wahe who guided the revision, and to Gunnar Peterson, who managed the project and provided significant updates to the content.

Jim HietalaAn IT security industry veteran, Jim is Vice President of Security at The Open Group, where he is responsible for security programs and standards activities. He holds the CISSP and GSEC certifications. Jim is based in the U.S.

Comments Off

Filed under Security Architecture