Monthly Archives: December 2013

ArchiMate® 2 Certification reaches the 1000th certification milestone

By Andrew Josey, The Open Group

We’re pleased to announce that the ArchiMate Certification for People program has reached the significant milestone of 1,000 individual certifications and there are individuals certified in 30 different countries as shown in the world map below.

ArchiMate 1000

The top 10 countries are:

Netherlands 458 45.8%
UK 104 10.4%
Belgium 76 7.6%
Australia 35 3.5%
Germany 32 3.2%
Norway 30 3%
Sweden 30 3%
USA 27 2.7%
Poland 16 1.6%
Slovakia 13 1.3%
 

The vision for the ArchiMate 2 Certification Program is to define and promote a market-driven education and certification program to support the ArchiMate modeling language Standard.

More information on the program is available at the ArchiMate 2 Certification site at http://www.opengroup.org/certifications/archimate/

Details of the ArchiMate 2 Examinations are available at: http://www.opengroup.org/certifications/archimate/docs/exam

The calendar of Accredited ArchiMate 2 Training courses is available at: http://www.opengroup.org/archimate/training-calendar/

The ArchiMate 2 Certification register can be found at https://archimate-cert.opengroup.org/certified-individuals

ArchiMate is a registered trademark of The Open Group.

 Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Comments Off

Filed under ArchiMate®, Certifications, Enterprise Architecture

Measuring the Immeasurable: You Have More Data Than You Think You Do

By Jim Hietala, Vice President, Security, The Open Group

According to a recent study by the Ponemon Institute, the average U.S. company experiences more than 100 successful cyber-attacks each year at a cost of $11.6M. By enabling security technologies, those companies can reduce losses by nearly $4M and instituting security governance reduces costs by an average of $1.5M, according to the study.

In light of increasing attacks and security breaches, executives are increasingly asking security and risk professionals to provide analyses of individual company risk and loss estimates. For example, the U.S. healthcare sector has been required by the HIPAA Security rule to perform annual risk assessments for some time now. The recent HITECH Act also added security breach notification and disclosure requirements, increased enforcement in the form of audits and increased penalties in the form of fines. Despite federal requirements, the prospect of measuring risk and doing risk analyses can be a daunting task that leaves even the best of us with a case of “analysis paralysis.”

Many IT experts agree that we are nearing a time where risk analysis is not only becoming the norm, but when those risk figures may well be used to cast blame (or be used as part of a defense in a lawsuit) if and when there are catastrophic security breaches that cost consumers, investors and companies significant losses.

In the past, many companies have been reluctant to perform risk analyses due to the perception that measuring IT security risk is too difficult because it’s intangible. But if IT departments could soon become accountable for breaches, don’t you want to be able to determine your risk and the threats potentially facing your organization?

In his book, How to Measure Anything, father of Applied Information Economics Douglas Hubbard points out that immeasurability is an illusion and that organizations do, in fact, usually have the information they need to create good risk analyses. Part of the misperception of immeasurability stems from a lack of understanding of what measurement is actually meant to be. According to Hubbard, most people, and executives in particular, expect measurement and analysis to produce an “exact” number—as in, “our organization has a 64.5 percent chance of having a denial of service attack next year.”

Hubbard argues that, as risk analysts, we need to look at measurement more like how scientists look at things—measurement is meant to reduce uncertainty—not to produce certainty—about a quantity based on observation.  Proper measurement should not produce an exact number, but rather a range of possibility, as in “our organization has a 30-60 percent chance of having a denial of service attack next year.” Realistic measurement of risk is far more likely when expressed as a probability distribution with a range of outcomes than in terms of one number or one outcome.

The problem that most often produces “analysis paralysis” is not just the question of how to derive those numbers but also how to get to the information that will help produce those numbers. If you’ve been tasked, for instance, with determining the risk of a breach that has never happened to your organization before, perhaps a denial of service attack against your web presence, how can you make an accurate determination about something that hasn’t happened in the past? Where do you get your data to do your analysis? How do you model that analysis?

In an article published in CSO Magazine, Hubbard argues that organizations have far more data than they think they do and they actually need less data than they may believe they do in order to do proper analyses. Hubbard says that IT departments, in particular, have gotten so used to having information stored in databases that they can easily query, they forget there are many other sources to gather data from. Just because something hasn’t happened yet and you haven’t been gathering historical data on it and socking it away in your database doesn’t mean you either don’t have any data or that you can’t find what you need to measure your risk. Even in the age of Big Data, there is plenty of useful data outside of the big database.

You will still need to gather that data. But you just need enough to be able to measure it accurately not necessarily precisely. In our recently published Open Group Risk Assessment Standard (O-RA), this is called calibration of estimates. Calibration provides a method for making good estimates, which are necessary for deriving a measured range of probability for risk. Section 3 of the O-RA standard uses provides a comprehensive look at how best to come up with calibrated estimates, as well as how to determine other risk factors using the FAIR (Factor Analysis of Information Risk) model.

So where do you get your data if it’s not already stored and easily accessible in a database? There are numerous sources you can turn to, both externally and internally. You just have to do the research to find it. For example, even if your company hasn’t experienced a DNS attack, many others have—what was their experience when it happened? This information is out there online—you just need to search for it. Industry reports are another source of information. Verizon publishes its own annual Verizon Data Breach Investigations Report for one. DatalossDB publishes an open data beach incident database that provides information on data loss incidents worldwide. Many vendors publish annual security reports and issue regular security advisories. Security publications and analyst firms such as CSO, Gartner, Forrester or Securosis all have research reports that data can be gleaned from.

Then there’s your internal information. Chances are your IT department has records you can use—they likely count how many laptops are lost or stolen each year. You should also look to the experts within your company to help. Other people can provide a wealth of valuable information for use in your analysis. You can also look to the data you do have on related or similar attacks as a gauge.

Chances are, you already have the data you need or you can easily find it online. Use it.

With the ever-growing list of threats and risks organizations face today, we are fast reaching a time when failing to measure risk will no longer be acceptable—in the boardroom or even by governments.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Data management, Information security, Open FAIR Certification, RISK Management, Uncategorized

ArchiMate® 2.1 Specification Maintenance Release

By Andrew Josey, The Open Group

We’re pleased to announce the latest release of the ArchiMate modeling language specification.

ArchiMate® 2.1, an Open Group standard, is a full updated release of the ArchiMate Specification addressing comments raised since the introduction of Issue 2.0 in 2012. It retains the major features and structure of ArchiMate 2.0 adding further detail and clarification, thereby preserving existing investment in the ArchiMate modeling language. In this blog, we take a brief look at what has changed[1].

The changes in this release are as follows:

  1. Additional explanatory text has been added in section 2.6 describing the ArchiMate Framework, its layers and aspects.
  2. Corrections have been made to figures throughout the specification for consistency with the text, including metamodel diagrams, concept diagrams and example models.
  3. An explanation has been added describing the use of colors within the specification. This makes it clear that the metamodel diagrams use colors to distinguish the different aspects of the ArchiMate Framework, and that within the models there are no formal semantics assigned to colors.
  4. Within the three layers, the concepts are now classified according to the aspects of the ArchiMate Framework: Active Structure Concepts (instead of Structural Concepts), Behavioral Concepts, and Passive Structure Concepts (instead of Informational Concepts).
  5. Duplicate text has been removed from the layers; for example meaning was defined in Section 3.4 and also in Section 3.4.2).
  6. In the Layers, a number of concept diagrams have been corrected to show all the permitted symbols for the concept; for example, Business Interface, Application Service, and Infrastructure Service.
  7. In the Architecture Viewpoints, the aspects for each viewpoint are now classified as per the ArchiMate Framework into Active Structure, Behavior, or Passive Structure.
  8. In the Architecture Viewpoints, a number of Concepts and Relationships diagrams have been updated to correct the relationships shown, similarly a number of example diagrams have corrections (for example use of a Communication Path to connect two nodes).
  9. In the Language Extension Mechanisms chapter, it has been made clear that specialization can also be applied to Relationships.
  10. In the Motivation Extension, it has been made clear that the association relationship can be used to connect motivation elements.
  11. The status of the appendices has been made clear; Appendix A is informative, whereas Appendix B is normative.
  12. Appendix B, the Relationship Tables has a number of corrections applied.

More information on the ArchiMate 2.1 Specification, including additional resources, can be obtained from The Open Group website here: http://www.opengroup.org/subjectareas/enterprise/archimate

[1] A detailed listing of the changes is available separately as Document U132, ArchiMate® 2.0 Technical Corrigendum 1 http://www.opengroup.org/bookstore/catalog/U132

Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

1 Comment

Filed under ArchiMate®, Enterprise Architecture