Tag Archives: IT security

How the Open Trusted Technology Provider Standard (O-TTPS) and Accreditation Will Help Lower Cyber Risk

By Andras Szakal, Vice President and Chief Technology Officer, IBM U.S. Federal

Changing business dynamics and enabling technologies

In 2008, IBM introduced the concept of a “Smarter Planet.” The Smarter Planet initiative focused, in part, on the evolution of globalization against the backdrop of changing business dynamics and enabling technologies. A key concept was the need for infrastructure to be tightly integrated, interconnected, and intelligent, thereby facilitating collaboration between people, government and businesses in order to meet the world’s growing appetite for data and automation. Since then, many industries and businesses have adopted this approach, including the ICT (information and communications technology) industries that support the global technology manufacturing supply chain.

Intelligent and interconnected critical systems

This transformation has infused technology into virtually all aspects of our lives, and involves, for example, government systems, the electric grid and healthcare. Most of these technological solutions are made up of hundreds or even thousands of components that are sourced from the growing global technology supply chain.
Intelligent and interconnected critical systems

In the global technology economy, no one technology vendor or integrator is able to always provide a single source solution. It is no longer cost competitive to design all of the electronic components, printed circuit boards, card assemblies, or other sub-assemblies in-house. Adapting to the changing market place and landscape by balancing response time and cost efficiency, in an expedient manner, drives a more wide-spread use of OEM (original equipment manufacturer) products.

As a result, most technology providers procure from a myriad of global component suppliers, who very often require similarly complex supply chains to source their components. Every enterprise has a supplier network, and each of their suppliers has a supply chain network, and these sub-tier suppliers have their own supply chain networks. The resultant technology supply chain is manifested into a network of integrated suppliers.

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Cybersecurity by design: Addressing risk in a sustainable way across the ecosystem

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways:

- Does the product meet essential functional requirements?
- Has the product or component been produced by trustworthy provider?

Of course, the leaders or owners of these systems have to do their part to achieve security and safety: e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats. Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

Assuring the quality and integrity of mission-critical technology

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

The very process of manufacturing technology is not immune to cyber-attack. The primary purpose of attacking the supply chain typically is motivated by monetary gain. The primary goals of a technology supply chain attack are intended to inflict massive economic damage in an effort to gain global economic advantage or as a way to seeding targets with malware that provides unfettered access for attackers.

It is for this reason that the global technology manufacturing industry must establish practices that mitigate this risk by increasing the cost barriers of launching such attacks and increasing the likelihood of being caught before the effects of such an attack are irreversible. As these threats evolve, the global ICT industry must deploy enhanced security through advanced automated cyber intelligence analysis. As critical infrastructure becomes more automated, integrated and essential to critical to functions, the technology supply chain that surrounds it must be considered a principle theme of the overall global security and risk mitigation strategy.

A global, agile, and scalable approach to supply chain security

Certainly, the manner in which technologies are invented, produced, and sold requires a global, agile, and scalable approach to supply chain assurance and is essential to achieve the desired results. Any technology supply chain security standard that hopes to be widely adopted must be flexible and country-agnostic. The very nature of the global supply chain (massively segmented and diverse) requires an approach that provides practicable guidance but avoids being overtly prescriptive. Such an approach would require the aggregation of industry practices that have been proven beneficial and effective at mitigating risk.

The OTTF (The Open Group Trusted Technology Forum) is an increasingly recognized and promising industry initiative to establish best practices to mitigate the risk of technology supply chain attack. Facilitated by The Open Group, a recognized international standards and certification body, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Current membership includes a list of the most well-known technology vendors, integrators, and technology assessment laboratories.

The benefits of O-TTPS for governments and enterprises

IBM is currently a member of the OTTF and has been honored to hold the Chair for the last three years.  Governments and enterprises alike will benefit from the work of the OTTF. Technology purchasers can use the Open Trusted Technology Provider™ Standard (O-TTPS) and Framework best-practice recommendations to guide their strategies.

A wide range of technology vendors can use O-TTPS approaches to build security and integrity into their end-to-end supply chains. The first version of the O-TTPS is focused on mitigating the risk of maliciously tainted and counterfeit technology components or products. Note that a maliciously tainted product is one that has been produced by the provider and acquired through reputable channels but which has been tampered maliciously. A counterfeit product is produced other than by or for the provider, or is supplied by a non-reputable channel, and is represented as legitimate. The OTTF is currently working on a program that will accredit technology providers who conform to the O-TTPS. IBM expects to complete pilot testing of the program by 2014.

IBM has actively supported the formation of the OTTF and the development of the O-TTPS for several reasons. These include but are not limited to the following:

- The Forum was established within a trusted and respected international standards body – The Open Group.
- The Forum was founded, in part, through active participation by governments in a true public-private partnership in which government members actively participate.
- The OTTF membership includes some of the most mature and trusted commercial technology manufactures and vendors because a primary objective of the OTTF was harmonization with other standards groups such as ISO (International Organization for Standardization) and Common Criteria.

The O-TTPS defines a framework of organizational guidelines and best practices that enhance the security and integrity of COTS ICT. The first version of the O-TTPS is focused on mitigating certain risks of maliciously tainted and counterfeit products within the technology development / engineering lifecycle. These best practices are equally applicable for systems integrators; however, the standard is intended to primarily address the point of view of the technology manufacturer.

O-TTPS requirements

The O-TTPS requirements are divided into three categories:

1. Development / Engineering Process and Method
2. Secure Engineering Practices
3. Supply Chain Security Practices

The O-TTPS is intended to establish a normalized set of criteria against which a technology provider, component supplier, or integrator can be assessed. The standard is divided into categories that define best practices for engineering development practices, secure engineering, and supply chain security and integrity intended to mitigate the risk of maliciously tainted and counterfeit components.

The accreditation program

As part of the process for developing the accreditation criteria and policy, the OTTF established a pilot accreditation program. The purpose of the pilot was to take a handful of companies through the accreditation process and remediate any potential process or interpretation issues. IBM participated in the OTTP-S accreditation pilot to accredit a very significant segment of the software product portfolio; the Application Infrastructure Middleware Division (AIM) which includes the flagship WebSphere product line. The AIM pilot started in mid-2013 and completed in the first week of 2014 and was formally recognized as accredited in the fist week of February 2014.

IBM is currently leveraging the value of the O-TTPS and working to accredit additional development organizations. Some of the lessons learned during the IBM AIM initial O-TTPS accreditation include:

- Conducting a pre-assessment against the O-TTPS should be conducted by an organization before formally entering accreditation. This allows for remediation of any gaps and reduces potential assessment costs and project schedule.
- Starting with a segment of your development portfolio that has a mature secure engineering practices and processes. This helps an organization address accreditation requirements and facilitates interactions with the 3rd party lab.
- Using your first successful O-TTPS accreditation to create templates that will help drive data gathering and validate practices to establish a repeatable process as your organization undertakes additional accreditations.

andras-szakalAndras Szakal, VP and CTO, IBM U.S. Federal, is responsible for IBM’s industry solution technology strategy in support of the U.S. Federal customer. Andras was appointed IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team in 2005. He is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP).  Andras holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. He has been a driving force behind IBM’s adoption of government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of the IT Architect Profession Certification Standard (ITAC). More recently, he was appointed chair of The Open Group Trusted Technology Forum and leads the development of The Open Trusted Technology Provider Framework.

1 Comment

Filed under Accreditations, Cybersecurity, government, O-TTF, O-TTPS, OTTF, RISK Management, Standards, supply chain, Supply chain risk

The Open Group San Francisco 2014 – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

Day two, February 4th, of The Open Group San Francisco conference kicked off with a welcome and opening remarks from Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects.

Nunn introduced Allen Brown, President and CEO of The Open Group, who provided highlights from The Open Group’s last quarter.  As of Q4 2013, The Open Group had 45,000 individual members in 134 countries hailing from 449 member companies in 38 countries worldwide. Ten new member companies have already joined The Open Group in 2014, and 24 members joined in the last quarter of 2013, with the first member company joining from Vietnam. In addition, 6,500 individuals attended events sponsored by The Open Group in Q4 2013 worldwide.

Updates on The Open Group’s ongoing work were provided including updates on the FACE™ Consortium, DirectNet® Waveform Standard, Architecture Forum, Archimate® Forum, Open Platform 3.0™ Forum and Security Forum.

Of note was the ongoing development of TOGAF® and introduction of a three-volume work including individual volumes outlining the TOGAF framework, guidance and tools and techniques for the standard, as well as collaborative work that allows the Archimate modeling language to be used for risk management in enterprise architectures.

In addition, Open Platform 3.0 Forum has already put together 22 business use cases outlining ROI and business value for various uses related to technology convergence. The Cloud Work Group’s Cloud Reference Architecture has also been submitted to ISO for international standards certification, and the Security Forum has introduced certification programs for OpenFAIR risk management certification for individuals.

The morning plenary centered on The Open Group’s Dependability through Assuredness™ (O-DA) Framework, which was released last August.

Speaking first about the framework was Dr. Mario Tokoro, Founder and Executive Advisor for Sony Computer Science Laboratories. Dr. Tokoro gave an overview of the Dependable Embedded OS project (DEOS), a large national project in Japan originally intended to strengthen the country’s embedded systems. After considerable research, the project leaders discovered they needed to consider whether large, open systems could be dependable when it came to business continuity, accountability and ensuring consistency throughout the systems’ lifecycle. Because the boundaries of large open systems are ever-changing, the project leaders knew they must put together dependability requirements that could accommodate constant change, allow for continuous service and provide continuous accountability for the systems based on consensus. As a result, they put together a framework to address both the change accommodation cycle and failure response cycles for large systems – this framework was donated to The Open Group’s Real-Time Embedded Systems Forum and released as the O-DA standard.

Dr. Tokoro’s presentation was followed by a panel discussion on the O-DA standard. Moderated by Dave Lounsbury, VP and CTO of The Open Group, the panel included Dr. Tokoro; Jack Fujieda, Founder and CEO ReGIS, Inc.; T.J. Virdi, Senior Enterprise IT Architect at Boeing; and Bill Brierly, Partner and Senior Consultant, Conexiam. The panel discussed the importance of openness for systems, iterating the conference theme of boundaries and the realities of having standards that can ensure openness and dependability at the same time. They also discussed how the O-DA standard provides end-to-end requirements for system architectures that also account for accommodating changes within the system and accountability for it.

Lounsbury concluded the track by iterating that assuring systems’ dependability is not only fundamental to The Open Group mission of Boundaryless Information Flow™ and interoperability but also in preventing large system failures.

Tuesday’s late morning sessions were split into two tracks, with one track continuing the Dependability through Assuredness theme hosted by Joe Bergmann, Forum Chair of The Open Group’s Real-Time and Embedded Systems Forum. In this track, Fujieda and Brierly furthered the discussion of O-DA outlining the philosophy and vision of the standard, as well as providing a roadmap for the standard.

In the morning Business Innovation & Transformation track, Alan Hakimi, Consulting Executive, Microsoft presented “Zen and the Art of Enterprise Architecture: The Dynamics of Transformation in a Complex World.” Hakimi emphasized that transformation needs to focus on a holistic view of an organization’s ecosystem and motivations, economics, culture and existing systems to help foster real change. Based on Buddhist philosophy, he presented an eightfold path to transformation that can allow enterprise architects to approach transformation and discuss it with other architects and business constituents in a way that is meaningful to them and allows for complexity and balance.

This was followed by “Building the Knowledge-Based Enterprise,” a session given by Bob Weisman, Head Management Consultant for Build the Vision.

Tuesday’s afternoon sessions centered on a number of topics including Business Innovation and Transformation, Risk Management, Archimate, TOGAF tutorials and case studies and Professional Development.

In the Archimate track, Vadim Polyakov of Inovalon, Inc., presented “Implementing an EA Practice in an Agile Enterprise” a case study centered on how his company integrated its enterprise architecture with the principles of agile development and how they customized the Archimate framework as part of the process.

The Risk Management track featured William Estrem, President, Metaplexity Associates, and Jim May of Windsor Software discussing how the Open FAIR Standard can be used in conjunction with TOGAF 9.1 to enhance risk management in organizations in their session, “Integrating Open FAIR Risk Analysis into the Enterprise Architecture Capability.” Jack Jones, President of CXOWARE, also discussed the best ways for “Communicating the Value Proposition” for cohesive enterprise architectures to business managers using risk management scenarios.

The plenary sessions and many of the track sessions from today’s tracks can be viewed on The Open Group’s Livestream channel at http://new.livestream.com/opengroup.

The day culminated with dinner and a Lion Dance performance in honor of Chinese New Year performed by Leung’s White Crane Lion & Dragon Dance School of San Francisco.

We would like to express our gratitude for the support by our following sponsors:  BIZZDesign, Corso, Good e-Learning, I-Server and Metaplexity Associates.

IMG_1460 copy

O-DA standard panel discussion with Dave Lounsbury, Bill Brierly, Dr. Mario Tokoro, Jack Fujieda and TJ Virdi

Leave a comment

Filed under Conference, Enterprise Architecture, Enterprise Transformation, Standards, TOGAF®, Uncategorized

Measuring the Immeasurable: You Have More Data Than You Think You Do

By Jim Hietala, Vice President, Security, The Open Group

According to a recent study by the Ponemon Institute, the average U.S. company experiences more than 100 successful cyber-attacks each year at a cost of $11.6M. By enabling security technologies, those companies can reduce losses by nearly $4M and instituting security governance reduces costs by an average of $1.5M, according to the study.

In light of increasing attacks and security breaches, executives are increasingly asking security and risk professionals to provide analyses of individual company risk and loss estimates. For example, the U.S. healthcare sector has been required by the HIPAA Security rule to perform annual risk assessments for some time now. The recent HITECH Act also added security breach notification and disclosure requirements, increased enforcement in the form of audits and increased penalties in the form of fines. Despite federal requirements, the prospect of measuring risk and doing risk analyses can be a daunting task that leaves even the best of us with a case of “analysis paralysis.”

Many IT experts agree that we are nearing a time where risk analysis is not only becoming the norm, but when those risk figures may well be used to cast blame (or be used as part of a defense in a lawsuit) if and when there are catastrophic security breaches that cost consumers, investors and companies significant losses.

In the past, many companies have been reluctant to perform risk analyses due to the perception that measuring IT security risk is too difficult because it’s intangible. But if IT departments could soon become accountable for breaches, don’t you want to be able to determine your risk and the threats potentially facing your organization?

In his book, How to Measure Anything, father of Applied Information Economics Douglas Hubbard points out that immeasurability is an illusion and that organizations do, in fact, usually have the information they need to create good risk analyses. Part of the misperception of immeasurability stems from a lack of understanding of what measurement is actually meant to be. According to Hubbard, most people, and executives in particular, expect measurement and analysis to produce an “exact” number—as in, “our organization has a 64.5 percent chance of having a denial of service attack next year.”

Hubbard argues that, as risk analysts, we need to look at measurement more like how scientists look at things—measurement is meant to reduce uncertainty—not to produce certainty—about a quantity based on observation.  Proper measurement should not produce an exact number, but rather a range of possibility, as in “our organization has a 30-60 percent chance of having a denial of service attack next year.” Realistic measurement of risk is far more likely when expressed as a probability distribution with a range of outcomes than in terms of one number or one outcome.

The problem that most often produces “analysis paralysis” is not just the question of how to derive those numbers but also how to get to the information that will help produce those numbers. If you’ve been tasked, for instance, with determining the risk of a breach that has never happened to your organization before, perhaps a denial of service attack against your web presence, how can you make an accurate determination about something that hasn’t happened in the past? Where do you get your data to do your analysis? How do you model that analysis?

In an article published in CSO Magazine, Hubbard argues that organizations have far more data than they think they do and they actually need less data than they may believe they do in order to do proper analyses. Hubbard says that IT departments, in particular, have gotten so used to having information stored in databases that they can easily query, they forget there are many other sources to gather data from. Just because something hasn’t happened yet and you haven’t been gathering historical data on it and socking it away in your database doesn’t mean you either don’t have any data or that you can’t find what you need to measure your risk. Even in the age of Big Data, there is plenty of useful data outside of the big database.

You will still need to gather that data. But you just need enough to be able to measure it accurately not necessarily precisely. In our recently published Open Group Risk Assessment Standard (O-RA), this is called calibration of estimates. Calibration provides a method for making good estimates, which are necessary for deriving a measured range of probability for risk. Section 3 of the O-RA standard uses provides a comprehensive look at how best to come up with calibrated estimates, as well as how to determine other risk factors using the FAIR (Factor Analysis of Information Risk) model.

So where do you get your data if it’s not already stored and easily accessible in a database? There are numerous sources you can turn to, both externally and internally. You just have to do the research to find it. For example, even if your company hasn’t experienced a DNS attack, many others have—what was their experience when it happened? This information is out there online—you just need to search for it. Industry reports are another source of information. Verizon publishes its own annual Verizon Data Breach Investigations Report for one. DatalossDB publishes an open data beach incident database that provides information on data loss incidents worldwide. Many vendors publish annual security reports and issue regular security advisories. Security publications and analyst firms such as CSO, Gartner, Forrester or Securosis all have research reports that data can be gleaned from.

Then there’s your internal information. Chances are your IT department has records you can use—they likely count how many laptops are lost or stolen each year. You should also look to the experts within your company to help. Other people can provide a wealth of valuable information for use in your analysis. You can also look to the data you do have on related or similar attacks as a gauge.

Chances are, you already have the data you need or you can easily find it online. Use it.

With the ever-growing list of threats and risks organizations face today, we are fast reaching a time when failing to measure risk will no longer be acceptable—in the boardroom or even by governments.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Data management, Information security, Open FAIR Certification, RISK Management, Uncategorized

Introducing Two New Security Standards for Risk Analysis—Part II – Risk Analysis Standard

By Jim Hietala, VP Security, The Open Group

Last week we took a look at one of the new risk standards recently introduced by The Open Group® Security Forum at the The Open Group London Conference 2013, the Risk Taxonomy Technical Standard 2.0 (O-RT). Today’s blog looks at its sister standard, the Risk Analysis (O-RA) Standard, which provides risk professionals the tools they need to perform thorough risk analyses within their organizations for better decision-making about risk.

Risk Analysis (O-RA) Standard

The new Risk Analysis Standard provides a comprehensive guide for performing effective analysis scenarios within organizations using the Factor Analysis of Information Risk (FAIR™) framework. O-RA is geared toward managing the frequency and magnitude of loss that can arise from a threat, whether human, animal or a natural event–in other words “how often bad things happened and how bad they are when they occur.” Used together, the O-RT and O-RA Standards provide organizations with a way to perform consistent risk modeling, that can not only help thoroughly explain risk factors to stakeholders but allow information security professionals to strengthen existing or create better analysis methods. O-RA may also be used in conjunction with other risk frameworks to perform risk analysis.

The O-RA standard is also meant to provide something more than a mere assessment of risk. Many professionals within the security industry often fail to distinguish between “assessing” risk vs. “analysis” of risk. This standard goes beyond assessment by supporting effective analyses so that risk statements are less vulnerable to problems and are more meaningful and defensible than assessments that provide only the broad risk-ratings (“this is a 4 on a scale of 1-to-5”) normally used in assessments.

O-RA also lays out standard process for approaching risk analysis that can help organizations streamline the way they approach risk measurement. By focusing in on these four core process elements, organizations are able to perform more effective analyses:

  • Clearly identifying and characterizing the assets, threats, controls and impact/loss elements at play within the scenario being assessed
  • Understanding the organizational context for analysis (i.e. what’s at stake from an organizational perspective)
  • Measuring/estimating various risk factors
  • Calculating risk using a model that represents a logical, rational, and useful view of what risk is and how it works.

Because measurement and calculation are essential elements of properly analyzing risk variables, an entire chapter of the standard is dedicated to how to measure and calibrate risk. This chapter lays out a number of useful approaches for establishing risk variables, including establishing baseline risk estimates and ranges; creating distribution ranges and most likely values; using Monte Carlo simulations; accounting for uncertainty; determining accuracy vs. precision and subjective vs. objective criteria; deriving vulnerability; using ordinal scales; and determining diminishing returns.

Finally, a practical, real-world example is provided to take readers through an actual risk analysis scenario. Using the FAIR model, the example outlines the process for dealing with an threat in which an HR executive at a large bank has left the user name and password that allow him access to all the company’s HR systems on a Post-It note tacked onto his computer in his office in clear view of anyone (other employees, cleaning crews, etc.) who comes into the office.

The scenario outlines four stages in assessing this risk:

  1. .    Stage 1: Identify Scenario Components (Scope the Analysis)
  2. .    Stage 2: Evaluate Loss Event Frequency (LEF)
  3. .    Stage 3: Evaluate Loss Magnitude (LM)
  4. .    Stage 4: Derive and Articulate Risk

Each step of the risk analysis process is thoroughly outlined for the scenario to provide Risk Analysts an example of how to perform an analysis process using the FAIR framework. Considerable guidance is provided for stages 2 and 3, in particular, as those are the most critical elements in determining organizational risk.

Ultimately, the O-RA is a guide to help organizations make better decisions about which risks are the most critical for the organization to prioritize and pay attention to versus those that are less important and may not warrant attention. It is critical for Risk Analysts and organizations to become more consistent in this practice because lack of consistency in determining risk among information security professionals has been a major obstacle in allowing security professionals a more legitimate “seat at the table” in the boardroom with other business functions (finance, HR, etc.) within organizations.

For our profession to evolve and grow, consistency and accurate measurement is key. Issues and solutions must be identified consistently and comparisons and measurement must be based on solid foundations, as illustrated below.

Risk2

Chained Dependencies

O-RA can help organizations arrive at better decisions through consistent analysis techniques as well as provide more legitimacy within the profession.  Without a foundation from which to manage information risk, Risk Analysts and information security professionals may rely too heavily on intuition, bias, commercial or personal agendas for their analyses and decision making. By outlining a thorough foundation for Risk Analysis, O-RA provides not only a common foundation for performing risk analyses but the opportunity to make better decisions and advance the security profession.

For more on the O-RA Standard or to download it, please visit: https://www2.opengroup.org/ogsys/catalog/C13G.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Introducing Two New Security Standards for Risk Analysis—Part I – Risk Taxonomy Technical Standard 2.0

By Jim Hietala, VP Security, The Open Group

At the The Open Group London 2013 Conference, The Open Group® announced three new initiatives related to the Security Forum’s work around Risk Management. The first of these was the establishment of a new certification program for Risk Analysts working within the security profession, the Open FAIR Certification Program.  Aimed at providing a professional certification for Risk Analysts, the program will bring a much-needed level of assuredness to companies looking to hire Risk Analysts, certifying that analysts who have completed the Open FAIR program understand the fundamentals of risk analysis and are qualified to perform that analysis.

Forming the basis of the Open FAIR certification program are two new Open Group standards, version 2.0 of the Risk Taxonomy (O-RT) standard originally introduced by the Security Forum in 2009, and a new Risk Analysis (O-RA) Standard, both of which were also announced at the London conference. These standards are the result of ongoing work around risk analysis that the Security Forum has been conducting for a number of years now in order to help organizations better understand and identify their exposure to risk, particularly when it comes to information security risk.

The Risk Taxonomy and Risk Analysis standards not only form the basis and body of knowledge for the Open FAIR certification, but provide practical advice for security practitioners who need to evaluate and counter the potential threats their organization may face.

Today’s blog will look at the first standard, the Risk Taxonomy Technical Standard, version 2.0. Next week, we’ll look at the other standard for Risk Analysis.

Risk Taxonomy (O-RT) Technical Standard 2.0

Originally, published in January 2009, the O-RT is intended to provide a common language and references for security and business professionals who need to understand or analyze risk conditions, providing a common language for them to use when discussing those risks. Version 2.0 of the standard contains a number of updates based both on feedback provided by professionals that have been using the standard and as a result of research conducted by Security Forum member CXOWARE.

The majority of the changes to Version 2.0 are refinements in terminology, including changes in language that better reflect what each term encompasses. For example, the term “Control Strength” in the original standard has now been changed to “Resistance Strength” to reflect that controls used in that part of the taxonomy must be resistive in nature.

More substantive changes were made to the portion of the taxonomy that discusses how Loss Magnitude is evaluated.

Why create a taxonomy for risk?  For two reasons. First, the taxonomy provides a foundation from which risk analysis can be performed and talked about. Second, a tightly defined taxonomy reduces the inability to effectively measure or estimate risk scenarios, leading to better decision making, as illustrated by the following “risk management stack.”

Effective Management


↑

Well-informed Decisions

Effective Comparisons


↑

Meaningful Measurements

Accurate Risk Model

The complete Risk Taxonomy is comprised of two branches: Loss Event Frequency (LEF) and Loss Magnitude (LM), illustrated here:

Risk1

Focusing solely on pure risk (which only results in loss) rather than speculative risk (which might result in either loss or profit), the O-RT is meant to help estimate the probable frequency and magnitude of future loss.

Traditionally LM has been far more difficult to determine than LEF, in part because organizations don’t always perform analyses on their losses or they just stick to evaluating “low hanging fruit” variables rather than delve into determining more complex risk factors. The new taxonomy takes a deep dive into the Loss Magnitude branch of the risk analysis taxonomy providing guidance that will allow Risk Analysts to better tackle the difficult task of determining LM. It includes terminology outlining six specific forms of loss an organization can experience (productivity, response, replacement, fines and judgments, competitive advantage, reputation) as well as how to determine Loss Flow, a new concept in this standard.

The Loss Flow analysis helps identify how a loss may affect both primary (owners, employees, etc.) and secondary (customers, stockholders, regulators, etc.) stakeholders as a result of a threat agent’s action on an asset. The new standard provides a thorough overview on how to assess Loss Flow and identify the loss factors of any given threat.

Finally, the standard also includes a practical, real-world scenario to help analysts understand how to put the taxonomy to use in within their organizations. O-RT provides a common linguistic foundation that will allow security professionals to then perform the risk analyses as outlined in the O-RA Standard.

For more on the Risk Taxonomy Standard or to download it, visit: https://www2.opengroup.org/ogsys/catalog/C13K.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Open FAIR Certification Launched

By Jim Hietala, The Open Group, VP of Security

The Open Group today announced the new Open FAIR Certification Program aimed at Risk Analysts, bringing a much-needed professional certification to the market that is focused on the practice of risk analysis. Both the Risk Taxonomy and Risk Analysis standards, standards of The Open Group, constitute the body of knowledge for the certification program, and they advance the risk analysis profession by defining a standard taxonomy for risk, and by describing the process aspects of a rigorous risk analysis.

We believe that this new risk analyst certification program will bring significant value to risk analysts, and to organizations seeking to hire qualified risk analysts. Adoption of these two risk standards from The Open Group will help produce more effective and useful risk analysis. This program clearly represents the growing need in our industry for professionals who understand risk analysis fundamentals.  Furthermore, the mature processes and due diligence The Open Group applies to our standards and certification programs will help make organizations comfortable with the ground breaking concepts and methods underlying FAIR. This will also help professionals looking to differentiate themselves by demonstrating the ability to take a “business perspective” on risk.

In order to become certified, Risk Analysts must pass an Open FAIR certification exam. All certification exams are administered through Prometric, Inc. Exam candidates can start the registration process by visiting Prometric’s Open Group Test Sponsor Site www.prometric.com/opengroup.  With 4,000 testing centers in its IT channel, Prometric brings Open FAIR Certification to security professionals worldwide. For more details on the exam requirements visit http://www.opengroup.org/certifications/exams.

Training courses will be delivered through an Open Group accredited channel. The accreditation of Open FAIR training courses will be available from November 1st 2013.

Our thanks to all of the members of the risk certification working group who worked tirelessly over the past 15 months to bring this certification program, along with a new risk analysis standard and a revised risk taxonomy standard to the market. Our thanks also to the sponsors of the program, whose support is important to building this program. The Open FAIR program sponsors are Architecting the Enterprise, CXOWARE, SNA, and The Unit.

Lastly, if you are involved in risk analysis, we encourage you to consider becoming Open FAIR certified, and to get involved in the risk analysis program at The Open Group. We have plans to develop an advanced level of Open FAIR certification, and we also see a great deal of best practices guidance that is needed by the industry.

For more information on the Open FAIR certification program visit http://www.opengroup.org/certifications/openfair

You may also wish to attend a webcast scheduled for 7th November, 4pm BST that will provide an overview of the Open FAIR certification program, as well as an overview of the two risk standards. You can register here

.62940-hietala

Jim Hietala, CISSP, GSEC, is Vice President, Security for The Open Group, where he manages all security and risk management programs and standards activities, including the Security Forum and the Jericho Forum.  He has participated in the development of several industry standards including O-ISM3, O-ESA, Risk Taxonomy Standard, and O-ACEML. He also led the development of compliance and audit guidance for the Cloud Security Alliance v2 publication.

Jim is a frequent speaker at industry conferences. He has participated in the SANS Analyst/Expert program, having written several research white papers and participated in several webcasts for SANS. He has also published numerous articles on information security, risk management, and compliance topics in publications including CSO, The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

An IT security industry veteran, he has held leadership roles at several IT security vendors.

Jim holds a B.S. in Marketing from Southern Illinois University.

Comments Off

Filed under Conference, Cybersecurity, Open FAIR Certification, Standards

3 Steps to Proactively Address Board-Level Security Concerns

By E.G. Nadhan, HP

Last month, I shared the discussions that ensued in a Tweet Jam conducted by The Open Group on Big Data and Security where the key takeaway was: Protecting Data is Good.  Protecting Information generated from Big Data is priceless.  Security concerns around Big Data continue to the extent that it has become a Board-level concern as explained in this article in ComputerWorldUK.  Board-level concerns must be addressed proactively by enterprises.  To do so, enterprises must provide the business justification for such proactive steps needed to address such board-level concerns.

Nadhan blog image

At The Open Group Conference in Sydney in April, the session on “Which information risks are shaping our lives?” by Stephen Singam, Chief Technology Officer, HP Enterprise Security Services, Australia provides great insight on this topic.  In this session, Singam analyzes the current and emerging information risks while recommending a proactive approach to address them head-on with adversary-centric solutions.

The 3 steps that enterprises must take to proactively address security concerns are below:

Computing the cost of cyber-crime

The HP Ponemon 2012 Cost of Cyber Crime Study revealed that cyber attacks have more than doubled in a three year period with the financial impact increasing by nearly 40 percent. Here are the key takeaways from this research:

  • Cyber-crimes continue to be costly. The average annualized cost of cyber-crime for 56 organizations is $8.9 million per year, with a range of $1.4 million to $46 million.
  • Cyber attacks have become common occurrences. Companies experienced 102 successful attacks per week and 1.8 successful attacks per company per week in 2012.
  • The most costly cyber-crimes are those caused by denial of service, malicious insiders and web-based attacks.

When computing the cost of cyber-crime, enterprises must address direct, indirect and opportunity costs that result from the loss or theft of information, disruption to business operations, revenue loss and destruction of property, plant and equipment. The following phases of combating cyber-crime must also be factored in to comprehensively determine the total cost:

  1. Detection of patterns of behavior indicating an impending attack through sustained monitoring of the enabling infrastructure
  2. Investigation of the security violation upon occurrence to determine the underlying root cause and take appropriate remedial measures
  3. Incident response to address the immediate situation at hand, communicate the incidence of the attack raise all applicable alerts
  4. Containment of the attack by controlling its proliferation across the enterprise
  5. Recovery from the damages incurred as a result of the attack to ensure ongoing business operations based upon the business continuity plans in place

Identifying proactive steps that can be taken to address cyber-crime

  1. “Better get security right,” says HP Security Strategist Mary Ann Mezzapelle in her keynote on Big Data and Security at The Open Group Conference in Newport Beach. Asserting that proactive risk management is the most effective approach, Mezzapelle challenged enterprises to proactively question the presence of shadow IT, data ownership, usage of security tools and standards while taking a comprehensive approach to security end-to-end within the enterprise.
  2. Art Gilliland suggested that learning from cyber criminals and understanding their methods in this ZDNet article since the very frameworks enterprises strive to comply with (such as ISO and PCI) set a low bar for security that adversaries capitalize on.
  3. Andy Ellis discussed managing risk with psychology instead of brute force in his keynote at the 2013 RSA Conference.
  4. At the same conference, in another keynote, world re-knowned game-designer and inventor of SuperBetter, Jane McGonigal suggested the application of the “collective intelligence” that gaming generates can combat security concerns.
  5. In this interview, Bruce Schneier, renowned security guru and author of several books including LIARS & Outliers, suggested “Bad guys are going to invent new stuff — whether we want them to or not.” Should we take a cue from Hollywood and consider the inception of OODA loop into the security hacker’s mind?

The Balancing Act.

Can enterprises afford to take such proactive steps? Or more importantly, can they afford not to?

Enterprises must define their risk management strategy and determine the proactive steps that are best in alignment with their business objectives and information security standards.  This will enable organizations to better assess the cost of execution for such measures.  While the actual cost is likely to vary by enterprise, inaction is not an acceptable alternative.  Like all other critical corporate initiatives, these proactive measures must receive the board-level attention they deserve.

Enterprises must balance the cost of executing such proactive measures against the potential cost of data loss and reputational harm. This will ensure that the right proactive measures are taken with executive support.

How about you?  Has your enterprise taken the steps to assess the cost of cybercrime?  Have you considered various proactive steps to combat cybercrime?  Share your thoughts with me in the comments section below.

NadhanHP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHP.

1 Comment

Filed under Conference

Beyond Big Data

By Chris Harding, The Open Group

The big bang that started The Open Group Conference in Newport Beach was, appropriately, a presentation related to astronomy. Chris Gerty gave a keynote on Big Data at NASA, where he is Deputy Program Manager of the Open Innovation Program. He told us how visualizing deep space and its celestial bodies created understanding and enabled new discoveries. Everyone who attended felt inspired to explore the universe of Big Data during the rest of the conference. And that exploration – as is often the case with successful space missions – left us wondering what lies beyond.

The Big Data Conference Plenary

The second presentation on that Monday morning brought us down from the stars to the nuts and bolts of engineering. Mechanical devices require regular maintenance to keep functioning. Processing the mass of data generated during their operation can improve safety and cut costs. For example, airlines can overhaul aircraft engines when it needs doing, rather than on a fixed schedule that has to be frequent enough to prevent damage under most conditions, but might still fail to anticipate failure in unusual circumstances. David Potter and Ron Schuldt lead two of The Open Group initiatives, Quantum Lifecycle management (QLM) and the Universal Data Element Framework (UDEF). They explained how a semantic approach to product lifecycle management can facilitate the big-data processing needed to achieve this aim.

Chris Gerty was then joined by Andras Szakal, vice-president and chief technology officer at IBM US Federal IMT, Robert Weisman, chief executive officer of Build The Vision, and Jim Hietala, vice-president of Security at The Open Group, in a panel session on Big Data that was moderated by Dana Gardner of Interarbor Solutions. As always, Dana facilitated a fascinating discussion. Key points made by the panelists included: the trend to monetize data; the need to ensure veracity and usefulness; the need for security and privacy; the expectation that data warehouse technology will exist and evolve in parallel with map/reduce “on-the-fly” analysis; the importance of meaningful presentation of the data; integration with cloud and mobile technology; and the new ways in which Big Data can be used to deliver business value.

More on Big Data

In the afternoons of Monday and Tuesday, and on most of Wednesday, the conference split into streams. These have presentations that are more technical than the plenary, going deeper into their subjects. It’s a pity that you can’t be in all the streams at once. (At one point I couldn’t be in any of them, as there was an important side meeting to discuss the UDEF, which is in one of the areas that I support as forum director). Fortunately, there were a few great stream presentations that I did manage to get to.

On the Monday afternoon, Tom Plunkett and Janet Mostow of Oracle presented a reference architecture that combined Hadoop and NoSQL with traditional RDBMS, streaming, and complex event processing, to enable Big Data analysis. One application that they described was to trace the relations between particular genes and cancer. This could have big benefits in disease prediction and treatment. Another was to predict the movements of protesters at a demonstration through analysis of communications on social media. The police could then concentrate their forces in the right place at the right time.

Jason Bloomberg, president of Zapthink – now part of Dovel – is always thought-provoking. His presentation featured the need for governance vitality to cope with ever changing tools to handle Big Data of ever increasing size, “crowdsourcing” to channel the efforts of many people into solving a problem, and business transformation that is continuous rather than a one-time step from “as is” to “to be.”

Later in the week, I moderated a discussion on Architecting for Big Data in the Cloud. We had a well-balanced panel made up of TJ Virdi of Boeing, Mark Skilton of Capgemini and Tom Plunkett of Oracle. They made some excellent points. Big Data analysis provides business value by enabling better understanding, leading to better decisions. The analysis is often an iterative process, with new questions emerging as answers are found. There is no single application that does this analysis and provides the visualization needed for understanding, but there are a number of products that can be used to assist. The role of the data scientist in formulating the questions and configuring the visualization is critical. Reference models for the technology are emerging but there are as yet no commonly-accepted standards.

The New Enterprise Platform

Jogging is a great way of taking exercise at conferences, and I was able to go for a run most mornings before the meetings started at Newport Beach. Pacific Coast Highway isn’t the most interesting of tracks, but on Tuesday morning I was soon up in Castaways Park, pleasantly jogging through the carefully-nurtured natural coastal vegetation, with views over the ocean and its margin of high-priced homes, slipways, and yachts. I reflected as I ran that we had heard some interesting things about Big Data, but it is now an established topic. There must be something new coming over the horizon.

The answer to what this might be was suggested in the first presentation of that day’s plenary, Mary Ann Mezzapelle, security strategist for HP Enterprise Services, talked about the need to get security right for Big Data and the Cloud. But her scope was actually wider. She spoke of the need to secure the “third platform” – the term coined by IDC to describe the convergence of social, cloud and mobile computing with Big Data.

Securing Big Data

Mary Ann’s keynote was not about the third platform itself, but about what should be done to protect it. The new platform brings with it a new set of security threats, and the increasing scale of operation makes it increasingly important to get the security right. Mary Ann presented a thoughtful analysis founded on a risk-based approach.

She was followed by Adrian Lane, chief technology officer at Securosis, who pointed out that Big Data processing using NoSQL has a different architecture from traditional relational data processing, and requires different security solutions. This does not necessarily mean new techniques; existing techniques can be used in new ways. For example, Kerberos may be used to secure inter-node communications in map/reduce processing. Adrian’s presentation completed the Tuesday plenary sessions.

Service Oriented Architecture

The streams continued after the plenary. I went to the Distributed Services Architecture stream, which focused on SOA.

Bill Poole, enterprise architect at JourneyOne in Australia, described how to use the graphical architecture modeling language ArchiMate® to model service-oriented architectures. He illustrated this using a case study of a global mining organization that wanted to consolidate its two existing bespoke inventory management applications into a single commercial off-the-shelf application. It’s amazing how a real-world case study can make a topic come to life, and the audience certainly responded warmly to Bill’s excellent presentation.

Ali Arsanjani, chief technology officer for Business Performance and Service Optimization, and Heather Kreger, chief technology officer for International Standards, both at IBM, described the range of SOA standards published by The Open Group and available for use by enterprise architects. Ali was one of the brains that developed the SOA Reference Architecture, and Heather is a key player in international standards activities for SOA, where she has helped The Open Group’s Service Integration Maturity Model and SOA Governance Framework to become international standards, and is working on an international standard SOA reference architecture.

Cloud Computing

To start Wednesday’s Cloud Computing streams, TJ Virdi, senior enterprise architect at The Boeing Company, discussed use of TOGAF® to develop an Enterprise Architecture for a Cloud ecosystem. A large enterprise such as Boeing may use many Cloud service providers, enabling collaboration between corporate departments, partners, and regulators in a complex ecosystem. Architecting for this is a major challenge, and The Open Group’s TOGAF for Cloud Ecosystems project is working to provide guidance.

Stuart Boardman of KPN gave a different perspective on Cloud ecosystems, with a case study from the energy industry. An ecosystem may not necessarily be governed by a single entity, and the participants may not always be aware of each other. Energy generation and consumption in the Netherlands is part of a complex international ecosystem involving producers, consumers, transporters, and traders of many kinds. A participant may be involved in several ecosystems in several ways: a farmer for example, might consume energy, have wind turbines to produce it, and also participate in food production and transport ecosystems.

Penelope Gordon of 1-Plug Corporation explained how choice and use of business metrics can impact Cloud service providers. She worked through four examples: a start-up Software-as-a-Service provider requiring investment, an established company thinking of providing its products as cloud services, an IT department planning to offer an in-house private Cloud platform, and a government agency seeking budget for government Cloud.

Mark Skilton, director at Capgemini in the UK, gave a presentation titled “Digital Transformation and the Role of Cloud Computing.” He covered a very broad canvas of business transformation driven by technological change, and illustrated his theme with a case study from the pharmaceutical industry. New technology enables new business models, giving competitive advantage. Increasingly, the introduction of this technology is driven by the business, rather than the IT side of the enterprise, and it has major challenges for both sides. But what new technologies are in question? Mark’s presentation had Cloud in the title, but also featured social and mobile computing, and Big Data.

The New Trend

On Thursday morning I took a longer run, to and round Balboa Island. With only one road in or out, its main street of shops and restaurants is not a through route and the island has the feel of a real village. The SOA Work Group Steering Committee had found an excellent, and reasonably priced, Italian restaurant there the previous evening. There is a clear resurgence of interest in SOA, partly driven by the use of service orientation – the principle, rather than particular protocols – in Cloud Computing and other new technologies. That morning I took the track round the shoreline, and was reminded a little of Dylan Thomas’s “fishing boat bobbing sea.” Fishing here is for leisure rather than livelihood, but I suspected that the fishermen, like those of Thomas’s little Welsh village, spend more time in the bar than on the water.

I thought about how the conference sessions had indicated an emerging trend. This is not a new technology but the combination of four current technologies to create a new platform for enterprise IT: Social, Cloud, and Mobile computing, and Big Data. Mary Ann Mezzapelle’s presentation had referenced IDC’s “third platform.” Other discussions had mentioned Gartner’s “Nexus of forces,” the combination of Social, Cloud and Mobile computing with information that Gartner says is transforming the way people and businesses relate to technology, and will become a key differentiator of business and technology management. Mark Skilton had included these same four technologies in his presentation. Great minds, and analyst corporations, think alike!

I thought also about the examples and case studies in the stream presentations. Areas as diverse as healthcare, manufacturing, energy and policing are using the new technologies. Clearly, they can deliver major business benefits. The challenge for enterprise architects is to maximize those benefits through pragmatic architectures.

Emerging Standards

On the way back to the hotel, I remarked again on what I had noticed before, how beautifully neat and carefully maintained the front gardens bordering the sidewalk are. I almost felt that I was running through a public botanical garden. Is there some ordinance requiring people to keep their gardens tidy, with severe penalties for anyone who leaves a lawn or hedge unclipped? Is a miserable defaulter fitted with a ball and chain, not to be removed until the untidy vegetation has been properly trimmed, with nail clippers? Apparently not. People here keep their gardens tidy because they want to. The best standards are like that: universally followed, without use or threat of sanction.

Standards are an issue for the new enterprise platform. Apart from the underlying standards of the Internet, there really aren’t any. The area isn’t even mapped out. Vendors of Social, Cloud, Mobile, and Big Data products and services are trying to stake out as much valuable real estate as they can. They have no interest yet in boundaries with neatly-clipped hedges.

This is a stage that every new technology goes through. Then, as it matures, the vendors understand that their products and services have much more value when they conform to standards, just as properties have more value in an area where everything is neat and well-maintained.

It may be too soon to define those standards for the new enterprise platform, but it is certainly time to start mapping out the area, to understand its subdivisions and how they inter-relate, and to prepare the way for standards. Following the conference, The Open Group has announced a new Forum, provisionally titled Open Platform 3.0, to do just that.

The SOA and Cloud Work Groups

Thursday was my final day of meetings at the conference. The plenary and streams presentations were done. This day was for working meetings of the SOA and Cloud Work Groups. I also had an informal discussion with Ron Schuldt about a new approach for the UDEF, following up on the earlier UDEF side meeting. The conference hallways, as well as the meeting rooms, often see productive business done.

The SOA Work Group discussed a certification program for SOA professionals, and an update to the SOA Reference Architecture. The Open Group is working with ISO and the IEEE to define a standard SOA reference architecture that will have consensus across all three bodies.

The Cloud Work Group had met earlier to further the TOGAF for Cloud ecosystems project. Now it worked on its forthcoming white paper on business performance metrics. It also – though this was not on the original agenda – discussed Gartner’s Nexus of Forces, and the future role of the Work Group in mapping out the new enterprise platform.

Mapping the New Enterprise Platform

At the start of the conference we looked at how to map the stars. Big Data analytics enables people to visualize the universe in new ways, reach new understandings of what is in it and how it works, and point to new areas for future exploration.

As the conference progressed, we found that Big Data is part of a convergence of forces. Social, mobile, and Cloud Computing are being combined with Big Data to form a new enterprise platform. The development of this platform, and its roll-out to support innovative applications that deliver more business value, is what lies beyond Big Data.

At the end of the conference we were thinking about mapping the new enterprise platform. This will not require sophisticated data processing and analysis. It will take discussions to create a common understanding, and detailed committee work to draft the guidelines and standards. This work will be done by The Open Group’s new Open Platform 3.0 Forum.

The next Open Group conference is in the week of April 15, in Sydney, Australia. I’m told that there’s some great jogging there. More importantly, we’ll be reflecting on progress in mapping Open Platform 3.0, and thinking about what lies ahead. I’m looking forward to it already.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Conference

Protecting Data is Good. Protecting Information Generated from Big Data is Priceless

By E.G. Nadhan, HP

This was the key message that came out of The Open Group® Big Data Security Tweet Jam on Jan 22 at 9:00 a.m. PT, which addressed several key questions centered on Big Data and security. Here is my summary of the observations made in the context of these questions.

Q1. What is Big Data security? Is it different from data security?

Big data security is more about information security. It is typically external to the corporate perimeter. IT is not prepared today to adequately monitor its sheer volume in brontobytes of data. The time period of long-term storage could violate compliance mandates. Note that storing Big Data in the Cloud changes the game with increased risks of leaks, loss, breaches.

Information resulting from the analysis of the data is even more sensitive and therefore, higher risk – especially when it is Personally Identifiable Information on the Internet of devices requiring a balance between utility and privacy.

At the end of the day, it is all about governance or as they say, “It’s the data, stupid! Govern it.”

Q2. Any thoughts about security systems as producers of Big Data, e.g., voluminous systems logs?

Data gathered from information security logs is valuable but rules for protecting it are the same. Security logs will be a good source to detect patterns of customer usage.

Q3. Most BigData stacks have no built in security. What does this mean for securing Big Data?

There is an added level of complexity because it goes across apps, network plus all end points. Having standards to establish identity, metadata, trust would go a long way. The quality of data could also be a security issue — has it been tampered with, are you being gamed etc. Note that enterprises have varying needs of security around their business data.

Q4. How is the industry dealing with the social and ethical uses of consumer data gathered via Big Data?

Big Data is still nascent and ground rules for handling the information are yet to be established. Privacy issue will be key when companies market to consumers. Organizations are seeking forgiveness rather than permission. Regulatory bodies are getting involved due to consumer pressure. Abuse of power from access to big data is likely to trigger more incentives to attack or embarrass. Note that ‘abuse’ to some is just business to others.

Q5. What lessons from basic data security and cloud security can be implemented in Big Data security?

Security testing is even more vital for Big Data. Limit access to specific devices, not just user credentials. Don’t assume security via obscurity for sensors producing bigdata inputs – they will be targets.

Q6. What are some best practices for securing Big Data? What are orgs doing now and what will organizations be doing 2-3 years from now?

Current best practices include:

  • Treat Big Data as your most valuable asset
  • Encrypt everything by default, proper key management, enforcement of policies, tokenized logs
  • Ask your Cloud and Big Data providers the right questions – ultimately, YOU are responsible for security
  • Assume data needs verification and cleanup before it is used for decisions if you are unable to establish trust with data source

Future best practices:

  • Enterprises treat Information like data today and will respect it as the most valuable asset in the future
  • CIOs will eventually become Chief Officer for Information

Q7. We’re nearing the end of today’s tweet tam. Any last thoughts on Big Data security?

Adrian Lane who participated in the tweet jam will be keynoting at The Open Group Conference in Newport Beach next week and wrote a good best practices paper on securing Big Data.

I have been part of multiple tweet chats specific to security as well as one on Information Optimization. Recently, I also conducted the first Open Group Web Jam internal to The Cloud Work Group.  What I liked about this Big Data Security Tweet Jam is that it brought two key domains together highlighting the intersection points. There was great contribution from subject matter experts forcing participants to think about one domain in the context of the other.

In a way, this post is actually synthesizing valuable information from raw data in the tweet messages – and therefore needs to be secured!

What are your thoughts on the observations made in this tweet jam? What measures are you taking to secure Big Data in your enterprise?

I really enjoyed this tweet jam and would strongly encourage you to actively participate in upcoming tweet jams hosted by The Open Group.  You get to interact with a wide spectrum of knowledgeable practitioners listed in this summary post.

NadhanHP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has more than 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project, and is also the founding co-chair for the Open Group Cloud Computing Governance project. Connect with Nadhan on: Twitter, Facebook, LinkedIn and Journey Blog.

 

2 Comments

Filed under Tweet Jam

#ogChat Summary – Big Data and Security

By Patty Donovan, The Open Group

The Open Group hosted a tweet jam (#ogChat) to discuss Big Data security. In case you missed the conversation, here is a recap of the event.

The Participants

A total of 18 participants joined in the hour-long discussion, including:

Q1 What is #BigData #security? Is it different from #data security? #ogChat

Participants seemed to agree that while Big Data security is similar to data security, it is more extensive. Two major factors to consider: sensitivity and scalability.

  • @dustinkirkland At the core it’s the same – sensitive data – but the difference is in the size and the length of time this data is being stored. #ogChat
  • @jim_hietala Q1: Applying traditional security controls to BigData environments, which are not just very large info stores #ogChat
  • @TheTonyBradley Q1. The value of analyzing #BigData is tied directly to the sensitivity and relevance of that data–making it higher risk. #ogChat
  • @AdrianLane Q1 Securing #BigData is different. Issues of velocity, scale, elasticity break many existing security products. #ogChat
  • @editingwhiz #Bigdata security is standard information security, only more so. Meaning sampling replaced by complete data sets. #ogchat
  • @Dana_Gardner Q1 Not only is the data sensitive, the analysis from the data is sensitive. Secret. On the QT. Hush, hush. #BigData #data #security #ogChat
    • @Technodad @Dana_Gardner A key point. Much #bigdata will be public – the business value is in cleanup & analysis. Focus on protecting that. #ogChat

Q2 Any thoughts about #security systems as producers of #BigData, e.g., voluminous systems logs? #ogChat

  • Most agreed that security systems should be setting an example for producing secure Big Data environments.
  • @dustinkirkland Q2. They should be setting the example. If the data is deemed important or sensitive, then it should be secured and encrypted. #ogChat
  • @TheTonyBradley Q2. Data is data. Data gathered from information security logs is valuable #BigData, but rules for protecting it are the same. #ogChat
  • @elinormills Q2 SIEM is going to be big. will drive spending. #ogchat #bigdata #security
  • @jim_hietala Q2: Well instrumented IT environments generate lots of data, and SIEM/audit tools will have to be managers of this #BigData #ogchat
  • @dustinkirkland @theopengroup Ideally #bigdata platforms will support #tokenization natively, or else appdevs will have to write it into apps #ogChat

Q3 Most #BigData stacks have no built in #security. What does this mean for securing #BigData? #ogChat

The lack of built-in security hoists a target on the Big Data. While not all enterprise data is sensitive, housing it insecurely runs the risk of compromise. Furthermore, security solutions not only need to be effective, but also scalable as data will continue to get bigger.

  • @elinormills #ogchat big data is one big hacker target #bigdata #security
    • @editingwhiz @elinormills #bigdata may be a huge hacker target, but will hackers be able to process the chaff out of it? THAT takes $$$ #ogchat
    • @elinormills @editingwhiz hackers are innovation leaders #ogchat
    • @editingwhiz @elinormills Yes, hackers are innovation leaders — in security, but not necessarily dataset processing. #eweeknews #ogchat
  • @jim_hietala Q3:There will be a strong market for 3rd party security tools for #BigData – existing security technologies can’t scale #ogchat
  • @TheTonyBradley Q3. When you take sensitive info and store it–particularly in the cloud–you run the risk of exposure or compromise. #ogChat
  • @editingwhiz Not all enterprises have sensitive business data they need to protect with their lives. We’re talking non-regulated, of course. #ogchat
  • @TheTonyBradley Q3. #BigData is sensitive enough. The distilled information from analyzing it is more sensitive. Solutions need to be effective. #ogChat
  • @AdrianLane Q3 It means identifying security products that don’t break big data – i.e. they scale or leverage #BigData #ogChat
    • @dustinkirkland @AdrianLane #ogChat Agreed, this is where certifications and partnerships between the 3rd party and #bigdata vendor are essential.

Q4 How is the industry dealing with the social and ethical uses of consumer data gathered via #BigData? #ogChat #privacy

Participants agreed that the industry needs to improve when it comes to dealing with the social and ethical used of consumer data gathered through Big Data. If the data is easily accessible, hackers will be attracted. No matter what, the cost of a breach is far greater than any preventative solution.

  • @dustinkirkland Q4. #ogChat Sadly, not well enough. The recent Instagram uproar was well publicized but such abuse of social media rights happens every day.
    • @TheTonyBradley @dustinkirkland True. But, they’ll buy the startups, and take it to market. Fortune 500 companies don’t like to play with newbies. #ogChat
    • @editingwhiz Disagree with this: Fortune 500s don’t like to play with newbies. We’re seeing that if the IT works, name recognition irrelevant. #ogchat
    • @elinormills @editingwhiz @thetonybradley ‘hacker’ covers lot of ground, so i would say depends on context. some of my best friends are hackers #ogchat
    • @Technodad @elinormills A core point- data from sensors will drive #bigdata as much as enterprise data. Big security, quality issues there. #ogChat
  • @Dana_Gardner Q4 If privacy is a big issue, hacktivism may crop up. Power of #BigData can also make it socially onerous. #data #security #ogChat
  • @dustinkirkland Q4. The cost of a breach is far greater than the cost (monetary or reputation) of any security solution. Don’t risk it. #ogChat

Q5 What lessons from basic #datasecurity and #cloud #security can be implemented in #BigData security? #ogChat

The principles are the same, just on a larger scale. The biggest risks come from cutting corners due to the size and complexity of the data gathered. As hackers (like Anonymous) get better, so does security regardless of the data size.

  • @TheTonyBradley Q5. Again, data is data. The best practices for securing and protecting it stay the same–just on a more massive #BigData scale. #ogChat
  • @Dana_Gardner Q5 Remember, this is in many ways unchartered territory so expect the unexpected. Count on it. #BigData #data #security #ogChat
  • @NadhanAtHP A5 @theopengroup – Security Testing is even more vital when it comes to #BigData and Information #ogChat
  • @TheTonyBradley Q5. Anonymous has proven time and again that most existing data security is trivial. Need better protection for #BigData. #ogChat

Q6 What are some best practices for securing #BigData? What are orgs doing now, and what will orgs be doing 2-3 years from now? #ogChat

While some argued encrypting everything is the key, and others encouraged pressure on big data providers, most agreed that a multi-step security infrastructure is necessary. It’s not just the data that needs to be secured, but also the transportation and analysis processes.

  • @dustinkirkland Q6. #ogChat Encrypting everything, by default, at least at the fs layer. Proper key management. Policies. Logs. Hopefully tokenized too.
  • @dustinkirkland Q6. #ogChat Ask tough questions of your #cloud or #bigdata provider. Know what they are responsible for and who has access to keys. #ogChat
    • @elinormills Agreed–> @dustinkirkland Q6. #ogChat Ask tough questions of your #cloud or #bigdataprovider. Know what they are responsible for …
  • @Dana_Gardner Q6 Treat most #BigData as a crown jewel, see it as among most valuable assets. Apply commensurate security. #data #security #ogChat
  • @elinormills Q6 govt level crypto minimum, plus protect all endpts #ogchat #bigdata #security
  • @TheTonyBradley Q6. Multi-faceted issue. Must protect raw #BigData, plus processing, analyzing, transporting, and resulting distilled analysis. #ogChat
  • @Technodad If you don’t establish trust with data source, you need to assume data needs verification, cleanup before it is used for decisions. #ogChat

A big thank you to all the participants who made this such a great discussion!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

3 Comments

Filed under Tweet Jam

Questions for the Upcoming Big Data Security Tweet Jam on Jan. 22

By Patty Donovan, The Open Group

Last week, we announced our upcoming tweet jam on Tuesday, January 22 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST, which will examine the impact of Big Data on security and how it will change the security landscape.

Please join us next Tuesday, January 22! The discussion will be moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct. We welcome Open Group members and interested participants from all backgrounds to join the session. Our panel of experts will include:

  • Elinor Mills, former CNET reporter and current director of content and media strategy at Bateman Group (@elinormills)
  • Jaikumar Vijayan, Computerworld (@jaivijayan)
  • Chris Preimesberger, eWEEK (@editingwhiz)
  • Tony Bradley, PC World (@TheTonyBradley)
  • Michael Santarcangelo, Security Catalyst Blog (@catalyst)

The discussion will be guided by these six questions:

  1. What is #BigData security? Is it different from #data #security? #ogChat
  2. Any thoughts about #security systems as producers of #BigData, e.g., voluminous systems logs? #ogChat
  3. Most #BigData stacks have no built in #security. What does this mean for securing BigData? #ogChat
  4. How is the industry dealing with the social and ethical uses of consumer data gathered via #BigData? #ogChat #privacy
  5. What lessons from basic data security and #cloud #security can be implemented in #BigData #security? #ogChat
  6. What are some best practices for securing #BigData? #ogChat

To join the discussion, please follow the #ogChat hashtag during the allotted discussion time. Other hashtags we recommend you use during the event include:

  • Information Security: #InfoSec
  • Security: #security
  • BYOD: #BYOD
  • Big Data: #BigData
  • Privacy: #privacy
  • Mobile: #mobile
  • Compliance: #compliance

For more information about the tweet jam, guidelines and general background information, please visit our previous blog post: http://blog.opengroup.org/2013/01/15/big-data-security-tweet-jam/

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com), or leave a comment below. We anticipate a lively chat and hope you will be able to join us!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

 

1 Comment

Filed under Tweet Jam

Big Data Security Tweet Jam

By Patty Donovan, The Open Group

On Tuesday, January 22, The Open Group will host a tweet jam examining the topic of Big Data and its impact on the security landscape.

Recently, Big Data has been dominating the headlines, analyzing everything about the topic from how to manage and process it, to the way it will impact your organization’s IT roadmap. As 2012 came to a close, analyst firm, Gartner predicted that data will help drive IT spending to $3.8 trillion in 2014. Knowing the phenomenon is here to stay, enterprises face a new and daunting challenge of how to secure Big Data. Big Data security also raises other questions, such as: Is Big Data security different from data security? How will enterprises handle Big Data security? What is the best approach to Big Data security?

It’s yet to be seen if Big Data will necessarily revolutionize enterprise security, but it certainly will change execution – if it hasn’t already. Please join us for our upcoming Big Data Security tweet jam where leading security experts will discuss the merits of Big Data security.

Please join us on Tuesday, January 22 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. GMT for a tweet jam, moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct, that will discuss and debate the issues around big data security. Key areas that will be addressed during the discussion include: data security, privacy, compliance, security ethics and, of course, Big Data. We welcome Open Group members and interested participants from all backgrounds to join the session and interact with our panel of IT security experts, analysts and thought leaders led by Jim Hietala (@jim_hietala) and Dave Lounsbury (@Technodad) of The Open Group. To access the discussion, please follow the #ogChat hashtag during the allotted discussion time.

And for those of you who are unfamiliar with tweet jams, here is some background information:

What Is a Tweet Jam?

A tweet jam is a one hour “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on Big Data security. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

  • Have your first #ogChat tweet be a self-introduction: name, affiliation, occupation.
  • Start all other tweets with the question number you’re responding to and the #ogChat hashtag.
    • Sample: “Q1 enterprises will have to make significant adjustments moving forward to secure Big Data environments #ogChat”
    • Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.
    • While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue!
    • A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com). We anticipate a lively chat and hope you will be able to join!

 

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Tweet Jam

2013 Open Group Predictions, Vol. 1

By The Open Group

A big thank you to all of our members and staff who have made 2012 another great year for The Open Group. There were many notable achievements this year, including the release of ArchiMate 2.0, the launch of the Future Airborne Capability Environment (FACE™) Technical Standard and the publication of the SOA Reference Architecture (SOA RA) and the Service-Oriented Cloud Computing Infrastructure Framework (SOCCI).

As we wrap up 2012, we couldn’t help but look towards what is to come in 2013 for The Open Group and the industries we‘re a part of. Without further ado, here they are:

Big Data
By Dave Lounsbury, Chief Technical Officer

Big Data is on top of everyone’s mind these days. Consumerization, mobile smart devices, and expanding retail and sensor networks are generating massive amounts of data on behavior, environment, location, buying patterns – etc. – producing what is being called “Big Data”. In addition, as the use of personal devices and social networks continue to gain popularity so does the expectation to have access to such data and the computational power to use it anytime, anywhere. Organizations will turn to IT to restructure its services so it meets the growing expectation of control and access to data.

Organizations must embrace Big Data to drive their decision-making and to provide the optimal service mix services to customers. Big Data is becoming so big that the big challenge is how to use it to make timely decisions. IT naturally focuses on collecting data so Big Data itself is not an issue.. To allow humans to keep on top of this flood of data, industry will need to move away from programming computers for storing and processing data to teaching computers how to assess large amounts of uncorrelated data and draw inferences from this data on their own. We also need to start thinking about the skills that people need in the IT world to not only handle Big Data, but to make it actionable. Do we need “Data Architects” and if so, what would their role be?

In 2013, we will see the beginning of the Intellectual Computing era. IT will play an essential role in this new era and will need to help enterprises look at uncorrelated data to find the answer.

Security

By Jim Hietala, Vice President of Security

As 2012 comes to a close, some of the big developments in security over the past year include:

  • Continuation of hacktivism attacks.
  • Increase of significant and persistent threats targeting government and large enterprises. The notable U.S. National Strategy for Trusted Identities in Cyberspace started to make progress in the second half of the year in terms of industry and government movement to address fundamental security issues.
  • Security breaches were discovered by third parties, where the organizations affected had no idea that they were breached. Data from the 2012 Verizon report suggests that 92 percent of companies breached were notified by a third party.
  • Acknowledgement from senior U.S. cybersecurity professionals that organizations fall into two groups: those that know they’ve been penetrated, and those that have been penetrated, but don’t yet know it.

In 2013, we’ll no doubt see more of the same on the attack front, plus increased focus on mobile attack vectors. We’ll also see more focus on detective security controls, reflecting greater awareness of the threat and on the reality that many large organizations have already been penetrated, and therefore responding appropriately requires far more attention on detection and incident response.

We’ll also likely see the U.S. move forward with cybersecurity guidance from the executive branch, in the form of a Presidential directive. New national cybersecurity legislation seemed to come close to happening in 2012, and when it failed to become a reality, there were many indications that the administration would make something happen by executive order.

Enterprise Architecture

By Leonard Fehskens, Vice President of Skills and Capabilities

Preparatory to my looking back at 2012 and forward to 2013, I reviewed what I wrote last year about 2011 and 2012.

Probably the most significant thing from my perspective is that so little has changed. In fact, I think in many respects the confusion about what Enterprise Architecture (EA) and Business Architecture are about has gotten worse.

The stress within the EA community as both the demands being placed on it and the diversity of opinion within it increase continues to grow.  This year, I saw a lot more concern about the value proposition for EA, but not a lot of (read “almost no”) convergence on what that value proposition is.

Last year I wrote “As I expected at this time last year, the conventional wisdom about Enterprise Architecture continues to spin its wheels.”  No need to change a word of that. What little progress at the leading edge was made in 2011 seems to have had no effect in 2012. I think this is largely a consequence of the dust thrown in the eyes of the community by the ascendance of the concept of “Business Architecture,” which is still struggling to define itself.  Business Architecture seems to me to have supplanted last year’s infatuation with “enterprise transformation” as the means of compensating for the EA community’s entrenched IT-centric perspective.

I think this trend and the quest for a value proposition are symptomatic of the same thing — the urgent need for Enterprise Architecture to make its case to its stakeholder community, especially to the people who are paying the bills. Something I saw in 2011 that became almost epidemic in 2012 is conflation — the inclusion under the Enterprise Architecture umbrella of nearly anything with the slightest taste of “business” to it. This has had the unfortunate effect of further obscuring the unique contribution of Enterprise Architecture, which is to bring architectural thinking to bear on the design of human enterprise.

So, while I’m not quite mired in the slough of despond, I am discouraged by the community’s inability to advance the state of the art. In a private communication to some colleagues I wrote, “the conventional wisdom on EA is at about the same state of maturity as 14th century cosmology. It is obvious to even the most casual observer that the earth is both flat and the center of the universe. We debate what happens when you fall off the edge of the Earth, and is the flat earth carried on the back of a turtle or an elephant?  Does the walking of the turtle or elephant rotate the crystalline sphere of the heavens, or does the rotation of the sphere require the turtlephant to walk to keep the earth level?  These are obviously the questions we need to answer.”

Cloud

By Chris Harding, Director of Interoperability

2012 has seen the establishment of Cloud Computing as a mainstream resource for enterprise architects and the emergence of Big Data as the latest hot topic, likely to be mainstream for the future. Meanwhile, Service-Oriented Architecture (SOA) has kept its position as an architectural style of choice for delivering distributed solutions, and the move to ever more powerful mobile devices continues. These trends have been reflected in the activities of our Cloud Computing Work Group and in the continuing support by members of our SOA work.

The use of Cloud, Mobile Computing, and Big Data to deliver on-line systems that are available anywhere at any time is setting a new norm for customer expectations. In 2013, we will see the development of Enterprise Architecture practice to ensure the consistent delivery of these systems by IT professionals, and to support the evolution of creative new computing solutions.

IT systems are there to enable the business to operate more effectively. Customers expect constant on-line access through mobile and other devices. Business organizations work better when they focus on their core capabilities, and let external service providers take care of the rest. On-line data is a huge resource, so far largely untapped. Distributed, Cloud-enabled systems, using Big Data, and architected on service-oriented principles, are the best enablers of effective business operations. There will be a convergence of SOA, Mobility, Cloud Computing, and Big Data as they are seen from the overall perspective of the enterprise architect.

Within The Open Group, the SOA and Cloud Work Groups will continue their individual work, and will collaborate with other forums and work groups, and with outside organizations, to foster the convergence of IT disciplines for distributed computing.

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Cybersecurity, Enterprise Architecture

#ogChat Summary – 2013 Security Priorities

By Patty Donovan, The Open Group

Totaling 446 tweets, yesterday’s 2013 Security Priorities Tweet Jam (#ogChat) saw a lively discussion on the future of security in 2013 and became our most successful tweet jam to date. In case you missed the conversation, here’s a recap of yesterday’s #ogChat!

The event was moderated by former CNET security reporter Elinor Mills, and there was a total of 28 participants including:

Here is a high-level snapshot of yesterday’s #ogChat:

Q1 What’s the biggest lesson learned by the security industry in 2012? #ogChat

The consensus among participants was that 2012 was a year of going back to the basics. There are many basic vulnerabilities within organizations that still need to be addressed, and it affects every aspect of an organization.

  • @Dana_Gardner Q1 … Security is not a product. It’s a way of conducting your organization, a mentality, affects all. Repeat. #ogChat #security #privacy
  • @Technodad Q1: Biggest #security lesson of 2102: everyone is in two security camps: those who know they’ve been penetrated & those who don’t. #ogChat
  • @jim_hietala Q1. Assume you’ve been penetrated, and put some focus on detective security controls, reaction/incident response #ogChat
  • @c7five Lesson of 2012 is how many basics we’re still not covering (eg. all the password dumps that showed weak controls and pw choice). #ogChat

Q2 How will organizations tackle #BYOD security in 2013? Are standards needed to secure employee-owned devices? #ogChat

Participants debated over the necessity of standards. Most agreed that standards and policies are key in securing BYOD.

  • @arj Q2: No “standards” needed for BYOD. My advice: collect as little information as possible; use MDM; create an explicit policy #ogChat
  • @Technodad @arj Standards are needed for #byod – but operational security practices more important than technical standards. #ogChat
  • @AWildCSO Organizations need to develop a strong asset management program as part of any BYOD effort. Identification and Classification #ogChat
  • @Dana_Gardner Q2 #BYOD forces more apps & data back on servers, more secure; leaves devices as zero client. Then take that to PCs too. #ogChat #security
  • @taosecurity Orgs need a BYOD policy for encryption & remote wipe of company data; expect remote compromise assessment apps too @elinormills #ogChat

Q3 In #BYOD era, will organizations be more focused on securing the network, the device, or the data? #ogChat

There was disagreement here. Some emphasized focusing on protecting data, while others argued that it is the devices and networks that need protecting.

  • @taosecurity Everyone claims to protect data, but the main ways to do so remain protecting devices & networks. Ignores code sec too. @elinormills #ogChat
  • @arj Q3: in the BYOD era, the focus must be on the data. Access is gated by employee’s entitlements + device capabilities. #ogChat
  • @Technodad @arj Well said. Data sec is the big challenge now – important for #byod, #cloud, many apps. #ogChat
  • @c7five Organization will focus more on device management while forgetting about the network and data controls in 2013. #ogChat #BYOD

Q4 What impact will using 3rd party #BigData have on corporate security practices? #ogChat

Participants agreed that using third parties will force organizations to rely on security provided by those parties. They also acknowledged that data must be secure in transit.

  • @daviottenheimer Q4 Big Data will redefine perimeter. have to isolate sensitive data in transit, store AND process #ogChat
  • @jim_hietala Q4. 3rd party Big Data puts into focus 3rd party risk management, and transparency of security controls and control state #ogChat
  • @c7five Organizations will jump into 3rd party Big Data without understanding of their responsibilities to secure the data they transfer. #ogChat
  • @Dana_Gardner Q4 You have to trust your 3rd party #BigData provider is better at #security than you are, eh? #ogChat  #security #SLA
  • @jadedsecurity @Technodad @Dana_Gardner has nothing to do with trust. Data that isn’t public must be secured in transit #ogChat
  • @AWildCSO Q4: with or without bigdata, third party risk management programs will continue to grow in 2013. #ogChat

Q5 What will global supply chain security look like in 2013? How involved should governments be? #ogChat

Supply chains are an emerging security issue, and governments need to get involved. But consumers will also start to understand what they are responsible for securing themselves.

  • @jim_hietala Q5. supply chain emerging as big security issue, .gov’s need to be involved, and Open Group’s OTTF doing good work here #ogChat
  • @Technodad Q5: Governments are going to act- issue is getting too important. Challenge is for industry to lead & minimize regulatory patchwork. #ogChat
  • @kjhiggins Q5: Customers truly understanding what they’re responsible for securing vs. what cloud provider is. #ogChat

Q6 What are the biggest unsolved issues in Cloud Computing security? #ogChat

Cloud security is a big issue. Most agreed that Cloud security is mysterious, and it needs to become more transparent. When Cloud providers claim they are secure, consumers and organizations put blind trust in them, making the problem worse.

  • @jadedsecurity @elinormills Q6 all of them. Corps assume cloud will provide CIA and in most cases even fails at availability. #ogChat
  • @jim_hietala Q6. Transparency of security controls/control state, cloud risk management, protection of unstructured data in cloud services #ogChat
  • @c7five Some PaaS cloud providers advertise security as something users don’t need to worry about. That makes the problem worse. #ogChat

Q7 What should be the top security priorities for organizations in 2013? #ogChat

Top security priorities varied. Priorities highlighted in the discussion included:  focusing on creating a culture that promotes secure activity; prioritizing security spending based on risk; focusing on where the data resides; and third-party risk management coming to the forefront.

  • @jim_hietala Q7. prioritizing security spend based on risks, protecting data, detective controls #ogChat
  • @Dana_Gardner Q7 Culture trumps technology and business. So make #security policy adherence a culture that is defined and rewarded. #ogChat #security
  • @kjhiggins Q7 Getting a handle on where all of your data resides, including in the mobile realm. #ogChat
  • @taosecurity Also for 2013: 1) count and classify your incidents & 2) measure time from detection to containment. Apply Lean principles to both. #ogChat
  • @AWildCSO Q7: Asset management, third party risk management, and risk based controls for 2013. #ogChat

A big thank you to all the participants who made this such a great discussion!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Tweet Jam

Operational Resilience through Managing External Dependencies

By Ian Dobson & Jim Hietala, The Open Group

These days, organizations are rarely self-contained. Businesses collaborate through partnerships and close links with suppliers and customers. Outsourcing services and business processes, including into Cloud Computing, means that key operations that an organization depends on are often fulfilled outside their control.

The challenge here is how to manage the dependencies your operations have on factors that are outside your control. The goal is to perform your risk management so it optimizes your operational success through being resilient against external dependencies.

The Open Group’s Dependency Modeling (O-DM) standard specifies how to construct a dependency model to manage risk and build trust over organizational dependencies between enterprises – and between operational divisions within a large organization. The standard involves constructing a model of the operations necessary for an organization’s success, including the dependencies that can affect each operation. Then, applying quantitative risk sensitivities to each dependency reveals those operations that have highest exposure to risk of not being successful, informing business decision-makers where investment in reducing their organization’s exposure to external risks will result in best return.

O-DM helps you to plan for success through operational resilience, assured business continuity, and effective new controls and contingencies, enabling you to:

  • Cut costs without losing capability
  • Make the most of tight budgets
  • Build a resilient supply chain
  •  Lead programs and projects to success
  • Measure, understand and manage risk from outsourcing relationships and supply chains
  • Deliver complex event analysis

The O-DM analytical process facilitates organizational agility by allowing you to easily adjust and evolve your organization’s operations model, and produces rapid results to illustrate how reducing the sensitivity of your dependencies improves your operational resilience. O-DM also allows you to drill as deep as you need to go to reveal your organization’s operational dependencies.

O-DM support training on the development of operational dependency models conforming to this standard is available, as are software computation tools to automate speedy delivery of actionable results in graphic formats to facilitate informed business decision-making.

The O-DM standard represents a significant addition to our existing Open Group Risk Management publications:

The O-DM standard may be accessed here.

Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world.  In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Security Architecture

Questions for the Upcoming 2013 Security Priorities Tweet Jam – Dec. 11

By Patty Donovan, The Open Group

Last week, we announced our upcoming tweet jam on Tuesday, December 11 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST, which will examine the topic of IT security and what is in store for 2013.

Please join us next Tuesday, December 11! The discussion will be moderated by Elinor Mills (@elinormills), former CNET security reporter, and we welcome Open Group members and interested participants from all backgrounds to join the session. Our panel of experts will include:

The discussion will be guided by these seven questions:

  1. What’s the biggest lesson learned by the security industry in 2012? #ogChat
  2. How will organizations tackle #BYOD security in 2013? Are standards needed to secure employee-owned devices? #ogChat
  3. In #BYOD era, will organizations be more focused on securing the network, the device, or the data? #ogChat
  4. What impact will using 3rd party #BigData have on corporate security practices? #ogChat
  5. What will global supply chain security look like in 2013? How involved should governments be? #ogChat
  6. What are the biggest unsolved issues in cloud computing security? #ogChat
  7. What should be the top security priorities for organizations in 2013? #ogChat

To access the discussion, please follow the #ogChat hashtag during the allotted discussion time. Other hashtags we recommend you use during the event include:

  • Information Security: #InfoSec
  • Security: #security
  • BYOD: #BYOD
  • Big Data: #BigData
  • Privacy: #privacy
  • Mobile: #mobile
  • Supply Chain: #supplychain

For more information about the tweet jam topic (security), guidelines and general background information on the event, please visit our previous blog post: http://blog.opengroup.org/2012/11/26/2013-security-priorities-tweet-jam/

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com), or leave a comment below. We anticipate a lively chat and hope you will be able to join us!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

Comments Off

Filed under Tweet Jam

The Open Group Newport Beach Conference – Early Bird Registration Ends January 4

By The Open Group Conference Team

The Open Group is busy gearing up for the Newport Beach Conference. Taking place January 28-31, 2013, the conference theme is “Big Data – The Transformation We Need to Embrace Today” and will bring together leading minds in technology to discuss the challenges and solutions facing Enterprise Architecture around the growth of Big Data. Register today!

Information is power, and we stand at a time when 90% of the data in the world today was generated in the last two years alone.  Despite the sheer enormity of the task, off the shelf hardware, open source frameworks, and the processing capacity of the Cloud, mean that Big Data processing is within the cost-effective grasp of the average business. Organizations can now initiate Big Data projects without significant investment in IT infrastructure.

In addition to tutorial sessions on TOGAF® and ArchiMate®, the conference offers roughly 60 sessions on a varied of topics including:

  • The ways that Cloud Computing is transforming the possibilities for collecting, storing, and processing big data.
  • How to contend with Big Data in your Enterprise?
  • How does Big Data enable your Business Architecture?
  • What does the Big Data revolution mean for the Enterprise Architect?
  • Real-time analysis of Big Data in the Cloud.
  • Security challenges in the world of outsourced data.
  • What is an architectural view of Security for the Cloud?

Plenary speakers include:

  • Christian Verstraete, Chief Technologist – Cloud Strategy, HP
  • Mary Ann Mezzapelle, Strategist – Security Services, HP
  • Michael Cavaretta, Ph.D, Technical Leader, Predictive Analytics / Data Mining Research and Advanced Engineering, Ford Motor Company
  • Adrian Lane, Analyst and Chief Technical Officer, Securosis
  • David Potter, Chief Technical Officer, Promise Innovation Oy
  • Ron Schuldt, Senior Partner, UDEF-IT, LLC

A full conference agenda is available here. Tracks include:

  • Architecting Big Data
  • Big Data and Cloud Security
  • Data Architecture and Big Data
  • Business Architecture
  • Distributed Services Architecture
  • EA and Disruptive Technologies
  • Architecting the Cloud
  • Cloud Computing for Business

Early Bird Registration

Early Bird registration for The Open Group Conference in Newport Beach ends January 4. Register now and save! For more information or to register: http://www.opengroup.org/event/open-group-newport-beach-2013/reg

Upcoming Conference Submission Deadlines

In addition to the Early Bird registration deadline to attend the Newport Beach conference, there are upcoming deadlines for speaker proposal submissions to Open Group conferences in Sydney, Philadelphia and London. To submit a proposal to speak, click here.

Venue Industry Focus Submission Deadline
Sydney (April 15-17) Finance, Defense, Mining January 18, 2013
Philadelphia (July 15-17) Healthcare, Finance, Defense April 5, 2013
London (October 21-23) Finance, Government, Healthcare July 8, 2013

We expect space on the agendas of these events to be at a premium, so it is important for proposals to be submitted as early as possible. Proposals received after the deadline dates will still be considered, if space is available; if not, they may be carried over to a future conference. Priority will be given to proposals received by the deadline dates and to proposals that include an end-user organization, at least as a co-presenter.

Comments Off

Filed under Conference

Data Protection Today and What’s Needed Tomorrow

By Ian Dobson and Jim Hietala, The Open Group

Technology today allows thieves to copy sensitive data, leaving the original in place and thus avoiding detection. One needn’t look far in today’s headlines to understand why protection of data is critical going forward. As this recent article from Bloomberg points out, penetrations of corporate IT systems with the aim to extract sensitive information, IP and other corporate data are rampant.  Despite the existence of data breach and data privacy laws in the U.S., EU and elsewhere, this issue is still not well publicized. The article cites specific intrusions at large consumer products companies, the EU, itself, law firms and a nuclear power plant.

Published in October 2012, the Jericho Forum® Data Protection white paper reviews the state of data protection today and where it should be heading to meet tomorrow’s business needs. The Open Group’s Jericho Forum contends that future data protection solutions must aim to provide stronger, more flexible protection mechanisms around the data itself.

The white paper argues that some of the current issues with data protection are:

  • It is too global and remote to be effective
  • Protection is neither granular nor interoperable enough
  • It’s not integrated with Centralized Authorization Services
  • Weak security services are relied on for enforcement

Refreshingly, it explains not only why, but also how. The white paper reviews the key issues surrounding data protection today; describes properties that data protection mechanisms should include to meet current and future requirements; considers why current technologies don’t deliver what is required; and proposes a set of data protection principles to guide the design of effective solutions.

It goes on to describe how data protection has evolved to where it’s at today, and outlines a series of target stages for progressively moving the industry forward to deliver stronger more flexible protection solutions that business managers are already demanding their IT systems managers provide.  Businesses require these solutions to ensure appropriate data protection levels are wrapped around the rapidly increasing volumes of confidential information that is shared with their business partners, suppliers, customers and outworkers/contractors on a daily basis.

Having mapped out an evolutionary path for what we need to achieve to move data protection forward in the direction our industry needs, we’re now planning optimum approaches for how to achieve each successive stage of protection. The Jericho Forum welcomes folks who want to join us in this important journey.

 

Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world.  In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity

Call for Submissions

By Patty Donovan, The Open Group

The Open Group Blog is celebrating its second birthday this month! Over the past few years, our blog posts have tended to cover Open Group activities – conferences, announcements, our lovely members, etc. While several members and Open Group staff serve as regular contributors, we’d like to take this opportunity to invite our community members to share their thoughts and expertise on topics related to The Open Group’s areas of expertise as guest contributors.

Here are a few examples of popular guest blog posts that we’ve received over the past year

Blog posts generally run between 500 and 800 words and address topics relevant to The Open Group workgroups, forums, consortiums and events. Some suggested topics are listed below.

  • ArchiMate®
  • Big Data
  • Business Architecture
  • Cloud Computing
  • Conference recaps
  • DirectNet
  • Enterprise Architecture
  • Enterprise Management
  • Future of Airborne Capability Environment (FACE™)
  • Governing Board Businesses
  • Governing Board Certified Architects
  • Governing Board Certified IT Specialists
  • Identity Management
  • IT Security
  • The Jericho Forum
  • The Open Group Trusted Technology Forum (OTTF)
  • Quantum Lifecycle Management
  • Real-Time Embedded Systems
  • Semantic Interoperability
  • Service-Oriented Architecture
  • TOGAF®

If you have any questions or would like to contribute, please contact opengroup (at) bateman-group.com.

Please note that all content submitted to The Open Group blog is subject to The Open Group approval process. The Open Group reserves the right to deny publication of any contributed works. Anything published shall be copyright of The Open Group.

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Uncategorized

Optimizing ISO/IEC 27001 Using O-ISM3

By Jim Hietala, The Open Group and Vicente Aceituno, Sistemas Informáticos Abiertos

The Open Group has just published a guide titled “Optimizing ISO/IEC 27001 using O-ISM3” that will be of interest to organizations using ISO27001/27002 as their Information Security Management System (ISMS).

By way of background, The Open Group published our Open Information Security Management Maturity Model last year, O-ISM3. O-ISM3 brings continuous improvement to information security management, and it provides a framework for security decision-making that is top down in nature, where security controls, security objectives and spending decisions are driven by (and aligned with) business objectives.

We have for some time now heard from information security managers that they would like a resource aimed at showing how the O-ISM3 standard could be used to manage information security alongside ISO27001/27002. This new guide provides specific guidance on this topic.

We view this as an important resource, for the following reasons:

  • O-ISM3 complements ISO27001/2 by adding the “how” dimension to information security management
  • O-ISM3 uses a process-oriented approach, defining inputs and outputs, and allowing for evaluation by process-specific metrics
  • O-ISM3 provides a framework for continuous improvement of information security processes

This resource:

  • Maps O-ISM3 and ISO27001 security objectives
  • Maps ISO27001/27002 controls and documents to O-ISM3 security processes, documents, and outputs
  • Provides a critical linkage between the controls-based approach found in ISO27001 to the process-based approach found in O-ISM3

If you have interest in information security management, we encourage you to have a look at Optimizing ISO/IEC 27001 using O-ISM3. The guide may be downloaded (at no cost, minimal registration required) here.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Vicente Aceituno, CISA, has 20 years experience in the field of IT and Information Security. During his career in Spain and the UK, he has worked for companies like Coopers & Lybrand, BBC News, Everis, and SIA Group. He is the main author of the Information Security Management Method ISM3, author of the information security book “Seguridad de la Información,” Director of the ISM3 Consortium (www.ism3.com) and President of the Spanish chapter of the ISSA.

3 Comments

Filed under Cybersecurity, Information security, Security Architecture