Tag Archives: data security

New Health Data Deluges Require Secure Information Flow Enablement Via Standards, Says The Open Group’s New Healthcare Director

By The Open Group

Below is the transcript of The Open Group podcast on how new devices and practices have the potential to expand the information available to Healthcare providers and facilities.

Listen to the podcast here.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview coming to you in conjunction with The Open Group’s upcoming event, Enabling Boundaryless Information Flow™ July 21-22, 2014 in Boston.

GardnerI’m Dana Gardner, Principal Analyst at Interarbor Solutions and I’ll be your host and moderator for the series of discussions from the conference on Boundaryless Information Flow, Open Platform 3.0™, Healthcare, and Security issues.

One area of special interest is the Healthcare arena, and Boston is a hotbed of innovation and adaption for how technology, Enterprise Architecture, and standards can improve the communication and collaboration among Healthcare ecosystem players.

And so, we’re joined by a new Forum Director at The Open Group to learn how an expected continued deluge of data and information about patients, providers, outcomes, and efficiencies is pushing the Healthcare industry to rapid change.

WJason Lee headshotith that, please join me now in welcoming our guest. We’re here with Jason Lee, Healthcare and Security Forums Director at The Open Group. Welcome, Jason.

Jason Lee: Thank you so much, Dana. Good to be here.

Gardner: Great to have you. I’m looking forward to the Boston conference and want to remind our listeners and readers that it’s not too late to sign up. You can learn more at http://www.opengroup.org.

Jason, let’s start by talking about the relationship between Boundaryless Information Flow, which is a major theme of the conference, and healthcare. Healthcare perhaps is the killer application for Boundaryless Information Flow.

Lee: Interesting, I haven’t heard it referred to that way, but healthcare is 17 percent of the US economy. It’s upwards of $3 trillion. The costs of healthcare are a problem, not just in the United States, but all over the world, and there are a great number of inefficiencies in the way we practice healthcare.

We don’t necessarily intend to be inefficient, but there are so many places and people involved in healthcare, it’s very difficult to get them to speak the same language. It’s almost as if you’re in a large house with lots of different rooms, and every room you walk into they speak a different language. To get information to flow from one room to the other requires some active efforts and that’s what we’re undertaking here at The Open Group.

Gardner: What is it about the current collaboration approaches that don’t work? Obviously, healthcare has been around for a long time and there have been different players involved. What’s the hurdle? What prevents a nice, seamless, easy flow and collaboration in information that gets better outcomes? What’s the holdup?

Lee: There are many ways to answer that question, because there are many barriers. Perhaps the simplest is the transformation of healthcare from a paper-based industry to a digital industry. Everyone has walked into an office, looked behind the people at the front desk, and seen file upon file and row upon row of folders, information that’s kept in a written format.

When there’s been movement toward digitizing that information, not everyone has used the same system. It’s almost like trains running on a different gauge track. Obviously if the track going east to west is a different gauge than going north to south, then trains aren’t going to be able to travel on those same tracks. In the same way, healthcare information does not flow easily from one office to another or from one provider to another.

Gardner: So not only do we have disparate strategies for collecting and communicating health data, but we’re also seeing much larger amounts of data coming from a variety of new and different places. Some of them now even involve sensors inside of patients themselves or devices that people will wear. So is the data deluge, the volume, also an issue here?

Lee: Certainly. I heard recently that an integrated health plan, which has multiple hospitals involved, contains more elements of data than the Library of Congress. As information is collected at multiple points in time, over a relatively short period of time, you really do have a data deluge. Figuring out how to find your way through all the data and look at the most relevant for the patient is a great challenge.

Gardner: I suppose the bad news is that there is this deluge of data, but it’s also good news, because more data means more opportunity for analysis, a better ability to predict and determine best practices, and also provide overall lower costs with better patient care.

So it seems like the stakes are rather high here to get this right, to not just crumble under a volume or an avalanche of data, but to master it, because it’s perhaps the future. The solution is somewhere in there too.

Lee: No question about it. At The Open Group, our focus is on solutions. We, like others, put a great deal of effort into describing the problems, but figuring out how to bring IT technologies to bear on business problems, how to encourage different parts of organizations to speak to one another and across organizations to speak the same language, and to operate using common standards and language. That’s really what we’re all about.

And it is, in a large sense, part of the process of helping to bring healthcare into the 21st Century. A number of industries are a couple of decades ahead of healthcare in the way they use large datasets — big data, some people refer to it as. I’m talking about companies like big department stores and large online retailers. They really have stepped up to the plate and are using that deluge of data in ways that are very beneficial to them, and healthcare can do the same. We’re just not quite at the same level of evolution.

Gardner: And to your point, the stakes are so much higher. Retail is, of course, a big deal in the economy, but as you pointed out, healthcare is such a much larger segment and portion. So just making modest improvements in communication, collaboration, or data analysis can reap huge rewards.

Lee: Absolutely true. There is the cost side of things, but there is also the quality side. So there are many ways in which healthcare can improve through standardization and coordinated development, using modern technology that cannot just reduce cost, but improve quality at the same time.

Gardner: I’d like to get into a few of the hotter trends, but before we do, it seems that The Open Group has recognized the importance here by devoting the entire second day of their conference in Boston, that will be on July 22, to Healthcare.

Maybe you could give us a brief overview of what participants, and even those who come in online and view recorded sessions of the conference at http://new.livestream.com/opengroup should expect? What’s going to go on July 22nd?

Lee: We have a packed day. We’re very excited to have Dr. Joe Kvedar, a physician at Partners HealthCare and Founding Director of the Center for Connected Health, as our first plenary speaker. The title of his presentation is “Making Health Additive.” Dr. Kvedar is a widely respected expert on mobile health, which is currently the Healthcare Forum’s top work priority. As mobile medical devices become ever more available and diversified, they will enable consumers to know more about their own health and wellness. A great deal of data of potentially useful health data will be generated. How this information can be used–not just by consumers but also by the healthcare establishment that takes care of them as patients, will become a question of increasing importance. It will become an area where standards development and The Open Group can be very helpful.

Our second plenary speaker, Proteus Duxbury, Chief Technology Officer at Connect for Health Colorado,will discuss a major feature of the Affordable Care Act—the health insurance exchanges–which are designed to bring health insurance to tens of millions of people who previously did not have access to it. Mr. Duxbury is going to talk about how Enterprise Architecture–which is really about getting to solutions by helping the IT folks talk to the business folks and vice versa–has helped the State of Colorado develop their Health Insurance Exchange.

After the plenaries, we will break up into 3 tracks, one of which is Healthcare-focused. In this track there will be three presentations, all of which discuss how Enterprise Architecture and the approach to Boundaryless Information Flow can help healthcare and healthcare decision-makers become more effective and efficient.

One presentation will focus on the transformation of care delivery at the Visiting Nurse Service of New York. Another will address stewarding healthcare transformation using Enterprise Architecture, focusing on one of our Platinum members, Oracle, and a company called Intelligent Medical Objects, and how they’re working together in a productive way, bringing IT and healthcare decision-making together.

Then, the final presentation in this track will focus on the development of an Enterprise Architecture-based solution at an insurance company. The payers, or the insurers–the big companies that are responsible for paying bills and collecting premiums–have a very important role in the healthcare system that extends beyond administration of benefits. Yet, payers are not always recognized for their key responsibilities and capabilities in the area of clinical improvements and cost improvements.

With the increase in payer data brought on in large part by the adoption of a new coding system–the ICD-10–which will come online this year, there will be a huge amount of additional data, including clinical data, that become available. At The Open Group, we consider payers—health insurance companies (some of which are integrated with providers)–as very important stakeholders in the big picture..

In the afternoon, we’re going to switch gears a bit and have a speaker talk about the challenges, the barriers, the “pain points” in introducing new technology into the healthcare systems. The focus will return to remote or mobile medical devices and the predictable but challenging barriers to getting newly generated health information to flow to doctors’ offices and into patients records, electronic health records, and hospitals data keeping and data sharing systems.

We’ll have a panel of experts that responds to these pain points, these challenges, and then we’ll draw heavily from the audience, who we believe will be very, very helpful, because they bring a great deal of expertise in guiding us in our work. So we’re very much looking forward to the afternoon as well.

Gardner: It’s really interesting. A couple of these different plenaries and discussions in the afternoon come back to this user-generated data. Jason, we really seem to be on the cusp of a whole new level of information that people will be able to develop from themselves through their lifestyle, new devices that are connected.

We hear from folks like Apple, Samsung, Google, and Microsoft. They’re all pulling together information and making it easier for people to not only monitor their exercise, but their diet, and maybe even start to use sensors to keep track of blood sugar levels, for example.

In fact, a new Flurry Analytics survey showed 62 percent increase in the use of health and fitness application over the last six months on the popular mobile devices. This compares to a 33 percent increase in other applications in general. So there’s an 87 percent faster uptick in the use of health and fitness applications.

Tell me a little bit how you see this factoring in. Is this a mixed blessing? Will so much data generated from people in addition to the electronic medical records, for example, be a bad thing? Is this going to be a garbage in, garbage out, or is this something that could potentially be a game-changer in terms of how people react to their own data and then bring more data into the interactions they have with care providers?

Lee: It’s always a challenge to predict what the market is going to do, but I think that’s a remarkable statistic that you cited. My prediction is that the increased volume of person- generated data from mobile health devices is going to be a game-changer. This view also reflects how the Healthcare Forum members (which includes members from Capgemini, Philips, IBM, Oracle and HP) view the future.

The commercial demand for mobile medical devices, things that can be worn, embedded, or swallowed, as in pills, as you mentioned, is growing ever more. The software and the applications that will be developed to be used with the devices is going to grow by leaps and bounds. As you say, there are big players getting involved. Already some of the pedometer type devices that measure the number of steps taken in a day have captured the interest of many, many people. Even David Sedaris, serious guy that he is, was writing about it recently in ‘The New Yorker’.

What we will find is that many of the health indicators that we used to have to go to the doctor or nurse or lab to get information on will become available to us through these remote devices.

There will be a question, of course, as to reliability and validity of the information, to your point about garbage in, garbage out, but I think standards development will help here This, again, is where The Open Group comes in. We might also see the FDA exercising its role in ensuring safety here, as well as other organizations, in determining which devices are reliable.

The Open Group is working in the area of mobile data and information systems that are developed around them, and their ability to (a) talk to one another and (b) talk to the data devices/infrastructure used in doctors’ offices and in hospitals. This is called interoperability and it’s certainly lacking in the country.

There are already problems around interoperability and connectivity of information in the healthcare establishment as it is now. When patients and consumers start collecting their own data, and the patient is put at the center of the nexus of healthcare, then the question becomes how does that information that patients collect get back to the doctor/clinician in ways in which the data can be trusted and where the data are helpful?

After all, if a patient is wearing a medical device, there is the opportunity to collect data, about blood sugar level let’s say, throughout the day. And this is really taking healthcare outside of the four walls of the clinic and bringing information to bear that can be very, very useful to clinicians and beneficial to patients.

In short, the rapid market dynamic in mobile medical devices and in the software and hardware that facilitates interoperability begs for standards-based solutions that reduce costs and improve quality, and all of which puts the patient at the center. This is The Open Group’s Healthcare Forum’s sweet spot.

Gardner: It seems to me a real potential game-changer as well, and that something like Boundaryless Information Flow and standards will play an essential role. Because one of the big question marks with many of the ailments in a modern society has to do with lifestyle and behavior.

So often, the providers of the care only really have the patient’s responses to questions, but imagine having a trove of data at their disposal, a 360-degree view of the patient to then further the cause of understanding what’s really going on, on a day-to-day basis.

But then, it’s also having a two-way street, being able to deliver perhaps in an automated fashion reinforcements and incentives, information back to the patient in real-time about behavior and lifestyles. So it strikes me as something quite promising, and I look forward to hearing more about it at the Boston conference.

Any other thoughts on this issue about patient flow of data, not just among and between providers and payers, for example, or providers in an ecosystem of care, but with the patient as the center of it all, as you said?

Lee: As more mobile medical devices come to the market, we’ll find that consumers own multiple types of devices at least some of which collect multiple types of data. So even for the patient, being at the center of their own healthcare information collection, there can be barriers to having one device talk to the other. If a patient wants to keep their own personal health record, there may be difficulties in bringing all that information into one place.

So the interoperability issue, the need for standards, guidelines, and voluntary consensus among stakeholders about how information is represented becomes an issue, not just between patients and their providers, but for individual consumers as well.

Gardner: And also the cloud providers. There will be a variety of large organizations with cloud-modeled services, and they are going to need to be, in some fashion, brought together, so that a complete 360-degree view of the patient is available when needed. It’s going to be an interesting time.

Of course, we’ve also looked at many other industries and tried to have a cloud synergy, a cloud-of-clouds approach to data and also the transaction. So it’s interesting how what’s going on in multiple industries is common, but it strikes me that, again, the scale and the impact of the healthcare industry makes it a leader now, and perhaps a driver for some of these long overdue structured and standardized activities.

Lee: It could become a leader. There is no question about it. Moreover, there is a lot Healthcare can learn from other companies, from mistakes that other companies have made, from lessons they have learned, from best practices they have developed (both on the content and process side). And there are issues, around security in particular, where Healthcare will be at the leading edge in trying to figure out how much is enough, how much is too much, and what kinds of solutions work.

There’s a great future ahead here. It’s not going to be without bumps in the road, but organizations like The Open Group are designed and experienced to help multiple stakeholders come together and have the conversations that they need to have in order to push forward and solve some of these problems.

Gardner: Well, great. I’m sure there will be a lot more about how to actually implement some of those activities at the conference. Again, that’s going to be in Boston, beginning on July 21, 2014.

We’ll have to leave it there. We’re about out of time. We’ve been talking with a new Director at The Open Group to learn how an expected continued deluge of data and information about patients and providers, outcomes and efficiencies are all working together to push the Healthcare industry to rapid change. And, as we’ve heard, that might very well spill over into other industries as well.

So we’ve seen how innovation and adaptation around technology, Enterprise Architecture and standards can improve the communication and collaboration among Healthcare ecosystem players.

It’s not too late to register for The Open Group Boston 2014 (http://www.opengroup.org/boston2014) and join the conversation via Twitter #ogchat #ogBOS, where you will be able to learn more about Boundaryless Information Flow, Open Platform 3.0, Healthcare and other relevant topics.

So a big thank you to our guest. We’ve been joined by Jason Lee, Healthcare and Security Forums Director at The Open Group. Thanks so much, Jason.

Lee: Thank you very much.

 

 

 

 

 

 

 

 

 

Leave a comment

Filed under Boundaryless Information Flow™, Cloud, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Interoperability, Open Platform 3.0, Standards, Uncategorized

The Power of APIs – Join The Open Group Tweet Jam on Wednesday, July 9th

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The face of technology is evolving at breakneck speed, driven by demand from consumers and businesses alike for more robust, intuitive and integrated service offerings. APIs (application programming interfaces) have made this possible by offering greater interoperability between otherwise disparate software and hardware systems. While there are clear benefits to their use, how do today’s security and value-conscious enterprises take advantage of this new interoperability without exposing them themselves?

On Wednesday, July 9th at 9:00 am PT/12:00 pm ET/5:00 pm GMT, please join us for a tweet jam that will explore how APIs are changing the face of business today, and how to prepare for their implementation in your enterprise.

APIs are at the heart of how today’s technology communicates with one another, and have been influential in enabling new levels of development for social, mobility and beyond. The business benefits of APIs are endless, as are the opportunities to explore how they can be effectively used and developed.

There is reason to maintain a certain level of caution, however, as recent security issues involving open APIs have impacted overall confidence and sustainability.

This tweet jam will look at the business benefits of APIs, as well as potential vulnerabilities and weak points that you should be wary of when integrating them into your Enterprise Architecture.

We welcome The Open Group members and interested participants from all backgrounds to join the discussion and interact with our panel of thought-leaders from The Open Group including Jason Lee, Healthcare and Security Forums Director; Jim Hietala, Vice President of Security; David Lounsbury, CTO; and Dr. Chris Harding, Director for Interoperability and Open Platform 3.0™ Forum Director. To access the discussion, please follow the hashtag #ogchat during the allotted discussion time.

Interested in joining The Open Group Security Forum? Register your interest, here.

What Is a Tweet Jam?

A tweet jam is a 45 minute “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on relevant and thought-provoking issues. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Here are some helpful guidelines for taking part in the tweet jam:

  • Please introduce yourself (name, title and organization)
  • Use the hashtag #ogchat following each of your tweets
  • Begin your tweets with the question number to which you are responding
  • Please refrain from individual product/service promotions – the goal of the tweet jam is to foster an open and informative dialogue
  • Keep your commentary focused, thoughtful and on-topic

If you have any questions prior to the event or would like to join as a participant, please contact George Morin (@GMorin81 or george.morin@hotwirepr.com).

We look forward to a spirited discussion and hope you will be able to join!

 

3 Comments

Filed under Data management, digital technologies, Enterprise Architecture, Enterprise Transformation, Information security, Open Platform 3.0, real-time and embedded systems, Standards, Strategy, Tweet Jam, Uncategorized

3 Steps to Proactively Address Board-Level Security Concerns

By E.G. Nadhan, HP

Last month, I shared the discussions that ensued in a Tweet Jam conducted by The Open Group on Big Data and Security where the key takeaway was: Protecting Data is Good.  Protecting Information generated from Big Data is priceless.  Security concerns around Big Data continue to the extent that it has become a Board-level concern as explained in this article in ComputerWorldUK.  Board-level concerns must be addressed proactively by enterprises.  To do so, enterprises must provide the business justification for such proactive steps needed to address such board-level concerns.

Nadhan blog image

At The Open Group Conference in Sydney in April, the session on “Which information risks are shaping our lives?” by Stephen Singam, Chief Technology Officer, HP Enterprise Security Services, Australia provides great insight on this topic.  In this session, Singam analyzes the current and emerging information risks while recommending a proactive approach to address them head-on with adversary-centric solutions.

The 3 steps that enterprises must take to proactively address security concerns are below:

Computing the cost of cyber-crime

The HP Ponemon 2012 Cost of Cyber Crime Study revealed that cyber attacks have more than doubled in a three year period with the financial impact increasing by nearly 40 percent. Here are the key takeaways from this research:

  • Cyber-crimes continue to be costly. The average annualized cost of cyber-crime for 56 organizations is $8.9 million per year, with a range of $1.4 million to $46 million.
  • Cyber attacks have become common occurrences. Companies experienced 102 successful attacks per week and 1.8 successful attacks per company per week in 2012.
  • The most costly cyber-crimes are those caused by denial of service, malicious insiders and web-based attacks.

When computing the cost of cyber-crime, enterprises must address direct, indirect and opportunity costs that result from the loss or theft of information, disruption to business operations, revenue loss and destruction of property, plant and equipment. The following phases of combating cyber-crime must also be factored in to comprehensively determine the total cost:

  1. Detection of patterns of behavior indicating an impending attack through sustained monitoring of the enabling infrastructure
  2. Investigation of the security violation upon occurrence to determine the underlying root cause and take appropriate remedial measures
  3. Incident response to address the immediate situation at hand, communicate the incidence of the attack raise all applicable alerts
  4. Containment of the attack by controlling its proliferation across the enterprise
  5. Recovery from the damages incurred as a result of the attack to ensure ongoing business operations based upon the business continuity plans in place

Identifying proactive steps that can be taken to address cyber-crime

  1. “Better get security right,” says HP Security Strategist Mary Ann Mezzapelle in her keynote on Big Data and Security at The Open Group Conference in Newport Beach. Asserting that proactive risk management is the most effective approach, Mezzapelle challenged enterprises to proactively question the presence of shadow IT, data ownership, usage of security tools and standards while taking a comprehensive approach to security end-to-end within the enterprise.
  2. Art Gilliland suggested that learning from cyber criminals and understanding their methods in this ZDNet article since the very frameworks enterprises strive to comply with (such as ISO and PCI) set a low bar for security that adversaries capitalize on.
  3. Andy Ellis discussed managing risk with psychology instead of brute force in his keynote at the 2013 RSA Conference.
  4. At the same conference, in another keynote, world re-knowned game-designer and inventor of SuperBetter, Jane McGonigal suggested the application of the “collective intelligence” that gaming generates can combat security concerns.
  5. In this interview, Bruce Schneier, renowned security guru and author of several books including LIARS & Outliers, suggested “Bad guys are going to invent new stuff — whether we want them to or not.” Should we take a cue from Hollywood and consider the inception of OODA loop into the security hacker’s mind?

The Balancing Act.

Can enterprises afford to take such proactive steps? Or more importantly, can they afford not to?

Enterprises must define their risk management strategy and determine the proactive steps that are best in alignment with their business objectives and information security standards.  This will enable organizations to better assess the cost of execution for such measures.  While the actual cost is likely to vary by enterprise, inaction is not an acceptable alternative.  Like all other critical corporate initiatives, these proactive measures must receive the board-level attention they deserve.

Enterprises must balance the cost of executing such proactive measures against the potential cost of data loss and reputational harm. This will ensure that the right proactive measures are taken with executive support.

How about you?  Has your enterprise taken the steps to assess the cost of cybercrime?  Have you considered various proactive steps to combat cybercrime?  Share your thoughts with me in the comments section below.

NadhanHP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHP.

1 Comment

Filed under Conference

Quick Hit Thoughts from RSA Conference 2013

By Joshua Brickman, CA Technologies

I have a great job at CA Technologies, I can’t deny it. Working in CA Technologies Federal Certification Program Office, I have the responsibility of knowing what certifications, accreditations, mandates, etc. are relevant and then helping them get implemented.

One of the responsibilities (and benefits) of my job is getting to go to great conferences like the RSA Security Conference which just wrapped last week. This year I was honored to be selected by the Program Committee to speak twice at the event. Both talks fit well to the Policy and Government track at the show.

First I was on a panel with a distinguished group of senior leaders from both industry and government. The title of the session was, Certification of Products or Accreditation of Organizations: Which to Do? The idea was to discuss the advantages and disadvantages of individual product certifications vs. looking at an entire company or business unit. Since I’ve led CA through many product certifications (certs) and have been involved in accreditation programs as well, my position was to be able to bring real-world industry perspective to the panel. The point I tried to make was that product certs (like Common Criteria – CC) add value, but only for the specific purpose that they are designed for (security functions). We’ve seen CC expanding beyond just security enforcing products and that’s concerning. Product certs are expensive, time consuming and take away from time that could be spent on innovation. We want to do CC when it will be long lasting and add value.

On the idea of accreditation of organizations, I first talked about CMMI and my views on its challenges. I then shifted to the Open Trusted Technology Forum (OTTF), a forum of The Open Group, as I’ve written about before and said that the accreditation program that group is building is more focused than CMMI. OTTF is building something that  – when adopted by industry and THEIR suppliers – will provide assurance that technology is being built the right way (best practices) and will give acquirers confidence that products bought from vendors that have the OTTF mark can be trusted. The overall conclusion of the panel was that accreditation of organizations and certifications of products both had a place, and that it is important that the value was understood by buyers and vendors.

A couple of days later, I presented with Mary Ann Davidson, CSO of Oracle. The main point of the talk was to try and give the industry perspective on mandates, legislation and regulations – which all seemed to be focused on technology providers – to solve the cyber security issues which we see every day. We agreed that sometimes regulations make sense but having a clear problem definition, language and limited scope was the path to success and acceptance. We also encouraged government to get involved with industry via public/private partnerships, like The Open Group Trusted Technology Forum.

Collaboration is the key to fighting the cyber security battle. If you are interested in hearing more about ways to get involved in building a safer and more productive computing environment, feel free to contact me or leave a comment on this blog. Cybersecurity is a complicated issue and there were well over 20,000 security professionals discussing it at RSA Conference. We’d love to hear your views as well.

 This blog post was originally published on the CA Technologies blog.


joshJoshua Brickman, PMP (Project Management Professional), runs CA Technologies Federal Certifications Program. He has led CA through the successful evaluation of sixteen products through the Common Criteria over the last six years (in both the U.S. and Canada). He is also a Steering Committee member on The Open Group consortium focused on Supply Chain Integrity and Security, The Open Group Trusted Technology Forum (OTTF). He also runs CA Technologies Accessibility Program. 

1 Comment

Filed under OTTF

Beyond Big Data

By Chris Harding, The Open Group

The big bang that started The Open Group Conference in Newport Beach was, appropriately, a presentation related to astronomy. Chris Gerty gave a keynote on Big Data at NASA, where he is Deputy Program Manager of the Open Innovation Program. He told us how visualizing deep space and its celestial bodies created understanding and enabled new discoveries. Everyone who attended felt inspired to explore the universe of Big Data during the rest of the conference. And that exploration – as is often the case with successful space missions – left us wondering what lies beyond.

The Big Data Conference Plenary

The second presentation on that Monday morning brought us down from the stars to the nuts and bolts of engineering. Mechanical devices require regular maintenance to keep functioning. Processing the mass of data generated during their operation can improve safety and cut costs. For example, airlines can overhaul aircraft engines when it needs doing, rather than on a fixed schedule that has to be frequent enough to prevent damage under most conditions, but might still fail to anticipate failure in unusual circumstances. David Potter and Ron Schuldt lead two of The Open Group initiatives, Quantum Lifecycle management (QLM) and the Universal Data Element Framework (UDEF). They explained how a semantic approach to product lifecycle management can facilitate the big-data processing needed to achieve this aim.

Chris Gerty was then joined by Andras Szakal, vice-president and chief technology officer at IBM US Federal IMT, Robert Weisman, chief executive officer of Build The Vision, and Jim Hietala, vice-president of Security at The Open Group, in a panel session on Big Data that was moderated by Dana Gardner of Interarbor Solutions. As always, Dana facilitated a fascinating discussion. Key points made by the panelists included: the trend to monetize data; the need to ensure veracity and usefulness; the need for security and privacy; the expectation that data warehouse technology will exist and evolve in parallel with map/reduce “on-the-fly” analysis; the importance of meaningful presentation of the data; integration with cloud and mobile technology; and the new ways in which Big Data can be used to deliver business value.

More on Big Data

In the afternoons of Monday and Tuesday, and on most of Wednesday, the conference split into streams. These have presentations that are more technical than the plenary, going deeper into their subjects. It’s a pity that you can’t be in all the streams at once. (At one point I couldn’t be in any of them, as there was an important side meeting to discuss the UDEF, which is in one of the areas that I support as forum director). Fortunately, there were a few great stream presentations that I did manage to get to.

On the Monday afternoon, Tom Plunkett and Janet Mostow of Oracle presented a reference architecture that combined Hadoop and NoSQL with traditional RDBMS, streaming, and complex event processing, to enable Big Data analysis. One application that they described was to trace the relations between particular genes and cancer. This could have big benefits in disease prediction and treatment. Another was to predict the movements of protesters at a demonstration through analysis of communications on social media. The police could then concentrate their forces in the right place at the right time.

Jason Bloomberg, president of Zapthink – now part of Dovel – is always thought-provoking. His presentation featured the need for governance vitality to cope with ever changing tools to handle Big Data of ever increasing size, “crowdsourcing” to channel the efforts of many people into solving a problem, and business transformation that is continuous rather than a one-time step from “as is” to “to be.”

Later in the week, I moderated a discussion on Architecting for Big Data in the Cloud. We had a well-balanced panel made up of TJ Virdi of Boeing, Mark Skilton of Capgemini and Tom Plunkett of Oracle. They made some excellent points. Big Data analysis provides business value by enabling better understanding, leading to better decisions. The analysis is often an iterative process, with new questions emerging as answers are found. There is no single application that does this analysis and provides the visualization needed for understanding, but there are a number of products that can be used to assist. The role of the data scientist in formulating the questions and configuring the visualization is critical. Reference models for the technology are emerging but there are as yet no commonly-accepted standards.

The New Enterprise Platform

Jogging is a great way of taking exercise at conferences, and I was able to go for a run most mornings before the meetings started at Newport Beach. Pacific Coast Highway isn’t the most interesting of tracks, but on Tuesday morning I was soon up in Castaways Park, pleasantly jogging through the carefully-nurtured natural coastal vegetation, with views over the ocean and its margin of high-priced homes, slipways, and yachts. I reflected as I ran that we had heard some interesting things about Big Data, but it is now an established topic. There must be something new coming over the horizon.

The answer to what this might be was suggested in the first presentation of that day’s plenary, Mary Ann Mezzapelle, security strategist for HP Enterprise Services, talked about the need to get security right for Big Data and the Cloud. But her scope was actually wider. She spoke of the need to secure the “third platform” – the term coined by IDC to describe the convergence of social, cloud and mobile computing with Big Data.

Securing Big Data

Mary Ann’s keynote was not about the third platform itself, but about what should be done to protect it. The new platform brings with it a new set of security threats, and the increasing scale of operation makes it increasingly important to get the security right. Mary Ann presented a thoughtful analysis founded on a risk-based approach.

She was followed by Adrian Lane, chief technology officer at Securosis, who pointed out that Big Data processing using NoSQL has a different architecture from traditional relational data processing, and requires different security solutions. This does not necessarily mean new techniques; existing techniques can be used in new ways. For example, Kerberos may be used to secure inter-node communications in map/reduce processing. Adrian’s presentation completed the Tuesday plenary sessions.

Service Oriented Architecture

The streams continued after the plenary. I went to the Distributed Services Architecture stream, which focused on SOA.

Bill Poole, enterprise architect at JourneyOne in Australia, described how to use the graphical architecture modeling language ArchiMate® to model service-oriented architectures. He illustrated this using a case study of a global mining organization that wanted to consolidate its two existing bespoke inventory management applications into a single commercial off-the-shelf application. It’s amazing how a real-world case study can make a topic come to life, and the audience certainly responded warmly to Bill’s excellent presentation.

Ali Arsanjani, chief technology officer for Business Performance and Service Optimization, and Heather Kreger, chief technology officer for International Standards, both at IBM, described the range of SOA standards published by The Open Group and available for use by enterprise architects. Ali was one of the brains that developed the SOA Reference Architecture, and Heather is a key player in international standards activities for SOA, where she has helped The Open Group’s Service Integration Maturity Model and SOA Governance Framework to become international standards, and is working on an international standard SOA reference architecture.

Cloud Computing

To start Wednesday’s Cloud Computing streams, TJ Virdi, senior enterprise architect at The Boeing Company, discussed use of TOGAF® to develop an Enterprise Architecture for a Cloud ecosystem. A large enterprise such as Boeing may use many Cloud service providers, enabling collaboration between corporate departments, partners, and regulators in a complex ecosystem. Architecting for this is a major challenge, and The Open Group’s TOGAF for Cloud Ecosystems project is working to provide guidance.

Stuart Boardman of KPN gave a different perspective on Cloud ecosystems, with a case study from the energy industry. An ecosystem may not necessarily be governed by a single entity, and the participants may not always be aware of each other. Energy generation and consumption in the Netherlands is part of a complex international ecosystem involving producers, consumers, transporters, and traders of many kinds. A participant may be involved in several ecosystems in several ways: a farmer for example, might consume energy, have wind turbines to produce it, and also participate in food production and transport ecosystems.

Penelope Gordon of 1-Plug Corporation explained how choice and use of business metrics can impact Cloud service providers. She worked through four examples: a start-up Software-as-a-Service provider requiring investment, an established company thinking of providing its products as cloud services, an IT department planning to offer an in-house private Cloud platform, and a government agency seeking budget for government Cloud.

Mark Skilton, director at Capgemini in the UK, gave a presentation titled “Digital Transformation and the Role of Cloud Computing.” He covered a very broad canvas of business transformation driven by technological change, and illustrated his theme with a case study from the pharmaceutical industry. New technology enables new business models, giving competitive advantage. Increasingly, the introduction of this technology is driven by the business, rather than the IT side of the enterprise, and it has major challenges for both sides. But what new technologies are in question? Mark’s presentation had Cloud in the title, but also featured social and mobile computing, and Big Data.

The New Trend

On Thursday morning I took a longer run, to and round Balboa Island. With only one road in or out, its main street of shops and restaurants is not a through route and the island has the feel of a real village. The SOA Work Group Steering Committee had found an excellent, and reasonably priced, Italian restaurant there the previous evening. There is a clear resurgence of interest in SOA, partly driven by the use of service orientation – the principle, rather than particular protocols – in Cloud Computing and other new technologies. That morning I took the track round the shoreline, and was reminded a little of Dylan Thomas’s “fishing boat bobbing sea.” Fishing here is for leisure rather than livelihood, but I suspected that the fishermen, like those of Thomas’s little Welsh village, spend more time in the bar than on the water.

I thought about how the conference sessions had indicated an emerging trend. This is not a new technology but the combination of four current technologies to create a new platform for enterprise IT: Social, Cloud, and Mobile computing, and Big Data. Mary Ann Mezzapelle’s presentation had referenced IDC’s “third platform.” Other discussions had mentioned Gartner’s “Nexus of forces,” the combination of Social, Cloud and Mobile computing with information that Gartner says is transforming the way people and businesses relate to technology, and will become a key differentiator of business and technology management. Mark Skilton had included these same four technologies in his presentation. Great minds, and analyst corporations, think alike!

I thought also about the examples and case studies in the stream presentations. Areas as diverse as healthcare, manufacturing, energy and policing are using the new technologies. Clearly, they can deliver major business benefits. The challenge for enterprise architects is to maximize those benefits through pragmatic architectures.

Emerging Standards

On the way back to the hotel, I remarked again on what I had noticed before, how beautifully neat and carefully maintained the front gardens bordering the sidewalk are. I almost felt that I was running through a public botanical garden. Is there some ordinance requiring people to keep their gardens tidy, with severe penalties for anyone who leaves a lawn or hedge unclipped? Is a miserable defaulter fitted with a ball and chain, not to be removed until the untidy vegetation has been properly trimmed, with nail clippers? Apparently not. People here keep their gardens tidy because they want to. The best standards are like that: universally followed, without use or threat of sanction.

Standards are an issue for the new enterprise platform. Apart from the underlying standards of the Internet, there really aren’t any. The area isn’t even mapped out. Vendors of Social, Cloud, Mobile, and Big Data products and services are trying to stake out as much valuable real estate as they can. They have no interest yet in boundaries with neatly-clipped hedges.

This is a stage that every new technology goes through. Then, as it matures, the vendors understand that their products and services have much more value when they conform to standards, just as properties have more value in an area where everything is neat and well-maintained.

It may be too soon to define those standards for the new enterprise platform, but it is certainly time to start mapping out the area, to understand its subdivisions and how they inter-relate, and to prepare the way for standards. Following the conference, The Open Group has announced a new Forum, provisionally titled Open Platform 3.0, to do just that.

The SOA and Cloud Work Groups

Thursday was my final day of meetings at the conference. The plenary and streams presentations were done. This day was for working meetings of the SOA and Cloud Work Groups. I also had an informal discussion with Ron Schuldt about a new approach for the UDEF, following up on the earlier UDEF side meeting. The conference hallways, as well as the meeting rooms, often see productive business done.

The SOA Work Group discussed a certification program for SOA professionals, and an update to the SOA Reference Architecture. The Open Group is working with ISO and the IEEE to define a standard SOA reference architecture that will have consensus across all three bodies.

The Cloud Work Group had met earlier to further the TOGAF for Cloud ecosystems project. Now it worked on its forthcoming white paper on business performance metrics. It also – though this was not on the original agenda – discussed Gartner’s Nexus of Forces, and the future role of the Work Group in mapping out the new enterprise platform.

Mapping the New Enterprise Platform

At the start of the conference we looked at how to map the stars. Big Data analytics enables people to visualize the universe in new ways, reach new understandings of what is in it and how it works, and point to new areas for future exploration.

As the conference progressed, we found that Big Data is part of a convergence of forces. Social, mobile, and Cloud Computing are being combined with Big Data to form a new enterprise platform. The development of this platform, and its roll-out to support innovative applications that deliver more business value, is what lies beyond Big Data.

At the end of the conference we were thinking about mapping the new enterprise platform. This will not require sophisticated data processing and analysis. It will take discussions to create a common understanding, and detailed committee work to draft the guidelines and standards. This work will be done by The Open Group’s new Open Platform 3.0 Forum.

The next Open Group conference is in the week of April 15, in Sydney, Australia. I’m told that there’s some great jogging there. More importantly, we’ll be reflecting on progress in mapping Open Platform 3.0, and thinking about what lies ahead. I’m looking forward to it already.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Conference

Open Group Panel Explores Changing Field of Risk Management and Analysis in the Era of Big Data

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: The Open Group Panel Explores Changing Field of Risk Management and Analysis in Era of Big Data

This is a transcript of a sponsored podcast discussion on the threats from and promise of Big Data in securing enterprise information assets in conjunction with the The Open Group Conference in Newport Beach.

Dana Gardner: Hello, and welcome to a special thought leadership interview series coming to you in conjunction with The Open Group Conference on January 28 in Newport Beach, California.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these business transformation discussions. The conference itself is focusing on Big Data the transformation we need to embrace today.

We’re here now with a panel of experts to explore new trends and solutions in the area of risk management and analysis. We’ll learn how large enterprises are delivering risk assessments and risk analysis, and we’ll see how Big Data can be both an area to protect from in form of risks, but also as a tool for better understanding and mitigating risks.

With that, please join me in welcoming our panel. We’re here with Jack Freund, PhD, the Information Security Risk Assessment Manager at TIAA-CREF. Welcome, Jack.

Jack Freund: Hello Dana, how are you?

Gardner: I’m great. Glad you could join us.

We are also here with Jack Jones, Principal of CXOWARE. He has more than nine years of experience as a Chief Information Security Officer, is the inventor of the Factor Analysis Information Risk (FAIR) framework. Welcome, Jack.

Jack Jones: Thank you. And we’re also here with Jim Hietala, Vice President, Security for The Open Group. Welcome, Jim.

Jim Hietala: Thanks, Dana.

Gardner: All right, let’s start out with looking at this from a position of trends. Why is the issue of risk analysis so prominent now? What’s different from, say, five years ago? And we’ll start with you, Jack Jones.

Jones: The information security industry has struggled with getting the attention of and support from management and businesses for a long time, and it has finally come around to the fact that the executives care about loss exposure — the likelihood of bad things happening and how bad those things are likely to be.

It’s only when we speak of those terms or those issues in terms of risk, that we make sense to those executives. And once we do that, we begin to gain some credibility and traction in terms of getting things done.

Gardner: So we really need to talk about this in the terms that a business executive would appreciate, not necessarily an IT executive.

Effects on business

Jones: Absolutely. They’re tired of hearing about vulnerabilities, hackers, and that sort of thing. It’s only when we can talk in terms of the effect on the business that it makes sense to them.

Gardner: Jack Freund, I should also point out that you have more than 14 years in enterprise IT experience. You’re a visiting professor at DeVry University and you chair a risk-management subcommittee for ISACA? Is that correct?

Freund: ISACA, yes.

Gardner: And do you agree?

Freund: The problem that we have as a profession, and I think it’s a big problem, is that we have allowed ourselves to escape the natural trend that the other IT professionals have already taken.

There was a time, years ago, when you could code in the basement, and nobody cared much about what you were doing. But now, largely speaking, developers and systems administrators are very focused on meeting the goals of the organization.

Security has been allowed to miss that boat a little. We have been allowed to hide behind this aura of a protector and of an alerter of terrible things that could happen, without really tying ourselves to the problem that the organizations are facing and how can we help them succeed in what they’re doing.

Gardner: Jim Hietala, how do you see things that are different now than a few years ago when it comes to risk assessment?

Hietala: There are certainly changes on the threat side of the landscape. Five years ago, you didn’t really have hacktivism or this notion of an advanced persistent threat (APT).

That highly skilled attacker taking aim at governments and large organizations didn’t really exist -– or didn’t exist to the degree it does today. So that has changed.

You also have big changes to the IT platform landscape, all of which bring new risks that organizations need to really think about. The mobility trend, the Cloud trend, the big-data trend that we are talking about today, all of those things bring new risk to the organization.

As Jack Jones mentioned, business executives don’t want to hear about, “I’ve got 15 vulnerabilities in the mobility part of my organization.” They want to understand what’s the risk of bad things happening because of mobility, what we’re doing about it, and what’s happening to risk over time?

So it’s a combination of changes in the threats and attackers, as well as just changes to the IT landscape, that we have to take a different look at how we measure and present risk to the business.

Gardner: Because we’re at a big-data conference, do you share my perception, Jack Jones, that Big Data can be a source of risk and vulnerability, but also the analytics and the business intelligence (BI) tools that we’re employing with Big Data can be used to alert you to risks or provide a strong tool for better understanding your true risk setting or environment.

Crown jewels

Jones: You are absolutely right. You think of Big Data and, by definition, it’s where your crown jewels, and everything that leads to crown jewels from an information perspective, are going to be found. It’s like one-stop shopping for the bad guy, if you want to look at it in that context. It definitely needs to be protected. The architecture surrounding it and its integration across a lot of different platforms and such, can be leveraged and probably result in a complex landscape to try and secure.

There are a lot of ways into that data and such, but at least if you can leverage that same Big Data architecture, it’s an approach to information security. With log data and other threat and vulnerability data and such, you should be able to make some significant gains in terms of how well-informed your analyses and your decisions are, based on that data.

Gardner: Jack Freund, do you share that? How does Big Data fit into your understanding of the evolving arena of risk assessment and analysis?

Freund: If we fast-forward it five years, and this is even true today, a lot of people on the cutting edge of Big Data will tell you the problem isn’t so much building everything together and figuring out what it can do. They are going to tell you that the problem is what we do once we figure out everything that we have. This is the problem that we have traditionally had on a much smaller scale in information security. When everything is important, nothing is important.

Gardner: To follow up on that, where do you see the gaps in risk analysis in large organizations? In other words, what parts of organizations aren’t being assessed for risk and should be?

Freund: The big problems that exist largely today in the way that risk assessments are done, is the focus on labels. We want to quickly address the low, medium, and high things and know where they are. But the problem is that there are inherent problems in the way that we think about those labels, without doing any of the analysis legwork.

I think that’s what’s really missing is that true analysis. If the system goes offline, do we lose money? If the system becomes compromised, what are the cost-accounting things that will happen that allow us to figure out how much money we’re going to lose.

That analysis work is largely missing. That’s the gap. The gap is if the control is not in place, then there’s a risk that must be addressed in some fashion. So we end up with these very long lists of horrible, terrible things that can be done to us in all sorts of different ways, without any relevance to the overall business of the organization.

Every day, our organizations are out there selling products, offering services, which is and of itself, its own risky venture. So tying what we do from an information security perspective to that is critical for not just the success of the organization, but the success of our profession.

Gardner: So we can safely say that large companies are probably pretty good at a cost-benefit analysis or they wouldn’t be successful. Now, I guess we need to ask them to take that a step further and do a cost-risk analysis, but in business terms, being mindful that their IT systems might be a much larger part of that than they had at once considered. Is that fair, Jack?

Risk implications

Jones: Businesses have been making these decisions, chasing the opportunity, but generally, without any clear understanding of the risk implications, at least from the information security perspective. They will have us in the corner screaming and throwing red flags in there, and talking about vulnerabilities and threats from one thing or another.

But, we come to the table with red, yellow, and green indicators, and on the other side of the table, they’ve got numbers. Well, here is what we expect to earn in revenue from this initiative, and the information security people are saying it’s crazy. How do you normalize the quantitative revenue gain versus red, yellow, and green?

Gardner: Jim Hietala, do you see it in the same red, yellow, green or are there some other frameworks or standard methodologies that The Open Group is looking at to make this a bit more of a science?

Hietala: Probably four years ago, we published what we call the Risk Taxonomy Standard which is based upon FAIR, the management framework that Jack Jones invented. So, we’re big believers in bringing that level of precision to doing risk analysis. Having just gone through training for FAIR myself, as part of the standards effort that we’re doing around certification, I can say that it really brings a level of precision and a depth of analysis to risk analysis that’s been lacking frequently in IT security and risk management.

Gardner: We’ve talked about how organizations need to be mindful that their risks are higher and different than in the past and we’ve talked about how standardization and methodologies are important, helping them better understand this from a business perspective, instead of just a technology perspective.

But, I’m curious about a cultural and organizational perspective. Whose job should this fall under? Who is wearing the white hat in the company and can rally the forces of good and make all the bad things managed? Is this a single person, a cultural, an organizational mission? How do you make this work in the enterprise in a real-world way? Let’s go to you, Jack Freund.

Freund: The profession of IT risk management is changing. That profession will have to sit between the business and information security inclusive of all the other IT functions that make that happen.

In order to be successful sitting between these two groups, you have to be able to speak the language of both of those groups. You have to be able to understand profit and loss and capital expenditure on the business side. On the IT risk side, you have to be technical enough to do all those sorts of things.

But I think the sum total of those two things is probably only about 50 percent of the job of IT risk management today. The other 50 percent is communication. Finding ways to translate that language and to understand the needs and concerns of each side of that relationship is really the job of IT risk management.

To answer your question, I think it’s absolutely the job of IT risk management to do that. From my own experiences with the FAIR framework, I can say that using FAIR is the Rosetta Stone for speaking between those two groups.

Necessary tools

It gives you the tools necessary to speak in the insurance and risk terms that business appreciate. And it gives you the ability to be as technical and just nerdy, if you will, as you need to be in order to talk to IT security and the other IT functions in order to make sure everybody is on the same page and everyone feels like their concerns are represented in the risk-assessment functions that are happening.

Gardner: Jack Jones, can you add to that?

Jones: I agree with what Jack said wholeheartedly. I would add, though, that integration or adoption of something like this is a lot easier the higher up in the organization you go.

For CFOs traditionally, their neck is most clearly on the line for risk-related issues within most organizations. At least in my experience, if you get their ear on this and present the information security data analyses to them, they jump on board, they drive it through the organization, and it’s just brain-dead easy.

If you try to drive it up through the ranks, maybe you get an enthusiastic supporter in the information security organization, especially if it’s below the CISO level, and they try a grassroots sort of effort to bring it in, it’s a tougher thing. It can still work. I’ve seen it work very well, but, it’s a longer row to hoe.

Gardner: There have been a lot of research, studies, and surveys on data breaches. What are some of the best sources, or maybe not so good sources, for actually measuring this? How do you know if you’re doing it right? How do you know if you’re moving from yellow to green, instead of to red? To you, Jack Freund.

Freund: There are a couple of things in that question. The first is there’s this inherent assumption in a lot of organizations that we need to move from yellow to green, and that may not be the case. So, becoming very knowledgeable about the risk posture and the risk tolerance of the organization is a key.

That’s part of the official mindset of IT security. When you graduate an information security person today, they are minted knowing that there are a lot of bad things out there, and their goal in life is to reduce them. But, that may not be the case. The case may very well be that things are okay now, but we have bigger things to fry over here that we’re going to focus on. So, that’s one thing.

The second thing, and it’s a very good question, is how we know that we’re getting better? How do we trend that over time? Overall, measuring that value for the organization has to be able to show a reduction of a risk or at least reduction of risk to the risk-tolerance levels of the organization.

Calculating and understanding that requires something that I always phrase as we have to become comfortable with uncertainty. When you are talking about risk in general, you’re talking about forward-looking statements about things that may or may not happen. So, becoming comfortable with the fact that they may or may not happen means that when you measure them today, you have to be willing to be a little bit squishy in how you’re representing that.

In FAIR and in other academic works, they talk about using ranges to do that. So, things like high, medium, and low, could be represented in terms of a minimum, maximum, and most likely. And that tends to be very, very effective. People can respond to that fairly well.

Gathering data

Jones: With regard to the data sources, there are a lot of people out there doing these sorts of studies, gathering data. The problem that’s hamstringing that effort is the lack of a common set of definitions, nomenclature, and even taxonomy around the problem itself.

You will have one study that will have defined threat, vulnerability, or whatever differently from some other study, and so the data can’t be normalized. It really harms the utility of it. I see data out there and I think, “That looks like that can be really useful.” But, I hesitate to use it because I don’t understand. They don’t publish their definitions, approach, and how they went after it.

There’s just so much superficial thinking in the profession on this that we now have dug under the covers. Too often, I run into stuff that just can’t be defended. It doesn’t make sense, and therefore the data can’t be used. It’s an unfortunate situation.

I do think we’re heading in a positive direction. FAIR can provide a normalizing structure for that sort of thing. The VERIS framework, which by the way, is also derived in part from FAIR, also has gained real attraction in terms of the quality of the research they have done and the data they’re generating. We’re headed in the right direction, but we’ve got a long way to go.

Gardner: Jim Hietala, we’re seemingly looking at this on a company-by-company basis. But, is there a vertical industry slice or industry-wide slice where we could look at what’s happening to everyone and put some standard understanding, or measurement around what’s going on in the overall market, maybe by region, maybe by country?

Hietala: There are some industry-specific initiatives and what’s really needed, as Jack Jones mentioned, are common definitions for things like breach, exposure, loss, all those, so that the data sources from one organization can be used in another, and so forth. I think about the financial services industry. I know that there is some information sharing through an organization called the FS-ISAC about what’s happening to financial services organizations in terms of attacks, loss, and those sorts of things.

There’s an opportunity for that on a vertical-by-vertical basis. But, like Jack said, there is a long way to go on that. In some industries, healthcare for instance, you are so far from that, it’s ridiculous. In the US here, the HIPAA security rule says you must do a risk assessment. So, hospitals have done annual risk assessments, will stick the binder on the shelf, and they don’t think much about information security in between those annual risk assessments. That’s a generalization, but various industries are at different places on a continuum of maturity of their risk management approaches.

Gardner: As we get better with having a common understanding of the terms and the measurements and we share more data, let’s go back to this notion of how to communicate this effectively to those people that can use it and exercise change management as a result. That could be the CFO, the CEO, what have you, depending on the organization.

Do you have any examples? Can we look to an organization that’s done this right, and examine their practices, the way they’ve communicated it, some of the tools they’ve used and say, “Aha, they’re headed in the right direction maybe we could follow a little bit.” Let’s start with you, Jack Freund.

Freund: I have worked and consulted for various organizations that have done risk management at different levels. The ones that have embraced FAIR tend to be the ones that overall feel that risk is an integral part of their business strategy. And I can give a couple of examples of scenarios that have played out that I think have been successful in the way they have been communicated.

Coming to terms

The key to keep in mind with this is that one of the really important things is that when you’re a security professional, you’re again trained to feel like you need results. But, the results for the IT risk management professional are different. The results are “I’ve communicated this effectively, so I am done.” And then whatever the results are, are the results that needed to be. And that’s a really hard thing to come to terms with.

I’ve been involved in large-scale efforts to assess risk for a Cloud venture. We needed to move virtually every confidential record that we have to the Cloud in order to be competitive with the rest of our industry. If our competitors are finding ways to utilize the Cloud before us, we can lose out. So, we need to find a way to do that, and to be secure and compliant with all the laws and regulations and such.

Through that scenario, one of the things that came out was that key ownership became really, really important. We had the opportunity to look at the various control structures and we analyzed them using FAIR. What we ended up with was sort of a long-tail risk. Most people will probably do their job right over a long enough period of time. But, over that same long period of time, the odds of somebody making a mistake not in your favor are probably likely, but, not significantly enough so that you can’t make the move.

But, the problem became that the loss side, the side that typically gets ignored with traditional risk-assessment methodologies, was so significant that the organization needed to make some judgment around that, and they needed to have a sense of what we needed to do in order to minimize that.

That became a big point of discussion for us and it drove the conversation away from bad things could happen. We didn’t bury the lead. The lead was that this is the most important thing to this organization in this particular scenario.

So, let’s talk about things we can do. Are we comfortable with it? Do we need to make any sort of changes? What are some control opportunities? How much do they cost? This is a significantly more productive conversation than just, “Here is a bunch of bad things that happen. I’m going to cross my arms and say no.”

Gardner: Jack Jones, examples at work?

Jones: In an organization that I’ve been working with recently, their board of directors said they wanted a quantitative view of information security risk. They just weren’t happy with the red, yellow, green. So, they came to us, and there were really two things that drove them there. One was that they were looking at cyber insurance. They wanted to know how much cyber insurance they should take out, and how do you figure that out when you’ve got a red, yellow, green scale?

They were able to do a series of analyses on a population of the scenarios that they thought were relevant in their world, get an aggregate view of their annualized loss exposure, and make a better informed decision about that particular problem.

Gardner: I’m curious how prevalent cyber insurance is, and is that going to be a leveling effect in the industry where people speak a common language the equivalent of actuarial tables, but for security in enterprise and cyber security?

Jones: One would dream and hope, but at this point, what I’ve seen out there in terms of the basis on which insurance companies are setting their premiums and such is essentially the same old “risk assessment” stuff that the industry has been doing poorly for years. It’s not based on data or any real analysis per se, at least what I’ve run into. What they do is set their premiums high to buffer themselves and typically cover as few things as possible. The question of how much value it’s providing the customers becomes a problem.

Looking to the future

Gardner: We’re coming up on our time limit. So, let’s quickly look to the future. Is there such thing as risk management as a service? Can we outsource this? Is there a way in which moving more of IT into Cloud or hybrid models would mitigate risk, because the Cloud provider would standardize? Then, many players in that environment, those who were buying those services, would be under that same umbrella? Let’s start with you Jim Hietala. What’s the future of this and what do the Cloud trends bring to the table?

Hietala: I’d start with a maxim that comes out of the financial services industry, which is that you can outsource the function, but you still own the risk. That’s an unfortunate reality. You can throw things out in the Cloud, but it doesn’t absolve you from understanding your risk and then doing things to manage it to transfer it if there’s insurance or whatever the case may be.

That’s just a reality. Organizations in the risky world we live in are going to have to get more serious about doing effective risk analysis. From The Open Group standpoint, we see this as an opportunity area.

As I mentioned, we’ve standardized the taxonomy piece of FAIR. And we really see an opportunity around the profession going forward to help the risk-analysis community by further standardizing FAIR and launching a certification program for a FAIR-certified risk analyst. That’s in demand from large organizations that are looking for evidence that people understand how to apply FAIR and use it in doing risk analyses.

Gardner: Jack Freund, looking into your crystal ball, how do you see this discipline evolving?

Freund: I always try to consider things as they exist within other systems. Risk is a system of systems. There are a series of pressures that are applied, and a series of levers that are thrown in order to release that sort of pressure.

Risk will always be owned by the organization that is offering that service. If we decide at some point that we can move to the Cloud and all these other things, we need to look to the legal system. There is a series of pressures that they are going to apply, and who is going to own that, and how that plays itself out.

If we look to the Europeans and the way that they’re managing risk and compliance, they’re still as strict as we in United States think that they may be about things, but there’s still a lot of leeway in a lot of the ways that laws are written. You’re still being asked to do things that are reasonable. You’re still being asked to do things that are standard for your industry. But, we’d still like the ability to know what that is, and I don’t think that’s going to go away anytime soon.

Judgment calls

We’re still going to have to make judgment calls. We’re still going to have to do 100 things with a budget for 10 things. Whenever that happens, you have to make a judgment call. What’s the most important thing that I care about? And that’s why risk management exists, because there’s a certain series of things that we have to deal with. We don’t have the resources to do them all, and I don’t think that’s going to change over time. Regardless of whether the landscape changes, that’s the one that remains true.

Gardner: The last word to you, Jack Jones. It sounds as if we’re continuing down the path of being mostly reactive. Is there anything you can see on the horizon that would perhaps tip the scales, so that the risk management and analysis practitioners can really become proactive and head things off before they become a big problem?

Jones: If we were to take a snapshot at any given point in time of an organization’s loss exposure, how much risk they have right then, that’s a lagging indicator of the decisions they’ve made in the past, and their ability to execute against those decisions.

We can do some great root-cause analysis around that and ask how we got there. But, we can also turn that coin around and ask how good we are at making well-informed decisions, and then executing against them, the asking what that implies from a risk perspective downstream.

If we understand the relationship between our current state, and past and future states, we have those linkages defined, especially, if we have an analytic framework underneath it. We can do some marvelous what-if analysis.

What if this variable changed in our landscape? Let’s run a few thousand Monte Carlo simulations against that and see what comes up. What does that look like? Well, then let’s change this other variable and then see which combination of dials, when we turn them, make us most robust to change in our landscape.

But again, we can’t begin to get there, until we have this foundational set of definitions, frameworks, and such to do that sort of analysis. That’s what we’re doing with FAIR, but without some sort of framework like that, there’s no way you can get there.

Gardner: I am afraid we’ll have to leave it there. We’ve been talking with a panel of experts on how new trends and solutions are emerging in the area of risk management and analysis. And we’ve seen how new tools for communication and using Big Data to understand risks are also being brought to the table.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference in Newport Beach, California. I’d like to thank our panel: Jack Freund, PhD, Information Security Risk Assessment Manager at TIAA-CREF. Thanks so much Jack.

Freund: Thank you, Dana.

Gardner: We’ve also been speaking with Jack Jones, Principal at CXOWARE.

Jones: Thank you. Thank you, pleasure to be here.

Gardner: And last, Jim Hietala, the Vice President for Security at The Open Group. Thanks.

Hietala: Thanks, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions; your host and moderator through these thought leadership interviews. Thanks again for listening and come back next time.

Comments Off

Filed under Security Architecture

Protecting Data is Good. Protecting Information Generated from Big Data is Priceless

By E.G. Nadhan, HP

This was the key message that came out of The Open Group® Big Data Security Tweet Jam on Jan 22 at 9:00 a.m. PT, which addressed several key questions centered on Big Data and security. Here is my summary of the observations made in the context of these questions.

Q1. What is Big Data security? Is it different from data security?

Big data security is more about information security. It is typically external to the corporate perimeter. IT is not prepared today to adequately monitor its sheer volume in brontobytes of data. The time period of long-term storage could violate compliance mandates. Note that storing Big Data in the Cloud changes the game with increased risks of leaks, loss, breaches.

Information resulting from the analysis of the data is even more sensitive and therefore, higher risk – especially when it is Personally Identifiable Information on the Internet of devices requiring a balance between utility and privacy.

At the end of the day, it is all about governance or as they say, “It’s the data, stupid! Govern it.”

Q2. Any thoughts about security systems as producers of Big Data, e.g., voluminous systems logs?

Data gathered from information security logs is valuable but rules for protecting it are the same. Security logs will be a good source to detect patterns of customer usage.

Q3. Most BigData stacks have no built in security. What does this mean for securing Big Data?

There is an added level of complexity because it goes across apps, network plus all end points. Having standards to establish identity, metadata, trust would go a long way. The quality of data could also be a security issue — has it been tampered with, are you being gamed etc. Note that enterprises have varying needs of security around their business data.

Q4. How is the industry dealing with the social and ethical uses of consumer data gathered via Big Data?

Big Data is still nascent and ground rules for handling the information are yet to be established. Privacy issue will be key when companies market to consumers. Organizations are seeking forgiveness rather than permission. Regulatory bodies are getting involved due to consumer pressure. Abuse of power from access to big data is likely to trigger more incentives to attack or embarrass. Note that ‘abuse’ to some is just business to others.

Q5. What lessons from basic data security and cloud security can be implemented in Big Data security?

Security testing is even more vital for Big Data. Limit access to specific devices, not just user credentials. Don’t assume security via obscurity for sensors producing bigdata inputs – they will be targets.

Q6. What are some best practices for securing Big Data? What are orgs doing now and what will organizations be doing 2-3 years from now?

Current best practices include:

  • Treat Big Data as your most valuable asset
  • Encrypt everything by default, proper key management, enforcement of policies, tokenized logs
  • Ask your Cloud and Big Data providers the right questions – ultimately, YOU are responsible for security
  • Assume data needs verification and cleanup before it is used for decisions if you are unable to establish trust with data source

Future best practices:

  • Enterprises treat Information like data today and will respect it as the most valuable asset in the future
  • CIOs will eventually become Chief Officer for Information

Q7. We’re nearing the end of today’s tweet tam. Any last thoughts on Big Data security?

Adrian Lane who participated in the tweet jam will be keynoting at The Open Group Conference in Newport Beach next week and wrote a good best practices paper on securing Big Data.

I have been part of multiple tweet chats specific to security as well as one on Information Optimization. Recently, I also conducted the first Open Group Web Jam internal to The Cloud Work Group.  What I liked about this Big Data Security Tweet Jam is that it brought two key domains together highlighting the intersection points. There was great contribution from subject matter experts forcing participants to think about one domain in the context of the other.

In a way, this post is actually synthesizing valuable information from raw data in the tweet messages – and therefore needs to be secured!

What are your thoughts on the observations made in this tweet jam? What measures are you taking to secure Big Data in your enterprise?

I really enjoyed this tweet jam and would strongly encourage you to actively participate in upcoming tweet jams hosted by The Open Group.  You get to interact with a wide spectrum of knowledgeable practitioners listed in this summary post.

NadhanHP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has more than 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project, and is also the founding co-chair for the Open Group Cloud Computing Governance project. Connect with Nadhan on: Twitter, Facebook, LinkedIn and Journey Blog.

 

2 Comments

Filed under Tweet Jam

#ogChat Summary – Big Data and Security

By Patty Donovan, The Open Group

The Open Group hosted a tweet jam (#ogChat) to discuss Big Data security. In case you missed the conversation, here is a recap of the event.

The Participants

A total of 18 participants joined in the hour-long discussion, including:

Q1 What is #BigData #security? Is it different from #data security? #ogChat

Participants seemed to agree that while Big Data security is similar to data security, it is more extensive. Two major factors to consider: sensitivity and scalability.

  • @dustinkirkland At the core it’s the same – sensitive data – but the difference is in the size and the length of time this data is being stored. #ogChat
  • @jim_hietala Q1: Applying traditional security controls to BigData environments, which are not just very large info stores #ogChat
  • @TheTonyBradley Q1. The value of analyzing #BigData is tied directly to the sensitivity and relevance of that data–making it higher risk. #ogChat
  • @AdrianLane Q1 Securing #BigData is different. Issues of velocity, scale, elasticity break many existing security products. #ogChat
  • @editingwhiz #Bigdata security is standard information security, only more so. Meaning sampling replaced by complete data sets. #ogchat
  • @Dana_Gardner Q1 Not only is the data sensitive, the analysis from the data is sensitive. Secret. On the QT. Hush, hush. #BigData #data #security #ogChat
    • @Technodad @Dana_Gardner A key point. Much #bigdata will be public – the business value is in cleanup & analysis. Focus on protecting that. #ogChat

Q2 Any thoughts about #security systems as producers of #BigData, e.g., voluminous systems logs? #ogChat

  • Most agreed that security systems should be setting an example for producing secure Big Data environments.
  • @dustinkirkland Q2. They should be setting the example. If the data is deemed important or sensitive, then it should be secured and encrypted. #ogChat
  • @TheTonyBradley Q2. Data is data. Data gathered from information security logs is valuable #BigData, but rules for protecting it are the same. #ogChat
  • @elinormills Q2 SIEM is going to be big. will drive spending. #ogchat #bigdata #security
  • @jim_hietala Q2: Well instrumented IT environments generate lots of data, and SIEM/audit tools will have to be managers of this #BigData #ogchat
  • @dustinkirkland @theopengroup Ideally #bigdata platforms will support #tokenization natively, or else appdevs will have to write it into apps #ogChat

Q3 Most #BigData stacks have no built in #security. What does this mean for securing #BigData? #ogChat

The lack of built-in security hoists a target on the Big Data. While not all enterprise data is sensitive, housing it insecurely runs the risk of compromise. Furthermore, security solutions not only need to be effective, but also scalable as data will continue to get bigger.

  • @elinormills #ogchat big data is one big hacker target #bigdata #security
    • @editingwhiz @elinormills #bigdata may be a huge hacker target, but will hackers be able to process the chaff out of it? THAT takes $$$ #ogchat
    • @elinormills @editingwhiz hackers are innovation leaders #ogchat
    • @editingwhiz @elinormills Yes, hackers are innovation leaders — in security, but not necessarily dataset processing. #eweeknews #ogchat
  • @jim_hietala Q3:There will be a strong market for 3rd party security tools for #BigData – existing security technologies can’t scale #ogchat
  • @TheTonyBradley Q3. When you take sensitive info and store it–particularly in the cloud–you run the risk of exposure or compromise. #ogChat
  • @editingwhiz Not all enterprises have sensitive business data they need to protect with their lives. We’re talking non-regulated, of course. #ogchat
  • @TheTonyBradley Q3. #BigData is sensitive enough. The distilled information from analyzing it is more sensitive. Solutions need to be effective. #ogChat
  • @AdrianLane Q3 It means identifying security products that don’t break big data – i.e. they scale or leverage #BigData #ogChat
    • @dustinkirkland @AdrianLane #ogChat Agreed, this is where certifications and partnerships between the 3rd party and #bigdata vendor are essential.

Q4 How is the industry dealing with the social and ethical uses of consumer data gathered via #BigData? #ogChat #privacy

Participants agreed that the industry needs to improve when it comes to dealing with the social and ethical used of consumer data gathered through Big Data. If the data is easily accessible, hackers will be attracted. No matter what, the cost of a breach is far greater than any preventative solution.

  • @dustinkirkland Q4. #ogChat Sadly, not well enough. The recent Instagram uproar was well publicized but such abuse of social media rights happens every day.
    • @TheTonyBradley @dustinkirkland True. But, they’ll buy the startups, and take it to market. Fortune 500 companies don’t like to play with newbies. #ogChat
    • @editingwhiz Disagree with this: Fortune 500s don’t like to play with newbies. We’re seeing that if the IT works, name recognition irrelevant. #ogchat
    • @elinormills @editingwhiz @thetonybradley ‘hacker’ covers lot of ground, so i would say depends on context. some of my best friends are hackers #ogchat
    • @Technodad @elinormills A core point- data from sensors will drive #bigdata as much as enterprise data. Big security, quality issues there. #ogChat
  • @Dana_Gardner Q4 If privacy is a big issue, hacktivism may crop up. Power of #BigData can also make it socially onerous. #data #security #ogChat
  • @dustinkirkland Q4. The cost of a breach is far greater than the cost (monetary or reputation) of any security solution. Don’t risk it. #ogChat

Q5 What lessons from basic #datasecurity and #cloud #security can be implemented in #BigData security? #ogChat

The principles are the same, just on a larger scale. The biggest risks come from cutting corners due to the size and complexity of the data gathered. As hackers (like Anonymous) get better, so does security regardless of the data size.

  • @TheTonyBradley Q5. Again, data is data. The best practices for securing and protecting it stay the same–just on a more massive #BigData scale. #ogChat
  • @Dana_Gardner Q5 Remember, this is in many ways unchartered territory so expect the unexpected. Count on it. #BigData #data #security #ogChat
  • @NadhanAtHP A5 @theopengroup – Security Testing is even more vital when it comes to #BigData and Information #ogChat
  • @TheTonyBradley Q5. Anonymous has proven time and again that most existing data security is trivial. Need better protection for #BigData. #ogChat

Q6 What are some best practices for securing #BigData? What are orgs doing now, and what will orgs be doing 2-3 years from now? #ogChat

While some argued encrypting everything is the key, and others encouraged pressure on big data providers, most agreed that a multi-step security infrastructure is necessary. It’s not just the data that needs to be secured, but also the transportation and analysis processes.

  • @dustinkirkland Q6. #ogChat Encrypting everything, by default, at least at the fs layer. Proper key management. Policies. Logs. Hopefully tokenized too.
  • @dustinkirkland Q6. #ogChat Ask tough questions of your #cloud or #bigdata provider. Know what they are responsible for and who has access to keys. #ogChat
    • @elinormills Agreed–> @dustinkirkland Q6. #ogChat Ask tough questions of your #cloud or #bigdataprovider. Know what they are responsible for …
  • @Dana_Gardner Q6 Treat most #BigData as a crown jewel, see it as among most valuable assets. Apply commensurate security. #data #security #ogChat
  • @elinormills Q6 govt level crypto minimum, plus protect all endpts #ogchat #bigdata #security
  • @TheTonyBradley Q6. Multi-faceted issue. Must protect raw #BigData, plus processing, analyzing, transporting, and resulting distilled analysis. #ogChat
  • @Technodad If you don’t establish trust with data source, you need to assume data needs verification, cleanup before it is used for decisions. #ogChat

A big thank you to all the participants who made this such a great discussion!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

3 Comments

Filed under Tweet Jam

Questions for the Upcoming Big Data Security Tweet Jam on Jan. 22

By Patty Donovan, The Open Group

Last week, we announced our upcoming tweet jam on Tuesday, January 22 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST, which will examine the impact of Big Data on security and how it will change the security landscape.

Please join us next Tuesday, January 22! The discussion will be moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct. We welcome Open Group members and interested participants from all backgrounds to join the session. Our panel of experts will include:

  • Elinor Mills, former CNET reporter and current director of content and media strategy at Bateman Group (@elinormills)
  • Jaikumar Vijayan, Computerworld (@jaivijayan)
  • Chris Preimesberger, eWEEK (@editingwhiz)
  • Tony Bradley, PC World (@TheTonyBradley)
  • Michael Santarcangelo, Security Catalyst Blog (@catalyst)

The discussion will be guided by these six questions:

  1. What is #BigData security? Is it different from #data #security? #ogChat
  2. Any thoughts about #security systems as producers of #BigData, e.g., voluminous systems logs? #ogChat
  3. Most #BigData stacks have no built in #security. What does this mean for securing BigData? #ogChat
  4. How is the industry dealing with the social and ethical uses of consumer data gathered via #BigData? #ogChat #privacy
  5. What lessons from basic data security and #cloud #security can be implemented in #BigData #security? #ogChat
  6. What are some best practices for securing #BigData? #ogChat

To join the discussion, please follow the #ogChat hashtag during the allotted discussion time. Other hashtags we recommend you use during the event include:

  • Information Security: #InfoSec
  • Security: #security
  • BYOD: #BYOD
  • Big Data: #BigData
  • Privacy: #privacy
  • Mobile: #mobile
  • Compliance: #compliance

For more information about the tweet jam, guidelines and general background information, please visit our previous blog post: http://blog.opengroup.org/2013/01/15/big-data-security-tweet-jam/

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com), or leave a comment below. We anticipate a lively chat and hope you will be able to join us!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

 

1 Comment

Filed under Tweet Jam

Big Data Security Tweet Jam

By Patty Donovan, The Open Group

On Tuesday, January 22, The Open Group will host a tweet jam examining the topic of Big Data and its impact on the security landscape.

Recently, Big Data has been dominating the headlines, analyzing everything about the topic from how to manage and process it, to the way it will impact your organization’s IT roadmap. As 2012 came to a close, analyst firm, Gartner predicted that data will help drive IT spending to $3.8 trillion in 2014. Knowing the phenomenon is here to stay, enterprises face a new and daunting challenge of how to secure Big Data. Big Data security also raises other questions, such as: Is Big Data security different from data security? How will enterprises handle Big Data security? What is the best approach to Big Data security?

It’s yet to be seen if Big Data will necessarily revolutionize enterprise security, but it certainly will change execution – if it hasn’t already. Please join us for our upcoming Big Data Security tweet jam where leading security experts will discuss the merits of Big Data security.

Please join us on Tuesday, January 22 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. GMT for a tweet jam, moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct, that will discuss and debate the issues around big data security. Key areas that will be addressed during the discussion include: data security, privacy, compliance, security ethics and, of course, Big Data. We welcome Open Group members and interested participants from all backgrounds to join the session and interact with our panel of IT security experts, analysts and thought leaders led by Jim Hietala (@jim_hietala) and Dave Lounsbury (@Technodad) of The Open Group. To access the discussion, please follow the #ogChat hashtag during the allotted discussion time.

And for those of you who are unfamiliar with tweet jams, here is some background information:

What Is a Tweet Jam?

A tweet jam is a one hour “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on Big Data security. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

  • Have your first #ogChat tweet be a self-introduction: name, affiliation, occupation.
  • Start all other tweets with the question number you’re responding to and the #ogChat hashtag.
    • Sample: “Q1 enterprises will have to make significant adjustments moving forward to secure Big Data environments #ogChat”
    • Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.
    • While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue!
    • A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com). We anticipate a lively chat and hope you will be able to join!

 

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Tweet Jam

Data Governance: A Fundamental Aspect of IT

By E.G. Nadhan, HP

In an earlier post, I had explained how you can build upon SOA governance to realize Cloud governance.  But underlying both paradigms is a fundamental aspect that we have been dealing with ever since the dawn of IT—and that’s the data itself.

In fact, IT used to be referred to as “data processing.” Despite the continuing evolution of IT through various platforms, technologies, architectures and tools, at the end of the day IT is still processing data. However, the data has taken multiple shapes and forms—both structured and unstructured. And Cloud Computing has opened up opportunities to process and store structured and unstructured data. There has been a need for data governance since the day data processing was born, and today, it’s taken on a whole new dimension.

“It’s the economy, stupid,” was a campaign slogan, coined to win a critical election in the United States in 1992. Today, the campaign slogan for governance in the land of IT should be, “It’s the data, stupid!”

Let us challenge ourselves with a few questions. Consider them the what, why, when, where, who and how of data governance.

What is data governance? It is the mechanism by which we ensure that the right corporate data is available to the right people, at the right time, in the right format, with the right context, through the right channels.

Why is data governance needed? The Cloud, social networking and user-owned devices (BYOD) have acted as catalysts, triggering an unprecedented growth in recent years. We need to control and understand the data we are dealing with in order to process it effectively and securely.

When should data governance be exercised? Well, when shouldn’t it be? Data governance kicks in at the source, where the data enters the enterprise. It continues across the information lifecycle, as data is processed and consumed to address business needs. And it is also essential when data is archived and/or purged.

Where does data governance apply? It applies to all business units and across all processes. Data governance has a critical role to play at the point of storage—the final checkpoint before it is stored as “golden” in a database. Data Governance also applies across all layers of the architecture:

  • Presentation layer where the data enters the enterprise
  • Business logic layer where the business rules are applied to the data
  • Integration layer where data is routed
  • Storage layer where data finds its home

Who does data governance apply to? It applies to all business leaders, consumers, generators and administrators of data. It is a good idea to identify stewards for the ownership of key data domains. Stewards must ensure that their data domains abide by the enterprise architectural principles.  Stewards should continuously analyze the impact of various business events to their domains.

How is data governance applied? Data governance must be exercised at the enterprise level with federated governance to individual business units and data domains. It should be proactively exercised when a new process, application, repository or interface is introduced.  Existing data is likely to be impacted.  In the absence of effective data governance, data is likely to be duplicated, either by chance or by choice.

In our data universe, “informationalization” yields valuable intelligence that enables effective decision-making and analysis. However, even having the best people, process and technology is not going to yield the desired outcomes if the underlying data is suspect.

How about you? How is the data in your enterprise? What governance measures do you have in place? I would like to know.

A version of this blog post was originally published on HP’s Journey through Enterprise IT Services blog.

NadhanHP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has more than 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project, and is also the founding co-chair for the Open Group Cloud Computing Governance project. Connect with Nadhan on: Twitter, Facebook, LinkedIn and Journey Blog.

1 Comment

Filed under Cloud, Cloud/SOA

2013 Open Group Predictions, Vol. 1

By The Open Group

A big thank you to all of our members and staff who have made 2012 another great year for The Open Group. There were many notable achievements this year, including the release of ArchiMate 2.0, the launch of the Future Airborne Capability Environment (FACE™) Technical Standard and the publication of the SOA Reference Architecture (SOA RA) and the Service-Oriented Cloud Computing Infrastructure Framework (SOCCI).

As we wrap up 2012, we couldn’t help but look towards what is to come in 2013 for The Open Group and the industries we‘re a part of. Without further ado, here they are:

Big Data
By Dave Lounsbury, Chief Technical Officer

Big Data is on top of everyone’s mind these days. Consumerization, mobile smart devices, and expanding retail and sensor networks are generating massive amounts of data on behavior, environment, location, buying patterns – etc. – producing what is being called “Big Data”. In addition, as the use of personal devices and social networks continue to gain popularity so does the expectation to have access to such data and the computational power to use it anytime, anywhere. Organizations will turn to IT to restructure its services so it meets the growing expectation of control and access to data.

Organizations must embrace Big Data to drive their decision-making and to provide the optimal service mix services to customers. Big Data is becoming so big that the big challenge is how to use it to make timely decisions. IT naturally focuses on collecting data so Big Data itself is not an issue.. To allow humans to keep on top of this flood of data, industry will need to move away from programming computers for storing and processing data to teaching computers how to assess large amounts of uncorrelated data and draw inferences from this data on their own. We also need to start thinking about the skills that people need in the IT world to not only handle Big Data, but to make it actionable. Do we need “Data Architects” and if so, what would their role be?

In 2013, we will see the beginning of the Intellectual Computing era. IT will play an essential role in this new era and will need to help enterprises look at uncorrelated data to find the answer.

Security

By Jim Hietala, Vice President of Security

As 2012 comes to a close, some of the big developments in security over the past year include:

  • Continuation of hacktivism attacks.
  • Increase of significant and persistent threats targeting government and large enterprises. The notable U.S. National Strategy for Trusted Identities in Cyberspace started to make progress in the second half of the year in terms of industry and government movement to address fundamental security issues.
  • Security breaches were discovered by third parties, where the organizations affected had no idea that they were breached. Data from the 2012 Verizon report suggests that 92 percent of companies breached were notified by a third party.
  • Acknowledgement from senior U.S. cybersecurity professionals that organizations fall into two groups: those that know they’ve been penetrated, and those that have been penetrated, but don’t yet know it.

In 2013, we’ll no doubt see more of the same on the attack front, plus increased focus on mobile attack vectors. We’ll also see more focus on detective security controls, reflecting greater awareness of the threat and on the reality that many large organizations have already been penetrated, and therefore responding appropriately requires far more attention on detection and incident response.

We’ll also likely see the U.S. move forward with cybersecurity guidance from the executive branch, in the form of a Presidential directive. New national cybersecurity legislation seemed to come close to happening in 2012, and when it failed to become a reality, there were many indications that the administration would make something happen by executive order.

Enterprise Architecture

By Leonard Fehskens, Vice President of Skills and Capabilities

Preparatory to my looking back at 2012 and forward to 2013, I reviewed what I wrote last year about 2011 and 2012.

Probably the most significant thing from my perspective is that so little has changed. In fact, I think in many respects the confusion about what Enterprise Architecture (EA) and Business Architecture are about has gotten worse.

The stress within the EA community as both the demands being placed on it and the diversity of opinion within it increase continues to grow.  This year, I saw a lot more concern about the value proposition for EA, but not a lot of (read “almost no”) convergence on what that value proposition is.

Last year I wrote “As I expected at this time last year, the conventional wisdom about Enterprise Architecture continues to spin its wheels.”  No need to change a word of that. What little progress at the leading edge was made in 2011 seems to have had no effect in 2012. I think this is largely a consequence of the dust thrown in the eyes of the community by the ascendance of the concept of “Business Architecture,” which is still struggling to define itself.  Business Architecture seems to me to have supplanted last year’s infatuation with “enterprise transformation” as the means of compensating for the EA community’s entrenched IT-centric perspective.

I think this trend and the quest for a value proposition are symptomatic of the same thing — the urgent need for Enterprise Architecture to make its case to its stakeholder community, especially to the people who are paying the bills. Something I saw in 2011 that became almost epidemic in 2012 is conflation — the inclusion under the Enterprise Architecture umbrella of nearly anything with the slightest taste of “business” to it. This has had the unfortunate effect of further obscuring the unique contribution of Enterprise Architecture, which is to bring architectural thinking to bear on the design of human enterprise.

So, while I’m not quite mired in the slough of despond, I am discouraged by the community’s inability to advance the state of the art. In a private communication to some colleagues I wrote, “the conventional wisdom on EA is at about the same state of maturity as 14th century cosmology. It is obvious to even the most casual observer that the earth is both flat and the center of the universe. We debate what happens when you fall off the edge of the Earth, and is the flat earth carried on the back of a turtle or an elephant?  Does the walking of the turtle or elephant rotate the crystalline sphere of the heavens, or does the rotation of the sphere require the turtlephant to walk to keep the earth level?  These are obviously the questions we need to answer.”

Cloud

By Chris Harding, Director of Interoperability

2012 has seen the establishment of Cloud Computing as a mainstream resource for enterprise architects and the emergence of Big Data as the latest hot topic, likely to be mainstream for the future. Meanwhile, Service-Oriented Architecture (SOA) has kept its position as an architectural style of choice for delivering distributed solutions, and the move to ever more powerful mobile devices continues. These trends have been reflected in the activities of our Cloud Computing Work Group and in the continuing support by members of our SOA work.

The use of Cloud, Mobile Computing, and Big Data to deliver on-line systems that are available anywhere at any time is setting a new norm for customer expectations. In 2013, we will see the development of Enterprise Architecture practice to ensure the consistent delivery of these systems by IT professionals, and to support the evolution of creative new computing solutions.

IT systems are there to enable the business to operate more effectively. Customers expect constant on-line access through mobile and other devices. Business organizations work better when they focus on their core capabilities, and let external service providers take care of the rest. On-line data is a huge resource, so far largely untapped. Distributed, Cloud-enabled systems, using Big Data, and architected on service-oriented principles, are the best enablers of effective business operations. There will be a convergence of SOA, Mobility, Cloud Computing, and Big Data as they are seen from the overall perspective of the enterprise architect.

Within The Open Group, the SOA and Cloud Work Groups will continue their individual work, and will collaborate with other forums and work groups, and with outside organizations, to foster the convergence of IT disciplines for distributed computing.

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Cybersecurity, Enterprise Architecture

#ogChat Summary – 2013 Security Priorities

By Patty Donovan, The Open Group

Totaling 446 tweets, yesterday’s 2013 Security Priorities Tweet Jam (#ogChat) saw a lively discussion on the future of security in 2013 and became our most successful tweet jam to date. In case you missed the conversation, here’s a recap of yesterday’s #ogChat!

The event was moderated by former CNET security reporter Elinor Mills, and there was a total of 28 participants including:

Here is a high-level snapshot of yesterday’s #ogChat:

Q1 What’s the biggest lesson learned by the security industry in 2012? #ogChat

The consensus among participants was that 2012 was a year of going back to the basics. There are many basic vulnerabilities within organizations that still need to be addressed, and it affects every aspect of an organization.

  • @Dana_Gardner Q1 … Security is not a product. It’s a way of conducting your organization, a mentality, affects all. Repeat. #ogChat #security #privacy
  • @Technodad Q1: Biggest #security lesson of 2102: everyone is in two security camps: those who know they’ve been penetrated & those who don’t. #ogChat
  • @jim_hietala Q1. Assume you’ve been penetrated, and put some focus on detective security controls, reaction/incident response #ogChat
  • @c7five Lesson of 2012 is how many basics we’re still not covering (eg. all the password dumps that showed weak controls and pw choice). #ogChat

Q2 How will organizations tackle #BYOD security in 2013? Are standards needed to secure employee-owned devices? #ogChat

Participants debated over the necessity of standards. Most agreed that standards and policies are key in securing BYOD.

  • @arj Q2: No “standards” needed for BYOD. My advice: collect as little information as possible; use MDM; create an explicit policy #ogChat
  • @Technodad @arj Standards are needed for #byod – but operational security practices more important than technical standards. #ogChat
  • @AWildCSO Organizations need to develop a strong asset management program as part of any BYOD effort. Identification and Classification #ogChat
  • @Dana_Gardner Q2 #BYOD forces more apps & data back on servers, more secure; leaves devices as zero client. Then take that to PCs too. #ogChat #security
  • @taosecurity Orgs need a BYOD policy for encryption & remote wipe of company data; expect remote compromise assessment apps too @elinormills #ogChat

Q3 In #BYOD era, will organizations be more focused on securing the network, the device, or the data? #ogChat

There was disagreement here. Some emphasized focusing on protecting data, while others argued that it is the devices and networks that need protecting.

  • @taosecurity Everyone claims to protect data, but the main ways to do so remain protecting devices & networks. Ignores code sec too. @elinormills #ogChat
  • @arj Q3: in the BYOD era, the focus must be on the data. Access is gated by employee’s entitlements + device capabilities. #ogChat
  • @Technodad @arj Well said. Data sec is the big challenge now – important for #byod, #cloud, many apps. #ogChat
  • @c7five Organization will focus more on device management while forgetting about the network and data controls in 2013. #ogChat #BYOD

Q4 What impact will using 3rd party #BigData have on corporate security practices? #ogChat

Participants agreed that using third parties will force organizations to rely on security provided by those parties. They also acknowledged that data must be secure in transit.

  • @daviottenheimer Q4 Big Data will redefine perimeter. have to isolate sensitive data in transit, store AND process #ogChat
  • @jim_hietala Q4. 3rd party Big Data puts into focus 3rd party risk management, and transparency of security controls and control state #ogChat
  • @c7five Organizations will jump into 3rd party Big Data without understanding of their responsibilities to secure the data they transfer. #ogChat
  • @Dana_Gardner Q4 You have to trust your 3rd party #BigData provider is better at #security than you are, eh? #ogChat  #security #SLA
  • @jadedsecurity @Technodad @Dana_Gardner has nothing to do with trust. Data that isn’t public must be secured in transit #ogChat
  • @AWildCSO Q4: with or without bigdata, third party risk management programs will continue to grow in 2013. #ogChat

Q5 What will global supply chain security look like in 2013? How involved should governments be? #ogChat

Supply chains are an emerging security issue, and governments need to get involved. But consumers will also start to understand what they are responsible for securing themselves.

  • @jim_hietala Q5. supply chain emerging as big security issue, .gov’s need to be involved, and Open Group’s OTTF doing good work here #ogChat
  • @Technodad Q5: Governments are going to act- issue is getting too important. Challenge is for industry to lead & minimize regulatory patchwork. #ogChat
  • @kjhiggins Q5: Customers truly understanding what they’re responsible for securing vs. what cloud provider is. #ogChat

Q6 What are the biggest unsolved issues in Cloud Computing security? #ogChat

Cloud security is a big issue. Most agreed that Cloud security is mysterious, and it needs to become more transparent. When Cloud providers claim they are secure, consumers and organizations put blind trust in them, making the problem worse.

  • @jadedsecurity @elinormills Q6 all of them. Corps assume cloud will provide CIA and in most cases even fails at availability. #ogChat
  • @jim_hietala Q6. Transparency of security controls/control state, cloud risk management, protection of unstructured data in cloud services #ogChat
  • @c7five Some PaaS cloud providers advertise security as something users don’t need to worry about. That makes the problem worse. #ogChat

Q7 What should be the top security priorities for organizations in 2013? #ogChat

Top security priorities varied. Priorities highlighted in the discussion included:  focusing on creating a culture that promotes secure activity; prioritizing security spending based on risk; focusing on where the data resides; and third-party risk management coming to the forefront.

  • @jim_hietala Q7. prioritizing security spend based on risks, protecting data, detective controls #ogChat
  • @Dana_Gardner Q7 Culture trumps technology and business. So make #security policy adherence a culture that is defined and rewarded. #ogChat #security
  • @kjhiggins Q7 Getting a handle on where all of your data resides, including in the mobile realm. #ogChat
  • @taosecurity Also for 2013: 1) count and classify your incidents & 2) measure time from detection to containment. Apply Lean principles to both. #ogChat
  • @AWildCSO Q7: Asset management, third party risk management, and risk based controls for 2013. #ogChat

A big thank you to all the participants who made this such a great discussion!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Tweet Jam

Operational Resilience through Managing External Dependencies

By Ian Dobson & Jim Hietala, The Open Group

These days, organizations are rarely self-contained. Businesses collaborate through partnerships and close links with suppliers and customers. Outsourcing services and business processes, including into Cloud Computing, means that key operations that an organization depends on are often fulfilled outside their control.

The challenge here is how to manage the dependencies your operations have on factors that are outside your control. The goal is to perform your risk management so it optimizes your operational success through being resilient against external dependencies.

The Open Group’s Dependency Modeling (O-DM) standard specifies how to construct a dependency model to manage risk and build trust over organizational dependencies between enterprises – and between operational divisions within a large organization. The standard involves constructing a model of the operations necessary for an organization’s success, including the dependencies that can affect each operation. Then, applying quantitative risk sensitivities to each dependency reveals those operations that have highest exposure to risk of not being successful, informing business decision-makers where investment in reducing their organization’s exposure to external risks will result in best return.

O-DM helps you to plan for success through operational resilience, assured business continuity, and effective new controls and contingencies, enabling you to:

  • Cut costs without losing capability
  • Make the most of tight budgets
  • Build a resilient supply chain
  •  Lead programs and projects to success
  • Measure, understand and manage risk from outsourcing relationships and supply chains
  • Deliver complex event analysis

The O-DM analytical process facilitates organizational agility by allowing you to easily adjust and evolve your organization’s operations model, and produces rapid results to illustrate how reducing the sensitivity of your dependencies improves your operational resilience. O-DM also allows you to drill as deep as you need to go to reveal your organization’s operational dependencies.

O-DM support training on the development of operational dependency models conforming to this standard is available, as are software computation tools to automate speedy delivery of actionable results in graphic formats to facilitate informed business decision-making.

The O-DM standard represents a significant addition to our existing Open Group Risk Management publications:

The O-DM standard may be accessed here.

Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world.  In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Security Architecture

Questions for the Upcoming 2013 Security Priorities Tweet Jam – Dec. 11

By Patty Donovan, The Open Group

Last week, we announced our upcoming tweet jam on Tuesday, December 11 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST, which will examine the topic of IT security and what is in store for 2013.

Please join us next Tuesday, December 11! The discussion will be moderated by Elinor Mills (@elinormills), former CNET security reporter, and we welcome Open Group members and interested participants from all backgrounds to join the session. Our panel of experts will include:

The discussion will be guided by these seven questions:

  1. What’s the biggest lesson learned by the security industry in 2012? #ogChat
  2. How will organizations tackle #BYOD security in 2013? Are standards needed to secure employee-owned devices? #ogChat
  3. In #BYOD era, will organizations be more focused on securing the network, the device, or the data? #ogChat
  4. What impact will using 3rd party #BigData have on corporate security practices? #ogChat
  5. What will global supply chain security look like in 2013? How involved should governments be? #ogChat
  6. What are the biggest unsolved issues in cloud computing security? #ogChat
  7. What should be the top security priorities for organizations in 2013? #ogChat

To access the discussion, please follow the #ogChat hashtag during the allotted discussion time. Other hashtags we recommend you use during the event include:

  • Information Security: #InfoSec
  • Security: #security
  • BYOD: #BYOD
  • Big Data: #BigData
  • Privacy: #privacy
  • Mobile: #mobile
  • Supply Chain: #supplychain

For more information about the tweet jam topic (security), guidelines and general background information on the event, please visit our previous blog post: http://blog.opengroup.org/2012/11/26/2013-security-priorities-tweet-jam/

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com), or leave a comment below. We anticipate a lively chat and hope you will be able to join us!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

Comments Off

Filed under Tweet Jam

The Open Group Newport Beach Conference – Early Bird Registration Ends January 4

By The Open Group Conference Team

The Open Group is busy gearing up for the Newport Beach Conference. Taking place January 28-31, 2013, the conference theme is “Big Data – The Transformation We Need to Embrace Today” and will bring together leading minds in technology to discuss the challenges and solutions facing Enterprise Architecture around the growth of Big Data. Register today!

Information is power, and we stand at a time when 90% of the data in the world today was generated in the last two years alone.  Despite the sheer enormity of the task, off the shelf hardware, open source frameworks, and the processing capacity of the Cloud, mean that Big Data processing is within the cost-effective grasp of the average business. Organizations can now initiate Big Data projects without significant investment in IT infrastructure.

In addition to tutorial sessions on TOGAF® and ArchiMate®, the conference offers roughly 60 sessions on a varied of topics including:

  • The ways that Cloud Computing is transforming the possibilities for collecting, storing, and processing big data.
  • How to contend with Big Data in your Enterprise?
  • How does Big Data enable your Business Architecture?
  • What does the Big Data revolution mean for the Enterprise Architect?
  • Real-time analysis of Big Data in the Cloud.
  • Security challenges in the world of outsourced data.
  • What is an architectural view of Security for the Cloud?

Plenary speakers include:

  • Christian Verstraete, Chief Technologist – Cloud Strategy, HP
  • Mary Ann Mezzapelle, Strategist – Security Services, HP
  • Michael Cavaretta, Ph.D, Technical Leader, Predictive Analytics / Data Mining Research and Advanced Engineering, Ford Motor Company
  • Adrian Lane, Analyst and Chief Technical Officer, Securosis
  • David Potter, Chief Technical Officer, Promise Innovation Oy
  • Ron Schuldt, Senior Partner, UDEF-IT, LLC

A full conference agenda is available here. Tracks include:

  • Architecting Big Data
  • Big Data and Cloud Security
  • Data Architecture and Big Data
  • Business Architecture
  • Distributed Services Architecture
  • EA and Disruptive Technologies
  • Architecting the Cloud
  • Cloud Computing for Business

Early Bird Registration

Early Bird registration for The Open Group Conference in Newport Beach ends January 4. Register now and save! For more information or to register: http://www.opengroup.org/event/open-group-newport-beach-2013/reg

Upcoming Conference Submission Deadlines

In addition to the Early Bird registration deadline to attend the Newport Beach conference, there are upcoming deadlines for speaker proposal submissions to Open Group conferences in Sydney, Philadelphia and London. To submit a proposal to speak, click here.

Venue Industry Focus Submission Deadline
Sydney (April 15-17) Finance, Defense, Mining January 18, 2013
Philadelphia (July 15-17) Healthcare, Finance, Defense April 5, 2013
London (October 21-23) Finance, Government, Healthcare July 8, 2013

We expect space on the agendas of these events to be at a premium, so it is important for proposals to be submitted as early as possible. Proposals received after the deadline dates will still be considered, if space is available; if not, they may be carried over to a future conference. Priority will be given to proposals received by the deadline dates and to proposals that include an end-user organization, at least as a co-presenter.

Comments Off

Filed under Conference

Data Protection Today and What’s Needed Tomorrow

By Ian Dobson and Jim Hietala, The Open Group

Technology today allows thieves to copy sensitive data, leaving the original in place and thus avoiding detection. One needn’t look far in today’s headlines to understand why protection of data is critical going forward. As this recent article from Bloomberg points out, penetrations of corporate IT systems with the aim to extract sensitive information, IP and other corporate data are rampant.  Despite the existence of data breach and data privacy laws in the U.S., EU and elsewhere, this issue is still not well publicized. The article cites specific intrusions at large consumer products companies, the EU, itself, law firms and a nuclear power plant.

Published in October 2012, the Jericho Forum® Data Protection white paper reviews the state of data protection today and where it should be heading to meet tomorrow’s business needs. The Open Group’s Jericho Forum contends that future data protection solutions must aim to provide stronger, more flexible protection mechanisms around the data itself.

The white paper argues that some of the current issues with data protection are:

  • It is too global and remote to be effective
  • Protection is neither granular nor interoperable enough
  • It’s not integrated with Centralized Authorization Services
  • Weak security services are relied on for enforcement

Refreshingly, it explains not only why, but also how. The white paper reviews the key issues surrounding data protection today; describes properties that data protection mechanisms should include to meet current and future requirements; considers why current technologies don’t deliver what is required; and proposes a set of data protection principles to guide the design of effective solutions.

It goes on to describe how data protection has evolved to where it’s at today, and outlines a series of target stages for progressively moving the industry forward to deliver stronger more flexible protection solutions that business managers are already demanding their IT systems managers provide.  Businesses require these solutions to ensure appropriate data protection levels are wrapped around the rapidly increasing volumes of confidential information that is shared with their business partners, suppliers, customers and outworkers/contractors on a daily basis.

Having mapped out an evolutionary path for what we need to achieve to move data protection forward in the direction our industry needs, we’re now planning optimum approaches for how to achieve each successive stage of protection. The Jericho Forum welcomes folks who want to join us in this important journey.

 

Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world.  In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity

#ogChat Summary – The Future of BYOD

By Patty Donovan, The Open Group

With over 400 tweets flying back and forth, last week’s BYOD Tweet Jam (#ogChat) saw a fast-paced, lively discussion on the future of the bring your own device (BYOD) trend and its implications in the enterprise. In case you missed the conversation, here’s a recap of last week’s #ogChat!

There were a total of 29 participants including:

Here is a high-level a snapshot of yesterday’s #ogChat:

Q1 What are the quantifiable benefits of BYOD? What are the major risks of #BYOD, and do these risks outweigh the benefits? #ogChat

Participants generally agreed that the main risk of BYOD is data security and benefits include cost and convenience.

  • @MobileGalen Data policy is core because that’s where the real value is in business. Affects access and intrusion/hacking of course secondarily #ogChat
  • @technodad Q1 #BYOD transcends time/space boundaries – necessary for a global business. #ogChat
  • @AWildCSO Q1 Risks: Risk to integrity and availability of corporate IT systems – malware into enterprise from employee owned devices #ogChat

Q2 What are the current security issues with #BYOD, and how should organizations go about securing those devices? #ogChat

The most prominent issue discussed was who owns the responsibility of security. Many couldn’t agree on whether responsibility fell on the user or the organization.

  • @AWildCSO Q2: Main issue is the confidentiality of data. Not a new issue, has been around a while, especially since the advent of networking. #ogChat
  • @cebess .@ MobileGalen Right — it’s about the data not the device. #ogChat
  • @AppsTechNews Q2 Not knowing who’s responsible? Recent ITIC/KnowBe4 survey: 37% say corporation responsible for #BYOD security; 39% say end user #ogChat
  • @802dotchris @MobileGalen there’s definitiely a “golden ratio” of fucntionality to security and controls @IDGTechTalk #ogChat
  • @MobileGalen #ogChat Be careful about looking for mobile mgmt tools as your fix. Most are about disablement not enablement. Start w enable, then protect.

Q3 How can an organization manage corporate data on employee owned devices, while not interfering with data owned by an employee? #ogChat

Most participants agreed that securing corporate data is a priority but were stumped when it came to maintaining personal data privacy. Some suggested that organizations will have no choice but to interfere with personal data, but all agreed that no matter what the policy, it needs to be clearly communicated to employees.

  • @802dotchris @jim_hietala in our research, we’re seeing more companies demand app-by-app wipe or other selective methods as MDM table stakes #ogChat
  • @AppsTechNews Q3 Manage the device, manage & control apps running on it, and manage data within those apps – best #BYOD solutions address all 3 #ogChat
  • @JonMoger @theopengroup #security #ogChat #BYOD is a catalyst for a bigger trend driven by cultural shift that affects HR, legal, finance, LOB.
  • @bobegan I am a big believer in people, and i think most employees feel that they own a piece of corporate policy #ogChat
  • @mobilityofficer @theopengroup Q3: Sometimes you have no choice but to interfere with private data but you must communicate that to employees #ogChat

Q4 How does #BYOD contribute to the creation or use of #BigData in the enterprise? What role does #BYOD play in #BigData strategy? #ogChat

Participants exchanged opinions on the relationship between BYOD and Big Data, leaving much room for future discussion.

  • @technodad Q4 #bigdata created by mobile, geotgged, realtime apps is gold dust for business analytics & marketing. Smart orgs will embrace it. #ogChat
  • @cebess .@ technodad Context is king. The device in the field has quite a bit of contextual info. #ogChat
  • @bobegan @cebess Right, a mobile strategy, including BYOD is really about information supply chain managment. Must include many audiences #ogChat

Q5 What best practices can orgs implement to provide #BYOD flexibility and also maintain control and governance over corporate data? #ogChat

When discussing best practices, it became clear that no matter what, organizations must educate employees and be consistent with business priorities. Furthermore, if data is precious, treat it that way.

  • @AWildCSO Q5: Establish policies and processes for the classification, ownership and custodianship of information assets. #ogChat
  • @MobileGalen #ogChat: The more precious your info, the less avail it should be, BYOD or not. Use containered apps for sensitive, local access for secret
  • @JonMoger @theopengroup #BYOD #ogChat 1. Get the right team to own 2. Educate mgmt on risks & opps 3. Set business priorities 4. Define policies

Q6 How will organizations embrace or reject #BYOD moving forward? Will they have a choice or will employees dictate use? #ogChat

While understanding the security risks, most participants embraced BYOD as a big trend that will eventually become the standard moving forward.

A big thank you to all the participants who made this such a great discussion!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

Comments Off

Filed under Tweet Jam

Questions for the Upcoming BYOD Tweet Jam – Sept. 18

By Patty Donovan, The Open Group

Earlier this week, we announced our upcoming tweet jam on Tuesday, September 18 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST, which will examine the topic of Bring-Your-Own-Device (BYOD) and current approaches to managing it.

Please join us next Tuesday! We welcome Open Group members and interested participants from all backgrounds to join the session and interact with our panel of experts, including:

To access the discussion, please follow the #ogChat hashtag during the allotted discussion time. Other hashtags we recommend you using include:

  • BYOD: #BYOD
  • Mobile: #mobile
  • Social Media: #socialmedia
  • Smartphone: #smartphone
  • iPhone: #iPhone
  • Apple: #Apple
  • Android: #Android (Twitter Account @Android)
  • Tablet: #tablet
  • iPad: #iPad
  • Security: #security
  • Big Data: #BigData
  • Privacy: #privacy
  • Open Group Conference, Barcelona: #ogBCN

and below is the list of the questions that will guide the hour-long discussion:

  1. What are the quantifiable benefits of BYOD? What are the major risks of #BYOD, and do these risks outweigh the benefits? #ogChat
  2. What are the current security issues with #BYOD, and how should organizations go about securing those devices? #ogChat
  3. How can an organization manage corporate data on employee owned devices, while not interfering with data owned by an employee? #ogChat
  4. How does #BYOD contribute to the creation or use of #BigData in the enterprise? What role does #BYOD play in #BigData strategy? #ogChat
  5. What best practices can orgs implement to provide #BYOD flexibility and also maintain control and governance over corporate data? #ogChat
  6. How will organizations embrace or reject #BYOD moving forward? Will they have a choice or will employees dictate use? #ogChat

For more information about the tweet jam topic (BYOD), guidelines and general background information on the event, please visit our previous blog post: http://blog.opengroup.org/2012/09/10/the-future-of-byod-tweet-jam/

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com), or leave a comment below. We anticipate a lively chat and hope you will be able to join!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

Comments Off

Filed under Tweet Jam

Optimizing ISO/IEC 27001 Using O-ISM3

By Jim Hietala, The Open Group and Vicente Aceituno, Sistemas Informáticos Abiertos

The Open Group has just published a guide titled “Optimizing ISO/IEC 27001 using O-ISM3” that will be of interest to organizations using ISO27001/27002 as their Information Security Management System (ISMS).

By way of background, The Open Group published our Open Information Security Management Maturity Model last year, O-ISM3. O-ISM3 brings continuous improvement to information security management, and it provides a framework for security decision-making that is top down in nature, where security controls, security objectives and spending decisions are driven by (and aligned with) business objectives.

We have for some time now heard from information security managers that they would like a resource aimed at showing how the O-ISM3 standard could be used to manage information security alongside ISO27001/27002. This new guide provides specific guidance on this topic.

We view this as an important resource, for the following reasons:

  • O-ISM3 complements ISO27001/2 by adding the “how” dimension to information security management
  • O-ISM3 uses a process-oriented approach, defining inputs and outputs, and allowing for evaluation by process-specific metrics
  • O-ISM3 provides a framework for continuous improvement of information security processes

This resource:

  • Maps O-ISM3 and ISO27001 security objectives
  • Maps ISO27001/27002 controls and documents to O-ISM3 security processes, documents, and outputs
  • Provides a critical linkage between the controls-based approach found in ISO27001 to the process-based approach found in O-ISM3

If you have interest in information security management, we encourage you to have a look at Optimizing ISO/IEC 27001 using O-ISM3. The guide may be downloaded (at no cost, minimal registration required) here.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Vicente Aceituno, CISA, has 20 years experience in the field of IT and Information Security. During his career in Spain and the UK, he has worked for companies like Coopers & Lybrand, BBC News, Everis, and SIA Group. He is the main author of the Information Security Management Method ISM3, author of the information security book “Seguridad de la Información,” Director of the ISM3 Consortium (www.ism3.com) and President of the Spanish chapter of the ISSA.

3 Comments

Filed under Cybersecurity, Information security, Security Architecture