Tag Archives: data security

Beyond Big Data

By Chris Harding, The Open Group

The big bang that started The Open Group Conference in Newport Beach was, appropriately, a presentation related to astronomy. Chris Gerty gave a keynote on Big Data at NASA, where he is Deputy Program Manager of the Open Innovation Program. He told us how visualizing deep space and its celestial bodies created understanding and enabled new discoveries. Everyone who attended felt inspired to explore the universe of Big Data during the rest of the conference. And that exploration – as is often the case with successful space missions – left us wondering what lies beyond.

The Big Data Conference Plenary

The second presentation on that Monday morning brought us down from the stars to the nuts and bolts of engineering. Mechanical devices require regular maintenance to keep functioning. Processing the mass of data generated during their operation can improve safety and cut costs. For example, airlines can overhaul aircraft engines when it needs doing, rather than on a fixed schedule that has to be frequent enough to prevent damage under most conditions, but might still fail to anticipate failure in unusual circumstances. David Potter and Ron Schuldt lead two of The Open Group initiatives, Quantum Lifecycle management (QLM) and the Universal Data Element Framework (UDEF). They explained how a semantic approach to product lifecycle management can facilitate the big-data processing needed to achieve this aim.

Chris Gerty was then joined by Andras Szakal, vice-president and chief technology officer at IBM US Federal IMT, Robert Weisman, chief executive officer of Build The Vision, and Jim Hietala, vice-president of Security at The Open Group, in a panel session on Big Data that was moderated by Dana Gardner of Interarbor Solutions. As always, Dana facilitated a fascinating discussion. Key points made by the panelists included: the trend to monetize data; the need to ensure veracity and usefulness; the need for security and privacy; the expectation that data warehouse technology will exist and evolve in parallel with map/reduce “on-the-fly” analysis; the importance of meaningful presentation of the data; integration with cloud and mobile technology; and the new ways in which Big Data can be used to deliver business value.

More on Big Data

In the afternoons of Monday and Tuesday, and on most of Wednesday, the conference split into streams. These have presentations that are more technical than the plenary, going deeper into their subjects. It’s a pity that you can’t be in all the streams at once. (At one point I couldn’t be in any of them, as there was an important side meeting to discuss the UDEF, which is in one of the areas that I support as forum director). Fortunately, there were a few great stream presentations that I did manage to get to.

On the Monday afternoon, Tom Plunkett and Janet Mostow of Oracle presented a reference architecture that combined Hadoop and NoSQL with traditional RDBMS, streaming, and complex event processing, to enable Big Data analysis. One application that they described was to trace the relations between particular genes and cancer. This could have big benefits in disease prediction and treatment. Another was to predict the movements of protesters at a demonstration through analysis of communications on social media. The police could then concentrate their forces in the right place at the right time.

Jason Bloomberg, president of Zapthink – now part of Dovel – is always thought-provoking. His presentation featured the need for governance vitality to cope with ever changing tools to handle Big Data of ever increasing size, “crowdsourcing” to channel the efforts of many people into solving a problem, and business transformation that is continuous rather than a one-time step from “as is” to “to be.”

Later in the week, I moderated a discussion on Architecting for Big Data in the Cloud. We had a well-balanced panel made up of TJ Virdi of Boeing, Mark Skilton of Capgemini and Tom Plunkett of Oracle. They made some excellent points. Big Data analysis provides business value by enabling better understanding, leading to better decisions. The analysis is often an iterative process, with new questions emerging as answers are found. There is no single application that does this analysis and provides the visualization needed for understanding, but there are a number of products that can be used to assist. The role of the data scientist in formulating the questions and configuring the visualization is critical. Reference models for the technology are emerging but there are as yet no commonly-accepted standards.

The New Enterprise Platform

Jogging is a great way of taking exercise at conferences, and I was able to go for a run most mornings before the meetings started at Newport Beach. Pacific Coast Highway isn’t the most interesting of tracks, but on Tuesday morning I was soon up in Castaways Park, pleasantly jogging through the carefully-nurtured natural coastal vegetation, with views over the ocean and its margin of high-priced homes, slipways, and yachts. I reflected as I ran that we had heard some interesting things about Big Data, but it is now an established topic. There must be something new coming over the horizon.

The answer to what this might be was suggested in the first presentation of that day’s plenary, Mary Ann Mezzapelle, security strategist for HP Enterprise Services, talked about the need to get security right for Big Data and the Cloud. But her scope was actually wider. She spoke of the need to secure the “third platform” – the term coined by IDC to describe the convergence of social, cloud and mobile computing with Big Data.

Securing Big Data

Mary Ann’s keynote was not about the third platform itself, but about what should be done to protect it. The new platform brings with it a new set of security threats, and the increasing scale of operation makes it increasingly important to get the security right. Mary Ann presented a thoughtful analysis founded on a risk-based approach.

She was followed by Adrian Lane, chief technology officer at Securosis, who pointed out that Big Data processing using NoSQL has a different architecture from traditional relational data processing, and requires different security solutions. This does not necessarily mean new techniques; existing techniques can be used in new ways. For example, Kerberos may be used to secure inter-node communications in map/reduce processing. Adrian’s presentation completed the Tuesday plenary sessions.

Service Oriented Architecture

The streams continued after the plenary. I went to the Distributed Services Architecture stream, which focused on SOA.

Bill Poole, enterprise architect at JourneyOne in Australia, described how to use the graphical architecture modeling language ArchiMate® to model service-oriented architectures. He illustrated this using a case study of a global mining organization that wanted to consolidate its two existing bespoke inventory management applications into a single commercial off-the-shelf application. It’s amazing how a real-world case study can make a topic come to life, and the audience certainly responded warmly to Bill’s excellent presentation.

Ali Arsanjani, chief technology officer for Business Performance and Service Optimization, and Heather Kreger, chief technology officer for International Standards, both at IBM, described the range of SOA standards published by The Open Group and available for use by enterprise architects. Ali was one of the brains that developed the SOA Reference Architecture, and Heather is a key player in international standards activities for SOA, where she has helped The Open Group’s Service Integration Maturity Model and SOA Governance Framework to become international standards, and is working on an international standard SOA reference architecture.

Cloud Computing

To start Wednesday’s Cloud Computing streams, TJ Virdi, senior enterprise architect at The Boeing Company, discussed use of TOGAF® to develop an Enterprise Architecture for a Cloud ecosystem. A large enterprise such as Boeing may use many Cloud service providers, enabling collaboration between corporate departments, partners, and regulators in a complex ecosystem. Architecting for this is a major challenge, and The Open Group’s TOGAF for Cloud Ecosystems project is working to provide guidance.

Stuart Boardman of KPN gave a different perspective on Cloud ecosystems, with a case study from the energy industry. An ecosystem may not necessarily be governed by a single entity, and the participants may not always be aware of each other. Energy generation and consumption in the Netherlands is part of a complex international ecosystem involving producers, consumers, transporters, and traders of many kinds. A participant may be involved in several ecosystems in several ways: a farmer for example, might consume energy, have wind turbines to produce it, and also participate in food production and transport ecosystems.

Penelope Gordon of 1-Plug Corporation explained how choice and use of business metrics can impact Cloud service providers. She worked through four examples: a start-up Software-as-a-Service provider requiring investment, an established company thinking of providing its products as cloud services, an IT department planning to offer an in-house private Cloud platform, and a government agency seeking budget for government Cloud.

Mark Skilton, director at Capgemini in the UK, gave a presentation titled “Digital Transformation and the Role of Cloud Computing.” He covered a very broad canvas of business transformation driven by technological change, and illustrated his theme with a case study from the pharmaceutical industry. New technology enables new business models, giving competitive advantage. Increasingly, the introduction of this technology is driven by the business, rather than the IT side of the enterprise, and it has major challenges for both sides. But what new technologies are in question? Mark’s presentation had Cloud in the title, but also featured social and mobile computing, and Big Data.

The New Trend

On Thursday morning I took a longer run, to and round Balboa Island. With only one road in or out, its main street of shops and restaurants is not a through route and the island has the feel of a real village. The SOA Work Group Steering Committee had found an excellent, and reasonably priced, Italian restaurant there the previous evening. There is a clear resurgence of interest in SOA, partly driven by the use of service orientation – the principle, rather than particular protocols – in Cloud Computing and other new technologies. That morning I took the track round the shoreline, and was reminded a little of Dylan Thomas’s “fishing boat bobbing sea.” Fishing here is for leisure rather than livelihood, but I suspected that the fishermen, like those of Thomas’s little Welsh village, spend more time in the bar than on the water.

I thought about how the conference sessions had indicated an emerging trend. This is not a new technology but the combination of four current technologies to create a new platform for enterprise IT: Social, Cloud, and Mobile computing, and Big Data. Mary Ann Mezzapelle’s presentation had referenced IDC’s “third platform.” Other discussions had mentioned Gartner’s “Nexus of forces,” the combination of Social, Cloud and Mobile computing with information that Gartner says is transforming the way people and businesses relate to technology, and will become a key differentiator of business and technology management. Mark Skilton had included these same four technologies in his presentation. Great minds, and analyst corporations, think alike!

I thought also about the examples and case studies in the stream presentations. Areas as diverse as healthcare, manufacturing, energy and policing are using the new technologies. Clearly, they can deliver major business benefits. The challenge for enterprise architects is to maximize those benefits through pragmatic architectures.

Emerging Standards

On the way back to the hotel, I remarked again on what I had noticed before, how beautifully neat and carefully maintained the front gardens bordering the sidewalk are. I almost felt that I was running through a public botanical garden. Is there some ordinance requiring people to keep their gardens tidy, with severe penalties for anyone who leaves a lawn or hedge unclipped? Is a miserable defaulter fitted with a ball and chain, not to be removed until the untidy vegetation has been properly trimmed, with nail clippers? Apparently not. People here keep their gardens tidy because they want to. The best standards are like that: universally followed, without use or threat of sanction.

Standards are an issue for the new enterprise platform. Apart from the underlying standards of the Internet, there really aren’t any. The area isn’t even mapped out. Vendors of Social, Cloud, Mobile, and Big Data products and services are trying to stake out as much valuable real estate as they can. They have no interest yet in boundaries with neatly-clipped hedges.

This is a stage that every new technology goes through. Then, as it matures, the vendors understand that their products and services have much more value when they conform to standards, just as properties have more value in an area where everything is neat and well-maintained.

It may be too soon to define those standards for the new enterprise platform, but it is certainly time to start mapping out the area, to understand its subdivisions and how they inter-relate, and to prepare the way for standards. Following the conference, The Open Group has announced a new Forum, provisionally titled Open Platform 3.0, to do just that.

The SOA and Cloud Work Groups

Thursday was my final day of meetings at the conference. The plenary and streams presentations were done. This day was for working meetings of the SOA and Cloud Work Groups. I also had an informal discussion with Ron Schuldt about a new approach for the UDEF, following up on the earlier UDEF side meeting. The conference hallways, as well as the meeting rooms, often see productive business done.

The SOA Work Group discussed a certification program for SOA professionals, and an update to the SOA Reference Architecture. The Open Group is working with ISO and the IEEE to define a standard SOA reference architecture that will have consensus across all three bodies.

The Cloud Work Group had met earlier to further the TOGAF for Cloud ecosystems project. Now it worked on its forthcoming white paper on business performance metrics. It also – though this was not on the original agenda – discussed Gartner’s Nexus of Forces, and the future role of the Work Group in mapping out the new enterprise platform.

Mapping the New Enterprise Platform

At the start of the conference we looked at how to map the stars. Big Data analytics enables people to visualize the universe in new ways, reach new understandings of what is in it and how it works, and point to new areas for future exploration.

As the conference progressed, we found that Big Data is part of a convergence of forces. Social, mobile, and Cloud Computing are being combined with Big Data to form a new enterprise platform. The development of this platform, and its roll-out to support innovative applications that deliver more business value, is what lies beyond Big Data.

At the end of the conference we were thinking about mapping the new enterprise platform. This will not require sophisticated data processing and analysis. It will take discussions to create a common understanding, and detailed committee work to draft the guidelines and standards. This work will be done by The Open Group’s new Open Platform 3.0 Forum.

The next Open Group conference is in the week of April 15, in Sydney, Australia. I’m told that there’s some great jogging there. More importantly, we’ll be reflecting on progress in mapping Open Platform 3.0, and thinking about what lies ahead. I’m looking forward to it already.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Conference

Open Group Panel Explores Changing Field of Risk Management and Analysis in the Era of Big Data

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: The Open Group Panel Explores Changing Field of Risk Management and Analysis in Era of Big Data

This is a transcript of a sponsored podcast discussion on the threats from and promise of Big Data in securing enterprise information assets in conjunction with the The Open Group Conference in Newport Beach.

Dana Gardner: Hello, and welcome to a special thought leadership interview series coming to you in conjunction with The Open Group Conference on January 28 in Newport Beach, California.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these business transformation discussions. The conference itself is focusing on Big Data the transformation we need to embrace today.

We’re here now with a panel of experts to explore new trends and solutions in the area of risk management and analysis. We’ll learn how large enterprises are delivering risk assessments and risk analysis, and we’ll see how Big Data can be both an area to protect from in form of risks, but also as a tool for better understanding and mitigating risks.

With that, please join me in welcoming our panel. We’re here with Jack Freund, PhD, the Information Security Risk Assessment Manager at TIAA-CREF. Welcome, Jack.

Jack Freund: Hello Dana, how are you?

Gardner: I’m great. Glad you could join us.

We are also here with Jack Jones, Principal of CXOWARE. He has more than nine years of experience as a Chief Information Security Officer, is the inventor of the Factor Analysis Information Risk (FAIR) framework. Welcome, Jack.

Jack Jones: Thank you. And we’re also here with Jim Hietala, Vice President, Security for The Open Group. Welcome, Jim.

Jim Hietala: Thanks, Dana.

Gardner: All right, let’s start out with looking at this from a position of trends. Why is the issue of risk analysis so prominent now? What’s different from, say, five years ago? And we’ll start with you, Jack Jones.

Jones: The information security industry has struggled with getting the attention of and support from management and businesses for a long time, and it has finally come around to the fact that the executives care about loss exposure — the likelihood of bad things happening and how bad those things are likely to be.

It’s only when we speak of those terms or those issues in terms of risk, that we make sense to those executives. And once we do that, we begin to gain some credibility and traction in terms of getting things done.

Gardner: So we really need to talk about this in the terms that a business executive would appreciate, not necessarily an IT executive.

Effects on business

Jones: Absolutely. They’re tired of hearing about vulnerabilities, hackers, and that sort of thing. It’s only when we can talk in terms of the effect on the business that it makes sense to them.

Gardner: Jack Freund, I should also point out that you have more than 14 years in enterprise IT experience. You’re a visiting professor at DeVry University and you chair a risk-management subcommittee for ISACA? Is that correct?

Freund: ISACA, yes.

Gardner: And do you agree?

Freund: The problem that we have as a profession, and I think it’s a big problem, is that we have allowed ourselves to escape the natural trend that the other IT professionals have already taken.

There was a time, years ago, when you could code in the basement, and nobody cared much about what you were doing. But now, largely speaking, developers and systems administrators are very focused on meeting the goals of the organization.

Security has been allowed to miss that boat a little. We have been allowed to hide behind this aura of a protector and of an alerter of terrible things that could happen, without really tying ourselves to the problem that the organizations are facing and how can we help them succeed in what they’re doing.

Gardner: Jim Hietala, how do you see things that are different now than a few years ago when it comes to risk assessment?

Hietala: There are certainly changes on the threat side of the landscape. Five years ago, you didn’t really have hacktivism or this notion of an advanced persistent threat (APT).

That highly skilled attacker taking aim at governments and large organizations didn’t really exist -– or didn’t exist to the degree it does today. So that has changed.

You also have big changes to the IT platform landscape, all of which bring new risks that organizations need to really think about. The mobility trend, the Cloud trend, the big-data trend that we are talking about today, all of those things bring new risk to the organization.

As Jack Jones mentioned, business executives don’t want to hear about, “I’ve got 15 vulnerabilities in the mobility part of my organization.” They want to understand what’s the risk of bad things happening because of mobility, what we’re doing about it, and what’s happening to risk over time?

So it’s a combination of changes in the threats and attackers, as well as just changes to the IT landscape, that we have to take a different look at how we measure and present risk to the business.

Gardner: Because we’re at a big-data conference, do you share my perception, Jack Jones, that Big Data can be a source of risk and vulnerability, but also the analytics and the business intelligence (BI) tools that we’re employing with Big Data can be used to alert you to risks or provide a strong tool for better understanding your true risk setting or environment.

Crown jewels

Jones: You are absolutely right. You think of Big Data and, by definition, it’s where your crown jewels, and everything that leads to crown jewels from an information perspective, are going to be found. It’s like one-stop shopping for the bad guy, if you want to look at it in that context. It definitely needs to be protected. The architecture surrounding it and its integration across a lot of different platforms and such, can be leveraged and probably result in a complex landscape to try and secure.

There are a lot of ways into that data and such, but at least if you can leverage that same Big Data architecture, it’s an approach to information security. With log data and other threat and vulnerability data and such, you should be able to make some significant gains in terms of how well-informed your analyses and your decisions are, based on that data.

Gardner: Jack Freund, do you share that? How does Big Data fit into your understanding of the evolving arena of risk assessment and analysis?

Freund: If we fast-forward it five years, and this is even true today, a lot of people on the cutting edge of Big Data will tell you the problem isn’t so much building everything together and figuring out what it can do. They are going to tell you that the problem is what we do once we figure out everything that we have. This is the problem that we have traditionally had on a much smaller scale in information security. When everything is important, nothing is important.

Gardner: To follow up on that, where do you see the gaps in risk analysis in large organizations? In other words, what parts of organizations aren’t being assessed for risk and should be?

Freund: The big problems that exist largely today in the way that risk assessments are done, is the focus on labels. We want to quickly address the low, medium, and high things and know where they are. But the problem is that there are inherent problems in the way that we think about those labels, without doing any of the analysis legwork.

I think that’s what’s really missing is that true analysis. If the system goes offline, do we lose money? If the system becomes compromised, what are the cost-accounting things that will happen that allow us to figure out how much money we’re going to lose.

That analysis work is largely missing. That’s the gap. The gap is if the control is not in place, then there’s a risk that must be addressed in some fashion. So we end up with these very long lists of horrible, terrible things that can be done to us in all sorts of different ways, without any relevance to the overall business of the organization.

Every day, our organizations are out there selling products, offering services, which is and of itself, its own risky venture. So tying what we do from an information security perspective to that is critical for not just the success of the organization, but the success of our profession.

Gardner: So we can safely say that large companies are probably pretty good at a cost-benefit analysis or they wouldn’t be successful. Now, I guess we need to ask them to take that a step further and do a cost-risk analysis, but in business terms, being mindful that their IT systems might be a much larger part of that than they had at once considered. Is that fair, Jack?

Risk implications

Jones: Businesses have been making these decisions, chasing the opportunity, but generally, without any clear understanding of the risk implications, at least from the information security perspective. They will have us in the corner screaming and throwing red flags in there, and talking about vulnerabilities and threats from one thing or another.

But, we come to the table with red, yellow, and green indicators, and on the other side of the table, they’ve got numbers. Well, here is what we expect to earn in revenue from this initiative, and the information security people are saying it’s crazy. How do you normalize the quantitative revenue gain versus red, yellow, and green?

Gardner: Jim Hietala, do you see it in the same red, yellow, green or are there some other frameworks or standard methodologies that The Open Group is looking at to make this a bit more of a science?

Hietala: Probably four years ago, we published what we call the Risk Taxonomy Standard which is based upon FAIR, the management framework that Jack Jones invented. So, we’re big believers in bringing that level of precision to doing risk analysis. Having just gone through training for FAIR myself, as part of the standards effort that we’re doing around certification, I can say that it really brings a level of precision and a depth of analysis to risk analysis that’s been lacking frequently in IT security and risk management.

Gardner: We’ve talked about how organizations need to be mindful that their risks are higher and different than in the past and we’ve talked about how standardization and methodologies are important, helping them better understand this from a business perspective, instead of just a technology perspective.

But, I’m curious about a cultural and organizational perspective. Whose job should this fall under? Who is wearing the white hat in the company and can rally the forces of good and make all the bad things managed? Is this a single person, a cultural, an organizational mission? How do you make this work in the enterprise in a real-world way? Let’s go to you, Jack Freund.

Freund: The profession of IT risk management is changing. That profession will have to sit between the business and information security inclusive of all the other IT functions that make that happen.

In order to be successful sitting between these two groups, you have to be able to speak the language of both of those groups. You have to be able to understand profit and loss and capital expenditure on the business side. On the IT risk side, you have to be technical enough to do all those sorts of things.

But I think the sum total of those two things is probably only about 50 percent of the job of IT risk management today. The other 50 percent is communication. Finding ways to translate that language and to understand the needs and concerns of each side of that relationship is really the job of IT risk management.

To answer your question, I think it’s absolutely the job of IT risk management to do that. From my own experiences with the FAIR framework, I can say that using FAIR is the Rosetta Stone for speaking between those two groups.

Necessary tools

It gives you the tools necessary to speak in the insurance and risk terms that business appreciate. And it gives you the ability to be as technical and just nerdy, if you will, as you need to be in order to talk to IT security and the other IT functions in order to make sure everybody is on the same page and everyone feels like their concerns are represented in the risk-assessment functions that are happening.

Gardner: Jack Jones, can you add to that?

Jones: I agree with what Jack said wholeheartedly. I would add, though, that integration or adoption of something like this is a lot easier the higher up in the organization you go.

For CFOs traditionally, their neck is most clearly on the line for risk-related issues within most organizations. At least in my experience, if you get their ear on this and present the information security data analyses to them, they jump on board, they drive it through the organization, and it’s just brain-dead easy.

If you try to drive it up through the ranks, maybe you get an enthusiastic supporter in the information security organization, especially if it’s below the CISO level, and they try a grassroots sort of effort to bring it in, it’s a tougher thing. It can still work. I’ve seen it work very well, but, it’s a longer row to hoe.

Gardner: There have been a lot of research, studies, and surveys on data breaches. What are some of the best sources, or maybe not so good sources, for actually measuring this? How do you know if you’re doing it right? How do you know if you’re moving from yellow to green, instead of to red? To you, Jack Freund.

Freund: There are a couple of things in that question. The first is there’s this inherent assumption in a lot of organizations that we need to move from yellow to green, and that may not be the case. So, becoming very knowledgeable about the risk posture and the risk tolerance of the organization is a key.

That’s part of the official mindset of IT security. When you graduate an information security person today, they are minted knowing that there are a lot of bad things out there, and their goal in life is to reduce them. But, that may not be the case. The case may very well be that things are okay now, but we have bigger things to fry over here that we’re going to focus on. So, that’s one thing.

The second thing, and it’s a very good question, is how we know that we’re getting better? How do we trend that over time? Overall, measuring that value for the organization has to be able to show a reduction of a risk or at least reduction of risk to the risk-tolerance levels of the organization.

Calculating and understanding that requires something that I always phrase as we have to become comfortable with uncertainty. When you are talking about risk in general, you’re talking about forward-looking statements about things that may or may not happen. So, becoming comfortable with the fact that they may or may not happen means that when you measure them today, you have to be willing to be a little bit squishy in how you’re representing that.

In FAIR and in other academic works, they talk about using ranges to do that. So, things like high, medium, and low, could be represented in terms of a minimum, maximum, and most likely. And that tends to be very, very effective. People can respond to that fairly well.

Gathering data

Jones: With regard to the data sources, there are a lot of people out there doing these sorts of studies, gathering data. The problem that’s hamstringing that effort is the lack of a common set of definitions, nomenclature, and even taxonomy around the problem itself.

You will have one study that will have defined threat, vulnerability, or whatever differently from some other study, and so the data can’t be normalized. It really harms the utility of it. I see data out there and I think, “That looks like that can be really useful.” But, I hesitate to use it because I don’t understand. They don’t publish their definitions, approach, and how they went after it.

There’s just so much superficial thinking in the profession on this that we now have dug under the covers. Too often, I run into stuff that just can’t be defended. It doesn’t make sense, and therefore the data can’t be used. It’s an unfortunate situation.

I do think we’re heading in a positive direction. FAIR can provide a normalizing structure for that sort of thing. The VERIS framework, which by the way, is also derived in part from FAIR, also has gained real attraction in terms of the quality of the research they have done and the data they’re generating. We’re headed in the right direction, but we’ve got a long way to go.

Gardner: Jim Hietala, we’re seemingly looking at this on a company-by-company basis. But, is there a vertical industry slice or industry-wide slice where we could look at what’s happening to everyone and put some standard understanding, or measurement around what’s going on in the overall market, maybe by region, maybe by country?

Hietala: There are some industry-specific initiatives and what’s really needed, as Jack Jones mentioned, are common definitions for things like breach, exposure, loss, all those, so that the data sources from one organization can be used in another, and so forth. I think about the financial services industry. I know that there is some information sharing through an organization called the FS-ISAC about what’s happening to financial services organizations in terms of attacks, loss, and those sorts of things.

There’s an opportunity for that on a vertical-by-vertical basis. But, like Jack said, there is a long way to go on that. In some industries, healthcare for instance, you are so far from that, it’s ridiculous. In the US here, the HIPAA security rule says you must do a risk assessment. So, hospitals have done annual risk assessments, will stick the binder on the shelf, and they don’t think much about information security in between those annual risk assessments. That’s a generalization, but various industries are at different places on a continuum of maturity of their risk management approaches.

Gardner: As we get better with having a common understanding of the terms and the measurements and we share more data, let’s go back to this notion of how to communicate this effectively to those people that can use it and exercise change management as a result. That could be the CFO, the CEO, what have you, depending on the organization.

Do you have any examples? Can we look to an organization that’s done this right, and examine their practices, the way they’ve communicated it, some of the tools they’ve used and say, “Aha, they’re headed in the right direction maybe we could follow a little bit.” Let’s start with you, Jack Freund.

Freund: I have worked and consulted for various organizations that have done risk management at different levels. The ones that have embraced FAIR tend to be the ones that overall feel that risk is an integral part of their business strategy. And I can give a couple of examples of scenarios that have played out that I think have been successful in the way they have been communicated.

Coming to terms

The key to keep in mind with this is that one of the really important things is that when you’re a security professional, you’re again trained to feel like you need results. But, the results for the IT risk management professional are different. The results are “I’ve communicated this effectively, so I am done.” And then whatever the results are, are the results that needed to be. And that’s a really hard thing to come to terms with.

I’ve been involved in large-scale efforts to assess risk for a Cloud venture. We needed to move virtually every confidential record that we have to the Cloud in order to be competitive with the rest of our industry. If our competitors are finding ways to utilize the Cloud before us, we can lose out. So, we need to find a way to do that, and to be secure and compliant with all the laws and regulations and such.

Through that scenario, one of the things that came out was that key ownership became really, really important. We had the opportunity to look at the various control structures and we analyzed them using FAIR. What we ended up with was sort of a long-tail risk. Most people will probably do their job right over a long enough period of time. But, over that same long period of time, the odds of somebody making a mistake not in your favor are probably likely, but, not significantly enough so that you can’t make the move.

But, the problem became that the loss side, the side that typically gets ignored with traditional risk-assessment methodologies, was so significant that the organization needed to make some judgment around that, and they needed to have a sense of what we needed to do in order to minimize that.

That became a big point of discussion for us and it drove the conversation away from bad things could happen. We didn’t bury the lead. The lead was that this is the most important thing to this organization in this particular scenario.

So, let’s talk about things we can do. Are we comfortable with it? Do we need to make any sort of changes? What are some control opportunities? How much do they cost? This is a significantly more productive conversation than just, “Here is a bunch of bad things that happen. I’m going to cross my arms and say no.”

Gardner: Jack Jones, examples at work?

Jones: In an organization that I’ve been working with recently, their board of directors said they wanted a quantitative view of information security risk. They just weren’t happy with the red, yellow, green. So, they came to us, and there were really two things that drove them there. One was that they were looking at cyber insurance. They wanted to know how much cyber insurance they should take out, and how do you figure that out when you’ve got a red, yellow, green scale?

They were able to do a series of analyses on a population of the scenarios that they thought were relevant in their world, get an aggregate view of their annualized loss exposure, and make a better informed decision about that particular problem.

Gardner: I’m curious how prevalent cyber insurance is, and is that going to be a leveling effect in the industry where people speak a common language the equivalent of actuarial tables, but for security in enterprise and cyber security?

Jones: One would dream and hope, but at this point, what I’ve seen out there in terms of the basis on which insurance companies are setting their premiums and such is essentially the same old “risk assessment” stuff that the industry has been doing poorly for years. It’s not based on data or any real analysis per se, at least what I’ve run into. What they do is set their premiums high to buffer themselves and typically cover as few things as possible. The question of how much value it’s providing the customers becomes a problem.

Looking to the future

Gardner: We’re coming up on our time limit. So, let’s quickly look to the future. Is there such thing as risk management as a service? Can we outsource this? Is there a way in which moving more of IT into Cloud or hybrid models would mitigate risk, because the Cloud provider would standardize? Then, many players in that environment, those who were buying those services, would be under that same umbrella? Let’s start with you Jim Hietala. What’s the future of this and what do the Cloud trends bring to the table?

Hietala: I’d start with a maxim that comes out of the financial services industry, which is that you can outsource the function, but you still own the risk. That’s an unfortunate reality. You can throw things out in the Cloud, but it doesn’t absolve you from understanding your risk and then doing things to manage it to transfer it if there’s insurance or whatever the case may be.

That’s just a reality. Organizations in the risky world we live in are going to have to get more serious about doing effective risk analysis. From The Open Group standpoint, we see this as an opportunity area.

As I mentioned, we’ve standardized the taxonomy piece of FAIR. And we really see an opportunity around the profession going forward to help the risk-analysis community by further standardizing FAIR and launching a certification program for a FAIR-certified risk analyst. That’s in demand from large organizations that are looking for evidence that people understand how to apply FAIR and use it in doing risk analyses.

Gardner: Jack Freund, looking into your crystal ball, how do you see this discipline evolving?

Freund: I always try to consider things as they exist within other systems. Risk is a system of systems. There are a series of pressures that are applied, and a series of levers that are thrown in order to release that sort of pressure.

Risk will always be owned by the organization that is offering that service. If we decide at some point that we can move to the Cloud and all these other things, we need to look to the legal system. There is a series of pressures that they are going to apply, and who is going to own that, and how that plays itself out.

If we look to the Europeans and the way that they’re managing risk and compliance, they’re still as strict as we in United States think that they may be about things, but there’s still a lot of leeway in a lot of the ways that laws are written. You’re still being asked to do things that are reasonable. You’re still being asked to do things that are standard for your industry. But, we’d still like the ability to know what that is, and I don’t think that’s going to go away anytime soon.

Judgment calls

We’re still going to have to make judgment calls. We’re still going to have to do 100 things with a budget for 10 things. Whenever that happens, you have to make a judgment call. What’s the most important thing that I care about? And that’s why risk management exists, because there’s a certain series of things that we have to deal with. We don’t have the resources to do them all, and I don’t think that’s going to change over time. Regardless of whether the landscape changes, that’s the one that remains true.

Gardner: The last word to you, Jack Jones. It sounds as if we’re continuing down the path of being mostly reactive. Is there anything you can see on the horizon that would perhaps tip the scales, so that the risk management and analysis practitioners can really become proactive and head things off before they become a big problem?

Jones: If we were to take a snapshot at any given point in time of an organization’s loss exposure, how much risk they have right then, that’s a lagging indicator of the decisions they’ve made in the past, and their ability to execute against those decisions.

We can do some great root-cause analysis around that and ask how we got there. But, we can also turn that coin around and ask how good we are at making well-informed decisions, and then executing against them, the asking what that implies from a risk perspective downstream.

If we understand the relationship between our current state, and past and future states, we have those linkages defined, especially, if we have an analytic framework underneath it. We can do some marvelous what-if analysis.

What if this variable changed in our landscape? Let’s run a few thousand Monte Carlo simulations against that and see what comes up. What does that look like? Well, then let’s change this other variable and then see which combination of dials, when we turn them, make us most robust to change in our landscape.

But again, we can’t begin to get there, until we have this foundational set of definitions, frameworks, and such to do that sort of analysis. That’s what we’re doing with FAIR, but without some sort of framework like that, there’s no way you can get there.

Gardner: I am afraid we’ll have to leave it there. We’ve been talking with a panel of experts on how new trends and solutions are emerging in the area of risk management and analysis. And we’ve seen how new tools for communication and using Big Data to understand risks are also being brought to the table.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference in Newport Beach, California. I’d like to thank our panel: Jack Freund, PhD, Information Security Risk Assessment Manager at TIAA-CREF. Thanks so much Jack.

Freund: Thank you, Dana.

Gardner: We’ve also been speaking with Jack Jones, Principal at CXOWARE.

Jones: Thank you. Thank you, pleasure to be here.

Gardner: And last, Jim Hietala, the Vice President for Security at The Open Group. Thanks.

Hietala: Thanks, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions; your host and moderator through these thought leadership interviews. Thanks again for listening and come back next time.

Comments Off

Filed under Security Architecture

Protecting Data is Good. Protecting Information Generated from Big Data is Priceless

By E.G. Nadhan, HP

This was the key message that came out of The Open Group® Big Data Security Tweet Jam on Jan 22 at 9:00 a.m. PT, which addressed several key questions centered on Big Data and security. Here is my summary of the observations made in the context of these questions.

Q1. What is Big Data security? Is it different from data security?

Big data security is more about information security. It is typically external to the corporate perimeter. IT is not prepared today to adequately monitor its sheer volume in brontobytes of data. The time period of long-term storage could violate compliance mandates. Note that storing Big Data in the Cloud changes the game with increased risks of leaks, loss, breaches.

Information resulting from the analysis of the data is even more sensitive and therefore, higher risk – especially when it is Personally Identifiable Information on the Internet of devices requiring a balance between utility and privacy.

At the end of the day, it is all about governance or as they say, “It’s the data, stupid! Govern it.”

Q2. Any thoughts about security systems as producers of Big Data, e.g., voluminous systems logs?

Data gathered from information security logs is valuable but rules for protecting it are the same. Security logs will be a good source to detect patterns of customer usage.

Q3. Most BigData stacks have no built in security. What does this mean for securing Big Data?

There is an added level of complexity because it goes across apps, network plus all end points. Having standards to establish identity, metadata, trust would go a long way. The quality of data could also be a security issue — has it been tampered with, are you being gamed etc. Note that enterprises have varying needs of security around their business data.

Q4. How is the industry dealing with the social and ethical uses of consumer data gathered via Big Data?

Big Data is still nascent and ground rules for handling the information are yet to be established. Privacy issue will be key when companies market to consumers. Organizations are seeking forgiveness rather than permission. Regulatory bodies are getting involved due to consumer pressure. Abuse of power from access to big data is likely to trigger more incentives to attack or embarrass. Note that ‘abuse’ to some is just business to others.

Q5. What lessons from basic data security and cloud security can be implemented in Big Data security?

Security testing is even more vital for Big Data. Limit access to specific devices, not just user credentials. Don’t assume security via obscurity for sensors producing bigdata inputs – they will be targets.

Q6. What are some best practices for securing Big Data? What are orgs doing now and what will organizations be doing 2-3 years from now?

Current best practices include:

  • Treat Big Data as your most valuable asset
  • Encrypt everything by default, proper key management, enforcement of policies, tokenized logs
  • Ask your Cloud and Big Data providers the right questions – ultimately, YOU are responsible for security
  • Assume data needs verification and cleanup before it is used for decisions if you are unable to establish trust with data source

Future best practices:

  • Enterprises treat Information like data today and will respect it as the most valuable asset in the future
  • CIOs will eventually become Chief Officer for Information

Q7. We’re nearing the end of today’s tweet tam. Any last thoughts on Big Data security?

Adrian Lane who participated in the tweet jam will be keynoting at The Open Group Conference in Newport Beach next week and wrote a good best practices paper on securing Big Data.

I have been part of multiple tweet chats specific to security as well as one on Information Optimization. Recently, I also conducted the first Open Group Web Jam internal to The Cloud Work Group.  What I liked about this Big Data Security Tweet Jam is that it brought two key domains together highlighting the intersection points. There was great contribution from subject matter experts forcing participants to think about one domain in the context of the other.

In a way, this post is actually synthesizing valuable information from raw data in the tweet messages – and therefore needs to be secured!

What are your thoughts on the observations made in this tweet jam? What measures are you taking to secure Big Data in your enterprise?

I really enjoyed this tweet jam and would strongly encourage you to actively participate in upcoming tweet jams hosted by The Open Group.  You get to interact with a wide spectrum of knowledgeable practitioners listed in this summary post.

NadhanHP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has more than 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project, and is also the founding co-chair for the Open Group Cloud Computing Governance project. Connect with Nadhan on: Twitter, Facebook, LinkedIn and Journey Blog.

 

2 Comments

Filed under Tweet Jam

#ogChat Summary – Big Data and Security

By Patty Donovan, The Open Group

The Open Group hosted a tweet jam (#ogChat) to discuss Big Data security. In case you missed the conversation, here is a recap of the event.

The Participants

A total of 18 participants joined in the hour-long discussion, including:

Q1 What is #BigData #security? Is it different from #data security? #ogChat

Participants seemed to agree that while Big Data security is similar to data security, it is more extensive. Two major factors to consider: sensitivity and scalability.

  • @dustinkirkland At the core it’s the same – sensitive data – but the difference is in the size and the length of time this data is being stored. #ogChat
  • @jim_hietala Q1: Applying traditional security controls to BigData environments, which are not just very large info stores #ogChat
  • @TheTonyBradley Q1. The value of analyzing #BigData is tied directly to the sensitivity and relevance of that data–making it higher risk. #ogChat
  • @AdrianLane Q1 Securing #BigData is different. Issues of velocity, scale, elasticity break many existing security products. #ogChat
  • @editingwhiz #Bigdata security is standard information security, only more so. Meaning sampling replaced by complete data sets. #ogchat
  • @Dana_Gardner Q1 Not only is the data sensitive, the analysis from the data is sensitive. Secret. On the QT. Hush, hush. #BigData #data #security #ogChat
    • @Technodad @Dana_Gardner A key point. Much #bigdata will be public – the business value is in cleanup & analysis. Focus on protecting that. #ogChat

Q2 Any thoughts about #security systems as producers of #BigData, e.g., voluminous systems logs? #ogChat

  • Most agreed that security systems should be setting an example for producing secure Big Data environments.
  • @dustinkirkland Q2. They should be setting the example. If the data is deemed important or sensitive, then it should be secured and encrypted. #ogChat
  • @TheTonyBradley Q2. Data is data. Data gathered from information security logs is valuable #BigData, but rules for protecting it are the same. #ogChat
  • @elinormills Q2 SIEM is going to be big. will drive spending. #ogchat #bigdata #security
  • @jim_hietala Q2: Well instrumented IT environments generate lots of data, and SIEM/audit tools will have to be managers of this #BigData #ogchat
  • @dustinkirkland @theopengroup Ideally #bigdata platforms will support #tokenization natively, or else appdevs will have to write it into apps #ogChat

Q3 Most #BigData stacks have no built in #security. What does this mean for securing #BigData? #ogChat

The lack of built-in security hoists a target on the Big Data. While not all enterprise data is sensitive, housing it insecurely runs the risk of compromise. Furthermore, security solutions not only need to be effective, but also scalable as data will continue to get bigger.

  • @elinormills #ogchat big data is one big hacker target #bigdata #security
    • @editingwhiz @elinormills #bigdata may be a huge hacker target, but will hackers be able to process the chaff out of it? THAT takes $$$ #ogchat
    • @elinormills @editingwhiz hackers are innovation leaders #ogchat
    • @editingwhiz @elinormills Yes, hackers are innovation leaders — in security, but not necessarily dataset processing. #eweeknews #ogchat
  • @jim_hietala Q3:There will be a strong market for 3rd party security tools for #BigData – existing security technologies can’t scale #ogchat
  • @TheTonyBradley Q3. When you take sensitive info and store it–particularly in the cloud–you run the risk of exposure or compromise. #ogChat
  • @editingwhiz Not all enterprises have sensitive business data they need to protect with their lives. We’re talking non-regulated, of course. #ogchat
  • @TheTonyBradley Q3. #BigData is sensitive enough. The distilled information from analyzing it is more sensitive. Solutions need to be effective. #ogChat
  • @AdrianLane Q3 It means identifying security products that don’t break big data – i.e. they scale or leverage #BigData #ogChat
    • @dustinkirkland @AdrianLane #ogChat Agreed, this is where certifications and partnerships between the 3rd party and #bigdata vendor are essential.

Q4 How is the industry dealing with the social and ethical uses of consumer data gathered via #BigData? #ogChat #privacy

Participants agreed that the industry needs to improve when it comes to dealing with the social and ethical used of consumer data gathered through Big Data. If the data is easily accessible, hackers will be attracted. No matter what, the cost of a breach is far greater than any preventative solution.

  • @dustinkirkland Q4. #ogChat Sadly, not well enough. The recent Instagram uproar was well publicized but such abuse of social media rights happens every day.
    • @TheTonyBradley @dustinkirkland True. But, they’ll buy the startups, and take it to market. Fortune 500 companies don’t like to play with newbies. #ogChat
    • @editingwhiz Disagree with this: Fortune 500s don’t like to play with newbies. We’re seeing that if the IT works, name recognition irrelevant. #ogchat
    • @elinormills @editingwhiz @thetonybradley ‘hacker’ covers lot of ground, so i would say depends on context. some of my best friends are hackers #ogchat
    • @Technodad @elinormills A core point- data from sensors will drive #bigdata as much as enterprise data. Big security, quality issues there. #ogChat
  • @Dana_Gardner Q4 If privacy is a big issue, hacktivism may crop up. Power of #BigData can also make it socially onerous. #data #security #ogChat
  • @dustinkirkland Q4. The cost of a breach is far greater than the cost (monetary or reputation) of any security solution. Don’t risk it. #ogChat

Q5 What lessons from basic #datasecurity and #cloud #security can be implemented in #BigData security? #ogChat

The principles are the same, just on a larger scale. The biggest risks come from cutting corners due to the size and complexity of the data gathered. As hackers (like Anonymous) get better, so does security regardless of the data size.

  • @TheTonyBradley Q5. Again, data is data. The best practices for securing and protecting it stay the same–just on a more massive #BigData scale. #ogChat
  • @Dana_Gardner Q5 Remember, this is in many ways unchartered territory so expect the unexpected. Count on it. #BigData #data #security #ogChat
  • @NadhanAtHP A5 @theopengroup – Security Testing is even more vital when it comes to #BigData and Information #ogChat
  • @TheTonyBradley Q5. Anonymous has proven time and again that most existing data security is trivial. Need better protection for #BigData. #ogChat

Q6 What are some best practices for securing #BigData? What are orgs doing now, and what will orgs be doing 2-3 years from now? #ogChat

While some argued encrypting everything is the key, and others encouraged pressure on big data providers, most agreed that a multi-step security infrastructure is necessary. It’s not just the data that needs to be secured, but also the transportation and analysis processes.

  • @dustinkirkland Q6. #ogChat Encrypting everything, by default, at least at the fs layer. Proper key management. Policies. Logs. Hopefully tokenized too.
  • @dustinkirkland Q6. #ogChat Ask tough questions of your #cloud or #bigdata provider. Know what they are responsible for and who has access to keys. #ogChat
    • @elinormills Agreed–> @dustinkirkland Q6. #ogChat Ask tough questions of your #cloud or #bigdataprovider. Know what they are responsible for …
  • @Dana_Gardner Q6 Treat most #BigData as a crown jewel, see it as among most valuable assets. Apply commensurate security. #data #security #ogChat
  • @elinormills Q6 govt level crypto minimum, plus protect all endpts #ogchat #bigdata #security
  • @TheTonyBradley Q6. Multi-faceted issue. Must protect raw #BigData, plus processing, analyzing, transporting, and resulting distilled analysis. #ogChat
  • @Technodad If you don’t establish trust with data source, you need to assume data needs verification, cleanup before it is used for decisions. #ogChat

A big thank you to all the participants who made this such a great discussion!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

3 Comments

Filed under Tweet Jam

Questions for the Upcoming Big Data Security Tweet Jam on Jan. 22

By Patty Donovan, The Open Group

Last week, we announced our upcoming tweet jam on Tuesday, January 22 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST, which will examine the impact of Big Data on security and how it will change the security landscape.

Please join us next Tuesday, January 22! The discussion will be moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct. We welcome Open Group members and interested participants from all backgrounds to join the session. Our panel of experts will include:

  • Elinor Mills, former CNET reporter and current director of content and media strategy at Bateman Group (@elinormills)
  • Jaikumar Vijayan, Computerworld (@jaivijayan)
  • Chris Preimesberger, eWEEK (@editingwhiz)
  • Tony Bradley, PC World (@TheTonyBradley)
  • Michael Santarcangelo, Security Catalyst Blog (@catalyst)

The discussion will be guided by these six questions:

  1. What is #BigData security? Is it different from #data #security? #ogChat
  2. Any thoughts about #security systems as producers of #BigData, e.g., voluminous systems logs? #ogChat
  3. Most #BigData stacks have no built in #security. What does this mean for securing BigData? #ogChat
  4. How is the industry dealing with the social and ethical uses of consumer data gathered via #BigData? #ogChat #privacy
  5. What lessons from basic data security and #cloud #security can be implemented in #BigData #security? #ogChat
  6. What are some best practices for securing #BigData? #ogChat

To join the discussion, please follow the #ogChat hashtag during the allotted discussion time. Other hashtags we recommend you use during the event include:

  • Information Security: #InfoSec
  • Security: #security
  • BYOD: #BYOD
  • Big Data: #BigData
  • Privacy: #privacy
  • Mobile: #mobile
  • Compliance: #compliance

For more information about the tweet jam, guidelines and general background information, please visit our previous blog post: http://blog.opengroup.org/2013/01/15/big-data-security-tweet-jam/

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com), or leave a comment below. We anticipate a lively chat and hope you will be able to join us!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

 

1 Comment

Filed under Tweet Jam

Big Data Security Tweet Jam

By Patty Donovan, The Open Group

On Tuesday, January 22, The Open Group will host a tweet jam examining the topic of Big Data and its impact on the security landscape.

Recently, Big Data has been dominating the headlines, analyzing everything about the topic from how to manage and process it, to the way it will impact your organization’s IT roadmap. As 2012 came to a close, analyst firm, Gartner predicted that data will help drive IT spending to $3.8 trillion in 2014. Knowing the phenomenon is here to stay, enterprises face a new and daunting challenge of how to secure Big Data. Big Data security also raises other questions, such as: Is Big Data security different from data security? How will enterprises handle Big Data security? What is the best approach to Big Data security?

It’s yet to be seen if Big Data will necessarily revolutionize enterprise security, but it certainly will change execution – if it hasn’t already. Please join us for our upcoming Big Data Security tweet jam where leading security experts will discuss the merits of Big Data security.

Please join us on Tuesday, January 22 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. GMT for a tweet jam, moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct, that will discuss and debate the issues around big data security. Key areas that will be addressed during the discussion include: data security, privacy, compliance, security ethics and, of course, Big Data. We welcome Open Group members and interested participants from all backgrounds to join the session and interact with our panel of IT security experts, analysts and thought leaders led by Jim Hietala (@jim_hietala) and Dave Lounsbury (@Technodad) of The Open Group. To access the discussion, please follow the #ogChat hashtag during the allotted discussion time.

And for those of you who are unfamiliar with tweet jams, here is some background information:

What Is a Tweet Jam?

A tweet jam is a one hour “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on Big Data security. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

  • Have your first #ogChat tweet be a self-introduction: name, affiliation, occupation.
  • Start all other tweets with the question number you’re responding to and the #ogChat hashtag.
    • Sample: “Q1 enterprises will have to make significant adjustments moving forward to secure Big Data environments #ogChat”
    • Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.
    • While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue!
    • A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com). We anticipate a lively chat and hope you will be able to join!

 

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Tweet Jam

Data Governance: A Fundamental Aspect of IT

By E.G. Nadhan, HP

In an earlier post, I had explained how you can build upon SOA governance to realize Cloud governance.  But underlying both paradigms is a fundamental aspect that we have been dealing with ever since the dawn of IT—and that’s the data itself.

In fact, IT used to be referred to as “data processing.” Despite the continuing evolution of IT through various platforms, technologies, architectures and tools, at the end of the day IT is still processing data. However, the data has taken multiple shapes and forms—both structured and unstructured. And Cloud Computing has opened up opportunities to process and store structured and unstructured data. There has been a need for data governance since the day data processing was born, and today, it’s taken on a whole new dimension.

“It’s the economy, stupid,” was a campaign slogan, coined to win a critical election in the United States in 1992. Today, the campaign slogan for governance in the land of IT should be, “It’s the data, stupid!”

Let us challenge ourselves with a few questions. Consider them the what, why, when, where, who and how of data governance.

What is data governance? It is the mechanism by which we ensure that the right corporate data is available to the right people, at the right time, in the right format, with the right context, through the right channels.

Why is data governance needed? The Cloud, social networking and user-owned devices (BYOD) have acted as catalysts, triggering an unprecedented growth in recent years. We need to control and understand the data we are dealing with in order to process it effectively and securely.

When should data governance be exercised? Well, when shouldn’t it be? Data governance kicks in at the source, where the data enters the enterprise. It continues across the information lifecycle, as data is processed and consumed to address business needs. And it is also essential when data is archived and/or purged.

Where does data governance apply? It applies to all business units and across all processes. Data governance has a critical role to play at the point of storage—the final checkpoint before it is stored as “golden” in a database. Data Governance also applies across all layers of the architecture:

  • Presentation layer where the data enters the enterprise
  • Business logic layer where the business rules are applied to the data
  • Integration layer where data is routed
  • Storage layer where data finds its home

Who does data governance apply to? It applies to all business leaders, consumers, generators and administrators of data. It is a good idea to identify stewards for the ownership of key data domains. Stewards must ensure that their data domains abide by the enterprise architectural principles.  Stewards should continuously analyze the impact of various business events to their domains.

How is data governance applied? Data governance must be exercised at the enterprise level with federated governance to individual business units and data domains. It should be proactively exercised when a new process, application, repository or interface is introduced.  Existing data is likely to be impacted.  In the absence of effective data governance, data is likely to be duplicated, either by chance or by choice.

In our data universe, “informationalization” yields valuable intelligence that enables effective decision-making and analysis. However, even having the best people, process and technology is not going to yield the desired outcomes if the underlying data is suspect.

How about you? How is the data in your enterprise? What governance measures do you have in place? I would like to know.

A version of this blog post was originally published on HP’s Journey through Enterprise IT Services blog.

NadhanHP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has more than 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project, and is also the founding co-chair for the Open Group Cloud Computing Governance project. Connect with Nadhan on: Twitter, Facebook, LinkedIn and Journey Blog.

1 Comment

Filed under Cloud, Cloud/SOA