Category Archives: Conference

As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

This latest BriefingsDirect discussion, leading into the The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore the business implications of the current shift to so-called Platform 3.0.

Known as the new model through which big data, cloud, and mobile and social — in combination — allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we’re here now to learn more how to leverage Platform 3.0 as more than a IT shift — and as a business game-changer. It will be a big topic at next week’s conference.

The panel: Dave Lounsbury, Chief Technical Officer at The Open Group; Chris Harding, Director of Interoperability at The Open Group, and Mark Skilton, Global Director in the Strategy Office at Capgemini. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They’re turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it’s a pretty fundamental change.

Lounsbury

If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we’re starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Harding: Enterprises have to keep up with the way that things are moving in order to keep their positions in their industries. Enterprises can’t afford to be working with yesterday’s technology. It’s a case of being able to understand the information that they’re presented, and make the best decisions.

Harding

We’ve always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we’re talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs — velocity, volume, and value — on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we’re now into is the multi-workload environment, where you have mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton

It has to do with not just one solution, not one subscription model — because we’re now into this subscription-model era … the subscription economy, as one group tends to describe it. Now, we’re looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we’re dealing with — 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Why should IT be thinking about this as a fundamental shift, rather than a modest change?

Lounsbury: A lot depends on how you define your IT organization. It’s useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it’s how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There’s no point giving someone data if it’s not been properly managed or if there’s incorrect information.

What’s going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we’ve seen in the open-source  role and things like that nature, but there’s the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they’re going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can’t do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally — how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I’d just like to add to Dave’s excellent points about, the shape of data has changed, but also about why should IT get involved. We’re seeing that there’s a shift in the constituency of who is using this data.

We have the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We’ve got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there’s also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: How do we prevent this from going off the rails?

Harding: This a very important point. And to add to the difficulties, it’s not only that a whole set of different people are getting involved with different kinds of information, but there’s also a step change in the speed with which all this is delivered. It’s no longer the case, that you can say, “Oh well, we need some kind of information system to manage this information. We’ll procure it and get a program written” that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It’s really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It’s a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Skilton: Capgemini has been doing work in this area. I break it down into four levels of scalability. It’s the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We’re very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it’s all about connectivity in the field. I meet a number of clients who are saying, “We’ve got this cloud service,” or “This service is in a certain area of my country. If I move to another parts of the country or I’m traveling, I can’t get connectivity.” That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the “long tail effect.” It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: We’re coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3.0, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We’re still in the formational stages of  “third platform” or Platform 3.0 for The Open Group as an industry. To some extent, we’re starting pretty much at the ground floor with that in the Platform 3.0 forum. We’re leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we’ve got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we’re really working toward in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Harding: We certainly also need to understand the business environment within which Platform 3.0 will be used. We’ve heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we’ve heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We’re also hearing about marketplaces for services, new ways in which services are being made available and combined.

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: Looking to the future, when we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we’ve begun to see that how a recommendation engine could be brought to bear. I’m thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thoughts?

Skilton: What we’re talking about is the next generation of the Internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the Internet.

I think that in the future, we’ll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We’ll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what’s happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: There’s this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, “So now this is where we could make a change to our business.” It’s the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It’s going to be different for every business, and I’m very happy to say this, it’s something that computers aren’t going to be able to do for a very long time yet. It’s going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it’s a very exciting time, and we’ll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we’ll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, “It’s going to be them” or “It’s going to be them.”

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I’d disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we’ll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Cloud/SOA, Conference, Data management, Enterprise Architecture, Platform 3.0, Professional Development, TOGAF®

The Open Group July Conference Emphasizes Value of Placing Structure and Agility Around Enterprise Risk Reduction Efforts

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15 in Philadelphia.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these discussions on Enterprise Transformation in the finance, government, and healthcare sector.

We’re here now with a panel of experts to explore new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We’ll learn how enterprises are better delivering risk assessment and, one hopes, defenses, in the current climate of challenging cyber security. And we’ll see how predicting risks and potential losses accurately, is an essential ingredient in enterprise transformation.

With that, please join me in welcoming our panel, we’re here with Jack Freund, the Information Security Risk Assessment Manager at TIAA-CREF. Jack has spent over 14 years in enterprise IT, is a visiting professor at DeVry University, and also chairs a Risk-Management Subcommittee for the ISACA. Welcome back, Jack.

Jack Freund: Glad to be here, Dana. Thanks for having me.

Gardner: We’re also here with Jack Jones. He is the Principal at CXOWARE, and he has more than nine years of experience as a Chief Information Security Officer (CISO). He is also an inventor of the FAIR, risk analysis framework. Welcome, Jack.

Jack Jones: Thank you very much.

Gardner: We’re also here with Jim Hietala. He is the Vice President, Security, at The Open Group. Welcome, Jim.

Jim Hietala: Thanks, Dana, good to be here.

Gardner: Let’s start with you, Jim. It’s been about six months since we spoke about these issues around risk assessment and understanding risk accurately, and it’s hard to imagine things getting any better in the last six months. There’s been a lot of news and interesting developments in the cyber-security landscape.

So has this heightened interest? What are The Open Group and others are doing in this field of risk assessment and accuracy and determining what your losses might be and how that can be a useful tool?

Hietala: I would say it has. Certainly, in the cyber security world in the past six or nine months, we’ve seen more and more discussion of the threats that are out there. We’ve got nation-state types of threats that are very concerning, very serious, and that organizations have to consider.

With what’s happening, you’ve seen that the US Administration and President Obama direct the National Institute of Standards and Technology (NIST) to develop a new cybersecurity framework. Certainly on the government side of things, there is an increased focus on what can we do to increase the level of cybersecurity throughout the country in critical infrastructure. So my short answer would be yes, there is more interest in coming up with ways to accurately measure and assess risk so that we can then deal with it.

Gardner: Jack Jones, do you also see a maturity going on, or are we just hearing more in the news and therefore there is a perception shift? How do you see things? How have things changed, in your perception, over the last six to nine months?

Jones: I continue to see growth and maturity, especially in areas of understanding the fundamental nature of risk and exploration of quantitative methods for it. A few years ago, that would have seemed unrealistic at best, and outlandish at worst in many people’s eyes. Now, they’re beginning to recognize that it is not only pragmatic, but necessary in order to get a handle on much of what we have to do from a prioritization perspective.

Gardner: Jack Freund are you seeing an elevation in the attention being paid to risk issues inside companies in larger organizations? Is this something that’s getting the attention of all the people it should?

Freund: We’re entering a phase where there is going to be increased regulatory oversights over very nearly everything. When that happens, all eyes are going to turn to IT and IT risk management functions to answer the question of whether we’re handling the right things. Without quantifying risk, you’re going to have a very hard time saying to your board of directors that you’re handling the right things the way a reasonable company should.

As those regulators start to see and compare among other companies, they’ll find that these companies over here are doing risk quantification, and you’re not. You’re putting yourself at a competitive disadvantage by not being able to provide those same sorts of services.

Gardner: So you’re saying that the market itself hasn’t been enough to drive this, and that regulation is required?

Freund: It’s probably a stronger driver than market forces at this point. The market is always going to be able to help push that to a more prominent role, but especially in information security. If you’re not experiencing primary losses as a result of these sorts of things, then you have to look to economic externalities, which are largely put in play by regulatory forces here in the United States.

Jones: To support Jack’s statement that regulators are becoming more interested in this too, just in the last 60 days, I’ve spent time training people at two regulatory agencies on FAIR. So they’re becoming more aware of these quantitative methods, and their level of interest is rising.

Gardner: Jack Jones, this is probably a good time for us to explain a little bit more about FAIR. For those listeners who might not be that familiar with it, please take a moment to give us the high-level overview of what FAIR is.

Jones: Sure, just thumbnail sketch of it. It’s, first and foremost, a model for what risk is and how it works. It’s a decomposition of the factors that make up risk. If you can measure or estimate the value of those factors, you can derive risk quantitatively in dollars and cents.

You see a lot of “risk quantification” based on ordinal scales — 1, 2, 3, 4, 5 scales, that sort of thing. But that’s actually not quantitative. If you dig into it, there’s no way you could defend a mathematical analysis based on those ordinal approaches. So FAIR is this model for risk that enables true quantitative analysis in a very pragmatic way.

Gardner: FAIR stands for a Factor Analysis of Information Risk. Is that correct?

Jones: That is correct.

Gardner: Jim Hietala, we also have in addition to a very interesting and dynamic cybersecurity landscape a major trend getting traction in big data, cloud computing, and mobile. There’s lots going on in the IT world. Perhaps IT’s very nature, the roles and responsibilities, are shifting. Is doing risk assessment and management becoming part and parcel of core competency of IT, and is that a fairly big departure from the past?

Hietala: As to the first question, it’s having to become kind of a standard practice within IT. When you look at outsourcing your IT operations to a cloud-service provider, you have to consider the security risks in that environment. What do they look like and how do we measure them?

It’s the same thing for things like mobile computing. You really have to look at the risks of folks carrying tablets and smart phones, and understand the risks associated with those same things for big data. For any of these large-scale changes to our IT infrastructure you’ve got to understand what it means from a security and risk standpoint.

Gardner: Jack Freund or Jack Jones, any thoughts about the changing role of IT as a service and service-level agreement brokering aspects of IT aligned with risk assessment?

Freund: I read an interesting article this morning around a school district that is doing something they call bring your own technology (BYOT). For anybody who has been involved in these sort of efforts in the corporate world that should sound very familiar. But I want to think culturally around this. When you have students wondering how to do these sorts of things and becoming accustomed to being able to bring current technology, oh my gosh. When they get to the corporate world and start to work, they’re going to expect the same sorts of levels of service.

To answer to your earlier question, absolutely. We have to find a way to embed risk assessment, which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive. That’s important.

Whether that’s an embedded function within IT or it’s an overarching function that exists across multiple business units, there are different models that work for different size companies and companies of different cultural types. But it has to be there. It’s absolutely critical.

Gardner: Jack Jones, how do you come down this role of IT shifting in the risk assessment issues, something that’s their responsibility. Are they embracing that or  maybe wishing it away?

Jones: It depends on whom you talk to. Some of them would certainly like to wish it away. I don’t think IT’s role in this idea for risk assessment and such has really changed. What is changing is the level of visibility and interest within the organization, the business side of the organization, in the IT risk position.

Previously, they were more or less tucked away in a dark corner. People just threw money at it and hoped bad things didn’t happen. Now, you’re getting a lot more board-level interest in IT risk, and with that visibility comes a responsibility, but also a certain amount of danger. If they’re doing it really badly, they’re incredibly immature in how they approach risk.

They’re going to look pretty foolish in front of the board. Unfortunately, I’ve seen that play out. It’s never pretty and it’s never good news for the IT folks. They’re realizing that they need to come up to speed a little bit from a risk perspective, so that they won’t look the fools when they’re in front of these executives.

They’re used to seeing quantitative measures of opportunities and operational issues of risk of various natures. If IT comes to the table with a red, yellow, green chart, the board is left to wonder, first how to interpret that, and second, whether these guys really get it. I’m not sure the role has changed, but I think the responsibilities and level of expectations are changing.

Gardner: Part of what FAIR does in risk analysis in general is to identify potential losses and put some dollars on what potential downside there is. That provides IT with the tool, the ability, to rationalize investments that are needed. Are you seeing the knowledge of potential losses to be an incentive for spending on modernization?

Jones: Absolutely. One organization I worked with recently had certain deficiencies from the security perspective that they were aware of, but that were going to be very problematic to fix. They had identified technology and process solutions that they thought would take them a long way towards a better risk position. But it was a very expensive proposition, and they didn’t have money in the IT or information security budget for it.

So, we did a current-state analysis using FAIR, how much loss exposure they had on annualized basis. Then, we said, “If you plug this solution into place, given how it affects the frequency and magnitude of loss that you’d expect to experience, here’s what’s your new annualized loss exposure would be.” It turned out to be a multimillion dollar reduction in annualized loss exposure for a few hundred thousand dollars cost.

When they took that business case to management, it was a no-brainer, and management signed the check in a hurry. So they ended up being in a much better position.

If they had gone to executive management saying, “Well, we’ve got a high risk and if we buy this set of stuff we’ll have low or medium risk,” it would’ve been a much less convincing and understandable business case for the executives. There’s reason to expect that it would have been challenging to get that sort of funding given how tight their corporate budgets were and that sort of thing. So, yeah, it can be incredibly effective in those business cases.

Gardner: Correct me if I am wrong, but you have a book out since we last spoke. Jack, maybe you could tell a bit about of that and how that comes to bear on these issues?

Freund: Well, the book is currently being written. Jack Jones and I have entered into a contract with Elsevier and we’re also going to be preparing the manuscript here over the summer and winter. Probably by second quarter next year, we’ll have something that we can share with everybody. It’s something that has been a long time coming. For Jack, I know he has wanted to write this for a long time.

We wanted to build a conversational book around how to assess risk using FAIR, and that’s an important distinction from other books in the market today. You really want to dig into a lot of the mathematical stuff. I’m speaking personally here, but I wanted to build a book that gave people tools, gave practitioners the risk tools to be able to handle common challenges and common opposition to what they are doing every day, and just understand how to apply concepts in FAIR in a very tangible way.

Gardner: Very good. What about the conference itself. We’re coming up very rapidly on The Open Group Conference. What should we expect in terms of some of your presentations and training activities?

Jones: I think it will be a good time. People would be pleased to have the quality of the presentations and some of the new information that they’ll get to see and experience. As you said, we’re offering FAIR training as a part of a conference. It’s a two-day session with an opportunity afterwards to take the certification exam.

If history is any indication, people will go through the training. We get a lot of very positive remarks about a number of different things. One, they never imagined that risk could be interesting. They’re also surprised that it’s not, as one friend of mine calls it “rocket surgery.” It’s relatively straightforward and intuitive stuff. It’s just that as a profession, we haven’t had this framework for reference, as well as some of the methods that we apply to make it practical and defensible before.

So we’ve gotten great feedback in the past, and I think people will be pleasantly surprised at what they experienced.

Freund: One of the things I always say about FAIR training is it’s a real red pill-blue pill moment — in reference to the old Matrix movies. I took FAIR training several years ago with Jack. I always tease Jack that it’s ruined me for other risk assessment methods. Once you learn how to do it right, it’s very obvious which are the wrong methods and why you can’t use them to assess risk and why it’s problematic.

I’m joking. It’s really great and valuable training, and now I use it every day. It really does open your eyes to the problems and the risk assessment portion of IT today, and gives a very practical and actionable things to do in order to be able to fix that, and to provide value to your organization.

Gardner: Jim Hietala, the emphasis in terms of vertical industries at the conference is on finance, government and healthcare. They seem to be the right groups to be factoring more standardization and understanding of risk. Tell me how it comes together. Why is The Open Group looking at vertical industries at this time?

Hietala: Specific to risk, if I can talk about that for a second, the healthcare world, at least here in the US, has new security rules, and one of the first few requirements is perform an annual risk assessment. So it’s currently relevant to that industry.

It’s the same thing with finance. One of the regulations around financial organizations tells them that, in terms of information security, they need to do a risk assessment. In government, clearly there has been a lot of emphasis on understanding risk and mitigating it throughout various government sectors.

In terms of The Open Group and verticals, we’ve done lots of great work in the area of Enterprise Architecture, security, and all the areas for which we’ve done work. In terms of our conferences, we’ve evolved things over the last year or so to start to look at what are the things that are unique in verticals.

It started in the mining industry. We set up a mining metals and exploration forum that looked at IT and architecture issues related specifically to that sector. We started that work several years ago and now we’re looking at other industries and starting to assess the unique things in healthcare, for example. We’ve got a one day workshop at Philadelphia on the Tuesday of the conference, looking at IT and transformation opportunities in the healthcare sector.

That’s how we got to this point, and we’ll see more of that from The Open Group in the future.

Gardner: Are there any updates that we should be aware of in terms of activities within The Open Group and other organizations working on standards, taxonomy, and definitions when it comes to risk?

Hietala: I’ll take that and dive into that. We at The Open Group originally published a risk taxonomy standard based on FAIR four years ago. Over time, we’ve seen greater adoption by large companies and we’ve also seen the need to extend what we’re doing there. So we’re updating the risk taxonomy standard, and the new version of that should be published by the end of this summer.

We also saw within the industry the need for a certification program for risk analysts, and so they’d be trained in quantitative risk assessment using FAIR. We’re working on that program and we’ll be talking more about it in Philadelphia.

Along the way, as we were building the certification program, we realized that there was a missing piece in terms of the body of knowledge. So we created a second standard that is a companion to the taxonomy. That will be called the Risk Analysis Standard that looks more at some of that the process issues and how to do risk analysis using FAIR. That standard will also be available by the end of the summer and, combined, those two standards will form the body of knowledge that we’ll be testing against in the certification program when it goes live later this year.

Gardner: Jack Freund, it seems that between regulatory developments, the need for maturity in these enterprises, and the standardization that’s being brought to bear by such groups as The Open Group, it’s making this quite a bit more of the science and less of an art.

What does that bring to organizations in terms of a bottom-line effect? I wonder if there is a use case or even an example that you could mention and explain that would help people better understand of what they get back when they go through these processes and they get this better maturity around risk?

Freund: I’m not an attorney, but I have had a lot of lawyers tell me — I think Jim had mentioned before in his vertical conversation — that a lot of the regulations start with performing annual risk assessment and then choose controls based upon that. They’re not very prescriptive that way.

One of the things that it drives in organizations is a sense of satisfaction that we’ve got things covered more than anything else. When you have your leadership in these organizations understanding that you’re doing what a regular reasonable company would do to manage risk this way, you have fewer fire drills. Nobody likes to walk into work and have to deal with hundred different things.

We’re moving hard drives out of printers and fax machines, what are we doing around scanning and vulnerabilities, and all of those various things that every single day can inundate you with worry, as opposed to focusing on the things that matter.

I like a folksy saying that sort of sums things up pretty well — a dime holding up a dollar. You have all these little bitty squabbly issues that get in the way of really focusing on reducing risk in your organization in meaningful ways and focusing on the things that matter.

Using approaches like FAIR, drives a lot of value into your organization, because you’re freeing up mind share in your executives to focus on things that really matter.

Gardner: Jack Jones, a similar question, any examples that exemplify the virtues of doing the due diligence and having some of these systems and understanding in place?

Jones: I have an example to Jack Freund’s point about being able to focus and prioritize. One organization I was working with had identified a significant risk issue and they were considering three different options for risk mitigation that had been proposed. One was “best practice,” and the other two were less commonly considered for that particular issue.

An analysis showed with real clarity that option B, one of the not-best practice options, should reduce risk every bit as effectively as best practice, but had a whole lot lower cost. The organization then got to make an informed decision about whether they were going to be herd followers or whether they were going to be more cost-effective in risk management.

Unfortunately, there’s always danger in not following the herd. If something happens downstream, and you didn’t follow best practice, you’re often asked to explain why you didn’t follow the herd.

That was part of the analysis too, but at the end of the day, management got to make a decision on how they wanted to behave. They chose to not follow best practice and be more cost-effective in using their money. When I asked them why they felt comfortable with that, they said, “Because we’re comfortable with the rigor in your analysis.”

To your question earlier about art-versus-science, first of all, in most organization there would have been no question. They would have said, “We must follow best practice.” They wouldn’t even examine the options, and management wouldn’t have had the opportunity to make that decision.

Furthermore, even if they had “examined” those options using a more subjective, artistic approach, somebody’s wet finger in the air, management almost certainly would not have felt comfortable with a non-best practice approach. So, the more scientific, more rigorous, approach that something like FAIR provides, gives you all kinds of opportunity to make informed decisions and to feel more comfortable about those decisions.

Gardner: It really sounds as if there’s a synergistic relationship between a lot of the big-data and analytics investments that are being made for a variety of reasons, and also this ability to bring more science and discipline to risk analysis.

How do those come together, Jack Jones? Are we seeing the dots being connected in these large organizations that they can take more of what they garner from big data and business intelligence (BI) and apply that to these risk assessment activities, is that happening yet?

Jones: It’s just beginning to. It’s very embryonic, and there are only probably a couple of organizations out there that I would argue are doing that with any sort of effectiveness. Imagine that — they’re both using FAIR.

But when you think about BI or any sort of analytics, there are really two halves to the equation. One is data and the other is models. You can have all the data in the world, but if your models stink, then you can’t be effective. And, of course, vice versa. If you’ve got great model and zero data, then you’ve got challenges there as well.

Being able to combine the two, good data and effective models, puts you in much better place. As an industry, we aren’t there yet. We’ve got some really interesting things going on, and so there’s a lot of potential there, but people have to leverage that data effectively and make sure they’re using a model that makes sense.

There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you. The models will grossly misinform you. So people have to be careful, because data is great, but if you’re applying it to a bad model, then you’re in trouble.

Gardner: We are coming up near the end of our half hour. Jack Freund, for those organizations that are looking to get started, to get more mature, perhaps start leveraging some of their investments in areas like big data, in addition to attending The Open Group Conference or watching some of the plenary sessions online, what tips do you have for getting started? Are there some basic building blocks that should be in place or ways in which to get the ball rolling when it comes to a better risk analysis?

Freund: Strong personality matters in this. They have to have some sort of evangelist in the organization who cares enough about it to drive it through to completion. That’s a stake on the ground to say, “Here is where we’re going to start, and here is the path that we are going to go on.”

When you start doing that sort of thing, even if leadership changes and other things happen, you have a strong commitment from the organization to keep moving forward on these sorts of things.

I spend a lot of my time integrating FAIR with other methodologies. One of the messaging points that I keep saying all the time is that what we are doing is implementing a discipline around how we choose our risk rankings. That’s one of the great things about FAIR. It’s universally compatible with other assessment methodologies, programs, standards, and legislation that allows you to be consistent and precise around how you’re connecting to everything else that your organization cares about.

Concerns around operational risk integration are important as well. But driving that through to completion in the organization has a lot to do with finding sponsorship and then just building a program to completion. But absent that high-level sponsorship, because FAIR allows you to build a discipline around how you choose rankings, you can also build it from the bottom up. You can have these groups of people that are FAIR trained that can build risk analyses or either pick ranges — 1, 2, 3, 4 or high, medium, low. But then when questioned, you have the ability to say, “We think this is a medium, because it met our frequency and magnitude criteria that we’ve been establishing using FAIR.”

Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis. In the end it’s an interesting and reasonable path to get to risk utopia.

Gardner: Jack Jones, any thoughts from your perspective on a good way to get started, maybe even through the lens of the verticals that The Open Group has targeted for this conference, finance, government and healthcare? Are there any specific important things to consider on the outset for your risk analysis journey from any of the three verticals?

Jones: A good place to start is with the materials that The Open Group has made available on the risk taxonomy and the soon to be published risk-analysis standard.

Another source that I recommend to everybody I talk to about other sorts of things is a book called ‘How to Measure Anything’ by Douglas Hubbard. If someone is even least bit interested in actually measuring risk in quantitative terms, they owe it to themselves to read that book. It puts into layman’s terms some very important concepts and approaches that are tremendously helpful. That’s an important resource for people to consider too.

As far as within organizations, some organizations will have a relatively mature enterprise risk-management program at the corporate level, outside of IT. Unfortunately, it can be hit-and-miss, but there can be some very good resources in terms of people and processes that the organization has already adopted. But you have to be careful there too, because with some of those enterprise risk management programs, even though they may have been in place for years, and thus, one would think over time and become mature, all they have done is dig a really deep ditch in terms of bad practices and misconceptions.

So it’s worth having the conversation with those folks to gauge how clueful are they, but don’t assume that just because they have been in place for a while and they have some specific title or something like that that they really understand risk at that level.

Gardner: Well, very good. I’m afraid we will have to leave it there. We’ve been talking with a panel of experts about the new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We’ve seen how enterprises are better delivering risk assessments, or beginning to, as they are facing challenges in cyber-security as well as undergoing the larger undertaking of enterprise transformation.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference in July 2013 in Philadelphia. There’s more information on The Open Group website about that conference for you to attend or to gather information from either live streaming or there are resources available by downloading an app for the conference.

So with that thanks to our panel. We’ve been joined by Jack Freund. He is the Information Security Risk Assessment Manager at TIAA-CREF. Thank you so much, Jack.

Freund: Thank you Dana.

Gardner: And also Jack Jones, the Principal at CXOWARE. Thank you, sir.

Jones: It’s been my pleasure. Thanks.

Gardner: And then also lastly, Jim Hietala, Vice President, Security at The Open Group. Thank you Jim.

Hietala: Thank you, Dana.

Gardner: And this is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator through these thought leader interview series. Thanks again for listening, and come back next time.

1 Comment

Filed under ArchiMate®, Business Architecture, Conference, Enterprise Architecture, Professional Development, TOGAF®

Three laws of the next Internet of Things – the new platforming evolution in computing

By Mark Skilton, Global Director at Capgemini

There is a wave of new devices and services that are growing in strength extending the boundary of what is possible in today’s internet driven economy and lifestyle.   A striking feature is the link between apps that are on smart phones and tablets and the ability to connect to not just websites but also to data collection sensors and intelligence analytical analysis of that information.   A key driver of this has also been the improvement in the cost-performance curve of information technology not just in CPU and storage but also the easy availability and affordability of highly powerful computing and mass storage in mobile devices coupled with access to complex sensors, advanced optics and screen displays results in a potentially truly immersive experience.  This is a long way from the early days of radio frequency identity tags which are the forerunner of this evolution.   Digitization of information and its interpretation of meaning is everywhere, moving into a range of industries and augmented services that create new possibilities and value. A key challenge in how to understand this growth of devices, sensors, content and services across the myriad of platforms and permutations this can bring.

·         Energy conservation

o   Through home and building energy management

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential
that connecting Devices

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems.

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Examples include Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.  Or Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in harsh or inaccessible environments. Disabled support through robotic prosthetics and communication synthesis.   Swarming robots that fly or mimic group behavior.  Swarming robots that mimic nature and decision making.

These are just the tip of want is possible; the early commercial ventures that are starting to drive new ways to think about information technology and application services.

A key feature I noticed in all these devices are that they augment previous layers of technology by sitting on top of them and adding extra value.   While often the long shadow of the first generation giants of the public internet Apple, Google, Amazon give the impression that to succeed means a controlled platform and investment of millions; these new technologies use existing infrastructure and operate across a federated distributed architecture that represents a new kind of platforming paradigm of multiple systems.

Perhaps a paradigm of new technology cycles is that as the new tech arrives it will cannibalize older technologies. Clearly nothing is immune to this trend, even the cloud,   I’ll call it even the evolution of a  kind a technology laws ( a feature  I saw in by Charles Fine clock speed book http://www.businessforum.com/clockspeed.html  but adapted here as a function of compound cannibalization and augmentation).  I think Big Data is an example of such a shift in this direction as augmented informatics enables major next generation power pays for added value services.

These devices and sensors can work with existing infrastructure services and resources but they also create a new kind of computing architecture that involves many technologies, standards and systems. What was in early times called “system of systems” Integration (Examples seen in the defence sector  http://www.bctmod.army.mil/SoSI/sosi.html  and digital ecosystems in the government sector  http://www.eurativ.com/specialreport-skills/kroes-europe-needs-digital-ecosy-interview-517996 )

While a sensor device can replace the existing thermostat in your house or the lighting or the access locks to your doors, they are offering a new kind of augmented experience that provides information and insight that enabled better control of the wider environment or the actions and decisions within a context.

This leads to a second feature of these device, the ability to learn and adapt from the inputs and environment.  This is probably an even larger impact than the first to use infrastructure in that it’s the ability to change the outcomes is a revolution in information.  The previous idea of static information and human sense making of this data is being replaced by the active pursuit of automated intelligence from the machines we build.   Earlier design paradigms that needed to define declarative services, what IT call CRUD (Create, Read, Update, Delete) as predefined and managed transactions are being replaced by machine learning algorithms that seek to build a second generation of intelligent services  that alter the results and services with the passage of time and usage characteristics.

This leads me to a third effect that became apparent in the discussion of lifestyle services versus medical and active device management.  In the case of lifestyle devices a key feature is the ability to blend in with the personal activity to enable new insight in behavior and lifestyle choices, to passively and actively monitor or tack action, not always to affect they behavior itself. That is to provide unobtrusive, ubiquitous presence.   But moving this idea further it is also about the way the devices could merge in a become integrated within the context of the user or environmental setting.  The example of biomedical devices to augment patient care and wellbeing is one such example that can have real and substantive impact of quality of life as well as efficiency in cost of care programs with an aging population to support.

An interesting side effect of these trends is the cultural dilemma these devices and sensors bring in the intrusion of personal data and privacy. Yet once the meaning and value of if this telemetry on safety , health or material value factors is perceived for the good of the individual and community, the adoption of such services may become more pronounced and reinforced. A virtuous circle of accelerated adoption seen as a key characteristic of successful growth and a kind of conditioning feedback that creates positive reinforcement.     While a key feature that is underpinning these is the ability of the device and sensor to have an unobtrusive, ubiquitous presence this overall effect is central to the idea of effective system of systems integration and borderless information flow TM (The Open Group)

These trends I see as three laws of the next Internet of things describing a next generation platforming strategy and evolution.

Its clear that sensors and devices are merging together in a way that will see cross cutting from one industry to another.  Motion and temperature sensors in one will see application in another industry.   Services from one industry may connect with other industries as combinations of these services, lifestyles and affects.

Iofthings1.jpg

Formal and informal communities both physical and virtual will be connected through sensors and devices that pervade the social, technological and commercial environments. This will drive further growth in the mass of data and digitized information with the gradual semantic representation of this information into meaningful context.  Apps services will develop increasing intelligence and awareness of the multiplicity of data, its content and metadata adding new insight and services to the infrastructure fabric.  This is a new platforming paradigm that may be constructed from one or many systems and architectures from the macro to micro, nano level systems technologies.

The three laws as I describe may be recast in a lighter tongue-in-cheek way comparing them to the famous Isaac Asimov three laws of robotics.   This is just an illustration but in some way implies that the sequence of laws is in some fashion protecting the users, resources and environment by some altruistic motive.  This may be the case in some system feedback loops that are seeking this goal but often commercial micro economic considerations may be more the driver. However I can’t help thinking that this does hint to what maybe the first stepping stone to the eventuality of such laws.

Three laws of the next generation of The Internet of Things – a new platforming architecture

Law 1. A device, sensor or service may operate in an environment if it can augment infrastructure

Law 2.  A device, sensor or service must be able  to learn and adapt its response to the environment as long as  it’s not in conflict with the First law

Law 3. A device, sensor or service  must have unobtrusive ubiquitous presence such that it does not conflict with the First or Second laws

References

 ·       Energy conservation

o   The example of  Nest  http://www.nest.com Learning thermostat, founded by Tony Fadell, ex ipod hardware designer and  Head of iPod and iPhone division, Apple.   The device monitors and learns about energy usage in a building and adapts and controls the use of energy for improved carbon and cost efficiency.

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.  Example such as UP Jawbone  https://jawbone/up and Fitbit  http://www.fitbit.com .

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential that connecting Devices such as Zensorium  http://www.zensorium.com

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  http://www.nytimes.com/2010/07/29/garden/29parents.html?pagewanted-all  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems. Examples include Toyota  advanced IT systems that will help drivers avoid road accidents.  Http://www.toyota.com/safety/ Google driverless car  http://www.forbes.com/sites/chenkamul/2013/01/22/fasten-your-seatbelts-googles-driverless-car-is-worth-trillions/

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Alpha Blue http://www.alphablue.co.uk Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.

o   Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity. Examples such as Lockitron  https://www.lockitron.com.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in  harsh or inaccessible environments. Examples include the NASA Mars curiosity rover that has active control programs to determine remote actions on the red planet that has a signal delay time round trip (13 minutes, 48 seconds EDL) approximately 30 minutes to detect perhaps react to an event remotely from Earth.  http://blogs.eas.int/mex/2012/08/05/time-delay-betrween-mars-and-earth/  http://www.nasa.gov/mission_pages/mars/main/imdex.html .  Disabled support through robotic prosthetics and communication synthesis.     http://disabilitynews.com/technology/prosthetic-robotic-arm-can-feel/.  Swarming robotc that fly or mimic group behavior.    University of Pennsylvania, http://www.reuters.com/video/2012/03/20/flying-robot-swarms-the-future-of-search?videoId-232001151 Swarming robots ,   Natural Robotics Lab , The University of Sheffield , UK   http://www.sheffield.ac.uk/news/nr/sheffield-centre-robotic-gross-natural-robotics-lab-1.265434

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0

The era of “Internet aware systems and services” – the multiple-data, multi-platform and multi-device and sensors world

By Mark Skilton, Global Director at Capgemini

Communications + Data protocols and the Next Internet of Things Multi-Platform solutions

Much of the discussion on the “internet of things” have been around industry sector examples use of device and sensor services.  Examples of these I have listed at the end of this paper.  What are central to this emerging trend are not just sector point solutions but three key technical issues driving a new Industry Sector Digital Services strategy to bring these together into a coherent whole.

  1. How combinations of system technologies platforms are converging enabling composite business processes that are mobile , content and transactional rich  and with near real time persistence and interactivity
  2. The development of “non-web browser” protocols in new sensor driven machine data that are emerging that extend  new types of data into internet connected business and social integration
  3. The development of “connected systems” that move solutions in a new digital services of multiple services across platforms creating new business and technology services

I want to illustrate this by focusing on three topics:  multi-platforming strategies, communication protocols and examples of connected systems.

I want to show that this is not a simple “three or four step model” that I often see where mobile + applications and Cloud equal a solution but result in silos of data and platform integration challenges. New processing methods for big data platforms, distributed stream computing and in memory data base services for example are changing the nature of business analytics and in particular marketing and sales strategic planning and insight.  New feedback systems collecting social and machine learning data are  creating new types of business growth opportunities in context aware services that work and augment skills and services.

The major solutions in the digital ecosystem today incorporate an ever growing mix of devices and platforms that offer new user experiences and  organization. This can be seen across most all industry sectors and horizontally between industry sectors. This diagram is a simplistic view I want to use to illustrate the fundamental structures that are forming.

Iofthings1.jpg

Multiple devices that offer simple to complex visualization, on-board application services

Multiple Sensors that can economically detect measure and monitor most physical phenomena: light, heat, energy, chemical, radiological in both non-biological and biological systems.

Physical and virtual communities of formal and informal relationships. These human and/ or machine based  associations in the sense that search and discover of data and resources that can now work autonomously across an internet of many different types of data.

Physical and virtual Infrastructure that represent servers, storage, databases, networks and other resources that can constitute one or more platforms and environments. This infrastructure now is more complex in that it is both distributed and federated across multiple domains: mobile platforms, cloud computing platforms, social network platforms, big data platforms and embedded sensor platforms. The sense of a single infrastructure is both correct and incorrect in that is a combined state and set of resources that may or may not be within a span of control of an individual or organization.

Single and multi-tenanted Application services that operate in transactional, semi or non-deterministic ways that drive logical processing, formatting, interpretation, computation and other processing of data and results from one-to-many, many-to-one or many-to-many platforms and endpoints.

The key to thinking in multiple platforms is to establish the context of how these fundamental forces of platform services are driving interactions for many Industries and business and social networks and services. This is changing because they are interconnected altering the very basis of what defines a single platform to a multiple platform concept.

MS2This diagram illustrates some of these relationships and arrangements.   It is just one example of a digital ecosystem pattern, there can be other arrangements of these system use cases to meet different needs and outcomes.

I use this model to illustrate some of the key digital strategies to consider in empowering communities; driving value for money strategies or establishing a joined up device and sensor strategy for new mobile knowledge workers.   This is particularly relevant for key business stakeholders decision making processes today in Sales, Marketing, Procurement, Design, Sourcing, Supply and Operations to board level as well as IT related Strategy and service integration and engineering.

Taking one key stakeholder example, the Chief Marketing Officer (CMO) is interested and central to strategic channel and product development and brand management. The CMO typically seeks to develop Customer Zones, Supplier zones, marketplace trading communities, social networking communities and behavior insight leadership. These are critical drivers for successful company presence, product and service brand and market grow development as well as managing and aligning IT Cost and spend to what is needed for the business performance.  This creates a new kind of Digital Marketing Infrastructure to drive new customer and marketing value.  The following diagram illustrates types of  marketing services that raise questions over the types of platforms needed for single and multiple data sources, data quality and fidelity.

ms3These interconnected issues effect the efficacy and relevancy of marketing services to work at the speed, timeliness and point of contact necessary to add and create customer and stakeholder value.

What all these new converged technologies have in common are communications.  But  communications that are not just HTTP protocols but wider bandwidth of frequencies that are blurring together what is now possible to be connected.

These protocols include Wi-Fi and other wireless systems and standards that are not just in the voice speech band but also in the collection and use of other types of telemetry relating to other senses and detectors.

All these have common issues of Device and sensor compatibility, discovery and paring and security compatibility and controls.

ms4Communication standards examples for multiple services.

  • Wireless: WLAN, Bluetooth, ZigBee, Z-Wave, Wireless USB,
  •  Proximity Smartcard, Passive , Active, Vicinity Card
  • IrDA, Infrared
  • GPS Satellite
  • Mobile 3G, 4GLTE, Cell, Femtocell, GSM, CDMA, WIMAX
  • RFID RF, LF, HFbands
  • Encryption: WEP, WPA, WPA2, WPS, other

These communication protocols impact on the design and connectivity of system- to-system services. These standards relate to the operability of the services that can be used in the context of a platform and how they are delivered and used by consumers and providers..  How does the data and service connect with the platform? How does the service content get collected, formatted, processed and transmitted between the source and target platform?  How do these devices and sensors work to support extended and remote mobile and platform service?  What distributed workloads work best in a mobile platform, sensor platform or distributed to a dedicated or shared platform that may be cloud computing or appliance based for example?

Answering these questions are key to providing a consistent and powerful digital service strategy that is both flexible and capable of exploiting, scaling and operating with these new system and intersystem capabilities.

This becomes central to a new generation of Internet aware data and services that represent the digital ecosystem that deliver new business and consumer experience on and across platforms.ms5

This results in a new kind of User Experience and Presence strategy that moves the “single voice of the Customer” and “Customer Single voice” to a new level that works across mobile, tablets and other devices and sensors that translate and create new forms of information and experience for consumers and providers. Combining this with new sensors that can include for example; positional, physical and biomedical data content become a reality in this new generation of digital services.  Smart phones today have a price-point that includes many built in sensors that are precision technologies measuring physical and biological data sources. When these are built into new feedback and decision analytics creates a whole new set of possibilities in real time and near real time augmented services as well as new levels of resource use and behavior insight.

The scale and range of data types (text, voice, video, image, semi structured, unstructured, knowledge, metadata , contracts, IP ) about social, business and physical environments have moved beyond the early days of RFID tags to encompass new internet aware sensors, systems, devices and services.  ms6This is not just “Tabs and Pads” of mobiles and tablets but a growing presence into “Boards, Places and Spaces” that make up physical environments turning them in part of the interactive experience and sensory input of service interaction. This now extends to the massive scale of terrestrial communications that connect across the planet and beyond in the case of NASA for example; but also right down to the Micro, Nano, Pico and quantum levels in the case of Molecular and Nano tech engineering .   All these are now part of the modern technological landscape that is pushing the barriers of what is possible in today’s digital ecosystem.

The conclusion is that strategic planning needs to have insight into the nature of new infrastructures and applications that will support these new multisystem workloads and digital infrastructures.
I illustrate this in the following diagram in what I call the “multi-platforming” framework that represents this emerging new ecosystem of services.ms7

Digital Service = k ∑ Platforms + ∑ Connections

K= a coefficient measuring how open, closed and potential value of service

Digital Ecosystem = e ∑ Digital Services

e = a coefficient of how diverse and dynamic the ecosystem and its service participants.

I will explore the impact on enterprise architecture and digital strategy in future blogs and how the emergence of a new kind of architecture called Ecosystem Arch.

Examples of new general Industry sector services Internet of Things

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

4 Comments

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0

Driving Boundaryless Information Flow in Healthcare

By E.G. Nadhan, HP

I look forward with great interest to the upcoming Open Group conference on EA & Enterprise Transformation in Finance, Government & Healthcare in Philadelphia in July 2013. In particular, I am interested in the sessions planned on topics related to the Healthcare Industry. This industry is riddled with several challenges of uncontrolled medical costs, legislative pressures, increased plan participation, and improved longevity of individuals. Come to think of it, these challenges are not that different from those faced when defining a comprehensive enterprise architecture. Therefore, can the fundamental principles of Enterprise Architecture be applied towards the resolution of these challenges in the Healthcare industry? The Open Group certainly thinks so.

Enterprise Architecture is a discipline, methodology, and practice for translating business vision and strategy into the fundamental structures and dynamics of an enterprise at various levels of abstraction. As defined by TOGAF®, enterprise architecture needs to be developed through multiple phases. These include Business Architecture, Applications, Information, and Technology Architecture. All this must be in alignment with the overall vision. The TOGAF Architecture Development Method enables a systematic approach to addressing these challenges while simplifying the problem domain.

This approach to the development of Enterprise Architecture can be applied towards the complex problem domain that manifests itself in Healthcare. Thus, it is no surprise that The Open Group is sponsoring the Population Health Working Group, which has a vision to enable “boundary-less information flow” between the stakeholders that participate in healthcare delivery. Checkout the presentation delivered by Larry Schmidt, Chief Technologist, Health and Life Sciences Industries, HP, US at the Open Group conference in Philadelphia.

As a Platinum member of The Open Group, HP has co-chaired the release of multiple standards, including the first technical cloud standard. The Open Group is also leading the definition of the Cloud Governance Framework. Having co-chaired these projects, I look forward to the launch of the Population Health Working Group with great interest.

Given the role of information in today’s landscape, “boundary-less information flow” between the stakeholders that participate in healthcare delivery is vital. At the same time, how about injecting a healthy dose of innovation given that enterprise Architects are best positioned for innovation – a post triggered by Forrester Analyst Brian Hopkins’s thoughts on this topic. The Open Group — with its multifaceted representation from a wide array of enterprises — provides incredible opportunities for innovation in the context of the complex landscape of the healthcare industry. Take a look at the steps taken by HP Labs to innovate and improve patient care one day at a time.

I would strongly encourage you to attend Schmidt’s session, as well as the Healthcare Transformation Panel moderated by Open Group CEO, Allen Brown at this conference.

How about you? What are some of the challenges that you are facing within the Healthcare industry today? Have you applied Enterprise Architecture development methods to problem domains in other industries? Please let me know.

Connect with Nadhan on: Twitter, Facebook, Linkedin and Journey Blog.

A version of this blog post originally appeared on the HP Enterprise Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. 

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Conference, Enterprise Architecture, Healthcare, TOGAF®

The Open Group Sydney – My Conference Highlights

By Mac Lemon, MD Australia at Enterprise Architects

Sydney

Well the dust has settled now with the conclusion of The Open Group ‘Enterprise Transformation’ Conference held in Sydney, Australia for the first time on April 15-20. Enterprise Architects is proud to have been recognised at the event by The Open Group as being pivotal in the success of this event. A number of our clients including NBN, Australia Post, QGC, RIO and Westpac presented excellent papers on leading edge approaches in strategy and architecture and a number of EA’s own thought leaders in Craig Martin, Christine Stephenson and Ana Kukec also delivered widely acclaimed papers.

Attendance at the conference was impressive and demonstrated that there is substantial appetite for a dedicated event focussed on the challenges of business and technology strategy and architecture. We saw many international visitors both as delegates and presenting papers and there is no question that a 2014 Open Group Forum will be the stand out event in the calendar for business and technology strategy and architecture professionals.

My top 10 take-outs from the conference include the following:

  1. The universal maturing in understanding the criticality of Business Architecture and the total convergence upon Business Capability Modelling as a cornerstone of business architecture;
  2. The improving appreciation of techniques for understanding and expressing business strategy and motivation, such as strategy maps, business model canvass and business motivation modelling;
  3. That customer experience is emerging as a common driver for many transformation initiatives;
  4. While the process for establishing the case and roadmap for transformation appears well enough understood, the process for management of the blueprint through transformation is not and generally remains a major program risk;
  5. Then next version of TOGAF® should offer material uplift in support for security architecture which otherwise remains at low levels of maturity from a framework standardisation perspective;
  6. ArchiMate® is generating real interest as a preferred enterprise architecture modelling notation – and that stronger alignment of ArchiMate® and TOGAF® meta models in then next version of TOGAF® is highly anticipated;
  7. There is industry demand for recognised certification of architects to demonstrate learning alongside experience as the mark of a good architect. There remains an unsatisfied requirement for certification that falls in the gap between TOGAF® and the Open CA certification;
  8. Australia can be proud of its position in having the second highest per capita TOGAF® certification globally behind the Netherlands;
  9. While the topic of interoperability in government revealed many battle scarred veterans convinced of the hopelessness of the cause – there remain an equal number of campaigners willing to tackle the challenge and their free and frank exchange of views was entertaining enough to justify worth the price of a conference ticket;
  10. Unashamedly – Enterprise Architects remains in a league of its own in the concentration of strategy and architecture thought leadership in Australia – if not globally.

Mac LemonMac Lemon is the Managing Director of Enterprise Architects Pty Ltd and is based in Melbourne, Australia.

This is an extract from Mac’s recent blog post on the Enterprise Architects web site which you can view here.

Comments Off

Filed under ArchiMate®, Business Architecture, Certifications, Conference, Enterprise Architecture, Enterprise Transformation, Professional Development, Security Architecture, TOGAF, TOGAF®

Connect at The Open Group Conference in Sydney (#ogSYD) via Social Media

By The Open Group Conference Team

By attending The Open Group’s conferences, attendees are able to learn from industry experts, understand the latest technologies and standards and discuss and debate current industry trends. One way to maximize the benefits is to make technology work for you. If you are attending The Open Group Conference in Sydney next week, we’ve put together a few tips on how to leverage social media to make networking at the conference easier, quicker and more effective.

Using Twitter at #ogSYD

Twitter is a real-time news-sharing tool that anyone can use. The official hashtag for the conference is #ogSYD. This enables anybody, whether they are physically attending the event or not, to follow what’s happening at The Open Group Conference in Sydney in real-time and interact with each other.

Before the conference, be sure to update your Twitter account to monitor #ogSYD and, of course, to tweet about the conference.

Using Facebook at The Open Group Conference in Sydney

You can also track what is happening at the conference on The Open Group Facebook Page. We will be posting photos from conference events throughout the week. If you’re willing to share, your photos with us, we’re happy to post them to our page with a photo credit. Please email your photos, captions, full name and organization to photo (at) opengroup.org!

LinkedIn during The Open Group Conference in Sydney

Motivated by one of the sessions? Interested in what your peers have to say? Start a discussion on The Open Group LinkedIn Group page. We’ll also be sharing interesting topics and questions related to The Open Group Conference as it is happening. If you’re not a member already, requesting membership is easy. Simply go to the group page and click the “Join Group” button. We’ll accept your request as soon as we can!

Blogging during The Open Group Conference in Sydney

Stay tuned for conference recaps here on The Open Group blog. In case you missed a session or you weren’t able to make it to Sydney, we’ll be posting the highlights and recaps on the blog. If you are attending the conference and would like to submit a recap of your own, please contact ukopengroup (at) hotwirepr.com.

If you have any questions about social media usage at the conference, feel free to tweet the conference team @theopengroup.

Comments Off

Filed under Conference

The Open Group Speakers Discuss Enterprise Architecture, Business Architecture and Enterprise Transformation

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: Expert Panel Explores Enterprise Architecture and Business Architecture as Enterprise Transformation Agents, or read the transcript here.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership interview series, coming to you in conjunction with The Open Group Conference on April 15, in Sydney, Australia.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these business transformation discussions. The conference, The Open Group’s first in Australia, will focus on “How Does Enterprise Architecture Transform an Enterprise?” And there will be special attention devoted to how enterprise transformation impacts such vertical industries as finance and defense, as well as exploration, mining, and minerals. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

We’re here now with two of the main speakers at the conference — Hugh Evans, the Chief Executive Officer of Enterprise Architects, a specialist enterprise architecture (EA) firm based in Melbourne, Australia; and Craig Martin, Chief Operations Officer and Chief Architect at Enterprise Architects.

As some background, Hugh is both the founder and CEO at Enterprise Architects. His professional experience blends design and business, having started out in traditional architecture, computer games design, and digital media, before moving into enterprise IT and business transformation.
In 1999, Hugh founded the IT Strategy Architecture Forum, which included chief architects from most of the top 20 companies in Australia. He has also helped found the Australian Architecture Body  of Knowledge and the London Architecture Leadership Forum in the UK.

Since starting Enterprise Architects in 2002, Hugh has grown the team to more than 100 people, with offices in Australia, the UK, and the U.S.
With a career spanning more than 20 years, Craig has held executive positions in the communications, high tech, media, entertainment, and government markets and has operated as an Enterprise Architect and Chief Consulting Architect for a while.

In 2012, Craig became COO of Enterprise Architects to improve the global scalability of the organization, but he is also a key thought leader for strategy and architecture practices for all their clients and also across the EA field.

Craig has been a strong advocate of finding differentiation in businesses through identifying new mixes of business capabilities in those organizations. He advises that companies that do not optimize how they reassemble their capabilities will struggle, and he also believes that business decision making should be driven by economic lifecycles.

So welcome to you both. How are you doing?

Hugh Evans: Great, Dana. Good morning, Dana. Welcome everyone. Craig Martin: Thanks very much for having us.

Big-picture perspective

Gardner: I look forward to our talk. Let’s look at this first from a big-picture perspective and then drill down into what you are going to get into at the conference in a couple of weeks. What are some of the big problems that businesses are facing, that they need to solve, and that architecture-level solutions can really benefit them. I’ll open this up to both Hugh and Craig?

Evans: Thanks very much, Dana. I’ll start with the trend in the industry around fast-paced change and disruptive innovation. You’ll find that many organizations, many industries, at the moment in the U.S., Australia, and around the world are struggling with the challenges of how to reinvent themselves with an increasing number of interesting and innovative business models coming through. For many organizations, this means that they need to wrap their arms around an understanding of their current business activities and what options they’ve got to leverage their strategic advantages.

We’re seeing business architecture as a tool for business model innovation, and on the other side, we’re also seeing business architecture as a tool that’s being used to better manage risk, compliance, security, and new technology trends around things like cloud, big data, and so on.

Martin: Yes, there is a strong drive within the industry to try and reduce complexity.  As organizations are growing, the business stakeholders are confronted with a large amount of information, especially within the architecture space. We’re seeing that they’re struggling with this complexity and have to make accurate and efficient business decisions on all this information.

What we are seeing, and based upon what Hugh has already discussed, is that some of those industry drivers are around disruptive business models. For example, we’re seeing it with the likes of higher education, the utility space, and financial services space, which are the dominant three.
There is a lot of change occurring in those spaces, and businesses are looking for ways to make them more agile to adapt to that change, and looking towards disciplined architecture and the business-architecture discipline to try and help them in that process.

Gardner: I think I know a bit about how we got here — computing, globalization, outsourcing, companies expanding across borders, the ability to enter new markets freely, and dealing with security, but also great opportunity. Did I miss anything? Is there anything about the past 10 or 15 years in business practices that have led now to this need for a greater emphasis on that strategic architectural level of thinking?

Martin: A lot has to do with basically building blocks. We’ve seen a journey that’s travelled within the architecture disciplines specifically. We call it the commodification of the business, and we’ve seen that maturity in the IT space. A lot of processes that used to be innovative in our business are now becoming fairly utility and core to the business. In any Tier 1 organization, a lot of the processes that used to differentiate them are now freely available in a number of vendor platforms, and any of their competitors can acquire those.

Looking for differentiation

So they are looking for that differentiation, the ability to be able to differentiate themselves from their competitors, and away from that sort of utility space. That’s a shift that’s beginning to occur. Because a lot of those IT aspects have become industrialized, that’s also moving up into the business space.

In other words, how can we now take complex mysteries in the business space and codify them? In other words, how can we create building blocks for them, so that organizations now can actually effectively work with those building blocks and string them together in different ways to solve more complex business problems.

Evans: I’ll add to that Dana. EA is now around 30 years old, but the rise in EA has really come from the need for IT systems to interoperate and to create common standards and common understanding within an organization for how an IT estate is going to come together and deliver the right type of business value.

Through the ’90s we saw the proliferation of technologies as a result of the extension of distributed computing models and the emergence of the Internet. We’ve seen now the ubiquity of the Internet and technology across business. The same sort of concepts that ring true in technology architecture extend out into the business, around how the business interoperates with its components.

The need to change very fast for business, which is occurring now in the current economy, with the entrepreneurship and the innovation going on, is seeing this type of thinking come to the fore. This type of thinking enables organizations to change more rapidly. The architecture itself won’t make the organization change rapidly, but it will provide the appropriate references and enable people to have the right conversations to make that happen.

Gardner: So architecture can come as a benefit when the complexity kicks in. When you try to change an organization, you don’t get lost along the way. Give me a sense about what sort of paybacks your clients get when they do this correctly, and what happens when you don’t do this very well?

Evans: Business architecture, as well as strategic architecture, is still quite a nascent capability for organizations, and many organizations are really still trying to get a grip on this. The general rule is that organizations don’t manage this so well at the moment, but organizations are looking to improving in this area, because of the obvious, even heuristic, payoffs that you get from being better organized.

You end up spending less money, because you’re a more efficient organization, and you end up delivering better value to customers, because you’re a more effective organization. This efficiency and effectiveness need within organizations is worth the price of investment in this area.
The actual tangible benefits that we’re seeing across our customers includes reduced cost of their IT estate.

Meeting profiles

You have improved security and improved compliance, because organizations can see where their capabilities are meeting the various risk and compliance profiles, and you are also seeing organizations bring products to market quicker. The ability to move through the product management process, bring products to market more rapidly, and respond to customer need more rapidly puts organizations in front and makes them more competitive.

The sorts of industries we’re seeing acting in this area would include the postal industry, where they are moving from a traditional mail- to parcels, which is a result of a move towards online retailing. You’re also seeing it in the telco sector and you’re seeing it in the banking and finance sector.
In the banking and finance sector, we’ve also seen a lot of this investment driven by the merger and acquisition (M&A) activity that’s come out of the financial crisis in various countries where we operate. These organizations are getting real value from understanding where the enterprise boundaries are, how they bring the business together, how they better integrate the organizations and acquisitions, and how they better divest.

Martin: We’re seeing, especially at the strategic level, that the architecture discipline is able to give business decision makers a view into different strategic scenarios. For example, where a number of environmental factors and market pressures would have been inputs into a discussion around how to change a business, we’re also seeing business decision makers getting a lot of value from running those scenarios through an actual hypothesis of the business model.

For example, they could be considering four or five different strategic scenarios, and what we are seeing is that, using the architecture discipline, it’s showing them effectively what those scenarios look like as they cascade through the business. It’s showing the impact on capabilities, on people and the approaches and technologies, and the impact on capital expenditures (CAPEX) and operational expenditures (OPEX). Those views of each of those strategic scenarios allows them to basically pull the trigger on the better strategic scenario to pursue, before they’ve invested all of their efforts and all that analysis to possibly get to the point where it wasn’t the right decision in the first place. So that might be referred to as sort of the strategic enablement piece.

We’re also seeing a lot of value for organizations within the portfolio space. We traditionally get questions like, “I have 180 projects out there. Am I doing the right things? Are those the right 180 projects, and are they going to help me achieve the types of CAPEX and OPEX reductions that I am looking for?”

With the architecture discipline, you don’t take a portfolio lens into what’s occurring within the business. You take an architectural lens, and you’re able to give executives an overview of exactly where the spend is occurring. You give them an overview of where the duplication is occurring, and where the loss of cohesion is occurring.

Common problems

A common problem we find, when we go into do these types of gigs, is the amount of duplication occurring across a number of projects. In a worst-case scenario, 75 percent of the projects are all trying to do the same thing, on the same capability, with the same processes.
So there’s a reduction of complexity and the production of efforts that’s occurring across the organizations to try and bring it and get it into more synergistic sessions.

We’re also seeing a lot of value occurring up at the customer experience space. That is really taking a strong look at this customer experience view, which is less around all of the underlying building blocks and capabilities of an organization and looking more at what sort of experiences we want to give our customer? What type of product offerings must we assemble, and what underlying building blocks of the organization must be assembled to enable those offerings and those value propositions?

That sort of traceability through the cycle gives you a view of what levers you must pull to optimize your customer experience. Organizations are seeing a lot of value there and that’s basically increasing their effectiveness in the market and having a direct impact on their market share.
And that’s something that we see time and time again, regardless of what the driver was behind the investment in the architecture project, seeing the team interact and build a coalition for action and for change. That’s the most impressive thing that we get to see.

Gardner: Let’s drill down a little bit into some of what you’ll be discussing at the conference in Sydney in April. One of the things that’s puzzling to me, when I go to these Open Group Conferences, is to better understand the relationship between business architecture and IT architecture and where they converge and where they differ. Perhaps you could offer some insights and maybe tease out what some discussion points for that would be at the conference.

Martin: That’s actually quite a hot topic. In general, the architecture discipline has grown from the IT space, and that’s a good progression for it to take, because we’re seeing the fruits of that discipline in how they industrialize IT components. We’re seeing the fruits of that in complex enterprise resource planning (ERP) systems, the modularization of those ERP systems, their ability to be customized, and adapt to businesses. It’s a fairly mature space, and the natural progression of that is to apply those same thinking patterns back up into the business space.

In order for this to work effectively well, when somebody asks a question like that, we normally respond with a “depends” statement. We have in this organization a thing called the mandate curve, and it relates to what the mandate is within the business. What is the organization looking to solve?

Are they looking to build an HR management system? Are they looking to gain efficiencies from an enterprise-wide ERP solution? Are they looking to reduce the value chain losses that they’re having on a monthly basis? Are they looking to improve customer experience across a group of companies? Or are they looking to improve shareholder value across the organization for an M&A, or maybe reduce cost-to-income.

Problem spaces

Those are some of the problem spaces, and we often get into that mind space to ask, “Those are the problems that you are solving, but what mandate is given to architecture to solve them?” We often find that the mandate for the IT architecture space is sitting beneath the CIO, and the CIO tends to use business architecture as a communication tool with business. In other words, to understand business better, to begin to apply architecture rigor to the business process.

Evans: It’s interesting, Dana. I spent a lot of time last year in the UK, working with the team across a number of business-architecture requirements. We were building business-architecture teams. We were also delivering some projects, where the initial investigation was a business- architecture piece, and we also ran some executive roundtables in the UK.

One thing that struck me in that investigation was the separation that existed in the business- architecture community from the traditional enterprise and technology architecture or IT architecture communities in those organizations that we were dealing with.
One insurance company, in particular, that was building a business-architecture team was looking for people that didn’t necessarily have an architecture background, but possibly could apply that insight. They were looking for deep business domain knowledge inside the various aspects of the insurance organization that they were looking to cover.

So to your question about the relationship between business architecture and IT architecture, where they converge and how they differ, it’s our view that business architecture is a subset of the broader EA picture and that these are actually integrated and unified disciplines.
However, in practice you’ll find that there is often quite a separation between these two groups. I think that the major reason for that is that the drivers that are actually creating the investment for business architecture are actually now from coming outside of IT, and to some extent, IT is replicating that investment to build the engagement capability to engage with business so that they can have a more strategic discussion, rather than just take orders from the business.

I think that over this year, we’re going to see more convergence between these two groups, and that’s certainly something that we are looking to foster in EA.

Gardner: I just came back from The Open Group Conference in California a few weeks ago, where the topic was focused largely on big data, but analysis was certainly a big part of that. Now, business analysis and business analysts, I suppose, are also part of this ecosystem. Are they subsets of the business architect? How do you see the role of business analysts now fitting into this, given the importance of data and the ability for organizations to manage data with new efficiency and scale?

Martin: Once again, that’s also a hot topic. There is a convergence occurring, and we see that across the landscape, when it comes to the number of frameworks and standards that people certify on. Ultimately, it comes to this knife-edge point, in which we need to interact with the business stakeholder and we need to elicit requirements from that stakeholder and be able to model them successfully.
The business-analysis community is slightly more mature in this particular space. They have, for example, the Business Analysis Body of Knowledge (BABOK). Within that space, they leverage a competency model, which in effect goes through a cycle, from an entry level BA, right up to what they refer to as the generalist BA, which is where they see the start of the business- architecture role.

Career path

There’s a career path from a traditional business analyst role, which is around requirements solicitation and requirements management, which seems to be quite project focused. In other words, dropping down onto project environments, understanding stakeholder needs and requirements, and modeling those and documenting them, helping the IT teams model the data flows, the data structures but with a specific link into the business space.

As you move up that curve, you get into the business-architecture space, which is a broader structural view around how all the building blocks fit together. In other words, it’s a far broader view than what the business analyst traditional part would take, and looks at a number of different domains. The business architect tends to focus a lot on, as you mentioned, the information space, and we see a difference between the information and the data space.

So the business architect is looking at performance, market-related aspects, and customer, information, as well as the business processes and functional aspects of an organization. You can see that the business analysts could almost be seen as the soldiers of these types of functions. In other words, they’re the guys that are in the trenches seeing what’s working on a day-to-day basis. They’ve got a number of tools that they’re equipped with, which for example the BABOK has given them. And there are all different ways and techniques that they are using to elicit those requirements from various business stakeholders, until they move out that curve up into the business architecture and strategic architecture space.

Evans: There’s an interesting pattern that I’ve noticed with the business-analyst-to-business- architecture career journey and the traditional IT track, where you see a number of people move into solution architect roles. There might be a solution architect on a project, they might move to multiple projects and ultimately do a program, and a number of those people then pop out to a much broader enterprise view, as they go through their career.

The business analyst is, in many respects, tracking that journey, where business analysts might focus on a project and requirements for a project, might look across at a high view, and possibly get to a point where they have a strong domain understanding that can drive high level sort of strategic discussions within the organization.

There is certainly a pattern emerging, and there are great opportunities for business analysts to come across into the architecture sphere. However, I believe that the broader EA discipline does need to make the effort to bridge that gap. Architecture needs to come across and find those connection points with the analyst community and help to elevate and converge the two sides.

Gardner: Craig, in your presentation at The Open Group Conference in Sydney, what do you hope to accomplish, and will this issue of how the business analyst fits in be prominent in that?

Martin: It’s a general theme that we’re using leading right up to the conference. We have a couple of webinars, which deal specifically with this topic. That’s leading up to the plenary talk at The Open Group Conference, which is really looking at how we can use these tools of the architecture discipline to be able to achieve the types of outcomes that we’ve spoken about here.

Building cohesion

In other words, how do I build cohesion in an organization? How do I look at different types of scenarios that I can execute against? What are the better ways to assemble all the efforts in my organization to achieve those outcomes? That’s taking us through a variety of examples that will be quite visual.

We’ll also be addressing the specific role of where we see the career path and the complementary nature of the business analyst and business architect, as they travel through the cycle of trying to operate at a strategic level and as a strategic enabler within the organization.

Gardner: Maybe you could also help me better understand something. When organizations decide that this is the right thing for them — as you mentioned earlier, this is still somewhat nascent — what are some good foundational considerations to get started? What needs to be put in place? Maybe it’s a mindset. How do you often find that enterprises get beyond the inertia and into this discussion about architecture and about the strategic benefits of it?

Martin: Once again, it’s a “depends” answer. For example, we often have two market segments, where a Tier 1 type company would want to build the capability themselves. So there’s a journey that we need to take them on around how to have a business-architecture capability while delivering the actual outcomes?

Tier 2 and Tier 3 clients often don’t necessarily want to build that type of capability, so we would focus directly on the outcomes. And those outcomes start with two views. Traditionally, we’re seeing the view driven almost on a bottom-up view, as the sponsors of these types of exercises try to get credibility within the organization.

That relates to helping the clients build what we refer to as the utility of the business-architecture space. Our teams go in and, in effect, build a bunch of what we refer to as anchor models to try and get a consistent representation of the business and a consistent language occurring across the entire enterprise, not just within a specific project.

And that gives them a common language they can talk about, for example, common capabilities and common outcomes that they’re looking to achieve. In other words, it’s not just a bunch of building blocks, but the actual outcome of each of those building blocks and how does it match something like a business-motivation model.

They also look within each of those building blocks to see what the resources are that creates each of those building blocks — things like people, process and tools. How do we mix those resources in the right way to achieve those types of outcomes that the business is looking for? Normally, the first path that we go through is to try to get that sort of consistent language occurring within an organization. As an organization matures, that artifact starts to lose its value, and we then find that, because it has created a consistent language in the organization, you can now overlay a variety of different types of views to give business people insights. Ultimately, they don’t necessarily want all these models, but they actually want insight into their organizations to enable them to make decisions.

We can overlay objectives, current project spend, CAPEX, and OPEX. We can overlay where duplication is occurring, where overspend is occurring, where there’s conflict occurring at a global scale around duplication of efforts, and with the impact of costs and reduction and efficiencies, all of those types of questions can be answered by merely overlaying a variety of views across this common language.

Elevating the value

That starts to elevate the value of these types of artifacts, and we start to see our business sponsors walking into meetings with all of these overlays on them, and having conversations between them and their colleagues, specifically around the insights that are drawn from these artifacts. We want the architecture to tell the story, not necessarily lengthy PowerPoint presentations, but as people are looking at these types of artifacts, they are actually seeing all the insights that come specifically from it.

The third and final part is often around the business getting to a level of maturity, in that they’re starting to use these types of artifacts and then are looking for different ways that they can now mix and assemble. That’s normally a sign of a mature organization and the business-architecture practice.

They have the building blocks. They’ve seen the value or the types of insights that they can provide. Are there different ways that I can string together my capabilities to achieve different outcomes? Maybe I have got different critical success factors that I am looking to achieve. Maybe there are new shift or new pressures coming in from the environment. How can I assemble the underlying structures of my organization to better cope with it? That’s the third phase that we take customers through, once they get to that level of maturity.

Evans: Just to add to that, Dana, I agree with Craig on the point that, if you show the business what can actually be delivered such as views on a page that elicit the right types of discussions and that demonstrate the issues, when they see what they’re going to get delivered, typically the eyes light up and they say, “I want one of those things.”

The thing with architecture that I have noticed over the years is that architecture is done by a lot of very intelligent people, who have great insights and great understanding, but it’s not just enough to know the answer. You have to know how to engage somebody with the material. So when the architecture content that’s coming through is engaging, clear, understandable, and can be consumed by a variety of stakeholders, they go, “That’s what I want. I want one of those.”

So my advice to somebody who is going down this path is that if they want to get support and sponsorship for this sort of thing, make sure they get some good examples of what gets delivered when it’s done well, as that’s a great way to actually get people behind it.

Gardner: I’m afraid we will have to leave it there. We’ve been talking with Hugh Evans, the CEO of Enterprise Architects, a specialist EA firm in Melbourne; and Craig Martin, the COO and Chief Architect at Enterprise Architects. Thanks to you both.

Evans: Thanks very much Dana, it has been a pleasure.

Martin: Thank you, Dana.

Gardner: This BriefingsDirect discussion comes to you in conjunction with The Open Group Conference, the first in Australia, on April 15 in Sydney. The focus will be on “How Does Enterprise Architecture Transform an Enterprise?”

So thanks again to both Hugh and Craig, and I know they will be joined by many more thought leaders and speakers on the EA subject and other architecture issues at the conference, and I certainly encourage our readers and listeners to attend that conference, if they’re in the Asia- Pacific region.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator through these thought leadership interviews. Thanks again for listening, and come back next time.

1 Comment

Filed under ArchiMate®, Business Architecture, Conference, Enterprise Architecture, Professional Development, TOGAF®

The Open Group Conference in Sydney Plenary Sessions Preview

By The Open Group Conference Team

Taking place April 15-18, 2013, The Open Group Conference in Sydney will bring together industry experts to discuss the evolving role of Enterprise Architecture and how it transforms the enterprise. As the conference quickly approaches, let’s take a deeper look into the plenary sessions that kick-off day one and two. And if you haven’t already, register for The Open Group Conference in Sydney today!

Enterprise Transformation and the Role of Open Standards

By Allen Brown, President & CEO, The Open Group

Enterprise transformation seems to be gathering momentum within the Enterprise Architecture community.  The term, enterprise transformation, suggests the process of fundamentally changing an enterprise.  Sometimes the transformation is dramatic but for most of us it is a steady process. Allen will kick off the conference by discussing how to set expectations, the planning process for enterprise transformation and the role of standards, and provide an overview of ongoing projects by The Open Group’s members.

TOGAF® as a Powerful Took to Kick Start Business Transformation

By Peter Haviland, Chief Business Architect, and Martin Keywood, Partner, Ernst & Young

Business transformation is a tricky beast. It requires many people to work together toward a singular vision, and even more people to be aligned to an often multi-year execution program throughout which personal and organizational priorities will change. As a firm with considerable Business Architecture and transformation experience, Ernst & Young (EY) deploys multi-disciplinary teams of functional and technical experts and uses a number of approaches, anchored on TOGAF framework, to address these issues. This is necessary to get a handle on the complexity inherent to today’s business environment so that stakeholders are aligned and remain actively engaged, past investments in both processes and systems can be maximized, and transformation programs are set up for success and can be driven with sustained momentum.

In this session Peter and Martin will take us through EY’s Transformation Design approach – an approach that, within 12 weeks, can define a transformation vision, get executives on board, create a high level multi-domain architecture, broadly outline transformation alternatives and finally provide initial estimates of the necessary work packages to achieve transformation. They will also share case studies and metrics from the approach of financial services, oil and gas and professional services sectors. The session should interest executives looking to increase buy-in amongst their peers or professionals charged with stakeholder engagement and alignment. It will also show how to use the TOGAF framework within this situation.

Building a More Cohesive Organization Using Business Architecture

 By Craig Martin, COO & Chief Architect, Enterprise Architects

In shifting the focus away from Enterprise Architecture being seen purely as an IT discipline, organizations are beginning to formalize the development of Business Architecture practices and outcomes. The Open Group has made the differentiation between business, IT and enterprise architects through various working groups and certification tracks. However, industry at present is grappling to try to understand where the discipline of Business Architecture resides in the business and what value it can provide separate of the traditional project based business analysis focus.

Craig will provide an overview of some of the critical questions being asked by businesses and how these are addressed through Business Architecture. Using both method as well as case study examples, he will show an approach to building more cohesion across the business landscape. Craig will focus on the use of business motivation models, strategic scenario planning and capability based planning techniques to provide input into the strategic planning process.

Other plenary speakers include:

  • Capability Based Strategic Planning in Transforming a Mining Environment by David David, EA Manager, Rio Tinto
  • Development of the National Broadband Network IT Architecture – A Greenfield Telco Transformation by Roger Venning, Chief IT Architect, NBN Co. Ltd
  • Business Architecture in Finance Panel moderated by Chris Forde, VP Enterprise Architecture, The Open Group

More details about the conference can be found here: http://www.opengroup.org/sydney2013

1 Comment

Filed under Conference

3 Steps to Proactively Address Board-Level Security Concerns

By E.G. Nadhan, HP

Last month, I shared the discussions that ensued in a Tweet Jam conducted by The Open Group on Big Data and Security where the key takeaway was: Protecting Data is Good.  Protecting Information generated from Big Data is priceless.  Security concerns around Big Data continue to the extent that it has become a Board-level concern as explained in this article in ComputerWorldUK.  Board-level concerns must be addressed proactively by enterprises.  To do so, enterprises must provide the business justification for such proactive steps needed to address such board-level concerns.

Nadhan blog image

At The Open Group Conference in Sydney in April, the session on “Which information risks are shaping our lives?” by Stephen Singam, Chief Technology Officer, HP Enterprise Security Services, Australia provides great insight on this topic.  In this session, Singam analyzes the current and emerging information risks while recommending a proactive approach to address them head-on with adversary-centric solutions.

The 3 steps that enterprises must take to proactively address security concerns are below:

Computing the cost of cyber-crime

The HP Ponemon 2012 Cost of Cyber Crime Study revealed that cyber attacks have more than doubled in a three year period with the financial impact increasing by nearly 40 percent. Here are the key takeaways from this research:

  • Cyber-crimes continue to be costly. The average annualized cost of cyber-crime for 56 organizations is $8.9 million per year, with a range of $1.4 million to $46 million.
  • Cyber attacks have become common occurrences. Companies experienced 102 successful attacks per week and 1.8 successful attacks per company per week in 2012.
  • The most costly cyber-crimes are those caused by denial of service, malicious insiders and web-based attacks.

When computing the cost of cyber-crime, enterprises must address direct, indirect and opportunity costs that result from the loss or theft of information, disruption to business operations, revenue loss and destruction of property, plant and equipment. The following phases of combating cyber-crime must also be factored in to comprehensively determine the total cost:

  1. Detection of patterns of behavior indicating an impending attack through sustained monitoring of the enabling infrastructure
  2. Investigation of the security violation upon occurrence to determine the underlying root cause and take appropriate remedial measures
  3. Incident response to address the immediate situation at hand, communicate the incidence of the attack raise all applicable alerts
  4. Containment of the attack by controlling its proliferation across the enterprise
  5. Recovery from the damages incurred as a result of the attack to ensure ongoing business operations based upon the business continuity plans in place

Identifying proactive steps that can be taken to address cyber-crime

  1. “Better get security right,” says HP Security Strategist Mary Ann Mezzapelle in her keynote on Big Data and Security at The Open Group Conference in Newport Beach. Asserting that proactive risk management is the most effective approach, Mezzapelle challenged enterprises to proactively question the presence of shadow IT, data ownership, usage of security tools and standards while taking a comprehensive approach to security end-to-end within the enterprise.
  2. Art Gilliland suggested that learning from cyber criminals and understanding their methods in this ZDNet article since the very frameworks enterprises strive to comply with (such as ISO and PCI) set a low bar for security that adversaries capitalize on.
  3. Andy Ellis discussed managing risk with psychology instead of brute force in his keynote at the 2013 RSA Conference.
  4. At the same conference, in another keynote, world re-knowned game-designer and inventor of SuperBetter, Jane McGonigal suggested the application of the “collective intelligence” that gaming generates can combat security concerns.
  5. In this interview, Bruce Schneier, renowned security guru and author of several books including LIARS & Outliers, suggested “Bad guys are going to invent new stuff — whether we want them to or not.” Should we take a cue from Hollywood and consider the inception of OODA loop into the security hacker’s mind?

The Balancing Act.

Can enterprises afford to take such proactive steps? Or more importantly, can they afford not to?

Enterprises must define their risk management strategy and determine the proactive steps that are best in alignment with their business objectives and information security standards.  This will enable organizations to better assess the cost of execution for such measures.  While the actual cost is likely to vary by enterprise, inaction is not an acceptable alternative.  Like all other critical corporate initiatives, these proactive measures must receive the board-level attention they deserve.

Enterprises must balance the cost of executing such proactive measures against the potential cost of data loss and reputational harm. This will ensure that the right proactive measures are taken with executive support.

How about you?  Has your enterprise taken the steps to assess the cost of cybercrime?  Have you considered various proactive steps to combat cybercrime?  Share your thoughts with me in the comments section below.

NadhanHP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHP.

1 Comment

Filed under Conference

Join us for The Open Group Conference in Sydney – April 15-18

By The Open Group Conference Team

The Open Group is busy gearing up for the Sydney conference, which will take place on April 15-18, 2013. With over 2,000 Associate of Enterprise Architects (AEA) members in Australia, Sydney is an ideal setting for industry experts from around the world to gather and discuss the evolution of Enterprise Architecture and its role in transforming the enterprise. Be sure to register today!

The conference offers roughly 60 sessions on a varied of topics including:

  • Cloud infrastructure as an enabler of innovation in enterprises
  • Simplifying data integration in the government and defense sectors
  • Merger transformation with TOGAF® framework and ArchiMate® modeling language
  • Measuring and managing cybersecurity risks
  • Pragmatic IT road-mapping with ArchiMate modeling language
  • The value of Enterprise Architecture certification within a professional development framework

Plenary speakers will include:

  • Allen Brown, President & CEO, The Open Group
  • Peter Haviland, Chief Business Architect, with Martin Keywood, Partner, Ernst & Young
  • David David, EA Manager, Rio Tinto
  • Roger Venning, Chief IT Architect, NBN Co. Ltd
  • Craig Martin, COO & Chief Architect, Enterprise Architects
  • Chris Forde, VP Enterprise Architecture, The Open Group

The full conference agenda is available here. Tracks include:

  • Finance & Commerce
  • Government & Defense
  • Energy & Natural Resources

And topics of discussion include, but are not limited to:

  • Cloud
  • Business Transformation
  • Enterprise Architecture
  • Technology & Innovation
  • Data Integration/Information Sharing
  • Governance & Security
  • Architecture Reference Models
  • Strategic Planning
  • Distributed Services Architecture

Upcoming Conference Submission Deadlines

Would you like a chance to speak an Open Group conference? There are upcoming deadlines for speaker proposal submissions for upcoming conferences in Philadelphia and London. To submit a proposal to speak, click here.

Venue Industry Focus Submission Deadline
Philadelphia (July 15-17) Healthcare, Finance, Government & Defense April 5, 2013
London (October 21-23) Finance, Government, Healthcare July 8, 2013

 

The agenda for Philadelphia and London are filling up fast, so it is important for proposals to be submitted as early as possible. Proposals received after the deadline dates will still be considered, space permitting; if not, proposals may be carried over to a future conference. Priority will be given to proposals received by the deadline dates and to proposals that include an end-user organization, at least as a co-presenter.

Comments Off

Filed under Conference

Beyond Big Data

By Chris Harding, The Open Group

The big bang that started The Open Group Conference in Newport Beach was, appropriately, a presentation related to astronomy. Chris Gerty gave a keynote on Big Data at NASA, where he is Deputy Program Manager of the Open Innovation Program. He told us how visualizing deep space and its celestial bodies created understanding and enabled new discoveries. Everyone who attended felt inspired to explore the universe of Big Data during the rest of the conference. And that exploration – as is often the case with successful space missions – left us wondering what lies beyond.

The Big Data Conference Plenary

The second presentation on that Monday morning brought us down from the stars to the nuts and bolts of engineering. Mechanical devices require regular maintenance to keep functioning. Processing the mass of data generated during their operation can improve safety and cut costs. For example, airlines can overhaul aircraft engines when it needs doing, rather than on a fixed schedule that has to be frequent enough to prevent damage under most conditions, but might still fail to anticipate failure in unusual circumstances. David Potter and Ron Schuldt lead two of The Open Group initiatives, Quantum Lifecycle management (QLM) and the Universal Data Element Framework (UDEF). They explained how a semantic approach to product lifecycle management can facilitate the big-data processing needed to achieve this aim.

Chris Gerty was then joined by Andras Szakal, vice-president and chief technology officer at IBM US Federal IMT, Robert Weisman, chief executive officer of Build The Vision, and Jim Hietala, vice-president of Security at The Open Group, in a panel session on Big Data that was moderated by Dana Gardner of Interarbor Solutions. As always, Dana facilitated a fascinating discussion. Key points made by the panelists included: the trend to monetize data; the need to ensure veracity and usefulness; the need for security and privacy; the expectation that data warehouse technology will exist and evolve in parallel with map/reduce “on-the-fly” analysis; the importance of meaningful presentation of the data; integration with cloud and mobile technology; and the new ways in which Big Data can be used to deliver business value.

More on Big Data

In the afternoons of Monday and Tuesday, and on most of Wednesday, the conference split into streams. These have presentations that are more technical than the plenary, going deeper into their subjects. It’s a pity that you can’t be in all the streams at once. (At one point I couldn’t be in any of them, as there was an important side meeting to discuss the UDEF, which is in one of the areas that I support as forum director). Fortunately, there were a few great stream presentations that I did manage to get to.

On the Monday afternoon, Tom Plunkett and Janet Mostow of Oracle presented a reference architecture that combined Hadoop and NoSQL with traditional RDBMS, streaming, and complex event processing, to enable Big Data analysis. One application that they described was to trace the relations between particular genes and cancer. This could have big benefits in disease prediction and treatment. Another was to predict the movements of protesters at a demonstration through analysis of communications on social media. The police could then concentrate their forces in the right place at the right time.

Jason Bloomberg, president of Zapthink – now part of Dovel – is always thought-provoking. His presentation featured the need for governance vitality to cope with ever changing tools to handle Big Data of ever increasing size, “crowdsourcing” to channel the efforts of many people into solving a problem, and business transformation that is continuous rather than a one-time step from “as is” to “to be.”

Later in the week, I moderated a discussion on Architecting for Big Data in the Cloud. We had a well-balanced panel made up of TJ Virdi of Boeing, Mark Skilton of Capgemini and Tom Plunkett of Oracle. They made some excellent points. Big Data analysis provides business value by enabling better understanding, leading to better decisions. The analysis is often an iterative process, with new questions emerging as answers are found. There is no single application that does this analysis and provides the visualization needed for understanding, but there are a number of products that can be used to assist. The role of the data scientist in formulating the questions and configuring the visualization is critical. Reference models for the technology are emerging but there are as yet no commonly-accepted standards.

The New Enterprise Platform

Jogging is a great way of taking exercise at conferences, and I was able to go for a run most mornings before the meetings started at Newport Beach. Pacific Coast Highway isn’t the most interesting of tracks, but on Tuesday morning I was soon up in Castaways Park, pleasantly jogging through the carefully-nurtured natural coastal vegetation, with views over the ocean and its margin of high-priced homes, slipways, and yachts. I reflected as I ran that we had heard some interesting things about Big Data, but it is now an established topic. There must be something new coming over the horizon.

The answer to what this might be was suggested in the first presentation of that day’s plenary, Mary Ann Mezzapelle, security strategist for HP Enterprise Services, talked about the need to get security right for Big Data and the Cloud. But her scope was actually wider. She spoke of the need to secure the “third platform” – the term coined by IDC to describe the convergence of social, cloud and mobile computing with Big Data.

Securing Big Data

Mary Ann’s keynote was not about the third platform itself, but about what should be done to protect it. The new platform brings with it a new set of security threats, and the increasing scale of operation makes it increasingly important to get the security right. Mary Ann presented a thoughtful analysis founded on a risk-based approach.

She was followed by Adrian Lane, chief technology officer at Securosis, who pointed out that Big Data processing using NoSQL has a different architecture from traditional relational data processing, and requires different security solutions. This does not necessarily mean new techniques; existing techniques can be used in new ways. For example, Kerberos may be used to secure inter-node communications in map/reduce processing. Adrian’s presentation completed the Tuesday plenary sessions.

Service Oriented Architecture

The streams continued after the plenary. I went to the Distributed Services Architecture stream, which focused on SOA.

Bill Poole, enterprise architect at JourneyOne in Australia, described how to use the graphical architecture modeling language ArchiMate® to model service-oriented architectures. He illustrated this using a case study of a global mining organization that wanted to consolidate its two existing bespoke inventory management applications into a single commercial off-the-shelf application. It’s amazing how a real-world case study can make a topic come to life, and the audience certainly responded warmly to Bill’s excellent presentation.

Ali Arsanjani, chief technology officer for Business Performance and Service Optimization, and Heather Kreger, chief technology officer for International Standards, both at IBM, described the range of SOA standards published by The Open Group and available for use by enterprise architects. Ali was one of the brains that developed the SOA Reference Architecture, and Heather is a key player in international standards activities for SOA, where she has helped The Open Group’s Service Integration Maturity Model and SOA Governance Framework to become international standards, and is working on an international standard SOA reference architecture.

Cloud Computing

To start Wednesday’s Cloud Computing streams, TJ Virdi, senior enterprise architect at The Boeing Company, discussed use of TOGAF® to develop an Enterprise Architecture for a Cloud ecosystem. A large enterprise such as Boeing may use many Cloud service providers, enabling collaboration between corporate departments, partners, and regulators in a complex ecosystem. Architecting for this is a major challenge, and The Open Group’s TOGAF for Cloud Ecosystems project is working to provide guidance.

Stuart Boardman of KPN gave a different perspective on Cloud ecosystems, with a case study from the energy industry. An ecosystem may not necessarily be governed by a single entity, and the participants may not always be aware of each other. Energy generation and consumption in the Netherlands is part of a complex international ecosystem involving producers, consumers, transporters, and traders of many kinds. A participant may be involved in several ecosystems in several ways: a farmer for example, might consume energy, have wind turbines to produce it, and also participate in food production and transport ecosystems.

Penelope Gordon of 1-Plug Corporation explained how choice and use of business metrics can impact Cloud service providers. She worked through four examples: a start-up Software-as-a-Service provider requiring investment, an established company thinking of providing its products as cloud services, an IT department planning to offer an in-house private Cloud platform, and a government agency seeking budget for government Cloud.

Mark Skilton, director at Capgemini in the UK, gave a presentation titled “Digital Transformation and the Role of Cloud Computing.” He covered a very broad canvas of business transformation driven by technological change, and illustrated his theme with a case study from the pharmaceutical industry. New technology enables new business models, giving competitive advantage. Increasingly, the introduction of this technology is driven by the business, rather than the IT side of the enterprise, and it has major challenges for both sides. But what new technologies are in question? Mark’s presentation had Cloud in the title, but also featured social and mobile computing, and Big Data.

The New Trend

On Thursday morning I took a longer run, to and round Balboa Island. With only one road in or out, its main street of shops and restaurants is not a through route and the island has the feel of a real village. The SOA Work Group Steering Committee had found an excellent, and reasonably priced, Italian restaurant there the previous evening. There is a clear resurgence of interest in SOA, partly driven by the use of service orientation – the principle, rather than particular protocols – in Cloud Computing and other new technologies. That morning I took the track round the shoreline, and was reminded a little of Dylan Thomas’s “fishing boat bobbing sea.” Fishing here is for leisure rather than livelihood, but I suspected that the fishermen, like those of Thomas’s little Welsh village, spend more time in the bar than on the water.

I thought about how the conference sessions had indicated an emerging trend. This is not a new technology but the combination of four current technologies to create a new platform for enterprise IT: Social, Cloud, and Mobile computing, and Big Data. Mary Ann Mezzapelle’s presentation had referenced IDC’s “third platform.” Other discussions had mentioned Gartner’s “Nexus of forces,” the combination of Social, Cloud and Mobile computing with information that Gartner says is transforming the way people and businesses relate to technology, and will become a key differentiator of business and technology management. Mark Skilton had included these same four technologies in his presentation. Great minds, and analyst corporations, think alike!

I thought also about the examples and case studies in the stream presentations. Areas as diverse as healthcare, manufacturing, energy and policing are using the new technologies. Clearly, they can deliver major business benefits. The challenge for enterprise architects is to maximize those benefits through pragmatic architectures.

Emerging Standards

On the way back to the hotel, I remarked again on what I had noticed before, how beautifully neat and carefully maintained the front gardens bordering the sidewalk are. I almost felt that I was running through a public botanical garden. Is there some ordinance requiring people to keep their gardens tidy, with severe penalties for anyone who leaves a lawn or hedge unclipped? Is a miserable defaulter fitted with a ball and chain, not to be removed until the untidy vegetation has been properly trimmed, with nail clippers? Apparently not. People here keep their gardens tidy because they want to. The best standards are like that: universally followed, without use or threat of sanction.

Standards are an issue for the new enterprise platform. Apart from the underlying standards of the Internet, there really aren’t any. The area isn’t even mapped out. Vendors of Social, Cloud, Mobile, and Big Data products and services are trying to stake out as much valuable real estate as they can. They have no interest yet in boundaries with neatly-clipped hedges.

This is a stage that every new technology goes through. Then, as it matures, the vendors understand that their products and services have much more value when they conform to standards, just as properties have more value in an area where everything is neat and well-maintained.

It may be too soon to define those standards for the new enterprise platform, but it is certainly time to start mapping out the area, to understand its subdivisions and how they inter-relate, and to prepare the way for standards. Following the conference, The Open Group has announced a new Forum, provisionally titled Open Platform 3.0, to do just that.

The SOA and Cloud Work Groups

Thursday was my final day of meetings at the conference. The plenary and streams presentations were done. This day was for working meetings of the SOA and Cloud Work Groups. I also had an informal discussion with Ron Schuldt about a new approach for the UDEF, following up on the earlier UDEF side meeting. The conference hallways, as well as the meeting rooms, often see productive business done.

The SOA Work Group discussed a certification program for SOA professionals, and an update to the SOA Reference Architecture. The Open Group is working with ISO and the IEEE to define a standard SOA reference architecture that will have consensus across all three bodies.

The Cloud Work Group had met earlier to further the TOGAF for Cloud ecosystems project. Now it worked on its forthcoming white paper on business performance metrics. It also – though this was not on the original agenda – discussed Gartner’s Nexus of Forces, and the future role of the Work Group in mapping out the new enterprise platform.

Mapping the New Enterprise Platform

At the start of the conference we looked at how to map the stars. Big Data analytics enables people to visualize the universe in new ways, reach new understandings of what is in it and how it works, and point to new areas for future exploration.

As the conference progressed, we found that Big Data is part of a convergence of forces. Social, mobile, and Cloud Computing are being combined with Big Data to form a new enterprise platform. The development of this platform, and its roll-out to support innovative applications that deliver more business value, is what lies beyond Big Data.

At the end of the conference we were thinking about mapping the new enterprise platform. This will not require sophisticated data processing and analysis. It will take discussions to create a common understanding, and detailed committee work to draft the guidelines and standards. This work will be done by The Open Group’s new Open Platform 3.0 Forum.

The next Open Group conference is in the week of April 15, in Sydney, Australia. I’m told that there’s some great jogging there. More importantly, we’ll be reflecting on progress in mapping Open Platform 3.0, and thinking about what lies ahead. I’m looking forward to it already.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Conference

The Open Group Panel Explores How the Big Data Era Now Challenges the IT Status Quo

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: The Open Group panel explores how the Big Data era now challenges the IT status quo, or view the on-demand video recording on this discussion here: http://new.livestream.com/opengroup/events/1838807.

We recently assembled a panel of experts to explore how Big Data changes the status quo for architecting the enterprise. The bottom line from the discussion is that large enterprises should not just wade into Big Data as an isolated function, but should anticipate the strategic effects and impacts of Big Data — as well the simultaneous complicating factors of Cloud Computing and mobile– as soon as possible.

The panel consisted of Robert Weisman, CEO and Chief Enterprise Architect at Build The Vision; Andras Szakal, Vice President and CTO of IBM’s Federal Division; Jim Hietala, Vice President for Security at The Open Group, and Chris Gerty, Deputy Program Manager at the Open Innovation Program at NASA. I served as the moderator.

And this special thought leadership interview series comes to you in conjunction with The Open Group Conference recently held in Newport Beach, California. The conference focused on “Big Data – he transformation we need to embrace today.”

Threaded factors

An interesting thread for me throughout the conference was to factor where Big Data begins and plain old data, if you will, ends. Of course, it’s going to vary quite a bit from organization to organization.

But Gerty from NASA, part of our panel, provided a good example: It’s when you run out of gas with your old data methods, and your ability to deal with the data — and it’s not just the size of the data itself.

Therefore, Big Data means do things differently — not just to manage the velocity and the volume and the variety of the data, but to really think about data fundamentally and differently. And, we need to think about security, risk and governance. If it’s a “boundaryless organization” when it comes your data, either as a product or service or a resource, that control and management of which data should be exposed, which should be opened, and which should be very closely guarded all need to be factored, determined and implemented.

Here are some excerpts from the on-stage discussion:

Dana Gardner: You mentioned that Big Data to you is not a factor of the size, because NASA’s dealing with so much. It’s when you run out of steam, as it were, with the methodologies. Maybe you could explain more. When do you know that you’ve actually run out of steam with the methodologies?

Gerty: When we collect data, we have some sort of goal in minds of what we might get out of it. When we put the pieces from the data together, it either maybe doesn’t fit as well as you thought or you are successful and you continue to do the same thing, gathering archives of information.

Gardner: Andras, does that square with where you are in your government interactions — that data now becomes a different type of resource, and that you need to know when to do things differently?At that point, where you realize there might even something else that you want to do with the data, different than what you planned originally, that’s when we have to pivot a little bit and say, “Now I need to treat this as a living archive. It’s a ‘it may live beyond me’ type of thing.” At that point, I think you treat it as setting up the infrastructure for being used later, whether it’d be by you or someone else. That’s an important transition to make and might be what one could define as Big Data.

Szakal: The importance of data hasn’t changed. The data itself, the veracity of the data, is still important. Transactional data will always need to exist. The difference is that you have certainly the three or four Vs, depending on how you look at it, but the importance of data is in its veracity, and your ability to understand or to be able to use that data before the data’s shelf life runs out.

Gardner: Bob, we’ve seen the price points on storage go down so dramatically. We’ve seem people just decide to hold on to data that they wouldn’t have before, simply because they can and they can afford to do so. That means we need to try to extract value and use that data. From the perspective of an enterprise architect, how are things different now, vis-à-vis this much larger set of data and variety of data, when it comes to planning and executing as architects?Some data has a shelf life that’s long lived. Other data has very little shelf life, and you would use different approaches to being able to utilize that information. It’s ultimately not about the data itself, but it’s about gaining deep insight into that data. So it’s not storing data or manipulating data, but applying those analytical capabilities to data.

Weisman: One of the major issues is that normally organizations are holding two orders of magnitude more data then they need. It’s an huge overhead, both in terms of the applications architecture that has a code basis, larger than it should be, and also from the technology architecture that is supporting a horrendous number of servers and a whole bunch of technology stuff that they don’t need.

The issue for the architect is to figure out as what data is useful, institute a governance process, so that you can have data lifecycle management, have a proper disposition,  focus the organization on information data and knowledge that is basically going to provide business value to the organization, and help them innovate and have a competitive advantage.

Can’t afford it

And in terms of government, just improve service delivery, because there’s waste right now on information infrastructure, and we can’t afford it anymore.

Gardner: So it’s difficult to know what to keep and what not to keep. I’ve actually spoken to a few people lately who want to keep everything, just because they want to mine it, and they are willing to spend the money and effort to do that.

Jim Hietala, when people do get to this point of trying to decide what to keep, what not to keep, and how to architect properly for that, they also need to factor in security. It shouldn’t become later in the process. It should come early. What are some of the precepts that you think are important in applying good security practices to Big Data?

Hietala: One of the big challenges is that many of the big-data platforms weren’t built from the get-go with security in mind. So some of the controls that you’ve had available in your relational databases, for instance, you move over to the Big Data platforms and the access control authorizations and mechanisms are not there today.

Gardner: There are a lot of unknown unknowns out there, as we discovered with our tweet chat last month. Some people think that the data is just data, and you apply the same security to it. Do you think that’s the case with Big Data? Is it just another follow-through of what you always did with data in the first place?Planning the architecture, looking at bringing in third-party controls to give you the security mechanisms that you are used to in your older platforms, is something that organizations are going to have to do. It’s really an evolving and emerging thing at this point.

Hietala: I would say yes, at a conceptual level, but it’s like what we saw with virtualization. When there was a mad rush to virtualize everything, many of those traditional security controls didn’t translate directly into the virtualized world. The same thing is true with Big Data.

When you’re talking about those volumes of data, applying encryption, applying various security controls, you have to think about how those things are going to scale? That may require new solutions from new technologies and that sort of thing.

Gardner: Chris Gerty, when it comes to that governance, security, and access control, are there any lessons that you’ve learned that you are aware of in terms of the best of openness, but also with the ability to manage the spigot?

Gerty: Spigot is probably a dangerous term to use, because it implies that all data is treated the same. The sooner that you can tag the data as either sensitive or not, mostly coming from the person or team that’s developed or originated the data, the better.

Kicking the can

Once you have it on a hard drive, once you get crazy about storing everything, if you don’t know where it came from, you’re forced to put it into a secure environment. And that’s just kicking the can down the road. It’s really a disservice to people who might use the data in a useful way to address their problems.

We constantly have satellites that are made for one purpose. They send all the data down. It’s controlled either for security or for intellectual property (IP), so someone can write a paper. Then, after the project doesn’t get funded or it just comes to a nice graceful close, there is that extra step, which is almost a responsibility of the originators, to make it useful to the rest of the world.

Gardner: Let’s look at Big Data through the lens of some other major trends right now. Let’s start with Cloud. You mentioned that at NASA, you have your own private Cloud that you’re using a lot, of course, but you’re also now dabbling in commercial and public Clouds. Frankly, the price points that these Cloud providers are offering for storage and data services are pretty compelling.

So we should expect more data to go to the Cloud. Bob, from your perspective, as organizations and architects have to think about data in this hybrid Cloud on-premises off-premises, moving back and forth, what do you think enterprise architects need to start thinking about in terms of managing that, planning for the right destination of data, based on the right mix of other requirements?

Weisman: It’s a good question. As you said, the price point is compelling, but the security and privacy of the information is something else that has to be taken into account. Where is that information going to reside? You have to have very stringent service-level agreements (SLAs) and in certain cases, you might say it’s a price point that’s compelling, but the risk analysis that I have done means that I’m going to have to set up my own private Cloud.

Gardner: Andras, how do the Cloud and Big Data come together in a way that’s intriguing to you?Right now, everybody’s saying is the public Cloud is going to be the way to go. Vendors are going to have to be very sensitive to that and many are, at this point in time, addressing a lot of the needs of some of the large client basis. So it’s not one-size-fits-all and it’s more than just a price for service. Architecture can bring down the price pretty dramatically, even within an enterprise.

Szakal: Actually it’s a great question. We could take the rest of the 22 minutes talking on this one question. I helped lead the President’s Commission on Big Data that Steve Mills from IBM and — I forget the name of the executive from SAP — led. We intentionally tried to separate Cloud from Big Data architecture, primarily because we don’t believe that, in all cases, Cloud is the answer to all things Big Data. You have to define the architecture that’s appropriate for your business needs.

However, it also depends on where the data is born. Take many of the investments IBM has made into enterprise market management, for example, Coremetrics, several of these services that we now offer for helping customers understand deep insight into how their retail market or supply chain behaves.

Born in the Cloud

All of that information is born in the Cloud. But if you’re talking about actually using Cloud as infrastructure and moving around huge sums of data or constructing some of these solutions on your own, then some of the ideas that Bob conveyed are absolutely applicable.

I think it becomes prohibitive to do that and easier to stand up a hybrid environment for managing the amount of data. But I think that you have to think about whether your data is real-time data, whether it’s data that you could apply some of these new technologies like Hadoop to, Hadoop MapReduce-type solutions, or whether it’s traditional data warehousing.

Data warehouses are going to continue to exist and they’re going to continue to evolve technologically. You’re always going to use a subset of data in those data warehouses, and it’s going to be an applicable technology for many years to come.

Gardner: So suffice it to say, an enterprise architect who is well versed in both Cloud infrastructure requirements, technologies, and methods, as well as Big Data, will probably be in quite high demand. That specialization in one or the other isn’t as valuable as being able to cross-pollinate between them.

Szakal: Absolutely. It’s enabling our architects and finding deep individuals who have this unique set of skills, analytics, mathematics, and business. Those individuals are going to be the future architects of the IT world, because analytics and Big Data are going to be integrated into everything that we do and become part of the business processing.

Gardner: Well, that’s a great segue to the next topic that I am interested in, and it’s around mobility as a trend and also application development. The reason I lump them together is that I increasingly see developers being tasked with mobile first.

When you create a new app, you have to remember that this is going to run in the mobile tier and you want to make sure that the requirements, the UI, and the complexity of that app don’t go beyond the ability of the mobile app and the mobile user. This is interesting to me, because data now has a different relationship with apps.

We used to think of apps as creating data and then the data would be stored and it might be used or integrated. Now, we have applications that are simply there in order to present the data and we have the ability now to present it to those mobile devices in the mobile tier, which means it goes anywhere, everywhere all the time.

Let me start with you Jim, because it’s security and risk, but it’s also just rethinking the way we use data in a mobile tier. If we can do it safely, and that’s a big IF, how important should it be for organizations to start thinking about making this data available to all of these devices and just pour out into that mobile tier as possible?

Hietala: In terms of enabling the business, it’s very important. There are a lot of benefits that accrue from accessing your data from whatever device you happen to be on. To me, it is that question of “if,” because now there’s a whole lot of problems to be solved relative to the data floating around anywhere on Android, iOS, whatever the platform is, and the organization being able to lock down their data on those devices, forgetting about whether it’s the organization device or my device. There’s a set of issues around that that the security industry is just starting to get their arms around today.

Mobile ability

Gardner: Chris, any thoughts about this mobile ability that the data gets more valuable the more you can use it and apply it, and then the more you can apply it, the more data you generate that makes the data more valuable, and we start getting into that positive feedback loop?

Gerty: Absolutely. It’s almost an appreciation of what more people could do and get to the problem. We’re getting to the point where, if it’s available on your desktop, you’re going to find a way to make it available on your device.

That same security questions probably need to be answered anyway, but making it mobile compatible is almost an acknowledgment that there will be someone who wants to use it. So let me go that extra step to make it compatible and see what I get from them. It’s more of a cultural benefit that you get from making things compatible with mobile.

Gardner: Any thoughts about what developers should be thinking by trying to bring the fruits of Big Data through these analytics to more users rather than just the BI folks or those that are good at SQL queries? Does this change the game by actually making an application on a mobile device, simple, powerful but accessing this real time updated treasure trove of data?

Gerty: I always think of the astronaut on the moon. He’s got a big, bulky glove and he might have a heads-up display in front of him, but he really needs to know exactly a certain piece of information at the right moment, dealing with bandwidth issues, dealing with the environment, foggy helmet wherever.

It’s very analogous to what the day-to-day professional will use trying to find out that quick e-mail he needs to know or which meeting to go to — which one is more important — and it all comes down to putting your developer in the shoes of the user. So anytime you can get interaction between the two, that’s valuable.

Weisman: From an Enterprise Architecture point of view my background is mainly defense and government, but defense mobile computing has been around for decades. So you’ve always been dealing with that.

The main thing is that in many cases, if they’re coming up with information, the whole presentation layer is turning into another architecture domain with information visualization and also with your security controls, with an integrated identity management capability.

It’s like you were saying about astronaut getting it right. He doesn’t need to know everything that’s happening in the world. He needs to know about his heads-up display, the stuff that’s relevant to him.

So it’s getting the right information to person in an authorized manner, in a way that he can visualize and make sense of that information, be it straight data, analytics, or whatever. The presentation layer, ergonomics, visual communication are going to become very important in the future for that. There are also a lot of problems. Rather than doing it at the application level, you’re doing it entirely in one layer.

Governance and security

Gardner: So clearly the implications of data are cutting across how we think about security, how we think about UI, how we factor in mobility. What we now think about in terms of governance and security, we have to do differently than we did with older data models.

Jim Hietala, what about the impact on spurring people towards more virtualized desktop delivery, if you don’t want to have the date on that end device, if you want solve some of the issues about control and governance, and if you want to be able to manage just how much data gets into that UI, not too much not too little.

Do you think that some of these concerns that we’re addressing will push people to look even harder, maybe more aggressive in how they go to desktop and application virtualization, as they say, keep it on the server, deliver out just the deltas?

Hietala: That’s an interesting point. I’ve run across a startup in the last month or two that is doing is that. The whole value proposition is to virtualize the environment. You get virtual gold images. You don’t have to worry about what’s actually happening on the physical device and you know when the devices connect. The security threat goes away. So we may see more of that as a solution to that.

Gardner: Andras, do you see that that some of the implications of Big Data, far fetched as it may be, are propelling people to cultivate their servers more and virtualize their apps, their data, and their desktop right up to the end devices?

Szakal: Yeah, I do. I see IBM providing solutions for virtual desktop, but I think it was really a security question you were asking. You’re certainly going to see an additional number of virtualized desktop environments.

Ultimately, our network still is not stable enough or at a high enough bandwidth to really make that useful exercise for all but the most menial users in the enterprise. From a security point of view, there is a lot to be still solved.

And part of the challenge in the Cloud environment that we see today is the proliferation of virtual machines (VMs) and the inability to actually contain the security controls within those machines and across these machines from an enterprise perspective. So we’re going to see more solutions proliferate in this area and to try to solve some of the management issues, as well as the security issues, but we’re a long ways away from that.

Gerty: Big Data, by itself, isn’t magical. It doesn’t have the answers just by being big. If you need more, you need to pry deeper into it. That’s the example. They realized early enough that they were able to make something good.

Gardner: Jim Hietala, any thoughts about examples that illustrate where we’re going and why this is so important?

Hietala: Being a security guy, I tend to talk about scare stories, horror stories. One example from last year that struck me. One of the major retailers here in the U.S. hit the news for having predicted, through customer purchase behavior, when people were pregnant.

They could look and see, based upon buying 20 things, that if you’re buying 15 of these and your purchase behavior has changed, they can tell that. The privacy implications to that are somewhat concerning.

An example was that this retailer was sending out coupons related to somebody being pregnant. The teenage girl, who was pregnant hadn’t told her family yet. The father found it. There was alarm in the household and at the local retailer store, when the father went and confronted them.

Privacy implications

There are privacy implications from the use of Big Data. When you get powerful new technology in marketing people’s hands, things sometimes go awry. So I’d throw that out just as a cautionary tale that there is that aspect to this. When you can see across people’s buying transactions, things like that, there are privacy considerations that we’ll have to think about, and that we really need to think about as an industry and a society.

Comments Off

Filed under Conference

On Demand Broadcasts from Day One at The Open Group Conference in Newport Beach

By The Open Group Conference Team

Since not everyone could make the trip to The Open Group Conference in Newport Beach, we’ve put together a recap of day one’s plenary speakers. Stay tuned for more recaps coming soon!

Big Data at NASA

In his talk titled, “Big Data at NASA,” Chris Gerty, deputy program manager, Open Innovation Program, National Aeronautics and Space Administration (NASA), discussed how Big Data is being interpreted by the next generation of rocket scientists. Chris presented a few lessons learned from his experiences at NASA:

  1. A traditional approach is not always the best approach. A tried and proven method may not translate. Creating more programs for more data to store on bigger hard drives is not always effective. We need to address the never-ending challenges that lie ahead in the shift of society to the information age.
  2. A plan for openness. Based on a government directive, Chris’ team looked to answer questions by asking the right people. For example, NASA asked the people gathering data on a satellite to determine what data was the most important, which enabled NASA to narrow focus and solve problems. Furthermore, by realizing what can also be useful to the public and what tools have already been developed by the public, open source development can benefit the masses. Through collaboration, governments and citizens can work together to solve some of humanity’s biggest problems.
  3. Embrace the enormity of the universe. Look for Big Data where no one else is looking by putting sensors and information gathering tools. If people continue to be scared of Big Data, we will be resistant to gathering more of it. By finding Big Data where it has yet to be discovered, we can solve problems and innovate.

To view Chris’s presentation, please watch the broadcasted session here: http://new.livestream.com/opengroup/Gerty-NPB13

Bringing Order to the Chaos

David Potter, chief technical officer at Promise Innovation and Ron Schuldt, senior partner at UDEF-IT, LLC discussed how The Open Group’s evolving Quantum Lifecycle Management (QLM) standard coupled with its complementary Universal Data Element Framework (UDEF) standard help bring order to the terminology chaos that faces Big Data implementations.

The QLM standard provides a framework for the aggregation of lifecycle data from a multiplicity of sources to add value to the decision making process. Gathering mass amounts of data is useless if it cannot be analyzed. The QLM framework provides a means to interpret the information gathered for business intelligence. The UDEF allows each piece of data to be paired with an unambiguous key to provide clarity. By partnering with the UDEF, the QLM framework is able to separate itself from domain-specific semantic models. The UDEF also provides a ready-made key for international language support. As an open standard, the UDEF is data model independent and as such supports normalization across data models.

One example of successful implementation is by Compassion International. The organization needed to find a balance between information that should be kept internal (e.g., payment information) and information that should be shared with its international sponsors. In this instance, UDEF was used as a structured process for harmonizing the terms used in IT systems between funding partners.

The beauty of the QLM framework and UDEF integration is that they are flexible and can be applied to any product, domain and industry.

To view David and Ron’s presentation, please watch the broadcasted session here: http://new.livestream.com/opengroup/potter-NPB13

Big Data – Panel Discussion

Moderated by Dana Gardner, Interarbor Solution, Robert Weisman , Build The Vision, Andras Szakal, IBM, Jim Hietala, The Open Group, and Chris Gerty, NASA, discussed the implications of Big Data and what it means for business architects and enterprise architects.

Big Data is not about the size but about analyzing that data. Robert mentioned that most organizations store more data than they need or use, and from an enterprise architect’s perspective, it’s important to focus on the analysis of the data and to provide information that will ultimately aid it in some way. When it comes to security, Jim explained that newer Big Data platforms are not built with security in mind. While data is data, many security controls don’t translate to new platforms or scale with the influx of data.

Cloud Computing is Big Data-ready, and price can be compelling, but there are significant security and privacy risks. Robert brought up the argument over public and private Cloud adoption, and said, “It’s not one size fits all.” But can Cloud and Big Data come together? Andras explained that Cloud is not the almighty answer to Big Data. Every organization needs to find the Enterprise Architecture that fits its needs.

The fruits of Big Data can be useful to more than just business intelligence professionals. With the trend of mobility and application development in mind, Chris suggested that developers keep users in mind. Big Data can be used to tell us many different things, but it’s about finding out what is most important and relevant to users in a way that is digestible.

Finally, the panel discussed how Big Data bringing about big changes in almost every aspect of an organization. It is important not to generalize, but customize. Every enterprise needs its own set of architecture to fit its needs. Each organization finds importance in different facets of the data gathered, and security is different at every organization. With all that in mind, the panel agreed that focusing on the analytics is the key.

To view the panel discussion, please watch the broadcasted session here: http://new.livestream.com/opengroup/events/1838807

Comments Off

Filed under Conference

Capturing The Open Group Conference in Newport Beach

By The Open Group Conference Team

It is time to announce the winners of the Newport Beach Photo Contest! For those of you who were unable to attend, conference attendees submitted some of their best photos to the contest for a chance to win one free conference pass to one of The Open Group’s global conferences over the next year – a prize valued at more than $1,000/€900 value.

Southern California is known for its palm trees and warm sandy beaches. While Newport Beach is most recognized for its high-end real estate and association with popular television show, “The OC,” enterprise architects invaded the beach and boating town for The Open Group Conference.

The contest ended Friday at noon PDT, and it is time to announce the winners…

Best of The Open Group Conference in Newport Beach - For any photo taken during conference activities

The winner is Henry Franken, BiZZdesign!

 Henry Franken 01 BiZZdesign table

A busy BiZZdesign exhibitor booth

The Real OC Award – For best photo taken in or around Newport Beach

The winner is Andrew Josey, The Open Group!

 Andrew Josey 02

A local harbor in Newport Beach, Calif.

Thank you to all those who participated in this contest – whether it was submitting one of your own photos or voting for your favorites. Please visit The Open Group’s Facebook page to view all of the submissions and conference photos.

We’re always trying to improve our programs, so if you have any feedback regarding the photo contest, please email photo@opengroup.org or leave a comment below. We’ll see you in Sydney!

Comments Off

Filed under Conference

The Open Group Conference Plenary Speaker Sees Big-Data Analytics as a Way to Bolster Quality, Manufacturing and Business Processes

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: The Open Group Keynoter Sees Big-Data Analytics as a Way to Bolster Quality, Manufacturing and Business Processes

This is a transcript of a sponsored podcast discussion on Big Data analytics and its role in business processes, in conjunction with the The Open Group Conference in Newport Beach.

Dana Gardner: Hello, and welcome to a special thought leadership interview series coming to you in conjunction with The Open Group® Conference on January 28 in Newport Beach, California.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these business transformation discussions. The conference will focus on big data and the transformation we need to embrace today.

We are here now with one of the main speakers at the conference; Michael Cavaretta, PhD, Technical Leader of Predictive Analytics for Ford Research and Advanced Engineering in Dearborn, Michigan.

We’ll see how Ford has exploited the strengths of big data analytics by directing them internally to improve business results. In doing so, they scour the metrics from the company’s best processes across myriad manufacturing efforts and through detailed outputs from in-use automobiles, all to improve and help transform their business.

Cavaretta has led multiple data-analytic projects at Ford to break down silos inside the company to best define Ford’s most fruitful datasets. Ford has successfully aggregated customer feedback, and extracted all the internal data to predict how best new features in technologies will improve their cars.

As a lead-in to his Open Group presentation, Michael and I will now explore how big data is fostering business transformation by allowing deeper insights into more types of data efficiently, and thereby improving processes, quality control, and customer satisfaction.

With that, please join me in welcoming Michael Cavaretta. Welcome to BriefingsDirect, Michael.

Michael Cavaretta: Thank you very much.

Gardner: Your upcoming presentation for The Open Group Conference is going to describe some of these new approaches to big data and how that offers some valuable insights into internal operations, and therefore making a better product. To start, what’s different now in being able to get at this data and do this type of analysis from, say, five years ago?

Cavaretta: The biggest difference has to do with the cheap availability of storage and processing power, where a few years ago people were very much concentrated on filtering down the datasets that were being stored for long-term analysis. There has been a big sea change with the idea that we should just store as much as we can and take advantage of that storage to improve business processes.

Gardner: That sounds right on the money, but how do we get here? How do we get to the point where we could start using these benefits from a technology perspective, as you say, better storage, networks, being able to move big dataset, that sort of thing, to wrenching out benefits. What’s the process behind the benefit?

Cavaretta: The process behind the benefits has to do with a sea change in the attitude of organizations, particularly IT within large enterprises. There’s this idea that you don’t need to spend so much time figuring out what data you want to store and worry about the cost associated with it, and more about data as an asset. There is value in being able to store it, and being able to go back and extract different insights from it. This really comes from this really cheap storage, access to parallel processing machines, and great software.

Gardner: It seems to me that for a long time, the mindset was that data is simply the output from applications, with applications being primary and the data being almost an afterthought. It seems like we sort flipped that. The data now is perhaps as important, even more important, than the applications. Does that seem to hold true?

Cavaretta: Most definitely, and we’ve had a number of interesting engagements where people have thought about the data that’s being collected. When we talk to them about big data, storing everything at the lowest level of transactions, and what could be done with that, their eyes light up and they really begin to get it.

Gardner: I suppose earlier, when cost considerations and technical limitations were at work, we would just go for a tip of the iceberg level. Now, as you say, we can get almost all the data. So, is this a matter of getting at more data, different types of data, bringing in unstructured data, all the above? How much you are really going after here?

Cavaretta: I like to talk to people about the possibility that big data provides and I always tell them that I have yet to have a circumstance where somebody is giving me too much data. You can pull in all this information and then answer a variety of questions, because you don’t have to worry that something has been thrown out. You have everything.

You may have 100 questions, and each one of the questions uses a very small portion of the data. Those questions may use different portions of the data, a very small piece, but they’re all different. If you go in thinking, “We’re going to answer the top 20 questions and we’re just going to hold data for that,” that leaves so much on the table, and you don’t get any value out of it.

Gardner: I suppose too that we can think about small samples or small datasets and aggregate them or join them. We have new software capabilities to do that efficiently, so that we’re able to not just look for big honking, original datasets, but to aggregate, correlate, and look for a lifecycle level of data. Is that fair as well?

Cavaretta: Definitely. We’re a big believer in mash-ups and we really believe that there is a lot of value in being able to take even datasets that are not specifically big-data sizes yet, and then not go deep, not get more detailed information, but expand the breadth. So it’s being able to augment it with other internal datasets, bridging across different business areas as well as augmenting it with external datasets.

A lot of times you can take something that is maybe a few hundred thousand records or a few million records, and then by the time you’re joining it, and appending different pieces of information onto it, you can get the big dataset sizes.

Gardner: Just to be clear, you’re unique. The conventional wisdom for big data is to look at what your customers are doing, or just the external data. You’re really looking primarily at internal data, while also availing yourself of what external data might be appropriate. Maybe you could describe a little bit about your organization, what you do, and why this internal focus is so important for you.

Cavaretta: I’m part of a larger department that is housed over in the research and advanced-engineering area at Ford Motor Company, and we’re about 30 people. We work as internal consultants, kind of like Capgemini or Ernst & Young, but only within Ford Motor Company. We’re responsible for going out and looking for different opportunities from the business perspective to bring advanced technologies. So, we’ve been focused on the area of statistical modeling and machine learning for I’d say about 15 years or so.

And in this time, we’ve had a number of engagements where we’ve talked with different business customers, and people have said, “We’d really like to do this.” Then, we’d look at the datasets that they have, and say, “Wouldn’t it be great if we would have had this. So now we have to wait six months or a year.”

These new technologies are really changing the game from that perspective. We can turn on the complete fire-hose, and then say that we don’t have to worry about that anymore. Everything is coming in. We can record it all. We don’t have to worry about if the data doesn’t support this analysis, because it’s all there. That’s really a big benefit of big-data technologies.

Gardner: If you’ve been doing this for 15 years, you must be demonstrating a return on investment (ROI) or a value proposition back to Ford. Has that value proposition been changing? Do you expect it to change? What might be your real value proposition two or three years from now?

Cavaretta: The real value proposition definitely is changing as things are being pushed down in the company to lower-level analysts who are really interested in looking at things from a data-driven perspective. From when I first came in to now, the biggest change has been when Alan Mulally came into the company, and really pushed the idea of data-driven decisions.

Before, we were getting a lot of interest from people who are really very focused on the data that they had internally. After that, they had a lot of questions from their management and from upper level directors and vice-president saying, “We’ve got all these data assets. We should be getting more out of them.” This strategic perspective has really changed a lot of what we’ve done in the last few years.

Gardener: As I listen to you Michael, it occurs to me that you are applying this data-driven mentality more deeply. As you pointed out earlier, you’re also going after all the data, all the information, whether that’s internal or external.

In the case of an automobile company, you’re looking at the factory, the dealers, what drivers are doing, what the devices within the automobile are telling you, factoring that back into design relatively quickly, and then repeating this process. Are we getting to the point where this sort of Holy Grail notion of a total feedback loop across the lifecycle of a major product like an automobile is really within our grasp? Are we getting there, or is this still kind of theoretical. Can we pull it altogether and make it a science?

Cavaretta: The theory is there. The question has more to do with the actual implementation and the practicality of it. We still are talking a lot of data where even with new advanced technologies and techniques that’s a lot of data to store, it’s a lot of data to analyze, there’s a lot of data to make sure that we can mash-up appropriately.

And, while I think the potential is there and I think the theory is there. There is also a work in being able to get the data from multiple sources. So everything which you can get back from the vehicle, fantastic. Now if you marry that up with internal data, is it survey data, is it manufacturing data, is it quality data? What are the things do you want to go after first? We can’t do everything all at the same time.

Our perspective has been let’s make sure that we identify the highest value, the greatest ROI areas, and then begin to take some of the major datasets that we have and then push them and get more detail. Mash them up appropriately and really prove up the value for the technologists.

Gardner: Clearly, there’s a lot more to come in terms of where we can take this, but I suppose it’s useful to have a historic perspective and context as well. I was thinking about some of the early quality gurus like Deming and some of the movement towards quality like Six Sigma. Does this fall within that same lineage? Are we talking about a continuum here over that last 50 or 60 years, or is this something different?

Cavaretta: That’s a really interesting question. From the perspective of analyzing data, using data appropriately, I think there is a really good long history, and Ford has been a big follower of Deming and Six Sigma for a number of years now.

The difference though, is this idea that you don’t have to worry so much upfront about getting the data. If you’re doing this right, you have the data right there, and this has some great advantages. You’ll have to wait until you get enough history to look for somebody’s patterns. Then again, it also has some disadvantage, which is you’ve got so much data that it’s easy to find things that could be spurious correlations or models that don’t make any sense.

The piece that is required is good domain knowledge, in particular when you are talking about making changes in the manufacturing plant. It’s very appropriate to look at things and be able to talk with people who have 20 years of experience to say, “This is what we found in the data. Does this match what your intuition is?” Then, take that extra step.

Gardner: Tell me a little about sort a day in the life of your organization and your team to let us know what you do. How do you go about making more data available and then reaching some of these higher-level benefits?

Cavaretta: We’re very much focused on interacting with the business. Most of all, we do have to deal with working on pilot projects and working with our business customers to bring advanced analytics and big data technologies to bear against these problems. So we work in kind of what we call push-and-pull model.

We go out and investigate technologies and say these are technologies that Ford should be interested in. Then, we look internally for business customers who would be interested in that. So, we’re kind of pushing the technologies.

From the pull perspective, we’ve had so many successful engagements in such good contacts and good credibility within the organization that we’ve had people come to us and say, “We’ve got a problem. We know this has been in your domain. Give us some help. We’d love to be able to hear your opinions on this.”

So we’ve pulled from the business side and then our job is to match up those two pieces. It’s best when we will be looking at a particular technology and we have somebody come to us and we say, “Oh, this is a perfect match.”

Those types of opportunities have been increasing in the last few years, and we’ve been very happy with the number of internal customers that have really been very excited about the areas of big data.

Gardner: Because this is The Open Group conference and an audience that’s familiar with the IT side of things, I’m curious as to how this relates to software and software development. Of course there are so many more millions of lines of code in automobiles these days, software being more important than just about everything. Are you applying a lot of what you are doing to the software side of the house or are the agile and the feedback loops and the performance management issues a separate domain, or it’s your crossover here?

Cavaretta: There’s some crossover. The biggest area that we’ve been focused on has been picking information, whether internal business processes or from the vehicle, and then being able to bring it back in to derive value. We have very good contacts in the Ford IT group, and they have been fantastic to work with in bringing interesting tools and technology to bear, and then looking at moving those into production and what’s the best way to be able to do that.

A fantastic development has been this idea that we’re using some of the more agile techniques in this space and Ford IT has been pushing this for a while. It’s been fantastic to see them work with us and be able to bring these techniques into this new domain. So we’re pushing the envelope from two different directions.

Gardner: It sounds like you will be meeting up at some point with a complementary nature to your activities.

Cavaretta: Definitely.

Gardner: Let’s move on to this notion of the “Internet of things,” a very interesting concept that lot of people talk about. It seems relevant to what we’ve been discussing. We have sensors in these cars, wireless transfer of data, more-and-more opportunity for location information to be brought to bear, where cars are, how they’re driven, speed information, all sorts of metrics, maybe making those available through cloud providers that assimilate this data.

So let’s not go too deep, because this is a multi-hour discussion all on its own, but how is this notion of the Internet of things being brought to bear on your gathering of big data and applying it to the analytics in your organization?

Cavaretta: It is a huge area, and not only from the internal process perspective –  RFID tags within the manufacturing plans, as well as out on the plant floor, and then all of the information that’s being generated by the vehicle itself.

The Ford Energi generates about 25 gigabytes of data per hour. So you can imagine selling couple of million vehicles in the near future with that amount of data being generated. There are huge opportunities within that, and there are also some interesting opportunities having to do with opening up some of these systems for third-party developers. OpenXC is an initiative that we have going on to add at Research and Advanced Engineering.

We have a lot of data coming from the vehicle. There’s huge number of sensors and processors that are being added to the vehicles. There’s data being generated there, as well as communication between the vehicle and your cell phone and communication between vehicles.

There’s a group over at Ann Arbor Michigan, the University of Michigan Transportation Research Institute (UMTRI), that’s investigating that, as well as communication between the vehicle and let’s say a home system. It lets the home know that you’re on your way and it’s time to increase the temperature, if it’s winter outside, or cool it at the summer time. The amount of data that’s been generated there is invaluable information and could be used for a lot of benefits, both from the corporate perspective, as well as just the very nature of the environment.

Gardner: Just to put a stake in the ground on this, how much data do cars typically generate? Do you have a sense of what now is the case, an average?

Cavaretta: The Energi, according to the latest information that I have, generates about 25 gigabytes per hour. Different vehicles are going to generate different amounts, depending on the number of sensors and processors on the vehicle. But the biggest key has to do with not necessarily where we are right now but where we will be in the near future.

With the amount of information that’s being generated from the vehicles, a lot of it is just internal stuff. The question is how much information should be sent back for analysis and to find different patterns? That becomes really interesting as you look at external sensors, temperature, humidity. You can know when the windshield wipers go on, and then to be able to take that information, and mash that up with other external data sources too. It’s a very interesting domain.

Gardner: So clearly, it’s multiple gigabytes per hour per vehicle and probably going much higher.

Cavaretta: Easily.

Gardner: Let’s move forward now for those folks who have been listening and are interested in bringing this to bear on their organizations and their vertical industries, from the perspective of skills, mindset, and culture. Are there standards, certification, or professional organizations that you’re working with in order to find the right people?

It’s a big question. Let’s look at what skills do you target for your group, and what ways you think that you can improve on that. Then, we’ll get into some of those larger issues about culture and mindset.

Cavaretta: The skills that we have in our department, in particular on our team, are in the area of computer science, statistics, and some good old-fashioned engineering domain knowledge. We’ve really gone about this from a training perspective. Aside from a few key hires, it’s really been an internally developed group.

The biggest advantage that we have is that we can go out and be very targeted with the amount of training that we have. There are such big tools out there, especially in the open-source realm, that we can spin things up with relatively low cost and low risk, and do a number of experiments in the area. That’s really the way that we push the technologies forward.

Gardner: Why The Open Group? Why is that a good forum for your message, and for your research here?

Cavaretta: The biggest reason is the focus on the enterprise, where there are a lot of advantages and a lot of business cases, looking at large enterprises and where there are a lot of systems, companies that can take a relatively small improvement, and it can make a large difference on the bottom-line.

Talking with The Open Group really gives me an opportunity to be able to bring people on board with the idea that you should be looking at a difference in mindset. It’s not “Here’s a way that data is being generated, look, try and conceive of some questions that we can use, and we’ll store that too.” Let’s just take everything, we’ll worry about it later, and then we’ll find the value.

Gardner: I’m sure the viewers of your presentation on January 28 will be gathering a lot of great insights. A lot of the people that attend The Open Group conferences are enterprise architects. What do you think those enterprise architects should be taking away from this? Is there something about their mindset that should shift in recognizing the potential that you’ve been demonstrating?

Cavaretta: It’s important for them to be thinking about data as an asset, rather than as a cost. You even have to spend some money, and it may be a little bit unsafe without really solid ROI at the beginning. Then, move towards pulling that information in, and being able to store it in a way that allows not just the high-level data scientist to get access to and provide value, but people who are interested in the data overall. Those are very important pieces.

The last one is how do you take a big-data project, how do you take something where you’re not storing in the traditional business intelligence (BI) framework that an enterprise can develop, and then connect that to the BI systems and look at providing value to those mash-ups. Those are really important areas that still need some work.

Gardner: Another big constituency within The Open Group community are those business architects. Is there something about mindset and culture, getting back to that topic, that those business-level architects should consider? Do you really need to change the way you think about planning and resource allocation in a business setting, based on the fruits of things that you are doing with big data?

Cavaretta: I really think so. The digital asset that you have can be monetized to change the way the business works, and that could be done by creating new assets that then can be sold to customers, as well as improving the efficiencies of the business.

This idea that everything is going to be very well-defined and there is a lot of work that’s being put into  making sure that data has high quality, I think those things need to be changed somewhat. As you’re pulling the data in, as you are thinking about long-term storage, it’s more the access to the information, rather than the problem in just storing it.

Gardner: Interesting that you brought up that notion that the data becomes a product itself and even a profit center perhaps.

Cavaretta: Exactly. There are many companies, especially large enterprises, that are looking at their data assets and wondering what can they do to monetize this, not only to just pay for the efficiency improvement but as a new revenue stream.

Gardner: We’re almost out of time. For those organizations that want to get started on this, are there any 20/20 hindsights or Monday morning quarterback insights you can provide. How do you get started? Do you appoint a leader? Do you need a strategic roadmap, getting this culture or mindset shifted, pilot programs? How would you recommend that people might begin the process of getting into this?

Cavaretta: We’re definitely a huge believer in pilot projects and proof of concept, and we like to develop roadmaps by doing. So get out there. Understand that it’s going to be messy. Understand that it maybe going to be a little bit more costly and the ROI isn’t going to be there at the beginning.

But get your feet wet. Start doing some experiments, and then, as those experiments turn from just experimentation into really providing real business value, that’s the time to start looking at a more formal aspect and more formal IT processes. But you’ve just got to get going at this point.

Gardner: I would think that the competitive forces are out there. If you are in a competitive industry, and those that you compete against are doing this and you are not, that could spell some trouble.

Cavaretta:  Definitely.

Gardner: We’ve been talking with Michael Cavaretta, PhD, Technical Leader of Predictive Analytics at Ford Research and Advanced Engineering in Dearborn, Michigan. Michael and I have been exploring how big data is fostering business transformation by allowing deeper insights into more types of data and all very efficiently. This is improving processes, updating quality control and adding to customer satisfaction.

Our conversation today comes as a lead-in to Michael’s upcoming plenary presentation. He is going to be talking on January 28 in Newport Beach California, as part of The Open Group conference.

You will hear more from Michael and others, the global leaders on big data that are going to be gathering to talk about business transformation from big data at this conference. So a big thank you to Michael for joining us in this fascinating discussion. I really enjoyed it and I look forward to your presentation on the 28.

Cavaretta: Thank you very much.

Gardner: And I would encourage our listeners and readers to attend the conference or follow more of the threads in social media from the event. Again, it’s going to be happening from January 27 to January 30 in Newport Beach, California.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator through the thought leadership interviews. Thanks again for listening, and come back next time.

1 Comment

Filed under Conference, Uncategorized

How Should we use Cloud?

By Chris Harding, The Open Group

How should we use Cloud? This is the key question at the start of 2013.

The Open Group® conferences in recent years have thrown light on, “What is Cloud?” and, “Should we use Cloud?” It is time to move on.

Cloud as a Distributed Processing Platform

The question is an interesting one, because the answer is not necessarily, “Use Cloud resources just as you would use in-house resources.” Of course, you can use Cloud processing and storage to replace or supplement what you have in-house, and many companies are doing just that. You can also use the Cloud as a distributed computing platform, on which a single application instance can use multiple processing and storage resources, perhaps spread across many countries.

It’s a bit like contracting a company to do a job, rather than hiring a set of people. If you hire a set of people, you have to worry about who will do what when. Contract a company, and all that is taken care of. The company assembles the right people, schedules their work, finds replacements in case of sickness, and moves them on to other things when their contribution is complete.

This doesn’t only make things easier, it also enables you to tackle bigger jobs. Big Data is the latest technical phenomenon. Big Data can be processed effectively by parceling the work out to multiple computers. Cloud providers are beginning to make the tools to do this available, using distributed file systems and map-reduce. We do not yet have, “Distributed Processing as a Service” – but that will surely come.

Distributed Computing at the Conference

Big Data is the main theme of the Newport Beach conference. The plenary sessions have keynote presentations on Big Data, including the crucial aspect of security, and there is a Big Data track that explores in depth its use in Enterprise Architecture.

There are also Cloud tracks that explore the business aspects of using Cloud and the use of Cloud in Enterprise Architecture, including a session on its use for Big Data.

Service orientation is generally accepted as a sound underlying principle for systems using both Cloud and in-house resources. The Service Oriented Architecture (SOA) movement focused initially on its application within the enterprise. We are now looking to apply it to distributed systems of all kinds. This may require changes to specific technology and interfaces, but not to the fundamental SOA approach. The Distributed Services Architecture track contains presentations on the theory and practice of SOA.

Distributed Computing Work in The Open Group

Many of the conference presentations are based on work done by Open Group members in the Cloud Computing, SOA and Semantic Interoperability Work Groups, and in the Architecture, Security and Jericho Forums. The Open Group enables people to come together to develop standards and best practices for the benefit of the architecture community. We have active Work Groups and Forums working on artifacts such as a Cloud Computing Reference Architecture, a Cloud Portability and Interoperability Guide, and a Guide to the use of TOGAF® framework in Cloud Ecosystems.

The Open Group Conference in Newport Beach

Our conferences provide an opportunity for members and non-members to discuss ideas together. This happens not only in presentations and workshops, but also in informal discussions during breaks and after the conference sessions. These discussions benefit future work at The Open Group. They also benefit the participants directly, enabling them to bring to their enterprises ideas that they have sounded out with their peers. People from other companies can often bring new perspectives.

Most enterprises now know what Cloud is. Many have identified specific opportunities where they will use it. The challenge now for enterprise architects is determining how best to do this, either by replacing in-house systems, or by using the Cloud’s potential for distributed processing. This is the question for discussion at The Open Group Conference in Newport Beach. I’m looking forward to an interesting conference!

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

1 Comment

Filed under Cloud, Conference

The Open Group Photo Contest: Document the Magic at the Newport Beach Conference!

By The Open Group Conference Team

It’s that time again! The Open Group is busily preparing for the Newport Beach Conference, taking place Jan. 28-31, 2013. As you begin packing, charge up your smartphones and bring your digital cameras: We’ll be hosting The Open Group Photo Contest once again! The prize is a free pass to attend any one of the Open Group conferences in 2013!

The contest is open to all Newport Beach Conference attendees. Here are the details for those of you who have yet to participate or need a refresher on our guidelines.

The categories will include:

  • The Real O.C. Award – any photo taken in or around Newport Beach.
  • The Newport Beach Conference Award – any photo taken during the conference. This includes photos of keynote speakers, candid photos of Open Group members, group sessions, etc.

Participants can submit photos via Twitter using the hashtag #ogPhoto, or via email to photo@opengroup.org.  Please include your full name and the photo’s category upon submission. The submission period will end on Friday, February 8 at 5:00 p.m. PT, with the winner to be announced the following week.

All photos will be uploaded to The Open Group’s Facebook page. Facebook members can vote by “liking” a photo; photos with the most “likes” in each category will win the contest. Photos will be uploaded in real-time, so the sooner you submit a photo, the more time members will have to vote on it.

Below are previous photo contest winners from the Barcelona Conference in 2012:

Modernista Award: For best photo taken in or around Barcelona

Winner: Craig Heath

Craig Heath - Franklin Heath

“Barcelona Sky from the Fundació Joan Miró”

Best of Barcelona Conference Award:  For any photo taken during conference activities

Winner: Leonardo Ramirez

Leonardo Ramirez DuxDiligens 5

A flamenco dancer at the Tuesday night event

1 Comment

Filed under Conference

2013 Open Group Predictions, Vol. 2

By The Open Group

Continuing on the theme of predictions, here are a few more, which focus on global IT trends, business architecture, OTTF and Open Group events in 2013.

Global Enterprise Architecture

By Chris Forde, Vice President of Enterprise Architecture and Membership Capabilities

Cloud is no longer a bleeding edge technology – most organizations are already well on their way to deploying cloud technology.  However, Cloud implementations are resurrecting a perennial problem for organizations—integration. Now that Cloud infrastructures are being deployed, organizations are having trouble integrating different systems, especially with systems hosted by third parties outside their organization. What will happen when two, three or four technical delivery systems are hosted on AND off premise? This presents a looming integration problem.

As we see more and more organizations buying into cloud infrastructures, we’ll see an increase in cross-platform integration architectures globally in 2013. The role of the enterprise architect will become more complex. Architectures must not only ensure that systems are integrated properly, but architects also need to figure out a way to integrate outsourced teams and services and determine responsibility across all systems. Additionally, outsourcing and integration will lead to increased focus on security in the coming year, especially in healthcare and financial sectors. When so many people are involved, and responsibility is shared or lost in the process, gaping holes can be left unnoticed. As data is increasingly shared between organizations and current trends escalate, security will also become more and more of a concern. Integration may yield great rewards architecturally, but it also means greater exposure to vulnerabilities outside of your firewall.

Within the Architecture Forum, we will be working on improvements to the TOGAF® standard throughout 2013, as well as an effort to continue to harmonize the TOGAF specification with the ArchiMate® modelling language.  The Forum also expects to publish a whitepaper on application portfolio management in the new year, as well as be involved in the upcoming Cloud Reference Architecture.

In China, The Open Group is progressing well. In 2013, we’ll continue translating The Open Group website, books and whitepapers from English to Chinese. Partnerships and Open CA certification will remain in the forefront of global priorities, as well as enrolling TOGAF trainers throughout Asia Pacific as Open Group members. There are a lot of exciting developments arising, and we will keep you updated as we expand our footprint in China and the rest of Asia.

Open Group Events in 2013

By Patty Donovan, Vice President of Membership and Events

In 2013, the biggest change for us will be our quarterly summit. The focus will shift toward an emphasis on verticals. This new focus will debut at our April event in Sydney where the vertical themes include Mining, Government, and Finance. Additional vertical themes that we plan to cover throughout the year include: Healthcare, Transportation, Retail, just to name a few. We will also continue to increase the number of our popular Livestream sessions as we have seen an extremely positive reaction to them as well as all of our On-Demand sessions – listen to best selling authors and industry leaders who participated as keynote and track speakers throughout the year.

Regarding social media, we made big strides in 2012 and will continue to make this a primary focus of The Open Group. If you haven’t already, please “like” us on Facebook, follow us on Twitter, join the chat on (#ogchat) one of our Security focused Tweet Jams, and join our LinkedIn Group. And if you have the time, we’d love for you to contribute to The Open Group blog.

We’re always open to new suggestions, so if you have a creative idea on how we can improve your membership, Open Group events, webinars, podcasts, please let me know! Also, please be sure to attend the upcoming Open Group Conference in Newport Beach, Calif., which is taking place on January 28-31. The conference will address Big Data.

Business Architecture

By Steve Philp, Marketing Director for Open CA and Open CITS

Business Architecture is still a relatively new discipline, but in 2013 I think it will continue to grow in prominence and visibility from an executive perspective. C-Level decision makers are not just looking at operational efficiency initiatives and cost reduction programs to grow their future revenue streams; they are also looking at market strategy and opportunity analysis.

Business Architects are extremely valuable to an organization when they understand market and technology trends in a particular sector. They can then work with business leaders to develop strategies based on the capabilities and positioning of the company to increase revenue, enhance their market position and improve customer loyalty.

Senior management recognizes that technology also plays a crucial role in how organizations can achieve their business goals. A major role of the Business Architect is to help merge technology with business processes to help facilitate this business transformation.

There are a number of key technology areas for 2013 where Business Architects will be called upon to engage with the business such as Cloud Computing, Big Data and social networking. Therefore, the need to have competent Business Architects is a high priority in both the developed and emerging markets and the demand for Business Architects currently exceeds the supply. There are some training and certification programs available based on a body of knowledge, but how do you establish who is a practicing Business Architect if you are looking to recruit?

The Open Group is trying to address this issue and has incorporated a Business Architecture stream into The Open Group Certified Architect (Open CA) program. There has already been significant interest in this stream from both organizations and practitioners alike. This is because Open CA is a skills- and experience-based program that recognizes, at different levels, those individuals who are actually performing in a Business Architecture role. You must complete a candidate application package and be interviewed by your peers. Achieving certification demonstrates your competency as a Business Architect and therefore will stand you in good stead for both next year and beyond.

You can view the conformance criteria for the Open CA Business Architecture stream at https://www2.opengroup.org/ogsys/catalog/X120.

Trusted Technology

By Sally Long, Director of Consortia Services

The interdependency of all countries on global technology providers and technology providers’ dependencies on component suppliers around the world is more certain than ever before.  The need to work together in a vendor-neutral, country-neutral environment to assure there are standards for securing technology development and supply chain operations will become increasingly apparent in 2013. Securing the global supply chain can not be done in a vacuum, by a few providers or a few governments, it must be achieved by working together with all governments, providers, component suppliers and integrators and it must be done through open standards and accreditation programs that demonstrate conformance to those standards and are available to everyone.

The Open Group’s Trusted Technology Forum is providing that open, vendor and country-neutral environment, where suppliers from all countries and governments from around the world can work together in a trusted collaborative environment, to create a standard and an accreditation program for securing the global supply chain. The Open Trusted Technology Provider Standard (O-TTPS) Snapshot (Draft) was published in March of 2012 and is the basis for our 2013 predictions.

We predict that in 2013:

  • Version 1.0 of the O-TTPS (Standard) will be published.
  • Version 1.0 will be submitted to the ISO PAS process in 2013, and will likely become part of the ISO/IEC 27036 standard, where Part 5 of that ISO standard is already reserved for the O-TTPS work
  • An O-TTPS Accreditation Program – open to all providers, component suppliers, and integrators, will be launched
  • The Forum will continue the trend of increased member participation from governments and suppliers around the world

4 Comments

Filed under Business Architecture, Conference, Enterprise Architecture, O-TTF, OTTF

The Open Group Newport Beach Conference – Early Bird Registration Ends January 4

By The Open Group Conference Team

The Open Group is busy gearing up for the Newport Beach Conference. Taking place January 28-31, 2013, the conference theme is “Big Data – The Transformation We Need to Embrace Today” and will bring together leading minds in technology to discuss the challenges and solutions facing Enterprise Architecture around the growth of Big Data. Register today!

Information is power, and we stand at a time when 90% of the data in the world today was generated in the last two years alone.  Despite the sheer enormity of the task, off the shelf hardware, open source frameworks, and the processing capacity of the Cloud, mean that Big Data processing is within the cost-effective grasp of the average business. Organizations can now initiate Big Data projects without significant investment in IT infrastructure.

In addition to tutorial sessions on TOGAF® and ArchiMate®, the conference offers roughly 60 sessions on a varied of topics including:

  • The ways that Cloud Computing is transforming the possibilities for collecting, storing, and processing big data.
  • How to contend with Big Data in your Enterprise?
  • How does Big Data enable your Business Architecture?
  • What does the Big Data revolution mean for the Enterprise Architect?
  • Real-time analysis of Big Data in the Cloud.
  • Security challenges in the world of outsourced data.
  • What is an architectural view of Security for the Cloud?

Plenary speakers include:

  • Christian Verstraete, Chief Technologist – Cloud Strategy, HP
  • Mary Ann Mezzapelle, Strategist – Security Services, HP
  • Michael Cavaretta, Ph.D, Technical Leader, Predictive Analytics / Data Mining Research and Advanced Engineering, Ford Motor Company
  • Adrian Lane, Analyst and Chief Technical Officer, Securosis
  • David Potter, Chief Technical Officer, Promise Innovation Oy
  • Ron Schuldt, Senior Partner, UDEF-IT, LLC

A full conference agenda is available here. Tracks include:

  • Architecting Big Data
  • Big Data and Cloud Security
  • Data Architecture and Big Data
  • Business Architecture
  • Distributed Services Architecture
  • EA and Disruptive Technologies
  • Architecting the Cloud
  • Cloud Computing for Business

Early Bird Registration

Early Bird registration for The Open Group Conference in Newport Beach ends January 4. Register now and save! For more information or to register: http://www.opengroup.org/event/open-group-newport-beach-2013/reg

Upcoming Conference Submission Deadlines

In addition to the Early Bird registration deadline to attend the Newport Beach conference, there are upcoming deadlines for speaker proposal submissions to Open Group conferences in Sydney, Philadelphia and London. To submit a proposal to speak, click here.

Venue Industry Focus Submission Deadline
Sydney (April 15-17) Finance, Defense, Mining January 18, 2013
Philadelphia (July 15-17) Healthcare, Finance, Defense April 5, 2013
London (October 21-23) Finance, Government, Healthcare July 8, 2013

We expect space on the agendas of these events to be at a premium, so it is important for proposals to be submitted as early as possible. Proposals received after the deadline dates will still be considered, if space is available; if not, they may be carried over to a future conference. Priority will be given to proposals received by the deadline dates and to proposals that include an end-user organization, at least as a co-presenter.

Comments Off

Filed under Conference