Category Archives: Future Technologies

The Open Group Panel: Internet of Things – Opportunities and Obstacles

Below is the transcript of The Open Group podcast exploring the challenges and ramifications of the Internet of Things, as machines and sensors collect vast amounts of data.

Listen to the podcast.

Dana Gardner: Hello, and welcome to a special BriefingsDirect thought leadership interview series coming to you in conjunction with recent The Open Group Boston 2014 on July 21 in Boston.

Dana Gardner I’m Dana Gardner, principal analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these discussions on Open Platform 3.0 and Boundaryless Information Flow.

We’re going to now specifically delve into the Internet of Things with a panel of experts. The conference has examined how Open Platform 3.0™ leverages the combined impacts of cloud, big data, mobile, and social. But to each of these now we can add a new cresting wave of complexity and scale as we consider the rapid explosion of new devices, sensors, and myriad endpoints that will be connected using internet protocols, standards and architectural frameworks.

This means more data, more cloud connectivity and management, and an additional tier of “things” that are going to be part of the mobile edge — and extending that mobile edge ever deeper into even our own bodies.

When we think about inputs to these social networks — that’s going to increase as well. Not only will people be tweeting, your device could be very well tweet, too — using social networks to communicate. Perhaps your toaster will soon be sending you a tweet about your English muffins being ready each morning.

The Internet of Things is more than the “things” – it means a higher order of software platforms. For example, if we are going to operate data centers with new dexterity thanks to software-definited networking (SDN) and storage (SDS) — indeed the entire data center being software-defined (SDDC) — then why not a software-defined automobile, or factory floor, or hospital operating room — or even a software-defined city block or neighborhood?

And so how does this all actually work? Does it easily spin out of control? Or does it remain under proper management and governance? Do we have unknown unknowns about what to expect with this new level of complexity, scale, and volume of input devices?

Will architectures arise that support the numbers involved, interoperability, and provide governance for the Internet of Things — rather than just letting each type of device do its own thing?

To help answer some of these questions, The Open Group assembled a distinguished panel to explore the practical implications and limits of the Internet of Things. So please join me in welcoming Said Tabet, Chief Technology Officer for Governance, Risk and Compliance Strategy at EMC, and a primary representative to the Industrial Internet Consortium; Penelope Gordon, Emerging Technology Strategist at 1Plug Corporation; Jean-Francois Barsoum, Senior Managing Consultant for Smarter Cities, Water and Transportation at IBM, and Dave Lounsbury, Chief Technical Officer at The Open Group.

Jean-Francois, we have heard about this notion of “cities as platforms,” and I think the public sector might offer us some opportunity to look at what is going to happen with the Internet of Things, and then extrapolate from that to understand what might happen in the private sector.

Hypothetically, the public sector has a lot to gain. It doesn’t have to go through the same confines of a commercial market development, profit motive, and that sort of thing. Tell us a little bit about what the opportunity is in the public sector for smart cities.

Barsoum_Jean-FrancoisJean-Francois Barsoum: It’s immense. The first thing I want to do is link to something that Marshall Van Alstyne (Professor at Boston University and Researcher at MIT) had talked about, because I was thinking about his way of approaching platforms and thinking about how cities represent an example of that.

You don’t have customers; you have citizens. Cities are starting to see themselves as platforms, as ways to communicate with their customers, their citizens, to get information from them and to communicate back to them. But the complexity with cities is that as a good a platform as they could be, they’re relatively rigid. They’re legislated into existence and what they’re responsible for is written into law. It’s not really a market.

Chris Harding (Forum Director of The Open Group Open Platform 3.0) earlier mentioned, for example, water and traffic management. Cities could benefit greatly by managing traffic a lot better.

Part of the issue is that you might have a state or provincial government that looks after highways. You might have the central part of the city that looks after arterial networks. You might have a borough that would look after residential streets, and these different platforms end up not talking to each other.

They gather their own data. They put in their own widgets to collect information that concerns them, but do not necessarily share with their neighbor. One of the conditions that Marshall said would favor the emergence of a platform had to do with how much overlap there would be in your constituents and your customers. In this case, there’s perfect overlap. It’s the same citizen, but they have to carry an Android and an iPhone, despite the fact it is not the best way of dealing with the situation.

The complexities are proportional to the amount of benefit you could get if you could solve them.

Gardner: So more interoperability issues?

Barsoum: Yes.

More hurdles

Gardner: More hurdles, and when you say commensurate, you’re saying that the opportunity is huge, but the hurdles are huge and we’re not quite sure how this is going to unfold.

Barsoum: That’s right.

Gardner: Let’s go to an area where the opportunity outstrips the challenge, manufacturing. Said, what is the opportunity for the software-defined factory floor for recognizing huge efficiencies and applying algorithmic benefits to how management occurs across domains of supply-chain, distribution, and logistics. It seems to me that this is a no-brainer. It’s such an opportunity that the solution must be found.

Tabet_SaidSaid Tabet: When it comes to manufacturing, the opportunities are probably much bigger. It’s where we can see a lot of progress that has already been done and still work is going on. There are two ways to look at it.

One is the internal side of it, where you have improvements of business processes. For example, similar to what Jean-Francois said, in a lot of the larger companies that have factories all around the world, you’ll see such improvements on a factory base level. You still have those silos at that level.

Now with this new technology, with this connectedness, those improvements are going to be made across factories, and there’s a learning aspect to it in terms of trying to manage that data. In fact, they do a better job. We still have to deal with interoperability, of course, and additional issues that could be jurisdictional, etc.

However, there is that learning that allows them to improve their processes across factories. Maintenance is one of them, as well as creating new products, and connecting better with their customers. We can see a lot of examples in the marketplace. I won’t mention names, but there are lots of them out there with the large manufacturers.

Gardner: We’ve had just-in-time manufacturing and lean processes for quite some time, trying to compress the supply chain and distribution networks, but these haven’t necessarily been done through public networks, the internet, or standardized approaches.

But if we’re to benefit, we’re going to need to be able to be platform companies, not just product companies. How do you go from being a proprietary set of manufacturing protocols and approaches to this wider, standardized interoperability architecture?

Tabet: That’s a very good question, because now we’re talking about that connection to the customer. With the airline and the jet engine manufacturer, for example, when the plane lands and there has been some monitoring of the activity during the whole flight, at that moment, they’ll get that data made available. There could be improvements and maybe solutions available as soon as the plane lands.

Interoperability

That requires interoperability. It requires Platform 3.0 for example. If you don’t have open platforms, then you’ll deal with the same hurdles in terms of proprietary technologies and integration in a silo-based manner.

Gardner: Penelope, you’ve been writing about the obstacles to decision-making that might become apparent as big data becomes more prolific and people try to capture all the data about all the processes and analyze it. That’s a little bit of a departure from the way we’ve made decisions in organizations, public and private, in the past.

Of course, one of the bigger tenets of Internet of Things is all this great data that will be available to us from so many different points. Is there a conundrum of some sort? Is there an unknown obstacle for how we, as organizations and individuals, can deal with that data? Is this going to be chaos, or is this going to be all the promises many organizations have led us to believe around big data in the Internet of Things?

Gordon_PenelopePenelope Gordon: It’s something that has just been accelerated. This is not a new problem in terms of the decision-making styles not matching the inputs that are being provided into the decision-making process.

Former US President Bill Clinton was known for delaying making decisions. He’s a head-type decision-maker and so he would always want more data and more data. That just gets into a never-ending loop, because as people collect data for him, there is always more data that you can collect, particularly on the quantitative side. Whereas, if it is distilled down and presented very succinctly and then balanced with the qualitative, that allows intuition to come to fore, and you can make optimal decisions in that fashion.

Conversely, if you have someone who is a heart-type or gut-type decision-maker and you present them with a lot of data, their first response is to ignore the data. It’s just too much for them to take in. Then you end up completely going with whatever you feel is correct or whatever you have that instinct that it’s the correct decision. If you’re talking about strategic decisions, where you’re making a decision that’s going to influence your direction five years down the road, that could be a very wrong decision to make, a very expensive decision, and as you said, it could be chaos.

It just brings to mind to me Dr. Suess’s The Cat in the Hat with Thing One and Thing Two. So, as we talk about the Internet of Things, we need to keep in mind that we need to have some sort of structure that we are tying this back to and understanding what are we trying to do with these things.

Gardner: Openness is important, and governance is essential. Then, we can start moving toward higher-order business platform benefits. But, so far, our panel has been a little bit cynical. We’ve heard that the opportunity and the challenges are commensurate in the public sector and that in manufacturing we’re moving into a whole new area of interoperability, when we think about reaching out to customers and having a boundary that is managed between internal processes and external communications.

And we’ve heard that an overload of data could become a very serious problem and that we might not get benefits from big data through the Internet of Things, but perhaps even stumble and have less quality of decisions.

So Dave Lounsbury of The Open Group, will the same level of standardization work? Do we need a new type of standards approach, a different type of framework, or is this a natural path and course what we have done in the past?

Different level

Lounsbury_DaveDave Lounsbury: We need to look at the problem at a different level than we institutionally think about an interoperability problem. Internet of Things is riding two very powerful waves, one of which is Moore’s Law, that these sensors, actuators, and network get smaller and smaller. Now we can put Ethernet in a light switch right, a tag, or something like that.

Also, Metcalfe’s Law that says that the value of all this connectivity goes up with the square of the number of connected points, and that applies to both the connection of the things but more importantly the connection of the data.

The trouble is, as we have said, that there’s so much data here. The question is how do you manage it and how do you keep control over it so that you actually get business value from it. That’s going to require us to have this new concept of a platform to not only to aggregate, but to just connect the data, aggregate it, correlate it as you said, and present it in ways that people can make decisions however they want.

Also, because of the raw volume, we have to start thinking about machine agency. We have to think about the system actually making the routine decisions or giving advice to the humans who are actually doing it. Those are important parts of the solution beyond just a simple “How do we connect all the stuff together?”

Gardner: We might need a higher order of intelligence, now that we have reached this border of what we can do with our conventional approaches to data, information, and process.

Thinking about where this works best first in order to then understand where it might end up later, I was intrigued again this morning by Professor Van Alstyne. He mentioned that in healthcare, we should expect major battles, that there is a turf element to this, that the organization, entity or even commercial corporation that controls and manages certain types of information and access to that information might have some very serious platform benefits.

The openness element now is something to look at, and I’ll come back to the public sector. Is there a degree of openness that we could legislate or regulate to require enough control to prevent the next generation of lock-in, which might not be to a platform to access to data information and endpoints? Where is it in the public sector that we might look to a leadership position to establish needed openness and not just interoperability.

Barsoum: I’m not even sure where to start answering that question. To take healthcare as an example, I certainly didn’t write the bible on healthcare IT systems and if someone did write that, I think they really need to publish it quickly.

We have a single-payer system in Canada, and you would think that would be relatively easy to manage. There is one entity that manages paying the doctors, and everybody gets covered the same way. Therefore, the data should be easily shared among all the players and it should be easy for you to go from your doctor, to your oncologist, to whomever, and maybe to your pharmacy, so that everybody has access to this same information.

We don’t have that and we’re nowhere near having that. If I look to other areas in the public sector, areas where we’re beginning to solve the problem are ones where we face a crisis, and so we need to address that crisis rapidly.

Possibility of improvement

In the transportation infrastructure, we’re getting to that point where the infrastructure we have just doesn’t meet the needs. There’s a constraint in terms of money, and we can’t put much more money into the structure. Then, there are new technologies that are coming in. Chris had talked about driverless cars earlier. They’re essentially throwing a wrench into the works or may be offering the possibility of improvement.

On any given piece of infrastructure, you could fit twice as many driverless cars as cars with human drivers in them. Given that set of circumstances, the governments are going to find they have no choice but to share data in order to be able to manage those. Are there cases where we could go ahead of a crisis in order to manage it? I certainly hope so.

Gardner: How about allowing some of the natural forces of marketplaces, behavior, groups, maybe even chaos theory, where if sufficient openness is maintained there will be some kind of a pattern that will emerge? We need to let this go through its paces, but if we have artificial barriers, that might be thwarted or power could go to places that we would regret later.

Barsoum: I agree. People often focus on structure. So the governance doesn’t work. We should find some way to change the governance of transportation. London has done a very good job of that. They’ve created something called Transport for London that manages everything related to transportation. It doesn’t matter if it’s taxis, bicycles, pedestrians, boats, cargo trains, or whatever, they manage it.

You could do that, but it requires a lot of political effort. The other way to go about doing it is saying, “I’m not going to mess with the structures. I’m just going to require you to open and share all your data.” So, you’re creating a new environment where the governance, the structures, don’t really matter so much anymore. Everybody shares the same data.

Gardner: Said, to the private sector example of manufacturing, you still want to have a global fabric of manufacturing capabilities. This is requiring many partners to work in concert, but with a vast new amount of data and new potential for efficiency.

How do you expect that openness will emerge in the manufacturing sector? How will interoperability play when you don’t have to wait for legislation, but you do need to have cooperation and openness nonetheless?

Tabet: It comes back to the question you asked Dave about standards. I’ll just give you some examples. For example, in the automotive industry, there have been some activities in Europe around specific standards for communication.

The Europeans came to the US and started to have discussions, and the Japanese have interest, as well as the Chinese. That shows, because there is a common interest in creating these new models from a business standpoint, that these challenges they have to be dealt with together.

Managing complexity

When we talk about the amounts of data, what we call now big data, and what we are going to see in about five years or so, you can’t even imagine. How do we manage that complexity, which is multidimensional? We talked about this sort of platform and then further, that capability and the data that will be there. From that point of view, openness is the only way to go.

There’s no way that we can stay away from it and still be able to work in silos in that new environment. There are lots of things that we take for granted today. I invite some of you to go back and read articles from 10 years ago that try to predict the future in technology in the 21st century. Look at your smart phones. Adoption is there, because the business models are there, and we can see that progress moving forward.

Collaboration is a must, because it is a multidimensional level. It’s not just manufacturing like jet engines, car manufacturers, or agriculture, where you have very specific areas. They really they have to work with their customers and the customers of their customers.

Adoption is there, because the business models are there, and we can see that progress moving forward.

Gardner: Dave, I have a question for both you and Penelope. I’ve seen some instances where there has been a cooperative endeavor for accessing data, but then making it available as a service, whether it’s an API, a data set, access to a data library, or even analytics applications set. The Ocean Observatories Initiative is one example, where it has created a sensor network across the oceans and have created data that then they make available.

Do you think we expect to see an intermediary organization level that gets between the sensors and the consumers or even controllers of the processes? Is there’s a model inherent in that that we might look to — something like that cooperative data structure that in some ways creates structure and governance, but also allows for freedom? It’s sort of an entity that we don’t have yet in many organizations or many ecosystems and that needs to evolve.

Lounsbury: We’re already seeing that in the marketplace. If you look at the commercial and social Internet of Things area, we’re starting to see intermediaries or brokers cropping up that will connect the silo of my android ecosystem to the ecosystem of package tracking or something like that. There are dozens and dozens of these cropping up.

In fact, you now see APIs even into a silo of what you might consider a proprietary system and what people are doing is to to build a layer on top of those APIs that intermediate the data.

This is happening on a point-to-point basis now, but you can easily see the path forward. That’s going to expand to large amounts of data that people will share through a third party. I can see this being a whole new emerging market much as what Google did for search. You could see that happening for the Internet of Things.

Gardner: Penelope, do you have any thoughts about how that would work? Is there a mutually assured benefit that would allow people to want to participate and cooperate with that third entity? Should they have governance and rules about good practices, best practices for that intermediary organization? Any thoughts about how data can be managed in this sort of hierarchical model?

Nothing new

Gordon: First, I’ll contradict it a little bit. To me, a lot of this is nothing new, particularly coming from a marketing strategy perspective, with business intelligence (BI). Having various types of intermediaries, who are not only collecting the data, but then doing what we call data hygiene, synthesis, and even correlation of the data has been around for a long time.

It was an interesting, when I looked at recent listing of the big-data companies, that some notable companies were excluded from that list — companies like Nielsen. Nielsen’s been collecting data for a long time. Harte-Hanks is another one that collects a tremendous amount of information and sells that to companies.

That leads into the another part of it that I think there’s going to be. We’re seeing an increasing amount of opportunity that involves taking public sources of data and then providing synthesis on it. What remains to be seen is how much of the output of that is going to be provided for “free”, as opposed to “fee”. We’re going to see a lot more companies figuring out creative ways of extracting more value out of data and then charging directly for that, rather than using that as an indirect way of generating traffic.

Gardner: We’ve seen examples of how this has been in place. Does it scale and does the governance or lack of governance that might be in the market now sustain us through the transition into Platform 3.0 and the Internet of Things.

Gordon: That aspect is the lead-on part of “you get what you pay for”. If you’re using a free source of data, you don’t have any guarantee that it is from authoritative sources of data. Often, what we’re getting now is something somebody put it in a blog post, and then that will get referenced elsewhere, but there was nothing to go back to. It’s the shaky supply chain for data.

You need to think about the data supply and that is where the governance comes in. Having standards is going to increasingly become important, unless we really address a lot of the data illiteracy that we have. A lot of people do not understand how to analyze data.

One aspect of that is a lot of people expect that we have to do full population surveys, as opposed representative sampling to get much more accurate and much more cost-effective collection of data. That’s just one example, and we do need a lot more in governance and standards.

Gardner: What would you like to see changed most in order for the benefits and rewards of the Internet of Things to develop and overcome the drawbacks, the risks, the downside? What, in your opinion, would you like to see happen to make this a positive, rapid outcome? Let’s start with you Jean-Francois.

Barsoum: There are things that I have seen cities start to do now. There are couple of examples: Philadelphia is one and Barcelona does this too. Rather than do the typical request for proposal (RFP), where they say, “This is the kind of solution we’re looking for, and here are our parameters. Can l you tell us how much it is going to cost to build,” they come to you with the problem and they say, “Here is the problem I want to fix. Here are my priorities, and you’re at liberty to decide how best to fix the problem, but tell us how much that would cost.”

If you do that and you combine it with access to the public data that is available — if public sector opens up its data — you end up with a very powerful combination that liberates a lot of creativity. You can create a lot of new business models. We need to see much more of that. That’s where I would start.

More education

Tabet: I agree with Jean-Francois on that. What I’d like to add is that I think we need to push the relation a little further. We need more education, to your point earlier, around the data and the capabilities.

We need these platforms that we can leverage a little bit further with the analytics, with machine learning, and with all of these capabilities that are out there. We have to also remember, when we talk about the Internet of Things, it is things talking to each other.

So it is not human-machine communication. Machine-to-machine automation will be further than that, and we need more innovation and more work in this area, particularly more activity from the governments. We’ve seen that, but it is a little bit frail from that point of view right now.

Gardner: Dave Lounsbury, thoughts about what need to happen in order to keep this on the tracks?

Lounsbury: We’ve touched on lot of them already. Thank you for mentioning the machine-to-machine part, because there are plenty of projections that show that it’s going to be the dominant form of Internet communication, probably within the next four years.

So we need to start thinking of that and moving beyond our traditional models of humans talking through interfaces to set of services. We need to identify the building blocks of capability that you need to manage, not only the information flow and the skilled person that is going to produce it, but also how you manage the machine-to-machine interactions.

Gordon: I’d like to see not so much focus on data management, but focus on what is the data managing and helping us to do. Focusing on the machine-to-machine and the devices is great, but it should be not on the devices or on the machines… it should be on what can they accomplish by communicating; what can you accomplish with the devices and then have a reverse engineer from that.

Gardner: Let’s go to some questions from the audience. The first one asks about a high order of intelligence which we mentioned earlier. It could be artificial intelligence, perhaps, but they ask whether that’s really the issue. Is the nature of the data substantially different, or we are just creating more of the same, so that it is a storage, plumbing, and processing problem? What, if anything, are we lacking in our current analytics capabilities that are holding us back from exploiting the Internet of Things?

Gordon: I’ve definitely seen that. That has a lot to do with not setting your decision objectives and your decision criteria ahead of time so that you end up collecting a whole bunch of data, and the important data gets lost in the mix. There is a term “data smog.”

Most important

The solution is to figure out, before you go collecting data, what data is most important to you. If you can’t collect certain kinds of data that are important to you directly, then think about how to indirectly collect that data and how to get proxies. But don’t try to go and collect all the data for that. Narrow in on what is going to be most important and most representative of what you’re trying to accomplish.

Gardner: Does anyone want to add to this idea of understanding what current analytics capabilities are lacking, if we have to adopt and absorb the Internet of Things?

Barsoum: There is one element around projection into the future. We’ve been very good at analyzing historical information to understand what’s been happening in the past. We need to become better at projecting into the future, and obviously we’ve been doing that for some time already.

But so many variables are changing. Just to take the driverless car as an example. We’ve been collecting data from loop detectors, radar detectors, and even Bluetooth antennas to understand how traffic moves in the city. But we need to think harder about what that means and how we understand the city of tomorrow is going to work. That requires more thinking about the data, a little bit like what Penelope mentioned, how we interpret that, and how we push that out into the future.

Lounsbury: I have to agree with both. It’s not about statistics. We can use historical data. It helps with lot of things, but one of the major issues we still deal with today is the question of semantics, the meaning of the data. This goes back to your point, Penelope, around the relevance and the context of that information – how you get what you need when you need it, so you can make the right decisions.

Gardner: Our last question from the audience goes back to Jean-Francois’s comments about the Canadian healthcare system. I imagine it applies to almost any healthcare system around the world. But it asks why interoperability is so difficult to achieve, when we have the power of the purse, that is the market. We also supposedly have the power of the legislation and regulation. You would think between one or the other or both that interoperability, because the stakes are so high, would happen. What’s holding it up?

Barsoum: There are a couple of reasons. One, in the particular case of healthcare, is privacy, but that is one that you could see going elsewhere. As soon as you talk about interoperability in the health sector, people start wondering where is their data going to go and how accessible is it going to be and to whom.

You need to put a certain number of controls over top of that. What is happening in parallel is that you have people who own some data, who believe they have some power from owning that data, and that they will lose that power if they share it. That can come from doctors, hospitals, anywhere.

So there’s a certain amount of change management you have to get beyond. Everybody has to focus on the welfare of the patient. They have to understand that there has to be a priority, but you also have to understand the welfare of the different stakeholders in the system and make sure that you do not forget about them, because if you forget about them they will find some way to slow you down.

Use of an ecosystem

Lounsbury: To me, that’s a perfect example of what Marshall Van Alstyne talked about this morning. It’s the change from focus on product to a focus on an ecosystem. Healthcare traditionally has been very focused on a doctor providing product to patient, or a caregiver providing a product to a patient. Now, we’re actually starting to see that the only way we’re able to do this is through use of an ecosystem.

That’s a hard transition. It’s a business-model transition. I will put in a plug here for The Open Group Healthcare vertical, which is looking at that from architecture perspective. I see that our Forum Director Jason Lee is over here. So if you want to explore that more, please see him.

Gardner: I’m afraid we will have to leave it there. We’ve been discussing the practical implications of the Internet of Things and how it is now set to add a new dimension to Open Platform 3.0 and Boundaryless Information Flow.

We’ve heard how new thinking about interoperability will be needed to extract the value and orchestrate out the chaos with such vast new scales of inputs and a whole new categories of information.

So with that, a big thank you to our guests: Said Tabet, Chief Technology Officer for Governance, Risk and Compliance Strategy at EMC; Penelope Gordon, Emerging Technology Strategist at 1Plug Corp.; Jean-Francois Barsoum, Senior Managing Consultant for Smarter Cities, Water and Transportation at IBM, and Dave Lounsbury, Chief Technology Officer at The Open Group.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on Open Platform 3.0 and Boundaryless Information Flow at The Open Group Conference, recently held in Boston. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript.

Transcript of The Open Group podcast exploring the challenges and ramifications of the Internet of Things, as machines and sensors collect vast amounts of data. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2014. All rights reserved.

You may also be interested in:

Leave a comment

Filed under Boundaryless Information Flow™, Business Architecture, Cloud, Cloud/SOA, Data management, digital technologies, Enterprise Architecture, Future Technologies, Information security, Internet of Things, Interoperability, Open Platform 3.0, Service Oriented Architecture, Standards, Strategy, Supply chain risk, Uncategorized

The Internet of Things is the New Media

By Dave Lounsbury, Chief Technical Officer, The Open Group

A tip of the hat to @artbourbon for pointing out the article “Principles for Open Innovation and Open Leadingship” by Peter Vander Auwera, which led to a TED Talk by Joi Ito with his “Nine Principles of the Media Lab”. Something in this presentation struck me:

“Media is plural for Medium, Medium is something in which you can express yourself. The medium was hardware, screens, robots, etc. Now the medium is society, ecosystem, journalism,… Our work looks more like social science.”

Great changes in society often go hand-in-hand with advances in communications, which in turn are tied to improvements in scale or portability of media. Think the printing press, television or even the development of paint in tubes which allowed impressionist painters to get out of the studios to paint water lilies and wheat fields.

295px-Vincent_Van_Gogh_0020

We are seeing a similar advance in the next generation of the Internet. Traditionally, humans interact with computer systems and networks through visual media, like screens of varying sizes and printed material. However, this is changing: Sensors and actuators are shrinking in size and price, and there has been an explosion of devices, new services and applications that network these together into larger systems  to increase their value through Metcalfe’s law. We interact with the actions of these sensors not just with our eyes, but other senses as well – a simple example is the feeling of warmth as your house adjusts its temperature as you arrive home.

These devices, and the platforms that orchestrate their interactions, are the media in which the next generation of the internet will be painted. We call it the Internet of Things today, or maybe the Internet of Everything – but in long run, it will become just be the Internet. The expression of connectivity through sensors and devices will soon become as commonplace as social media is today.

Join the conversation! @theopengroup #ogchat

lounsburyDavid is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, David leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia.

David holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

Leave a comment

Filed under digital technologies, Future Technologies, Internet of Things, Open Platform 3.0, Uncategorized

The Digital Ecosystem Paradox – Learning to Move to Better Digital Design Outcomes

By Mark Skilton, Professor of Practice, Information Systems Management, Warwick Business School

Does digital technologies raise quality and improve efficiencies but at the same time drive higher costs of service as more advanced solutions and capabilities become available demanding higher entry investment and maintenance costs?

Many new digital technologies introduce step change in performance that would have been cost prohibitive in the previous technology generations. But in some industries the technology cost per outcome have be steadily rising in some industries.

In the healthcare market the cost per treatment of health care technology was highlighted in a MIT Technology Review article (1). In areas such as new drugs for treating depression, left-ventricular assistance devices, or implantable defibrillators may be raising the overall cost of health, yet how do we value this if patient quality of life is improving and life extending. While lower cost drugs and vaccines may be enabling better overall patient outcomes

In the smart city a similar story is unfolding where governments and organizations are seeking paths to use digitization to drive improvements in jobs productivity, better lifestyles and support of environmental sustainability. While there are several opportunities to reduce energy bills, improve transport and office spaces exist with savings of 40% to 60% consumption and efficiencies complexity costs of connecting different residential, corporate offices, transport and other living spaces requires digital initiatives that are coordinated and managed. (U-city experience in South Korea (2)).

These digital paradoxes represent the digital ecosystem challenge to maximise what these new digital technologies can do to augment every objects, services, places and spaces while taking account of the size and addressable market that all these solutions can serve.

Skilton1

What we see is that technology can be both a driver of the physical and digital economy through lowering of price per function in computer storage, compute, access and application technology and creating new value; conversely the issues around driving new value is having different degrees of success in industries.

Creating value in the digital economy

The digital economy is at a tipping point, a growing 30% of business is shifting online to search and engage with consumers, markets and transactions taking account of retail , mobile and impact on supply channels (3);  80% of transport, real estate and hotelier activity is processed through websites (4); over 70% of companies and consumers are experiencing cyber-privacy challenges (5), (6) yet the digital media in social, networks, mobile devices, sensors and the explosion of big data and cloud computing networks is interconnecting potentially everything everywhere – amounting to a new digital “ecosystem.

Disruptive business models across industries and new consumer innovation are increasingly built around new digital technologies such as social media, mobility, big data, cloud computing and the emerging internet of things sensors, networks and machine intelligence. (MISQ Digital Strategy Special Issue (7)).

These trends have significantly enhanced the relevance and significance of IT in its role and impact on business and market value at local, regional and global scale.

With IT budgets increasing shifting more towards the marketing functions and business users of these digital services from traditional IT, there is a growing role for technology to be able to work together in new connected ways.

Driving better digital design outcomes

The age of new digital technologies are combining in new ways to drive new value for individuals, enterprise, communities and societies. The key is in understanding the value that each of these technologies can bring individually and in the mechanisms to creating additive value when used appropriately and cost effectively to drive brand, manage cyber risk, and build consumer engagement and economic growth.

Skilton2

Value-in-use, value in contextualization

Each digital technology has the potential to enable better contextualization of the consumer experience and the value added by providers.   Each industry market has emerging combinations of technologies that can be developed to enable focused value.

Examples of these include.

  • Social media networks

o   Creating enhanced co-presence

  • Big data

o   Providing uniqueness profiling , targeting advice and preferences in context

  • Mobility

o   Creating location context services and awareness

  • Cloud

o   Enabling access to resources and services

  • Sensors

o   Creating real time feedback responsiveness

  • Machine intelligence

o   Enabling insight and higher decision quality

Together these digital technologies can build generative effects that when in context can enable higher value outcomes in digital workspaces.

Skilton3

Value in Contextualization

The value is not in whether these technologies, objects, consumers or provider inside or outside the enterprise or market. These distinctions are out-of-context from relating them to the situation and the consumer needs and wants. The issue is how to apply and put into context the user experience and enterprise and social environment to best use and maximise the outcomes in a specific setting context rom the role perspective.

With the medical roles of patient and clinician, the aim in digitization is how mobile devices, wearable monitoring can be used most efficiently and effectively to raise patient outcome quality and manage health service costs. Especially in the developing countries and remote areas where infrastructure and investment costs, how can technologies reach and improve the quality of health and at an effective cost price point.

This phenomena is wide spread and growing across all industry sectors such as: the connected automobile with in-car entertainment, route planning services; to tele-health that offers remote patient care monitoring and personalized responses; to smart buildings and smart cities that are optimizing energy consumption and work environments; to smart retail where interactive product tags for instant customer mobile information feedback and in-store promotions and automated supply chains. The convergence of these technologies requires a response from all businesses.

These issues are not going to go away, the statistics from analysts describe a new era of a digital industrial economy (8). What is common is the prediction in the next twenty to fifty years suggest double or triple growth in demand for new digital technologies and their adoption.

Skilton4

Platforming and designing better digital outcomes

Developing efective digital workspaces will be fundamental to the value and use of these technologies. There will be not absolute winners and losers as a result of the digital paradox. What is at state is in how the cost and inovation of these technologies can be leveraged to fit specific outcomes.

Understanding the architecting practices will be essentuial in realizing the digitel enterprise. Central to this is how to develop ways to contextualize digital technologies to enable this value for consumers and customers (Value and Worth – creating new markets in the digital economy (9)).Skilton5Platforming will be a central IT strategy that we see already emerging in early generations of digital marketplaces, mobile app ecosystems and emerging cross connecting services in health, automotive, retail and others seeking to create joined up value.

Digital technologies will enable new forms of digital workspaces to support new outcomes. By driving contextualized offers that meet and stimulate consumer behaviors and demand , a richer and more effective value experience and growth potential is possible.

Skilton6The challenge ahead

The evolution of digital technologies will enable many new types of architect and platforms. How these are constructed into meaningful solutions is both the opportunity and the task ahead.

The challenge for both business and IT practitioners is how to understand the practical use and advantages as well as the pitfalls and challenges from these digital technologies

  • What can be done using digital technologies to enhance customer experience, employee productivity and sell more products and services
  • Where to position in a digital market, create generative reinforcing positive behavior and feedback for better market branding
  • Who are the beneficiaries of the digital economy and the impact on the roles and jobs of business and IT professionals
  • Why do enterprises and industry marketplaces need to understand the disruptive effects of these digital technologies and how to leverage these for competitive advantage.
  • How to architect and design robust digital solutions that support the enterprise, its supply chain and extended consumers, customers and providers

References

  1. http://www.technologyreview.com/news/518876/the-costly-paradox-of-health-care-technology/.
  2. http://www.kyoto-smartcity.com/result_pdf/ksce2014_hwang.pdf.
  3. http://www.smartinsights.com/digital-marketing-strategy/online-retail-sales-growth/
  4. http://www.statisticbrain.com/internet-travel-hotel-booking-statistics/
  5. http://www.fastcompany.com/3019097/fast-feed/63-of-americans-70-of-milennials-are-cybercrime-victims
  6. https://www.kpmg.com/Global/en/IssuesAndInsights/ArticlesPublications/Documents/cyber-crime.pdf
  7. http://www.misq.org/contents-37-2
  8. http://www.gartner.com/newsroom/id/2602817
  9. http://www2.warwick.ac.uk/fac/sci/wmg/mediacentre/wmgnews/?newsItem=094d43a23d3fbe05013d835d6d5d05c6

 

Skilton7Digital Health

As the cost of health care, the increasing aging population and the rise of medical advances enable people to live longer and improved quality of life; the health sector together with governments and private industry are increasingly using digital technologies to manage the rising costs of health care while improve patient survival and quality outcomes.

Digital Health Technologies

mHealth, TeleHealth and Translation-to-Bench Health services are just some of the innovative medical technology practices creating new Connected Health Digital Ecosystems.

These systems connect Mobile phones, wearable health monitoring devices, remote emergency alerts to clinician respond and back to big data research for new generation health care.

The case for digital change

UN Department of Economic and Social Affairs

“World population projected to reach 8.92 billion for 2050 and 9.22 Million in 2075. Life expectance is expected to range from 66 to 97 years by 2100.”

OECD Organization for Economic Cooperation and Development

The cost of Health care in developing countries is 8 to 17% of GDP in developed countries. But overall Health car e spending is falling while population growth and life expectancy and aging is increasing.

 

Skilton8Smart cities

The desire to improve buildings, reduce pollution and crime, improve transport, create employment, better education and ways to launch new business start-ups through the use of digital technologies are at the core of important outcomes to drive city growth from “Smart Cities” digital Ecosystem.

Smart city digital technologies

Embedded sensors in building energy management, smart ID badges, and mobile apps for location based advice and services supporting social media communities, enabling improved traffic planning and citizen service response are just some of the ways digital technologies are changing the physical city in the new digital metropolis hubs of tomorrow.

The case for digital change

WHO World Health Organization

“By the middle of the 21st century, the urban population will almost double globally, By 2030, 6 out of every 10 people will live in a city, and by 2050, this proportion will increase to 7 out of 10 people.”

UN Inter-governmental Panel on Climate Change IPCC

“In 2010, the building sector accounted for around 32% final energy use with energy demand projected to approximately double and CO2 emissions to increase by 50–150% by mid-century”

IATA International Air Transport Association

“Airline Industry Forecast 2013-2017 show that airlines expect to see a 31% increase in passenger numbers between 2012 and 2017. By 2017 total passenger numbers are expected to rise to 3.91 billion—an increase of 930 million passengers over the 2.98 billion carried in 2012.”

Mark Skilton 2 Oct 2013Professor Mark Skilton,  Professor of Practice in Information Systems Management , Warwick Business School has over twenty years’ experience in Information Technology and Business consulting to many of the top fortune 1000 companies across many industry sectors and working in over 25 countries at C level board level to transform their operations and IT value.  Mark’s career has included CIO, CTO  Director roles for several FMCG, Telecoms Media and Engineering organizations and recently working in Global Strategic Office roles in the big 5 consulting organizations focusing on digital strategy and new multi-sourcing innovation models for public and private sectors. He is currently a part-time Professor of practice at Warwick Business School, UK where he teaches outsourcing and the intervention of new digital business models and CIO Excellence practices with leading Industry practitioners.

Mark’s current research and industry leadership engagement interests are in Digital Ecosystems and the convergence of social media networks, big data, mobility, cloud computing and M2M Internet of things to enable digital workspaces. This has focused on define new value models digitizing products, workplaces, transport and consumer and provider contextual services. He has spoken and published internationally on these subjects and is currently writing a book on the Digital Economy Series.

Since 2010 Mark has held International standards body roles in The Open Group co-chair of Cloud Computing and leading Open Platform 3.0™ initiatives and standards publications. Mark is active in the ISO JC38 distributed architecture standards and in the Hubs-of-all-things HAT a multi-disciplinary project funded by the Research Council’s UK Digital Economy Programme. Mark is also active in Cyber security forums at Warwick University, Ovum Security Summits and INFOSEC. He has spoken at the EU Commission on Digital Ecosystems Agenda and is currently an EU Commission Competition Judge on Smart Outsourcing Innovation.

 

 

 

 

 

Comments Off

Filed under Data management, digital technologies, Enterprise Architecture, Future Technologies, Healthcare, Open Platform 3.0, Uncategorized

Evolving Business and Technology Toward an Open Platform 3.0™

By Dave Lounsbury, Chief Technical Officer, The Open Group

The role of IT within the business is one that constantly evolves and changes. If you’ve been in the technology industry long enough, you’ve likely had the privilege of seeing IT grow to become integral to how businesses and organizations function.

In his recent keynote “Just Exactly What Is Going On in Business and Technology?” at The Open Group London Conference in October, Andy Mulholland, former Global Chief Technology Officer at Capgemini, discussed how the role of IT has changed from being traditionally internally focused (inside the firewall, proprietary, a few massive applications, controlled by IT) to one that is increasingly externally focused (outside the firewall, open systems, lots of small applications, increasingly controlled by users). This is due to the rise of a number of disruptive forces currently affecting the industry such as BYOD, Cloud, social media tools, Big Data, the Internet of Things, cognitive computing. As Mulholland pointed out, IT today is about how people are using technology in the front office. They are bringing their own devices, they are using apps to get outside of the firewall, they are moving further and further away from traditional “back office” IT.

Due to the rise of the Internet, the client/server model of the 1980s and 1990s that kept everything within the enterprise is no more. That model has been subsumed by a model in which development is fast and iterative and information is constantly being pushed and pulled primarily from outside organizations. The current model is also increasingly mobile, allowing users to get the information they need anytime and anywhere from any device.

At the same time, there is a push from business and management for increasingly rapid turnaround times and smaller scale projects that are, more often than not, being sourced via Cloud services. The focus of these projects is on innovating business models and acting in areas where the competition does not act. These forces are causing polarization within IT departments between internal IT operations based on legacy systems and new external operations serving buyers in business functions that are sourcing their own services through Cloud-based apps.

Just as UNIX® provided a standard platform for applications on single computers and the combination of servers, PCs and the Internet provided a second platform for web apps and services, we now need a new platform to support the apps and services that use cloud, social, mobile, big data and the Internet of Things. Rather than merely aligning with business goals or enabling business, the next platform will be embedded within the business as an integral element bringing together users, activity and data. To work properly, this must be a standard platform so that these things can work together effectively and at low cost, providing vendors a worthwhile market for their products.

Industry pundits have already begun to talk about this layer of technology. Gartner calls it the “Nexus of Forces.” IDC calls it the “third platform.” At the The Open Group, we refer to it as Open Platform 3.0™, and we announced a new Forum to address how organizations can address and support these technologies earlier this year. Open Platform 3.0 is meant to enable organizations (including standards bodies, users and vendors) coordinate their approaches to the new business models and IT practices driving the new platform to support a new generation of interoperable business solutions.

As is always the case with technologies, a point is reached where technical innovation must transition to business benefit. Open Platform 3.0 is, in essence, the next evolution of computing. To help the industry sort through these changes and create vendor-neutral standards that foster the cohesive adoption of new technologies, The Open Group must also evolve its focus and standards to respond to where the industry is headed.

The work of the Open Platform 3.0 Forum has already begun. Initial actions for the Forum have been identified and were shared during the London conference.  Our recent survey on Convergent Technologies confirmed the need to address these issues. Of those surveyed, 95 percent of respondents felt that converged technologies were an opportunity for business, and 84 percent of solution providers are already dealing with two or more of these technologies in combination. Respondents also saw vendor lock-in as a potential hindrance to using these technologies underscoring the need for an industry standard that will address interoperability. In addition to the survey, the Forum has also produced an initial Business Scenario to begin to address these industry needs and formulate requirements for this new platform.

If you have any questions about Open Platform 3.0 or if you would like to join the new Forum, please contact Chris Harding (c.harding@opengroup.org) for queries regarding the Forum or Chris Parnell (c.parnell@opengroup.org) for queries regarding membership.

 

Dave LounsburyDave is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, Dave leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia. Dave holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

 

 

1 Comment

Filed under Cloud, Data management, Future Technologies, Open Platform 3.0, Standards, Uncategorized, UNIX

Secure Integration of Convergent Technologies – a Challenge for Open Platform™

By Dr. Chris Harding, The Open Group

The results of The Open Group Convergent Technologies survey point to secure integration of the technologies as a major challenge for Open Platform 3.0.  This and other input is the basis for the definition of the platform, where the discussion took place at The Open Group conference in London.

Survey Highlights

Here are some of the highlights from The Open Group Convergent Technologies survey.

  • 95% of respondents felt that the convergence of technologies such as social media, mobility, cloud, big data, and the Internet of things represents an opportunity for business
  • Mobility currently has greatest take-up of these technologies, and the Internet of things has least.
  • 84% of those from companies creating solutions want to deal with two or more of the technologies in combination.
  • Developing the understanding of the technologies by potential customers is the first problem that solution creators must overcome. This is followed by integrating with products, services and solutions from other suppliers, and using more than one technology in combination.
  • Respondents saw security, vendor lock-in, integration and regulatory compliance as the main problems for users of software that enables use of these convergent technologies for business purposes.
  • When users are considered separately from other respondents, security and vendor lock-in show particularly strongly as issues.

The full survey report is available at: https://www2.opengroup.org/ogsys/catalog/R130

Open Platform 3.0

Analysts forecast that convergence of technical phenomena including mobility, cloud, social media, and big data will drive the growth in use of information technology through 2020. Open Platform 3.0 is an initiative that will advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to use them.

The survey confirms the value of an open platform to protect users of these technologies from vendor lock-in. It also shows that security is a key concern that must be addressed, that the platform must make the technologies easy to use, and that it must enable them to be used in combination.

Understanding the Requirements

The Open Group is conducting other work to develop an understanding of the requirements of Open Platform 3.0. This includes:

  • The Open Platform 3.0 Business Scenario, that was recently published, and is available from https://www2.opengroup.org/ogsys/catalog/R130
  • A set of business use cases, currently in development
  • A high-level round-table meeting to gain the perspective of CIOs, who will be key stakeholders.

The requirements input have been part of the discussion at The Open Group Conference, which took place in London this week. Monday’s keynote presentation by Andy Mulholland, Former Global CTO at Capgemini on “Just Exactly What Is Going on in Business and Technology?” included the conclusions from the round-table meeting. This week’s presentation and panel discussion on the requirements for Open Platform 3.0 covered all the inputs.

Delivering the Platform

Review of the inputs in the conference was followed by a members meeting of the Open Platform 3.0 Forum, to start developing the architecture of Open Platform 3.0, and to plan the delivery of the platform definition. The aim is to have a snapshot of the definition early in 2014, and to deliver the first version of the standard a year later.

Meeting the Challenge

Open Platform 3.0 will be crucial to establishing openness and interoperability in the new generation of information technologies. This is of first importance for everyone in the IT industry.

Following the conference, there will be an opportunity for everyone to input material and ideas for the definition of the platform. If you want to be part of the community that shapes the definition, to work on it with like-minded people in other companies, and to gain early insight of what it will be, then your company must join the Open Platform 3.0 Forum. (For more information on this, contact Chris Parnell – c.parnell@opengroup.org)

Providing for secure integration of the convergent technologies, and meeting the other requirements for Open Platform 3.0, will be a difficult but exciting challenge. I’m looking forward to continue to tackle the challenge with the Forum members.

Dr. Chris Harding

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

1 Comment

Filed under Cloud/SOA, Standards, Service Oriented Architecture, Semantic Interoperability, Data management, Conference, Open Platform 3.0, Future Technologies

Leading Business Disruption Strategy with Enterprise Architecture

By Patty Donovan, The Open Group

On Wednesday, October 2nd, The Open Group and Enterprise Architects will host a tweet jam which discusses how organisations can lead business disruption with Enterprise Architecture (EA). Today, businesses are being forced to come to terms with their vulnerabilities and opportunities when it comes to disruptive innovation. Enterprise Architecture, by leveraging its emergent business architecture capabilities and its traditional technology and innovation focus, has the opportunity to fill a key void, aiding businesses to win in this new world.

In the recently published Hype Cycle for Enterprise Architecture 2013 Gartner places disruptive forces at the center of the emerging EA mandate:

“Enterprise Architecture (EA) is a discipline for proactively and holistically leading enterprise responses to disruptive forces by identifying and analyzing the execution of change toward desired business vision and outcomes.”

“EA practitioners have the opportunity to take a quantum leap toward not only becoming integral to the business, but also leading business change.”

Source: Hype Cycle for Enterprise Architecture 2013, Gartner 2013

Please join us on Wednesday, October 2nd at 12noon BST for our upcoming “Leading Disruption Strategy with EA” tweet jam where leading experts will discuss this evolving topic.

We welcome Open Group members and interested participants from all backgrounds to join the session and interact with our panel thought leaders, led by Hugh Evans, CEO of Enterprise Architects (@enterprisearchs). To access the discussion, please follow the #ogChat hashtag during the allotted discussion time.

Planned questions include:

  • Q1 What is #Disruption?
  • Q2 What is #Digitaldisruption?
  • Q3 What are good examples of disruptive #Bizmodels?
  • Q4 What is the role of #EntArch in driving and responding to #disruption?
  • Q5 Why is #EntArch well placed to respond to #Disruption?
  • Q6 Who are the key stakeholders #EntArch needs to engage when developing a #Disruption strategy?
  • Q7 What current gaps in #EntArch must be filled to effectively lead #Disruption strategy?

Additional appropriate hashtags:

  • #EntArch – Enterprise Architecture
  • #BizArch – Business Architecture
  • #Disruption – Disruption
  • #DigitalDisruption – Digital Disruption
  • #Bizmodels – Business Models
  • #ogArch – The Open Group Architecture Forum

And for those of you who are unfamiliar with tweet jams, here is some background information:

What Is a Tweet Jam?

A tweet jam is a one hour “discussion” hosted on Twitter. The purpose of this tweet jam is to share knowledge and answer questions on leading business disruption strategy with enterprise architecture. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

  • Have your first #ogChat tweet be a self-introduction: name, affiliation, occupation.
  • Start all other tweets with the question number you’re responding to and the #ogChat hashtag.
    • Sample: “Big Data presents a large business opportunity, but it is not yet being managed effectively internally – who owns the big data function? #ogchat”
    • Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.
    • While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue!
    • A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rob Checkal (rob.checkal at hotwirepr.com). We anticipate a lively chat and hope you will be able to join!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

2 Comments

Filed under Business Architecture, Enterprise Architecture, Future Technologies, Open Platform 3.0, Platform 3.0, Tweet Jam

IT Technology Trends – a Risky Business?

By Patty Donovan, The Open Group

On Wednesday, September 25, The Open Group will host a tweet jam looking at a multitude of emerging/converging technology trends and the risks they present to organizations who have already adopted or are looking to adopt them. Most of the technology concepts we’re talking about – Cloud, Big Data, BYOD/BYOS, the Internet of Things etc – are not new, but organizations are at differing stages of implementation and do not yet fully understand the longer term impact of adoption.

This tweet jam will allow us to explore some of these technologies in more detail and look at how organizations may better prepare against potential risks – whether this is in regards to security, access management, policies, privacy or ROI. As discussed in our previous Open Platform 3.0™ tweet jam, new technology trends present many opportunities but can also present business challenges if not managed effectively.

Please join us on Wednesday, September 25 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST for a tweet jam that will discuss and debate the issues around technology risks. A number of key areas will be addressed during the discussion including: Big Data, Cloud, Consumerization of IT, the Internet of Things and mobile and social computing with a focus on understanding the key risk priority areas organizations face and ways to mitigate them.

We welcome Open Group members and interested participants from all backgrounds to join the session and interact with our panel thought leaders led by David Lounsbury, CTO and Jim Hietala, VP of Security, from The Open Group. To access the discussion, please follow the #ogChat hashtag during the allotted discussion time.

  • Do you feel prepared for the emergence/convergence of IT trends? – Cloud, Big Data, BYOD/BYOS, Internet of things
  • Where do you see risks in these technologies? – Cloud, Big Data, BYOD/BYOS, Internet of things
  • How does your organization monitor for, measure and manage risks from these technologies?
  • Which policies are best at dealing with security risks from technologies? Which are less effective?
  • Many new technologies move data out of the enterprise to user devices or cloud services. Can we manage these new risks? How?
  • What role do standards, best practices and regulations play in keeping up with risks from these & future technologies?
  • Aside from risks caused by individual trends, what is the impact of multiple technology trends converging (Platform 3.0)?

And for those of you who are unfamiliar with tweet jams, here is some background information:

What Is a Tweet Jam?

A tweet jam is a one hour “discussion” hosted on Twitter. The purpose of this tweet jam is to share knowledge and answer questions on emerging/converging technology trends and the risks they present. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

  • Have your first #ogChat tweet be a self-introduction: name, affiliation, occupation.
  • Start all other tweets with the question number you’re responding to and the #ogChat hashtag.
    • Sample: “Big Data presents a large business opportunity, but it is not yet being managed effectively internally – who owns the big data function? #ogchat”
    • Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.
    • While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue!
    • A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rob Checkal (rob.checkal at hotwirepr.com). We anticipate a lively chat and hope you will be able to join!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Cloud, Cloud/SOA, Data management, Future Technologies, Open Platform 3.0, Platform 3.0, Tweet Jam