Tag Archives: cloud

The Open Group Philadelphia – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

philly 2.jpgDay 2 at The Open Group conference in the City of Brotherly Love, as Philadelphia is also known, was another busy and remarkable day.

The plenary started with a fascinating presentation, “Managing the Health of the Nation” by David Nash, MD, MBA, Dean of Jefferson School of Population Health.  Healthcare is the number one industry in the city of Philadelphia, with the highest number of patients in beds in the top 10 US cities. The key theme of his thought-provoking speech was “boundaryless information sharing” (sound familiar?), which will enable a healthcare system that is “safe, effective, patient-centered, timely, equitable, efficient”.

Following Dr. Nash’s presentation was the Healthcare Transformation Panel moderated by Allen Brown, CEO of The Open Group.  Participants were:  Gina Uppal (Fulbright-Killam Fellow, American University Program), Mike Lambert (Open Group Fellow, Architecting the Enterprise), Rosemary Kennedy (Associate Professor, Thomas Jefferson University), Blaine Warkentine, MD, MPH and Fran Charney (Pennsylvania Patient Safety Authority). The group brought different sets of experiences within the healthcare system and provided reaction to Dr. Nash’s speech.  All agree on the need for fundamental change and that technology will be key.

The conference featured a spotlight on The Open Group’s newest forum, Open Platform 3.0™ by Dr. Chris Harding, Director of Interoperability.  Open Platform 3.0 was formed to advance The Open Group vision of Boundaryless Information Flow™ to help enterprises in the use of Cloud, Social, Mobile Computing and Big Data.  For more info; http://www.opengroup.org/getinvolved/forums/platform3.0

The Open Group flourishes because of people interaction and collaboration.  The accolades continued with several members being recognized for their outstanding contributions to The Open Group Trusted Technology Forum (OTTF) and the Service-Oriented Architecture (SOA) and Cloud Computing Work Groups.  To learn more about our Forums and Work Groups and how to get involved, please visit http://www.opengroup.org/getinvolved

Presentations and workshops were also held in the Healthcare, Finance and Government vertical industries. Presenters included Larry Schmidt (Chief Technologist, HP), Rajamanicka Ponmudi (IT Architect, IBM) and Robert Weisman (CEO, Build the Vision, Inc.).

2 Comments

Filed under ArchiMate®, Business Architecture, Cloud/SOA, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Open Platform 3.0, Security Architecture, Standards, TOGAF®

The Open Group Conference to Emphasize Healthcare as Key Sector for Ecosystem-Wide Interactions

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15, in Philadelphia. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.

Gardner

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on enterprise transformation in the finance, government, and healthcare sector.

We’re here now with a panel of experts to explore how new IT trends are empowering improvements, specifically in the area of healthcare. We’ll learn how healthcare industry organizations are seeking large-scale transformation and what are some of the paths they’re taking to realize that.

We’ll see how improved cross-organizational collaboration and such trends as big data and cloud computing are helping to make healthcare more responsive and efficient.

With that, please join me in welcoming our panel, Jason Uppal, Chief Architect and Acting CEO at clinicalMessage. Welcome, Jason.

Jason Uppal: Thank you, Dana.

Inside of healthcare and inside the healthcare ecosystem, information either doesn’t flow well or it only flows at a great cost.

Gardner: And we’re also joined by Larry Schmidt, Chief Technologist at HP for the Health and Life Sciences Industries. Welcome, Larry.

Larry Schmidt: Thank you.

Gardner: And also, Jim Hietala, Vice President of Security at The Open Group. Welcome back, Jim. [Disclosure: The Open Group and HP are sponsors of BriefingsDirect podcasts.]

Jim Hietala: Thanks, Dana. Good to be with you.

Gardner: Let’s take a look at this very interesting and dynamic healthcare sector, Jim. What, in particular, is so special about healthcare and why do things like enterprise architecture and allowing for better interoperability and communication across organizational boundaries seem to be so relevant here?

Hietala: There’s general acknowledgement in the industry that, inside of healthcare and inside the healthcare ecosystem, information either doesn’t flow well or it only flows at a great cost in terms of custom integration projects and things like that.

Fertile ground

From The Open Group’s perspective, it seems that the healthcare industry and the ecosystem really is fertile ground for bringing to bear some of the enterprise architecture concepts that we work with at The Open Group in order to improve, not only how information flows, but ultimately, how patient care occurs.

Gardner: Larry Schmidt, similar question to you. What are some of the unique challenges that are facing the healthcare community as they try to improve on responsiveness, efficiency, and greater capabilities?

Schmidt: There are several things that have not really kept up with what technology is able to do today.

For example, the whole concept of personal observation comes into play in what we would call “value chains” that exist right now between a patient and a doctor. We look at things like mobile technologies and want to be able to leverage that to provide additional observation of an individual, so that the doctor can make a more complete diagnosis of some sickness or possibly some medication that a person is on.

We want to be able to see that observation in real life, as opposed to having to take that in at the office, which typically winds up happening. I don’t know about everybody else, but every time I go see my doctor, oftentimes I get what’s called white coat syndrome. My blood pressure will go up. But that’s not giving the doctor an accurate reading from the standpoint of providing great observations.

Technology has advanced to the point where we can do that in real time using mobile and other technologies, yet the communication flow, that information flow, doesn’t exist today, or is at best, not easily communicated between doctor and patient.

There are plenty of places that additional collaboration and communication can improve the whole healthcare delivery model.

If you look at the ecosystem, as Jim offered, there are plenty of places that additional collaboration and communication can improve the whole healthcare delivery model.

That’s what we’re about. We want to be able to find the places where the technology has advanced, where standards don’t exist today, and just fuel the idea of building common communication methods between those stakeholders and entities, allowing us to then further the flow of good information across the healthcare delivery model.

Gardner: Jason Uppal, let’s think about what, in addition to technology, architecture, and methodologies can bring to bear here? Is there also a lag in terms of process thinking in healthcare, as well as perhaps technology adoption?

Uppal: I’m going to refer to a presentation that I watched from a very well-known surgeon from Harvard, Dr. Atul Gawande. His point was is that, in the last 50 years, the medical industry has made great strides in identifying diseases, drugs, procedures, and therapies, but one thing that he was alluding to was that medicine forgot the cost, that everything is cost.

At what price?

Today, in his view, we can cure a lot of diseases and lot of issues, but at what price? Can anybody actually afford it?

Uppal

His view is that if healthcare is going to change and improve, it has to be outside of the medical industry. The tools that we have are better today, like collaborative tools that are available for us to use, and those are the ones that he was recommending that we need to explore further.

That is where enterprise architecture is a powerful methodology to use and say, “Let’s take a look at it from a holistic point of view of all the stakeholders. See what their information needs are. Get that information to them in real time and let them make the right decisions.”

Therefore, there is no reason for the health information to be stuck in organizations. It could go with where the patient and providers are, and let them make the best decision, based on the best practices that are available to them, as opposed to having siloed information.

So enterprise-architecture methods are most suited for developing a very collaborative environment. Dr. Gawande was pointing out that, if healthcare is going to improve, it has to think about it not as medicine, but as healthcare delivery.

There are definitely complexities that occur based on the different insurance models and how healthcare is delivered across and between countries.

Gardner: And it seems that not only are there challenges in terms of technology adoption and even operating more like an efficient business in some ways. We also have very different climates from country to country, jurisdiction to jurisdiction. There are regulations, compliance, and so forth.

Going back to you, Larry, how important of an issue is that? How complex does it get because we have such different approaches to healthcare and insurance from country to country?

Schmidt: There are definitely complexities that occur based on the different insurance models and how healthcare is delivered across and between countries, but some of the basic and fundamental activities in the past that happened as a result of delivering healthcare are consistent across countries.

As Jason has offered, enterprise architecture can provide us the means to explore what the art of the possible might be today. It could allow us the opportunity to see how innovation can occur if we enable better communication flow between the stakeholders that exist with any healthcare delivery model in order to give us the opportunity to improve the overall population.

After all, that’s what this is all about. We want to be able to enable a collaborative model throughout the stakeholders to improve the overall health of the population. I think that’s pretty consistent across any country that we might work in.

Ongoing work

Gardner: Jim Hietala, maybe you could help us better understand what’s going on within The Open Group and, even more specifically, at the conference in Philadelphia. There is the Population Health Working Group and there is work towards a vision of enabling the boundaryless information flow between the stakeholders. Any other information and detail you could offer would be great.[Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Hietala: On Tuesday of the conference, we have a healthcare focus day. The keynote that morning will be given by Dr. David Nash, Dean of the Jefferson School of Population Health. He’ll give what’s sure to be a pretty interesting presentation, followed by a reactors’ panel, where we’ve invited folks from different stakeholder constituencies.

Hietala

We are going to have clinicians there. We’re going to have some IT folks and some actual patients to give their reaction to Dr. Nash’s presentation. We think that will be an interesting and entertaining panel discussion.

The balance of the day, in terms of the healthcare content, we have a workshop. Larry Schmidt is giving one of the presentations there, and Jason and myself and some other folks from our working group are involved in helping to facilitate and carry out the workshop.

The goal of it is to look into healthcare challenges, desired outcomes, the extended healthcare enterprise, and the extended healthcare IT enterprise and really gather those pain points that are out there around things like interoperability to surface those and develop a work program coming out of this.

We want to be able to enable a collaborative model throughout the stakeholders to improve the overall health of the population.

So we expect it to be an interesting day if you are in the healthcare IT field or just the healthcare field generally, it would definitely be a day well spent to check it out.

Gardner: Larry, you’re going to be talking on Tuesday. Without giving too much away, maybe you can help us understand the emphasis that you’re taking, the area that you’re going to be exploring.

Schmidt: I’ve titled the presentation “Remixing Healthcare through Enterprise Architecture.” Jason offered some thoughts as to why we want to leverage enterprise architecture to discipline healthcare. My thoughts are that we want to be able to make sure we understand how the collaborative model would work in healthcare, taking into consideration all the constituents and stakeholders that exist within the complete ecosystem of healthcare.

This is not just collaboration across the doctors, patients, and maybe the payers in a healthcare delivery model. This could be out as far as the drug companies and being able to get drug companies to a point where they can reorder their raw materials to produce new drugs in the case of an epidemic that might be occurring.

Real-time model

It would be a real-time model that allows us the opportunity to understand what’s truly happening, both to an individual from a healthcare standpoint, as well as to a country or a region within a country and so on from healthcare. This remixing of enterprise architecture is the introduction to that concept of leveraging enterprise architecture into this collaborative model.

Then, I would like to talk about some of the technologies that I’ve had the opportunity to explore around what is available today in technology. I believe we need to have some type of standardized messaging or collaboration models to allow us to further facilitate the ability of that technology to provide the value of healthcare delivery or betterment of healthcare to individuals. I’ll talk about that a little bit within my presentation and give some good examples.

It’s really interesting. I just traveled from my company’s home base back to my home base and I thought about something like a body scanner that you get into in the airport. I know we’re in the process of eliminating some of those scanners now within the security model from the airports, but could that possibly be something that becomes an element within healthcare delivery? Every time your body is scanned, there’s a possibility you can gather information about that, and allow that to become a part of your electronic medical record.

There is a lot of information available today that could be used in helping our population to be healthier.

Hopefully, that was forward thinking, but that kind of thinking is going to play into the art of the possible, with what we are going to be doing, both in this presentation and talking about that as part of the workshop.

Gardner: Larry, we’ve been having some other discussions with The Open Group around what they call Open Platform 3.0™, which is the confluence of big data, mobile, cloud computing, and social.

One of the big issues today is this avalanche of data, the Internet of things, but also the Internet of people. It seems that the more work that’s done to bring Open Platform 3.0 benefits to bear on business decisions, it could very well be impactful for centers and other data that comes from patients, regardless of where they are, to a medical establishment, regardless of where it is.

So do you think we’re really on the cusp of a significant shift in how medicine is actually conducted?

Schmidt: I absolutely believe that. There is a lot of information available today that could be used in helping our population to be healthier. And it really isn’t only the challenge of the communication model that we’ve been speaking about so far. It’s also understanding the information that’s available to us to take that and make that into knowledge to be applied in order to help improve the health of the population.

As we explore this from an as-is model in enterprise architecture to something that we believe we can first enable through a great collaboration model, through standardized messaging and things like that, I believe we’re going to get into even deeper detail around how information can truly provide empowered decisions to physicians and individuals around their healthcare.

So it will carry forward into the big data and analytics challenges that we have talked about and currently are talking about with The Open Group.

Healthcare framework

Gardner: Jason Uppal, we’ve also seen how in other business sectors, industries have faced transformation and have needed to rely on something like enterprise architecture and a framework like TOGAF® in order to manage that process and make it something that’s standardized, understood, and repeatable.

It seems to me that healthcare can certainly use that, given the pace of change, but that the impact on healthcare could be quite a bit larger in terms of actual dollars. This is such a large part of the economy that even small incremental improvements can have dramatic effects when it comes to dollars and cents.

So is there a benefit to bringing enterprise architect to healthcare that is larger and greater than other sectors because of these economics and issues of scale?

Uppal: That’s a great way to think about this thing. In other industries, applying enterprise architecture to do banking and insurance may be easily measured in terms of dollars and cents, but healthcare is a fundamentally different economy and industry.

It’s not about dollars and cents. It’s about people’s lives, and loved ones who are sick, who could very easily be treated, if they’re caught in time and the right people are around the table at the right time. So this is more about human cost than dollars and cents. Dollars and cents are critical, but human cost is the larger play here.

Whatever systems and methods are developed, they have to work for everybody in the world.

Secondly, when we think about applying enterprise architecture to healthcare, we’re not talking about just the U.S. population. We’re talking about global population here. So whatever systems and methods are developed, they have to work for everybody in the world. If the U.S. economy can afford an expensive healthcare delivery, what about the countries that don’t have the same kind of resources? Whatever methods and delivery mechanisms you develop have to work for everybody globally.

That’s one of the things that a methodology like TOGAF brings out and says to look at it from every stakeholder’s point of view, and unless you have dealt with every stakeholder’s concerns, you don’t have an architecture, you have a system that’s designed for that specific set of audience.

The cost is not this 18 percent of the gross domestic product in the U.S. that is representing healthcare. It’s the human cost, which is many multitudes of that. That’s is one of the areas where we could really start to think about how do we affect that part of the economy, not the 18 percent of it, but the larger part of the economy, to improve the health of the population, not only in the North America, but globally.

If that’s the case, then what really will be the impact on our greater world economy is improving population health, and population health is probably becoming our biggest problem in our economy.

We’ll be testing these methods at a greater international level, as opposed to just at an organization and industry level. This is a much larger challenge. A methodology like TOGAF is a proven and it could be stressed and tested to that level. This is a great opportunity for us to apply our tools and science to a problem that is larger than just dollars. It’s about humans.

All “experts”

Gardner: Jim Hietala, in some ways, we’re all experts on healthcare. When we’re sick, we go for help and interact with a variety of different services to maintain our health and to improve our lifestyle. But in being experts, I guess that also means we are witnesses to some of the downside of an unconnected ecosystem of healthcare providers and payers.

One of the things I’ve noticed in that vein is that I have to deal with different organizations that don’t seem to communicate well. If there’s no central process organizer, it’s really up to me as the patient to pull the lines together between the different services — tests, clinical observations, diagnosis, back for results from tests, sharing the information, and so forth.

Have you done any studies or have anecdotal information about how that boundaryless information flow would be still relevant, even having more of a centralized repository that all the players could draw on, sort of a collaboration team resource of some sort? I know that’s worked in other industries. Is this not a perfect opportunity for that boundarylessness to be managed?

Hietala: I would say it is. We all have experiences with going to see a primary physician, maybe getting sent to a specialist, getting some tests done, and the boundaryless information that’s flowing tends to be on paper delivered by us as patients in all the cases.

So the opportunity to improve that situation is pretty obvious to anybody who’s been in the healthcare system as a patient. I think it’s a great place to be doing work. There’s a lot of money flowing to try and address this problem, at least here in the U.S. with the HITECH Act and some of the government spending around trying to improve healthcare.

We’ll be testing these methods at a greater international level, as opposed to just at an organization and industry level.

You’ve got healthcare information exchanges that are starting to develop, and you have got lots of pain points for organizations in terms of trying to share information and not having standards that enable them to do it. It seems like an area that’s really a great opportunity area to bring lots of improvement.

Gardner: Let’s look for some examples of where this has been attempted and what the success brings about. I’ll throw this out to anyone on the panel. Do you have any examples that you can point to, either named organizations or anecdotal use case scenarios, of a better organization, an architectural approach, leveraging IT efficiently and effectively, allowing data to flow, putting in processes that are repeatable, centralized, organized, and understood. How does that work out?

Uppal: I’ll give you an example. One of the things that happens when a patient is admitted to hospital and in hospital is that they get what’s called a high-voltage care. There is staff around them 24×7. There are lots of people around, and every specialty that you can think of is available to them. So the patient, in about two or three days, starts to feel much better.

When that patient gets discharged, they get discharged to home most of the time. They go from very high-voltage care to next to no care. This is one of the areas where in one of the organizations we work with is able to discharge the patient and, instead of discharging them to the primary care doc, who may not receive any records from the hospital for several days, they get discharged to into a virtual team. So if the patient is at home, the virtual team is available to them through their mobile phone 24×7.

Connect with provider

If, at 3 o’clock in the morning, the patient doesn’t feel right, instead of having to call an ambulance to go to hospital once again and get readmitted, they have a chance to connect with their care provider at that time and say, “This is what the issue is. What do you want me to do next? Is this normal for the medication that I am on, or this is something abnormal that is happening?”

When that information is available to that care provider who may not necessarily have been part of the care team when the patient was in the hospital, that quick readily available information is key for keeping that person at home, as opposed to being readmitted to the hospital.

We all know that the cost of being in a hospital is 10 times more than it is being at home. But there’s also inconvenience and human suffering associated with being in a hospital, as opposed to being at home.

Those are some of the examples that we have, but they are very limited, because our current health ecosystem is a very organization specific, not  patient and provider specific. This is the area there is a huge room for opportunities for healthcare delivery, thinking about health information, not in the context of the organization where the patient is, as opposed to in a cloud, where it’s an association between the patient and provider and health information that’s there.

Extending that model will bring infinite value to not only reducing the cost, but improving the cost and quality of care.

In the past, we used to have emails that were within our four walls. All of a sudden, with Gmail and Yahoo Mail, we have email available to us anywhere. A similar thing could be happening for the healthcare record. This could be somewhere in the cloud’s eco setting, where it’s securely protected and used by only people who have granted access to it.

Those are some of the examples where extending that model will bring infinite value to not only reducing the cost, but improving the cost and quality of care.

Schmidt: Jason touched upon the home healthcare scenario and being able to provide touch points at home. Another place that we see evolving right now in the industry is the whole concept of mobile office space. Both countries, as well as rural places within countries that are developed, are actually getting rural hospitals and rural healthcare offices dropped in by helicopter to allow the people who live in those communities to have the opportunity to talk to a doctor via satellite technologies and so on.

The whole concept of a architecture around and being able to deal with an extension of what truly lines up being telemedicine is something that we’re seeing today. It would be wonderful if we could point to things like standards that allow us to be able to facilitate both the communication protocols as well as the information flows in that type of setting.

Many corporations can jump on the bandwagon to help the rural communities get the healthcare information and capabilities that they need via the whole concept of telemedicine.

That’s another area where enterprise architecture has come into play. Now that we see examples of that working in the industry today, I am hoping that as part of this working group, we’ll get to the point where we’re able to facilitate that much better, enabling innovation to occur for multiple companies via some of the architecture or the architecture work we are planning on producing.

Single view

Gardner: It seems that we’ve come a long way on the business side in many industries of getting a single view of the customer, as it’s called, the customer relationship management, big data, spreading the analysis around among different data sources and types. This sounds like a perfect fit for a single view of the patient across their life, across their care spectrum, and then of course involving many different types of organizations. But the government also needs to have a role here.

Jim Hietala, at The Open Group Conference in Philadelphia, you’re focusing on not only healthcare, but finance and government. Regarding the government and some of the agencies that you all have as members on some of your panels, how well do they perceive this need for enterprise architecture level abilities to be brought to this healthcare issue?

Hietala: We’ve seen encouraging signs from folks in government that are encouraging to us in bringing this work to the forefront. There is a recognition that there needs to be better data flowing throughout the extended healthcare IT ecosystem, and I think generally they are supportive of initiatives like this to make that happen.

Gardner: Of course having conferences like this, where you have a cross pollination between vertical industries, will perhaps allow some of the technical people to talk with some of the government people too and also have a conversation with some of the healthcare people. That’s where some of these ideas and some of the collaboration could also be very powerful.

We’ve seen encouraging signs from folks in government that are encouraging to us in bringing this work to the forefront.

I’m afraid we’re almost out of time. We’ve been talking about an interesting healthcare transition, moving into a new phase or even era of healthcare.

Our panel of experts have been looking at some of the trends in IT and how they are empowering improvement for how healthcare can be more responsive and efficient. And we’ve seen how healthcare industry organizations can take large scale transformation using cross-organizational collaboration, for example, and other such tools as big data, analytics, and cloud computing to help solve some of these issues.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference this July in Philadelphia. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL, and you will hear more about healthcare or Open Platform 3.0 as well as enterprise transformation in the finance, government, and healthcare sectors.

With that, I’d like to thank our panel. We’ve been joined today by Jason Uppal, Chief Architect and Acting CEO at clinicalMessage. Thank you so much, Jason.

Uppal: Thank you, Dana.

Gardner: And also Larry Schmidt, Chief Technologist at HP for the Health and Life Sciences Industries. Thanks, Larry.

Schmidt: You bet, appreciate the time to share my thoughts. Thank you.

Gardner: And then also Jim Hietala, Vice President of Security at The Open Group. Thanks so much.

Hietala: Thank you, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these thought leader interviews. Thanks again for listening and come back next time.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Conference, Enterprise Architecture, Healthcare, Open Platform 3.0, Professional Development, Service Oriented Architecture, TOGAF, TOGAF®

As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

This latest BriefingsDirect discussion, leading into the The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore the business implications of the current shift to so-called Platform 3.0.

Known as the new model through which big data, cloud, and mobile and social — in combination — allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we’re here now to learn more how to leverage Platform 3.0 as more than a IT shift — and as a business game-changer. It will be a big topic at next week’s conference.

The panel: Dave Lounsbury, Chief Technical Officer at The Open Group; Chris Harding, Director of Interoperability at The Open Group, and Mark Skilton, Global Director in the Strategy Office at Capgemini. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They’re turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it’s a pretty fundamental change.

Lounsbury

If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we’re starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Harding: Enterprises have to keep up with the way that things are moving in order to keep their positions in their industries. Enterprises can’t afford to be working with yesterday’s technology. It’s a case of being able to understand the information that they’re presented, and make the best decisions.

Harding

We’ve always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we’re talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs — velocity, volume, and value — on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we’re now into is the multi-workload environment, where you have mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton

It has to do with not just one solution, not one subscription model — because we’re now into this subscription-model era … the subscription economy, as one group tends to describe it. Now, we’re looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we’re dealing with — 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Why should IT be thinking about this as a fundamental shift, rather than a modest change?

Lounsbury: A lot depends on how you define your IT organization. It’s useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it’s how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There’s no point giving someone data if it’s not been properly managed or if there’s incorrect information.

What’s going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we’ve seen in the open-source  role and things like that nature, but there’s the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they’re going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can’t do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally — how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I’d just like to add to Dave’s excellent points about, the shape of data has changed, but also about why should IT get involved. We’re seeing that there’s a shift in the constituency of who is using this data.

We have the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We’ve got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there’s also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: How do we prevent this from going off the rails?

Harding: This a very important point. And to add to the difficulties, it’s not only that a whole set of different people are getting involved with different kinds of information, but there’s also a step change in the speed with which all this is delivered. It’s no longer the case, that you can say, “Oh well, we need some kind of information system to manage this information. We’ll procure it and get a program written” that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It’s really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It’s a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Skilton: Capgemini has been doing work in this area. I break it down into four levels of scalability. It’s the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We’re very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it’s all about connectivity in the field. I meet a number of clients who are saying, “We’ve got this cloud service,” or “This service is in a certain area of my country. If I move to another parts of the country or I’m traveling, I can’t get connectivity.” That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the “long tail effect.” It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: We’re coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3.0, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We’re still in the formational stages of  “third platform” or Platform 3.0 for The Open Group as an industry. To some extent, we’re starting pretty much at the ground floor with that in the Platform 3.0 forum. We’re leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we’ve got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we’re really working toward in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Harding: We certainly also need to understand the business environment within which Platform 3.0 will be used. We’ve heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we’ve heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We’re also hearing about marketplaces for services, new ways in which services are being made available and combined.

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: Looking to the future, when we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we’ve begun to see that how a recommendation engine could be brought to bear. I’m thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thoughts?

Skilton: What we’re talking about is the next generation of the Internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the Internet.

I think that in the future, we’ll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We’ll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what’s happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: There’s this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, “So now this is where we could make a change to our business.” It’s the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It’s going to be different for every business, and I’m very happy to say this, it’s something that computers aren’t going to be able to do for a very long time yet. It’s going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it’s a very exciting time, and we’ll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we’ll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, “It’s going to be them” or “It’s going to be them.”

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I’d disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we’ll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Cloud/SOA, Conference, Data management, Enterprise Architecture, Platform 3.0, Professional Development, TOGAF®

Three laws of the next Internet of Things – the new platforming evolution in computing

By Mark Skilton, Global Director at Capgemini

There is a wave of new devices and services that are growing in strength extending the boundary of what is possible in today’s internet driven economy and lifestyle.   A striking feature is the link between apps that are on smart phones and tablets and the ability to connect to not just websites but also to data collection sensors and intelligence analytical analysis of that information.   A key driver of this has also been the improvement in the cost-performance curve of information technology not just in CPU and storage but also the easy availability and affordability of highly powerful computing and mass storage in mobile devices coupled with access to complex sensors, advanced optics and screen displays results in a potentially truly immersive experience.  This is a long way from the early days of radio frequency identity tags which are the forerunner of this evolution.   Digitization of information and its interpretation of meaning is everywhere, moving into a range of industries and augmented services that create new possibilities and value. A key challenge in how to understand this growth of devices, sensors, content and services across the myriad of platforms and permutations this can bring.

·         Energy conservation

o   Through home and building energy management

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential
that connecting Devices

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems.

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Examples include Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.  Or Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in harsh or inaccessible environments. Disabled support through robotic prosthetics and communication synthesis.   Swarming robots that fly or mimic group behavior.  Swarming robots that mimic nature and decision making.

These are just the tip of want is possible; the early commercial ventures that are starting to drive new ways to think about information technology and application services.

A key feature I noticed in all these devices are that they augment previous layers of technology by sitting on top of them and adding extra value.   While often the long shadow of the first generation giants of the public internet Apple, Google, Amazon give the impression that to succeed means a controlled platform and investment of millions; these new technologies use existing infrastructure and operate across a federated distributed architecture that represents a new kind of platforming paradigm of multiple systems.

Perhaps a paradigm of new technology cycles is that as the new tech arrives it will cannibalize older technologies. Clearly nothing is immune to this trend, even the cloud,   I’ll call it even the evolution of a  kind a technology laws ( a feature  I saw in by Charles Fine clock speed book http://www.businessforum.com/clockspeed.html  but adapted here as a function of compound cannibalization and augmentation).  I think Big Data is an example of such a shift in this direction as augmented informatics enables major next generation power pays for added value services.

These devices and sensors can work with existing infrastructure services and resources but they also create a new kind of computing architecture that involves many technologies, standards and systems. What was in early times called “system of systems” Integration (Examples seen in the defence sector  http://www.bctmod.army.mil/SoSI/sosi.html  and digital ecosystems in the government sector  http://www.eurativ.com/specialreport-skills/kroes-europe-needs-digital-ecosy-interview-517996 )

While a sensor device can replace the existing thermostat in your house or the lighting or the access locks to your doors, they are offering a new kind of augmented experience that provides information and insight that enabled better control of the wider environment or the actions and decisions within a context.

This leads to a second feature of these device, the ability to learn and adapt from the inputs and environment.  This is probably an even larger impact than the first to use infrastructure in that it’s the ability to change the outcomes is a revolution in information.  The previous idea of static information and human sense making of this data is being replaced by the active pursuit of automated intelligence from the machines we build.   Earlier design paradigms that needed to define declarative services, what IT call CRUD (Create, Read, Update, Delete) as predefined and managed transactions are being replaced by machine learning algorithms that seek to build a second generation of intelligent services  that alter the results and services with the passage of time and usage characteristics.

This leads me to a third effect that became apparent in the discussion of lifestyle services versus medical and active device management.  In the case of lifestyle devices a key feature is the ability to blend in with the personal activity to enable new insight in behavior and lifestyle choices, to passively and actively monitor or tack action, not always to affect they behavior itself. That is to provide unobtrusive, ubiquitous presence.   But moving this idea further it is also about the way the devices could merge in a become integrated within the context of the user or environmental setting.  The example of biomedical devices to augment patient care and wellbeing is one such example that can have real and substantive impact of quality of life as well as efficiency in cost of care programs with an aging population to support.

An interesting side effect of these trends is the cultural dilemma these devices and sensors bring in the intrusion of personal data and privacy. Yet once the meaning and value of if this telemetry on safety , health or material value factors is perceived for the good of the individual and community, the adoption of such services may become more pronounced and reinforced. A virtuous circle of accelerated adoption seen as a key characteristic of successful growth and a kind of conditioning feedback that creates positive reinforcement.     While a key feature that is underpinning these is the ability of the device and sensor to have an unobtrusive, ubiquitous presence this overall effect is central to the idea of effective system of systems integration and borderless information flow TM (The Open Group)

These trends I see as three laws of the next Internet of things describing a next generation platforming strategy and evolution.

Its clear that sensors and devices are merging together in a way that will see cross cutting from one industry to another.  Motion and temperature sensors in one will see application in another industry.   Services from one industry may connect with other industries as combinations of these services, lifestyles and affects.

Iofthings1.jpg

Formal and informal communities both physical and virtual will be connected through sensors and devices that pervade the social, technological and commercial environments. This will drive further growth in the mass of data and digitized information with the gradual semantic representation of this information into meaningful context.  Apps services will develop increasing intelligence and awareness of the multiplicity of data, its content and metadata adding new insight and services to the infrastructure fabric.  This is a new platforming paradigm that may be constructed from one or many systems and architectures from the macro to micro, nano level systems technologies.

The three laws as I describe may be recast in a lighter tongue-in-cheek way comparing them to the famous Isaac Asimov three laws of robotics.   This is just an illustration but in some way implies that the sequence of laws is in some fashion protecting the users, resources and environment by some altruistic motive.  This may be the case in some system feedback loops that are seeking this goal but often commercial micro economic considerations may be more the driver. However I can’t help thinking that this does hint to what maybe the first stepping stone to the eventuality of such laws.

Three laws of the next generation of The Internet of Things – a new platforming architecture

Law 1. A device, sensor or service may operate in an environment if it can augment infrastructure

Law 2.  A device, sensor or service must be able  to learn and adapt its response to the environment as long as  it’s not in conflict with the First law

Law 3. A device, sensor or service  must have unobtrusive ubiquitous presence such that it does not conflict with the First or Second laws

References

 ·       Energy conservation

o   The example of  Nest  http://www.nest.com Learning thermostat, founded by Tony Fadell, ex ipod hardware designer and  Head of iPod and iPhone division, Apple.   The device monitors and learns about energy usage in a building and adapts and controls the use of energy for improved carbon and cost efficiency.

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.  Example such as UP Jawbone  https://jawbone/up and Fitbit  http://www.fitbit.com .

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential that connecting Devices such as Zensorium  http://www.zensorium.com

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  http://www.nytimes.com/2010/07/29/garden/29parents.html?pagewanted-all  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems. Examples include Toyota  advanced IT systems that will help drivers avoid road accidents.  Http://www.toyota.com/safety/ Google driverless car  http://www.forbes.com/sites/chenkamul/2013/01/22/fasten-your-seatbelts-googles-driverless-car-is-worth-trillions/

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Alpha Blue http://www.alphablue.co.uk Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.

o   Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity. Examples such as Lockitron  https://www.lockitron.com.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in  harsh or inaccessible environments. Examples include the NASA Mars curiosity rover that has active control programs to determine remote actions on the red planet that has a signal delay time round trip (13 minutes, 48 seconds EDL) approximately 30 minutes to detect perhaps react to an event remotely from Earth.  http://blogs.eas.int/mex/2012/08/05/time-delay-betrween-mars-and-earth/  http://www.nasa.gov/mission_pages/mars/main/imdex.html .  Disabled support through robotic prosthetics and communication synthesis.     http://disabilitynews.com/technology/prosthetic-robotic-arm-can-feel/.  Swarming robotc that fly or mimic group behavior.    University of Pennsylvania, http://www.reuters.com/video/2012/03/20/flying-robot-swarms-the-future-of-search?videoId-232001151 Swarming robots ,   Natural Robotics Lab , The University of Sheffield , UK   http://www.sheffield.ac.uk/news/nr/sheffield-centre-robotic-gross-natural-robotics-lab-1.265434

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0

The era of “Internet aware systems and services” – the multiple-data, multi-platform and multi-device and sensors world

By Mark Skilton, Global Director at Capgemini

Communications + Data protocols and the Next Internet of Things Multi-Platform solutions

Much of the discussion on the “internet of things” have been around industry sector examples use of device and sensor services.  Examples of these I have listed at the end of this paper.  What are central to this emerging trend are not just sector point solutions but three key technical issues driving a new Industry Sector Digital Services strategy to bring these together into a coherent whole.

  1. How combinations of system technologies platforms are converging enabling composite business processes that are mobile , content and transactional rich  and with near real time persistence and interactivity
  2. The development of “non-web browser” protocols in new sensor driven machine data that are emerging that extend  new types of data into internet connected business and social integration
  3. The development of “connected systems” that move solutions in a new digital services of multiple services across platforms creating new business and technology services

I want to illustrate this by focusing on three topics:  multi-platforming strategies, communication protocols and examples of connected systems.

I want to show that this is not a simple “three or four step model” that I often see where mobile + applications and Cloud equal a solution but result in silos of data and platform integration challenges. New processing methods for big data platforms, distributed stream computing and in memory data base services for example are changing the nature of business analytics and in particular marketing and sales strategic planning and insight.  New feedback systems collecting social and machine learning data are  creating new types of business growth opportunities in context aware services that work and augment skills and services.

The major solutions in the digital ecosystem today incorporate an ever growing mix of devices and platforms that offer new user experiences and  organization. This can be seen across most all industry sectors and horizontally between industry sectors. This diagram is a simplistic view I want to use to illustrate the fundamental structures that are forming.

Iofthings1.jpg

Multiple devices that offer simple to complex visualization, on-board application services

Multiple Sensors that can economically detect measure and monitor most physical phenomena: light, heat, energy, chemical, radiological in both non-biological and biological systems.

Physical and virtual communities of formal and informal relationships. These human and/ or machine based  associations in the sense that search and discover of data and resources that can now work autonomously across an internet of many different types of data.

Physical and virtual Infrastructure that represent servers, storage, databases, networks and other resources that can constitute one or more platforms and environments. This infrastructure now is more complex in that it is both distributed and federated across multiple domains: mobile platforms, cloud computing platforms, social network platforms, big data platforms and embedded sensor platforms. The sense of a single infrastructure is both correct and incorrect in that is a combined state and set of resources that may or may not be within a span of control of an individual or organization.

Single and multi-tenanted Application services that operate in transactional, semi or non-deterministic ways that drive logical processing, formatting, interpretation, computation and other processing of data and results from one-to-many, many-to-one or many-to-many platforms and endpoints.

The key to thinking in multiple platforms is to establish the context of how these fundamental forces of platform services are driving interactions for many Industries and business and social networks and services. This is changing because they are interconnected altering the very basis of what defines a single platform to a multiple platform concept.

MS2This diagram illustrates some of these relationships and arrangements.   It is just one example of a digital ecosystem pattern, there can be other arrangements of these system use cases to meet different needs and outcomes.

I use this model to illustrate some of the key digital strategies to consider in empowering communities; driving value for money strategies or establishing a joined up device and sensor strategy for new mobile knowledge workers.   This is particularly relevant for key business stakeholders decision making processes today in Sales, Marketing, Procurement, Design, Sourcing, Supply and Operations to board level as well as IT related Strategy and service integration and engineering.

Taking one key stakeholder example, the Chief Marketing Officer (CMO) is interested and central to strategic channel and product development and brand management. The CMO typically seeks to develop Customer Zones, Supplier zones, marketplace trading communities, social networking communities and behavior insight leadership. These are critical drivers for successful company presence, product and service brand and market grow development as well as managing and aligning IT Cost and spend to what is needed for the business performance.  This creates a new kind of Digital Marketing Infrastructure to drive new customer and marketing value.  The following diagram illustrates types of  marketing services that raise questions over the types of platforms needed for single and multiple data sources, data quality and fidelity.

ms3These interconnected issues effect the efficacy and relevancy of marketing services to work at the speed, timeliness and point of contact necessary to add and create customer and stakeholder value.

What all these new converged technologies have in common are communications.  But  communications that are not just HTTP protocols but wider bandwidth of frequencies that are blurring together what is now possible to be connected.

These protocols include Wi-Fi and other wireless systems and standards that are not just in the voice speech band but also in the collection and use of other types of telemetry relating to other senses and detectors.

All these have common issues of Device and sensor compatibility, discovery and paring and security compatibility and controls.

ms4Communication standards examples for multiple services.

  • Wireless: WLAN, Bluetooth, ZigBee, Z-Wave, Wireless USB,
  •  Proximity Smartcard, Passive , Active, Vicinity Card
  • IrDA, Infrared
  • GPS Satellite
  • Mobile 3G, 4GLTE, Cell, Femtocell, GSM, CDMA, WIMAX
  • RFID RF, LF, HFbands
  • Encryption: WEP, WPA, WPA2, WPS, other

These communication protocols impact on the design and connectivity of system- to-system services. These standards relate to the operability of the services that can be used in the context of a platform and how they are delivered and used by consumers and providers..  How does the data and service connect with the platform? How does the service content get collected, formatted, processed and transmitted between the source and target platform?  How do these devices and sensors work to support extended and remote mobile and platform service?  What distributed workloads work best in a mobile platform, sensor platform or distributed to a dedicated or shared platform that may be cloud computing or appliance based for example?

Answering these questions are key to providing a consistent and powerful digital service strategy that is both flexible and capable of exploiting, scaling and operating with these new system and intersystem capabilities.

This becomes central to a new generation of Internet aware data and services that represent the digital ecosystem that deliver new business and consumer experience on and across platforms.ms5

This results in a new kind of User Experience and Presence strategy that moves the “single voice of the Customer” and “Customer Single voice” to a new level that works across mobile, tablets and other devices and sensors that translate and create new forms of information and experience for consumers and providers. Combining this with new sensors that can include for example; positional, physical and biomedical data content become a reality in this new generation of digital services.  Smart phones today have a price-point that includes many built in sensors that are precision technologies measuring physical and biological data sources. When these are built into new feedback and decision analytics creates a whole new set of possibilities in real time and near real time augmented services as well as new levels of resource use and behavior insight.

The scale and range of data types (text, voice, video, image, semi structured, unstructured, knowledge, metadata , contracts, IP ) about social, business and physical environments have moved beyond the early days of RFID tags to encompass new internet aware sensors, systems, devices and services.  ms6This is not just “Tabs and Pads” of mobiles and tablets but a growing presence into “Boards, Places and Spaces” that make up physical environments turning them in part of the interactive experience and sensory input of service interaction. This now extends to the massive scale of terrestrial communications that connect across the planet and beyond in the case of NASA for example; but also right down to the Micro, Nano, Pico and quantum levels in the case of Molecular and Nano tech engineering .   All these are now part of the modern technological landscape that is pushing the barriers of what is possible in today’s digital ecosystem.

The conclusion is that strategic planning needs to have insight into the nature of new infrastructures and applications that will support these new multisystem workloads and digital infrastructures.
I illustrate this in the following diagram in what I call the “multi-platforming” framework that represents this emerging new ecosystem of services.ms7

Digital Service = k ∑ Platforms + ∑ Connections

K= a coefficient measuring how open, closed and potential value of service

Digital Ecosystem = e ∑ Digital Services

e = a coefficient of how diverse and dynamic the ecosystem and its service participants.

I will explore the impact on enterprise architecture and digital strategy in future blogs and how the emergence of a new kind of architecture called Ecosystem Arch.

Examples of new general Industry sector services Internet of Things

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

4 Comments

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0

Why is Cloud Adoption Taking so Long?

By Chris Harding, The Open Group

At the end of last year, Gartner predicted that cloud computing would become an integral part of IT in 2013 (http://www.gartner.com/DisplayDocument?doc_cd=230929). This looks a pretty safe bet. The real question is, why is it taking so long?

Cloud Computing

Cloud computing is a simple concept. IT resources are made available, within an environment that enables them to be used, via a communications network, as a service. It is used within enterprises to enable IT departments to meet users’ needs more effectively, and by external providers to deliver better IT services to their enterprise customers.

There are established vendors of products to fit both of these scenarios. The potential business benefits are well documented. There are examples of real businesses gaining those benefits, such as Netflix as a public cloud user (see http://www.zdnet.com/the-biggest-cloud-app-of-all-netflix-7000014298/ ), and Unilever and Lufthansa as implementers of private cloud (see http://www.computerweekly.com/news/2240114043/Unilever-and-Lufthansa-Systems-deploy-Azure-Private-cloud ).

Slow Pace of Adoption

Yet we are still talking of cloud computing becoming an integral part of IT. In the 2012 Open Group Cloud ROI survey, less than half of the respondents’ organizations were using cloud computing, although most of the rest were investigating its use. (See http://www.opengroup.org/sites/default/files/contentimages/Documents/cloud_roi_formal_report_12_19_12-1.pdf ). Clearly, cloud computing is not being used for enterprise IT as a matter of routine.

Cloud computing is now at least seven years old. Amazon’s “Elastic Compute Cloud” was launched in August 2006, and there are services that we now regard as cloud computing, though they may not have been called that, dating from before then. Other IT revolutions – personal computers, for example – have reached the point of being an integral part of IT in half the time. Why has it taken Cloud so long?

The Reasons

One reason is that using Cloud requires a high level of trust. You can lock your PC in your office, but you cannot physically secure your cloud resources. You must trust the cloud service provider. Such trust takes time to earn.

Another reason is that, although it is a simple concept, cloud computing is described in a rather complex way. The widely-accepted NIST definition (see http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf ) has three service models and four deployment models, giving a total of twelve distinct delivery combinations. Each combination has different business drivers, and the three service models are based on very different technical capabilities. Real products, of course, often do not exactly correspond to the definition, and their vendors describe them in product-specific terms. This complexity often leads to misunderstanding and confusion.

A third reason is that you cannot “mix and match” cloud services from different providers. The market is consolidating, with a few key players emerging as dominant at the infrastructure and platform levels. Each of them has its own proprietary interfaces. There are no real vendor-neutral standards. A recent Information Week article on Netflix (http://www.informationweek.co.uk/cloud-computing/platform/how-netflix-is-ruining-cloud-computing/240151650 ) describes some of the consequences. Customers are beginning to talk of “vendor lock-in” in a way that we haven’t seen since the days of mainframes.

The Portability and Interoperability Guide

The Open Group Cloud Computing Portability and Interoperability Guide addresses this last problem, by providing recommendations to customers on how best to achieve portability and interoperability when working with current cloud products and services. It also makes recommendations to suppliers and standards bodies on how standards and best practice should evolve to enable greater portability and interoperability in the future.

The Guide tackles the complexity of its subject by defining a simple Distributed Computing Reference Model. This model shows how cloud services fit into the mix of products and services used by enterprises in distributed computing solutions today. It identifies the major components of cloud-enabled solutions, and describes their portability and interoperability interfaces.

Platform 3.0

Cloud is not the only new game in town. Enterprises are looking at mobile computing, social computing, big data, sensors, and controls as new technologies that can transform their businesses. Some of these – mobile and social computing, for example – have caught on faster than Cloud.

Portability and interoperability are major concerns for these technologies too. There is a need for a standard platform to enable enterprises to use all of the new technologies, individually and in combination, and “mix and match” different products. This is the vision of the Platform 3.0 Forum, recently formed by The Open Group. The distributed computing reference model is an important input to this work.

The State of the Cloud

It is now at least becoming routine to consider cloud computing when architecting a new IT solution. The chances of it being selected however appear to be less than fifty-fifty, in spite of its benefits. The reasons include those mentioned above: lack of trust, complexity, and potential lock-in.

The Guide removes some of the confusion caused by the complexity, and helps enterprises assess their exposure to lock-in, and take what measures they can to prevent it.

The growth of cloud computing is starting to be constrained by lack of standards to enable an open market with free competition. The Guide contains recommendations to help the industry and standards bodies produce the standards that are needed.

Let’s all hope that the standards do appear soon. Cloud is, quite simply, a good idea. It is an important technology paradigm that has the potential to transform businesses, to make commerce and industry more productive, and to benefit society as a whole, just as personal computing did. Its adoption really should not be taking this long.

The Open Group Cloud Computing Portability and Interoperability Guide is available from The Open Group bookstore at https://www2.opengroup.org/ogsys/catalog/G135

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

3 Comments

Filed under Platform 3.0

Questions for the Upcoming Platform 3.0™ Tweet Jam

By Patty Donovan, The Open Group

Last week, we announced our upcoming tweet jam on Thursday, June 6 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST, which will examine how convergent technologies such as Big Data, Social, Mobile and The Internet of Things are impacting today’s business operations. We will also discuss the opportunities available to those organizations who keep pace with this rapid pace of change and how they might take steps to get there.

The discussion will be moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct, and we welcome both members of The Open Group and interested participants alike to join the session.

The discussion will be guided by these five questions:

- Does your organization see a convergence of emerging technologies such as social networking, mobile, cloud and the internet of things?

- How has this convergence affected your business?

- Are these changes causing you to change your IT platform; if so how?

- How is the data created by this convergence affecting business models or how you make business decisions?

- What new IT capabilities are needed to support new business models and decision making?

To join the discussion, please follow the #ogp3 and #ogChat hashtag during the allotted discussion time.

For more information about the tweet jam, guidelines and general background information, please visit our previous blog post.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rob Checkal (rob.checkal at hotwirepr dot com) or leave a comment below. We anticipate a lively chat and hope you will be able to join us!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

Comments Off

Filed under Cloud, Cloud/SOA, Data management, Platform 3.0, Tweet Jam