Monthly Archives: July 2013

Speaking the Language of Business with TOGAF®

By Glenn Evans, Senior Consultant at Enterprise Architects

TOGAF-A-personal-journey

I remember as a young child coming from a ‘non-sports obsessed’ family, I didn’t know what a yorker was, didn’t know what ‘LBW’ meant, or why Dennis Lillee or Geoffrey Boycott were such legends. I was ill equipped to join in on those all-important schoolboy conversations – the Monday morning autopsy of the weekend’s sporting events. Similarly, 30 years later, enterprise architecture presented me with the same dilemma. 

I remember as a junior IT engineer, I’d hear the technology choice made by the customer was for ‘business reasons’, not what was logical in my technical view of the world. I now see ‘Architecture’ was influencing the project decisions, it was the source of the ‘business reasons’.

In my early days as an Architect, it was like being back at primary school; I struggled with the conversation. There was a level of assumed knowledge with respect to the conversation and the process that was not readily accessible to me. So, I learnt the long and hard way.

Fast forward a decade or so… As a mandatory requirement of my new role with Enterprise Architects I recently attended our TOGAF® training. To be honest, I anticipated another dry, idealistic framework that, whilst relevant to the work that I do, would probably not be all that practical and would be difficult to apply to a real world situation. How wrong was I?

Don’t misunderstand! The TOGAF® manual is dry! Yes it is “another framework” and yes you do need to tailor it to the situation you are in, but this is one of its greatest strengths, this is what makes it so flexible and therefore relevant and applicable to real world situations. But it’s not the framework itself that has me excited. It’s what it enables.

To me TOGAF®:

  • Is a common language, linking the discovery from each of the domains together and to the business requirements, across different levels of the business in an iterative process.
  • Provides a toolset to articulate the complex, simply. 
  • Provides a backstop, giving traceable, auditable decision support for those difficult conversations.
  • Allows the development of focused visual models of complex and disparate sets of data.

This was clearly demonstrated to me on a recent engagement. I was deep in thought, staring at a collection of printed Architecture Models displayed on a wall. One of the admin staff with no IT or business background asked me what “it all meant”. I spent a few minutes explaining that these were models of the business and the technology used in it. Not only did they immediately understand the overall concept of what they were looking at, they were actually able to start extracting real insights from the models.

In my mind, it doesn’t get any better than that. I wish I had known about TOGAF® a decade ago, I would have been a better architect – and a lot sooner.

Glenn EvansGlenn Evans is a Senior Consultant for Enterprise Architects and is based in Melbourne, Australia.

This is an extract from Glenn’s recent blog post on the Enterprise Architects web site which you can view here.

2 Comments

Filed under Certifications, Enterprise Architecture, Enterprise Transformation, Professional Development, TOGAF, TOGAF®

TOGAF® 9 Certification Growth – Number of Individuals Certified Increases in the Last 12 Months – Now Over 24,000

By Andrew Josey, The Open Group

The number of individuals certified in the TOGAF® 9 certification program as of July 1st 2013 was 23,800. This represents 9000 new certifications in the equivalent twelve month period. Today (22nd July 2013) the total number of certifications is 24,213.

TOGAF continues to be adopted globally with certified individuals from ninety-six different countries.

TOGAF cert 2013

The top five countries include UK, USA, Netherlands, Australia and India.

Individuals certified by Country – TOP 10 Countries – July 2013

Rank

# Individuals

Country

Percentage

1

3629

UK

15.32%

2

3058

USA

12.91%

3

2277

Netherlands

9.61%

4

1648

Australia

6.96%

5

1611

India

6.8%

6

1118

Canada

4.72%

7

949

South Africa

4.01%

8

819

France

3.46%

9

810

China

3.42%

10

796

Finland

3.36%

Individuals certified by Region – July 2013

cert-by-region

There are forty-eight TOGAF 9 training partners worldwide and fifty-six accredited TOGAF 9 courses.  More information on TOGAF 9 Certification, including the directory of Certified People and official accredited training course calendar, can be obtained from The Open Group website at: http://www.opengroup.org/togaf9/cert/.

Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Comments Off

Filed under Certifications, Enterprise Architecture, TOGAF®

The Open Group Philadelphia – Day Three Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

We are winding down Day 3 and gearing up for the next two days of training and workshops.  Today’s subject areas included TOGAF®, ArchiMate®, Risk Management, Innovation Management, Open Platform 3.0™ and Future Trends.

The objective of the Future Trends session was to discuss “emerging business and technical trends that will shape enterprise IT”, according to Dave Lounsbury, Chief Technical Officer of The Open Group.

This track also featured a presentation by Dr. William Lafontaine, VP High Performance Computing, Analytics & Cognitive Markets, IBM Research, who gave an overview of the “Global Technology Outlook 2013”.  He stated the Mega Trends are:  Growing Scale/Lower Barrier of Entry; Increasing Complexity/Yet More Consumable; Fast Pace; Contextual Overload.  Mike Walker, Strategies & Enterprise Architecture Advisor for HP, noted the key disrupters that will affect our future are the business of IT, technology itself, expectation of consumers and globalization.

The session concluded with an in-depth Q&A with Bill, Dave, Mike (as shown below) and Allen Brown, CEO of The Open Group.Philly Day 3

Other sessions included presentations by TJ Virdi (Senior Enterprise Architect, Boeing) on Innovation Management, Jack Jones (President, CXOWARE, Inc.) on Risk Management and Stephen Bennett (Executive Principal, Oracle) on Big Data.

A special thanks goes to our many sponsors during this dynamic conference: Windstream, Architecting the Enterprise, Metaplexity, BIZZdesign, Corso, Avolution, CXOWARE, Penn State – Online Program in Enterprise Architecture, and Association of Enterprise Architects.

Stay tuned for post-conference proceedings to be posted soon!  See you at our conference in London, October 21-24.

Comments Off

Filed under ArchiMate®, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Open Platform 3.0, RISK Management, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

philly 2.jpgDay 2 at The Open Group conference in the City of Brotherly Love, as Philadelphia is also known, was another busy and remarkable day.

The plenary started with a fascinating presentation, “Managing the Health of the Nation” by David Nash, MD, MBA, Dean of Jefferson School of Population Health.  Healthcare is the number one industry in the city of Philadelphia, with the highest number of patients in beds in the top 10 US cities. The key theme of his thought-provoking speech was “boundaryless information sharing” (sound familiar?), which will enable a healthcare system that is “safe, effective, patient-centered, timely, equitable, efficient”.

Following Dr. Nash’s presentation was the Healthcare Transformation Panel moderated by Allen Brown, CEO of The Open Group.  Participants were:  Gina Uppal (Fulbright-Killam Fellow, American University Program), Mike Lambert (Open Group Fellow, Architecting the Enterprise), Rosemary Kennedy (Associate Professor, Thomas Jefferson University), Blaine Warkentine, MD, MPH and Fran Charney (Pennsylvania Patient Safety Authority). The group brought different sets of experiences within the healthcare system and provided reaction to Dr. Nash’s speech.  All agree on the need for fundamental change and that technology will be key.

The conference featured a spotlight on The Open Group’s newest forum, Open Platform 3.0™ by Dr. Chris Harding, Director of Interoperability.  Open Platform 3.0 was formed to advance The Open Group vision of Boundaryless Information Flow™ to help enterprises in the use of Cloud, Social, Mobile Computing and Big Data.  For more info; http://www.opengroup.org/getinvolved/forums/platform3.0

The Open Group flourishes because of people interaction and collaboration.  The accolades continued with several members being recognized for their outstanding contributions to The Open Group Trusted Technology Forum (OTTF) and the Service-Oriented Architecture (SOA) and Cloud Computing Work Groups.  To learn more about our Forums and Work Groups and how to get involved, please visit http://www.opengroup.org/getinvolved

Presentations and workshops were also held in the Healthcare, Finance and Government vertical industries. Presenters included Larry Schmidt (Chief Technologist, HP), Rajamanicka Ponmudi (IT Architect, IBM) and Robert Weisman (CEO, Build the Vision, Inc.).

2 Comments

Filed under ArchiMate®, Business Architecture, Cloud/SOA, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Open Platform 3.0, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day One Highlights

By Loren K.  Baynes, Director, Global Marketing Communications at The Open Group.

PhillyOn Monday, July 15th, we kicked off our conference in Philadelphia. As Allen Brown, CEO of The Open Group, commented in his opening remarks, Philadelphia is the birthplace of American democracy.  This is the first time The Open Group has hosted a conference in this historical city.

Today’s plenary sessions featured keynote speakers covering topics ranging from an announcement of a new Open Group standard, appointment of a new Fellow, Enterprise Architecture and Transformation, Big Data and spotlights on The Open Group forums, Real-time Embedded Systems and Open Trusted Technology, as well as a new initiative on Healthcare.

Allen Brown noted that The Open Group has 432 member organizations with headquarters in 32 countries and over 40,000 individual members in 126 countries.

The Open Group Vision is Boundaryless Information Flow™ achieved through global interoperability in a secure, reliable and timely manner.  But as stated by Allen, “Boundaryless does not mean there are no boundaries.  It means that boundaries are permeable to enable business”

Allen also presented an overview of the new “Dependability Through Assuredness™ Standard.  The Open Group Real-time Embedded Systems Forum is the home of this standard. More news to come!

Allen introduced Dr. Mario Tokoro, (CEO of Sony Computer Systems Laboratories) who began this project in 2006. Dr. Tokoro stated, “Thank you from the bottom of my heart for understanding the need for this standard.”

Eric Sweden, MSIH MBA, Program Director, Enterprise Architecture & Governance\National Association of State CIOs (NASCIO) offered a presentation entitled “State of the States – NASCIO on Enterprise Architecture: An Emphasis on Cross-Jurisdictional Collaboration across States”.  Eric noted “Enterprise Architecture is a blueprint for better government.” Furthermore, “Cybersecurity is a top priority for government”.

Dr. Michael Cavaretta, Technical Lead and Data Scientist with Ford Motor Company discussed “The Impact of Big Data on the Enterprise”.  The five keys, according to Dr. Cavaretta, are “perform, analyze, assess, track and monitor”.  Please see the following transcript from a Big Data analytics podcast, hosted by The Open Group, Dr. Cavaretta participated in earlier this year. http://blog.opengroup.org/2013/01/28/the-open-group-conference-plenary-speaker-sees-big-data-analytics-as-a-way-to-bolster-quality-manufacturing-and-business-processes/

The final presentation during Monday morning’s plenary was “Enabling Transformation Through Architecture” by Lori Summers (Director of Technology) and Amit Mayabhate (Business Architect Manager) with Fannie Mae Multifamily.

Lori stated that their organization had adopted Business Architecture and today they have an integrated team who will complete the transformation, realize value delivery and achieve their goals.

Amit noted “Traceability from the business to architecture principles was key to our design.”

In addition to the many interesting and engaging presentations, several awards were presented.  Joe Bergmann, Director, Real-time and Embedded Systems Forum, The Open Group, was appointed Fellow by Allen Brown in recognition of Joe’s major achievements over the past 20+ years with The Open Group.

Other special recognition recipients include members from Oracle, IBM, HP and Red Hat.

In addition to the plenary session, we hosted meetings on Finance, Government and Healthcare industry verticals. Today is only Day One of The Open Group conference in Philadelphia. Please stay tuned for more exciting conference highlights over the next couple days.

Comments Off

Filed under ArchiMate®, Business Architecture, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Security Architecture, Standards, TOGAF®

The Open Group Conference to Emphasize Healthcare as Key Sector for Ecosystem-Wide Interactions

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15, in Philadelphia. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.

Gardner

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on enterprise transformation in the finance, government, and healthcare sector.

We’re here now with a panel of experts to explore how new IT trends are empowering improvements, specifically in the area of healthcare. We’ll learn how healthcare industry organizations are seeking large-scale transformation and what are some of the paths they’re taking to realize that.

We’ll see how improved cross-organizational collaboration and such trends as big data and cloud computing are helping to make healthcare more responsive and efficient.

With that, please join me in welcoming our panel, Jason Uppal, Chief Architect and Acting CEO at clinicalMessage. Welcome, Jason.

Jason Uppal: Thank you, Dana.

Inside of healthcare and inside the healthcare ecosystem, information either doesn’t flow well or it only flows at a great cost.

Gardner: And we’re also joined by Larry Schmidt, Chief Technologist at HP for the Health and Life Sciences Industries. Welcome, Larry.

Larry Schmidt: Thank you.

Gardner: And also, Jim Hietala, Vice President of Security at The Open Group. Welcome back, Jim. [Disclosure: The Open Group and HP are sponsors of BriefingsDirect podcasts.]

Jim Hietala: Thanks, Dana. Good to be with you.

Gardner: Let’s take a look at this very interesting and dynamic healthcare sector, Jim. What, in particular, is so special about healthcare and why do things like enterprise architecture and allowing for better interoperability and communication across organizational boundaries seem to be so relevant here?

Hietala: There’s general acknowledgement in the industry that, inside of healthcare and inside the healthcare ecosystem, information either doesn’t flow well or it only flows at a great cost in terms of custom integration projects and things like that.

Fertile ground

From The Open Group’s perspective, it seems that the healthcare industry and the ecosystem really is fertile ground for bringing to bear some of the enterprise architecture concepts that we work with at The Open Group in order to improve, not only how information flows, but ultimately, how patient care occurs.

Gardner: Larry Schmidt, similar question to you. What are some of the unique challenges that are facing the healthcare community as they try to improve on responsiveness, efficiency, and greater capabilities?

Schmidt: There are several things that have not really kept up with what technology is able to do today.

For example, the whole concept of personal observation comes into play in what we would call “value chains” that exist right now between a patient and a doctor. We look at things like mobile technologies and want to be able to leverage that to provide additional observation of an individual, so that the doctor can make a more complete diagnosis of some sickness or possibly some medication that a person is on.

We want to be able to see that observation in real life, as opposed to having to take that in at the office, which typically winds up happening. I don’t know about everybody else, but every time I go see my doctor, oftentimes I get what’s called white coat syndrome. My blood pressure will go up. But that’s not giving the doctor an accurate reading from the standpoint of providing great observations.

Technology has advanced to the point where we can do that in real time using mobile and other technologies, yet the communication flow, that information flow, doesn’t exist today, or is at best, not easily communicated between doctor and patient.

There are plenty of places that additional collaboration and communication can improve the whole healthcare delivery model.

If you look at the ecosystem, as Jim offered, there are plenty of places that additional collaboration and communication can improve the whole healthcare delivery model.

That’s what we’re about. We want to be able to find the places where the technology has advanced, where standards don’t exist today, and just fuel the idea of building common communication methods between those stakeholders and entities, allowing us to then further the flow of good information across the healthcare delivery model.

Gardner: Jason Uppal, let’s think about what, in addition to technology, architecture, and methodologies can bring to bear here? Is there also a lag in terms of process thinking in healthcare, as well as perhaps technology adoption?

Uppal: I’m going to refer to a presentation that I watched from a very well-known surgeon from Harvard, Dr. Atul Gawande. His point was is that, in the last 50 years, the medical industry has made great strides in identifying diseases, drugs, procedures, and therapies, but one thing that he was alluding to was that medicine forgot the cost, that everything is cost.

At what price?

Today, in his view, we can cure a lot of diseases and lot of issues, but at what price? Can anybody actually afford it?

Uppal

His view is that if healthcare is going to change and improve, it has to be outside of the medical industry. The tools that we have are better today, like collaborative tools that are available for us to use, and those are the ones that he was recommending that we need to explore further.

That is where enterprise architecture is a powerful methodology to use and say, “Let’s take a look at it from a holistic point of view of all the stakeholders. See what their information needs are. Get that information to them in real time and let them make the right decisions.”

Therefore, there is no reason for the health information to be stuck in organizations. It could go with where the patient and providers are, and let them make the best decision, based on the best practices that are available to them, as opposed to having siloed information.

So enterprise-architecture methods are most suited for developing a very collaborative environment. Dr. Gawande was pointing out that, if healthcare is going to improve, it has to think about it not as medicine, but as healthcare delivery.

There are definitely complexities that occur based on the different insurance models and how healthcare is delivered across and between countries.

Gardner: And it seems that not only are there challenges in terms of technology adoption and even operating more like an efficient business in some ways. We also have very different climates from country to country, jurisdiction to jurisdiction. There are regulations, compliance, and so forth.

Going back to you, Larry, how important of an issue is that? How complex does it get because we have such different approaches to healthcare and insurance from country to country?

Schmidt: There are definitely complexities that occur based on the different insurance models and how healthcare is delivered across and between countries, but some of the basic and fundamental activities in the past that happened as a result of delivering healthcare are consistent across countries.

As Jason has offered, enterprise architecture can provide us the means to explore what the art of the possible might be today. It could allow us the opportunity to see how innovation can occur if we enable better communication flow between the stakeholders that exist with any healthcare delivery model in order to give us the opportunity to improve the overall population.

After all, that’s what this is all about. We want to be able to enable a collaborative model throughout the stakeholders to improve the overall health of the population. I think that’s pretty consistent across any country that we might work in.

Ongoing work

Gardner: Jim Hietala, maybe you could help us better understand what’s going on within The Open Group and, even more specifically, at the conference in Philadelphia. There is the Population Health Working Group and there is work towards a vision of enabling the boundaryless information flow between the stakeholders. Any other information and detail you could offer would be great.[Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Hietala: On Tuesday of the conference, we have a healthcare focus day. The keynote that morning will be given by Dr. David Nash, Dean of the Jefferson School of Population Health. He’ll give what’s sure to be a pretty interesting presentation, followed by a reactors’ panel, where we’ve invited folks from different stakeholder constituencies.

Hietala

We are going to have clinicians there. We’re going to have some IT folks and some actual patients to give their reaction to Dr. Nash’s presentation. We think that will be an interesting and entertaining panel discussion.

The balance of the day, in terms of the healthcare content, we have a workshop. Larry Schmidt is giving one of the presentations there, and Jason and myself and some other folks from our working group are involved in helping to facilitate and carry out the workshop.

The goal of it is to look into healthcare challenges, desired outcomes, the extended healthcare enterprise, and the extended healthcare IT enterprise and really gather those pain points that are out there around things like interoperability to surface those and develop a work program coming out of this.

We want to be able to enable a collaborative model throughout the stakeholders to improve the overall health of the population.

So we expect it to be an interesting day if you are in the healthcare IT field or just the healthcare field generally, it would definitely be a day well spent to check it out.

Gardner: Larry, you’re going to be talking on Tuesday. Without giving too much away, maybe you can help us understand the emphasis that you’re taking, the area that you’re going to be exploring.

Schmidt: I’ve titled the presentation “Remixing Healthcare through Enterprise Architecture.” Jason offered some thoughts as to why we want to leverage enterprise architecture to discipline healthcare. My thoughts are that we want to be able to make sure we understand how the collaborative model would work in healthcare, taking into consideration all the constituents and stakeholders that exist within the complete ecosystem of healthcare.

This is not just collaboration across the doctors, patients, and maybe the payers in a healthcare delivery model. This could be out as far as the drug companies and being able to get drug companies to a point where they can reorder their raw materials to produce new drugs in the case of an epidemic that might be occurring.

Real-time model

It would be a real-time model that allows us the opportunity to understand what’s truly happening, both to an individual from a healthcare standpoint, as well as to a country or a region within a country and so on from healthcare. This remixing of enterprise architecture is the introduction to that concept of leveraging enterprise architecture into this collaborative model.

Then, I would like to talk about some of the technologies that I’ve had the opportunity to explore around what is available today in technology. I believe we need to have some type of standardized messaging or collaboration models to allow us to further facilitate the ability of that technology to provide the value of healthcare delivery or betterment of healthcare to individuals. I’ll talk about that a little bit within my presentation and give some good examples.

It’s really interesting. I just traveled from my company’s home base back to my home base and I thought about something like a body scanner that you get into in the airport. I know we’re in the process of eliminating some of those scanners now within the security model from the airports, but could that possibly be something that becomes an element within healthcare delivery? Every time your body is scanned, there’s a possibility you can gather information about that, and allow that to become a part of your electronic medical record.

There is a lot of information available today that could be used in helping our population to be healthier.

Hopefully, that was forward thinking, but that kind of thinking is going to play into the art of the possible, with what we are going to be doing, both in this presentation and talking about that as part of the workshop.

Gardner: Larry, we’ve been having some other discussions with The Open Group around what they call Open Platform 3.0™, which is the confluence of big data, mobile, cloud computing, and social.

One of the big issues today is this avalanche of data, the Internet of things, but also the Internet of people. It seems that the more work that’s done to bring Open Platform 3.0 benefits to bear on business decisions, it could very well be impactful for centers and other data that comes from patients, regardless of where they are, to a medical establishment, regardless of where it is.

So do you think we’re really on the cusp of a significant shift in how medicine is actually conducted?

Schmidt: I absolutely believe that. There is a lot of information available today that could be used in helping our population to be healthier. And it really isn’t only the challenge of the communication model that we’ve been speaking about so far. It’s also understanding the information that’s available to us to take that and make that into knowledge to be applied in order to help improve the health of the population.

As we explore this from an as-is model in enterprise architecture to something that we believe we can first enable through a great collaboration model, through standardized messaging and things like that, I believe we’re going to get into even deeper detail around how information can truly provide empowered decisions to physicians and individuals around their healthcare.

So it will carry forward into the big data and analytics challenges that we have talked about and currently are talking about with The Open Group.

Healthcare framework

Gardner: Jason Uppal, we’ve also seen how in other business sectors, industries have faced transformation and have needed to rely on something like enterprise architecture and a framework like TOGAF® in order to manage that process and make it something that’s standardized, understood, and repeatable.

It seems to me that healthcare can certainly use that, given the pace of change, but that the impact on healthcare could be quite a bit larger in terms of actual dollars. This is such a large part of the economy that even small incremental improvements can have dramatic effects when it comes to dollars and cents.

So is there a benefit to bringing enterprise architect to healthcare that is larger and greater than other sectors because of these economics and issues of scale?

Uppal: That’s a great way to think about this thing. In other industries, applying enterprise architecture to do banking and insurance may be easily measured in terms of dollars and cents, but healthcare is a fundamentally different economy and industry.

It’s not about dollars and cents. It’s about people’s lives, and loved ones who are sick, who could very easily be treated, if they’re caught in time and the right people are around the table at the right time. So this is more about human cost than dollars and cents. Dollars and cents are critical, but human cost is the larger play here.

Whatever systems and methods are developed, they have to work for everybody in the world.

Secondly, when we think about applying enterprise architecture to healthcare, we’re not talking about just the U.S. population. We’re talking about global population here. So whatever systems and methods are developed, they have to work for everybody in the world. If the U.S. economy can afford an expensive healthcare delivery, what about the countries that don’t have the same kind of resources? Whatever methods and delivery mechanisms you develop have to work for everybody globally.

That’s one of the things that a methodology like TOGAF brings out and says to look at it from every stakeholder’s point of view, and unless you have dealt with every stakeholder’s concerns, you don’t have an architecture, you have a system that’s designed for that specific set of audience.

The cost is not this 18 percent of the gross domestic product in the U.S. that is representing healthcare. It’s the human cost, which is many multitudes of that. That’s is one of the areas where we could really start to think about how do we affect that part of the economy, not the 18 percent of it, but the larger part of the economy, to improve the health of the population, not only in the North America, but globally.

If that’s the case, then what really will be the impact on our greater world economy is improving population health, and population health is probably becoming our biggest problem in our economy.

We’ll be testing these methods at a greater international level, as opposed to just at an organization and industry level. This is a much larger challenge. A methodology like TOGAF is a proven and it could be stressed and tested to that level. This is a great opportunity for us to apply our tools and science to a problem that is larger than just dollars. It’s about humans.

All “experts”

Gardner: Jim Hietala, in some ways, we’re all experts on healthcare. When we’re sick, we go for help and interact with a variety of different services to maintain our health and to improve our lifestyle. But in being experts, I guess that also means we are witnesses to some of the downside of an unconnected ecosystem of healthcare providers and payers.

One of the things I’ve noticed in that vein is that I have to deal with different organizations that don’t seem to communicate well. If there’s no central process organizer, it’s really up to me as the patient to pull the lines together between the different services — tests, clinical observations, diagnosis, back for results from tests, sharing the information, and so forth.

Have you done any studies or have anecdotal information about how that boundaryless information flow would be still relevant, even having more of a centralized repository that all the players could draw on, sort of a collaboration team resource of some sort? I know that’s worked in other industries. Is this not a perfect opportunity for that boundarylessness to be managed?

Hietala: I would say it is. We all have experiences with going to see a primary physician, maybe getting sent to a specialist, getting some tests done, and the boundaryless information that’s flowing tends to be on paper delivered by us as patients in all the cases.

So the opportunity to improve that situation is pretty obvious to anybody who’s been in the healthcare system as a patient. I think it’s a great place to be doing work. There’s a lot of money flowing to try and address this problem, at least here in the U.S. with the HITECH Act and some of the government spending around trying to improve healthcare.

We’ll be testing these methods at a greater international level, as opposed to just at an organization and industry level.

You’ve got healthcare information exchanges that are starting to develop, and you have got lots of pain points for organizations in terms of trying to share information and not having standards that enable them to do it. It seems like an area that’s really a great opportunity area to bring lots of improvement.

Gardner: Let’s look for some examples of where this has been attempted and what the success brings about. I’ll throw this out to anyone on the panel. Do you have any examples that you can point to, either named organizations or anecdotal use case scenarios, of a better organization, an architectural approach, leveraging IT efficiently and effectively, allowing data to flow, putting in processes that are repeatable, centralized, organized, and understood. How does that work out?

Uppal: I’ll give you an example. One of the things that happens when a patient is admitted to hospital and in hospital is that they get what’s called a high-voltage care. There is staff around them 24×7. There are lots of people around, and every specialty that you can think of is available to them. So the patient, in about two or three days, starts to feel much better.

When that patient gets discharged, they get discharged to home most of the time. They go from very high-voltage care to next to no care. This is one of the areas where in one of the organizations we work with is able to discharge the patient and, instead of discharging them to the primary care doc, who may not receive any records from the hospital for several days, they get discharged to into a virtual team. So if the patient is at home, the virtual team is available to them through their mobile phone 24×7.

Connect with provider

If, at 3 o’clock in the morning, the patient doesn’t feel right, instead of having to call an ambulance to go to hospital once again and get readmitted, they have a chance to connect with their care provider at that time and say, “This is what the issue is. What do you want me to do next? Is this normal for the medication that I am on, or this is something abnormal that is happening?”

When that information is available to that care provider who may not necessarily have been part of the care team when the patient was in the hospital, that quick readily available information is key for keeping that person at home, as opposed to being readmitted to the hospital.

We all know that the cost of being in a hospital is 10 times more than it is being at home. But there’s also inconvenience and human suffering associated with being in a hospital, as opposed to being at home.

Those are some of the examples that we have, but they are very limited, because our current health ecosystem is a very organization specific, not  patient and provider specific. This is the area there is a huge room for opportunities for healthcare delivery, thinking about health information, not in the context of the organization where the patient is, as opposed to in a cloud, where it’s an association between the patient and provider and health information that’s there.

Extending that model will bring infinite value to not only reducing the cost, but improving the cost and quality of care.

In the past, we used to have emails that were within our four walls. All of a sudden, with Gmail and Yahoo Mail, we have email available to us anywhere. A similar thing could be happening for the healthcare record. This could be somewhere in the cloud’s eco setting, where it’s securely protected and used by only people who have granted access to it.

Those are some of the examples where extending that model will bring infinite value to not only reducing the cost, but improving the cost and quality of care.

Schmidt: Jason touched upon the home healthcare scenario and being able to provide touch points at home. Another place that we see evolving right now in the industry is the whole concept of mobile office space. Both countries, as well as rural places within countries that are developed, are actually getting rural hospitals and rural healthcare offices dropped in by helicopter to allow the people who live in those communities to have the opportunity to talk to a doctor via satellite technologies and so on.

The whole concept of a architecture around and being able to deal with an extension of what truly lines up being telemedicine is something that we’re seeing today. It would be wonderful if we could point to things like standards that allow us to be able to facilitate both the communication protocols as well as the information flows in that type of setting.

Many corporations can jump on the bandwagon to help the rural communities get the healthcare information and capabilities that they need via the whole concept of telemedicine.

That’s another area where enterprise architecture has come into play. Now that we see examples of that working in the industry today, I am hoping that as part of this working group, we’ll get to the point where we’re able to facilitate that much better, enabling innovation to occur for multiple companies via some of the architecture or the architecture work we are planning on producing.

Single view

Gardner: It seems that we’ve come a long way on the business side in many industries of getting a single view of the customer, as it’s called, the customer relationship management, big data, spreading the analysis around among different data sources and types. This sounds like a perfect fit for a single view of the patient across their life, across their care spectrum, and then of course involving many different types of organizations. But the government also needs to have a role here.

Jim Hietala, at The Open Group Conference in Philadelphia, you’re focusing on not only healthcare, but finance and government. Regarding the government and some of the agencies that you all have as members on some of your panels, how well do they perceive this need for enterprise architecture level abilities to be brought to this healthcare issue?

Hietala: We’ve seen encouraging signs from folks in government that are encouraging to us in bringing this work to the forefront. There is a recognition that there needs to be better data flowing throughout the extended healthcare IT ecosystem, and I think generally they are supportive of initiatives like this to make that happen.

Gardner: Of course having conferences like this, where you have a cross pollination between vertical industries, will perhaps allow some of the technical people to talk with some of the government people too and also have a conversation with some of the healthcare people. That’s where some of these ideas and some of the collaboration could also be very powerful.

We’ve seen encouraging signs from folks in government that are encouraging to us in bringing this work to the forefront.

I’m afraid we’re almost out of time. We’ve been talking about an interesting healthcare transition, moving into a new phase or even era of healthcare.

Our panel of experts have been looking at some of the trends in IT and how they are empowering improvement for how healthcare can be more responsive and efficient. And we’ve seen how healthcare industry organizations can take large scale transformation using cross-organizational collaboration, for example, and other such tools as big data, analytics, and cloud computing to help solve some of these issues.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference this July in Philadelphia. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL, and you will hear more about healthcare or Open Platform 3.0 as well as enterprise transformation in the finance, government, and healthcare sectors.

With that, I’d like to thank our panel. We’ve been joined today by Jason Uppal, Chief Architect and Acting CEO at clinicalMessage. Thank you so much, Jason.

Uppal: Thank you, Dana.

Gardner: And also Larry Schmidt, Chief Technologist at HP for the Health and Life Sciences Industries. Thanks, Larry.

Schmidt: You bet, appreciate the time to share my thoughts. Thank you.

Gardner: And then also Jim Hietala, Vice President of Security at The Open Group. Thanks so much.

Hietala: Thank you, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these thought leader interviews. Thanks again for listening and come back next time.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Conference, Enterprise Architecture, Healthcare, Open Platform 3.0, Professional Development, Service Oriented Architecture, TOGAF, TOGAF®

NASCIO Defines State of Enterprise Architecture at The Open Group Conference in Philadelphia

By E.G. Nadhan, HP

I have attended and blogged about many Open Group conferences. The keynotes at these conferences like other conferences provide valuable insight into the key messages and the underlying theme for the conference – which is Enterprise Architecture and Enterprise Transformation for The Open Group Conference in Philadelphia. Therefore, it is no surprise that Eric Sweden, Program Director, Enterprise Architecture & Governance, NASCIO will be delivering one of the keynotes on “State of the States: NASCIO on Enterprise Architecture”. Sweden asserts “Enterprise Architecture” provides an operating discipline for creating, operating, continual re-evaluation and transformation of an “Enterprise.” Not only do I agree with this assertion, but I would add that the proper creation, operation and continuous evaluation of the “Enterprise” systemically drives its transformation. Let’s see how.

Creation. This phase involves the definition of the Enterprise Architecture (EA) in the first place. Most often, this involves the definition of an architecture that factors in what is in place today while taking into account the future direction. TOGAF® (The Open Group Architecture Framework) provides a framework for developing this architecture from a business, application, data, infrastructure and technology standpoint; in alignment with the overall Architecture Vision with associated architectural governance.

Operation. EA is not a done deal once it has been defined. It is vital that the EA defined is sustained on a consistent basis with the advent of new projects, new initiatives, new technologies, and new paradigms. As the abstract states, EA is a comprehensive business discipline that drives business and IT investments. In addition to driving investments, the operation phase also includes making the requisite changes to the EA as a result of these investments.

Continuous Evaluation. We live in a landscape of continuous change with innovative solutions and technologies constantly emerging. Moreover, the business objectives of the enterprise are constantly impacted by market dynamics, mergers and acquisitions. Therefore, the EA defined and in operation must be continuously evaluated against the architectural principles, while exercising architectural governance across the enterprise.

Transformation. EA is an operating discipline for the transformation of an enterprise. Enterprise Transformation is not a destination — it is a journey that needs to be managed — as characterized by Twentieth Century Fox CIO, John Herbert. To Forrester Analyst Phil Murphy, Transformation is like the Little Engine That Could — focusing on the business functions that matter. (Big Data – highlighted in another keynote at this conference by Michael Cavaretta — is a paradigm gaining a lot of ground for enterprises to stay competitive in the future.)

Global organizations are enterprises of enterprises, undergoing transformation faced with the challenges of systemic architectural governance. NASCIO has valuable insight into the challenges faced by the 50 “enterprises” represented by each of the United States. Challenges that contrast the need for healthy co-existence of these states with the desire to retain a degree of autonomy. Therefore, I look forward to this keynote to see how EA done right can drive the transformation of the Enterprise.

By the way, remember when Enterprise Architecture was done wrong close to the venue of another Open Group conference?

How does Enterprise Architecture drive the transformation of your enterprise? Please let me know.

A version of this blog post originally appeared on the HP Journey through Enterprise IT Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. 

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, TOGAF®

As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

This latest BriefingsDirect discussion, leading into the The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore the business implications of the current shift to so-called Platform 3.0.

Known as the new model through which big data, cloud, and mobile and social — in combination — allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we’re here now to learn more how to leverage Platform 3.0 as more than a IT shift — and as a business game-changer. It will be a big topic at next week’s conference.

The panel: Dave Lounsbury, Chief Technical Officer at The Open Group; Chris Harding, Director of Interoperability at The Open Group, and Mark Skilton, Global Director in the Strategy Office at Capgemini. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They’re turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it’s a pretty fundamental change.

Lounsbury

If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we’re starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Harding: Enterprises have to keep up with the way that things are moving in order to keep their positions in their industries. Enterprises can’t afford to be working with yesterday’s technology. It’s a case of being able to understand the information that they’re presented, and make the best decisions.

Harding

We’ve always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we’re talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs — velocity, volume, and value — on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we’re now into is the multi-workload environment, where you have mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton

It has to do with not just one solution, not one subscription model — because we’re now into this subscription-model era … the subscription economy, as one group tends to describe it. Now, we’re looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we’re dealing with — 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Why should IT be thinking about this as a fundamental shift, rather than a modest change?

Lounsbury: A lot depends on how you define your IT organization. It’s useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it’s how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There’s no point giving someone data if it’s not been properly managed or if there’s incorrect information.

What’s going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we’ve seen in the open-source  role and things like that nature, but there’s the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they’re going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can’t do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally — how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I’d just like to add to Dave’s excellent points about, the shape of data has changed, but also about why should IT get involved. We’re seeing that there’s a shift in the constituency of who is using this data.

We have the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We’ve got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there’s also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: How do we prevent this from going off the rails?

Harding: This a very important point. And to add to the difficulties, it’s not only that a whole set of different people are getting involved with different kinds of information, but there’s also a step change in the speed with which all this is delivered. It’s no longer the case, that you can say, “Oh well, we need some kind of information system to manage this information. We’ll procure it and get a program written” that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It’s really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It’s a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Skilton: Capgemini has been doing work in this area. I break it down into four levels of scalability. It’s the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We’re very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it’s all about connectivity in the field. I meet a number of clients who are saying, “We’ve got this cloud service,” or “This service is in a certain area of my country. If I move to another parts of the country or I’m traveling, I can’t get connectivity.” That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the “long tail effect.” It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: We’re coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3.0, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We’re still in the formational stages of  “third platform” or Platform 3.0 for The Open Group as an industry. To some extent, we’re starting pretty much at the ground floor with that in the Platform 3.0 forum. We’re leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we’ve got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we’re really working toward in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Harding: We certainly also need to understand the business environment within which Platform 3.0 will be used. We’ve heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we’ve heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We’re also hearing about marketplaces for services, new ways in which services are being made available and combined.

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: Looking to the future, when we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we’ve begun to see that how a recommendation engine could be brought to bear. I’m thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thoughts?

Skilton: What we’re talking about is the next generation of the Internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the Internet.

I think that in the future, we’ll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We’ll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what’s happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: There’s this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, “So now this is where we could make a change to our business.” It’s the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It’s going to be different for every business, and I’m very happy to say this, it’s something that computers aren’t going to be able to do for a very long time yet. It’s going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it’s a very exciting time, and we’ll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we’ll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, “It’s going to be them” or “It’s going to be them.”

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I’d disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we’ll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Cloud/SOA, Conference, Data management, Enterprise Architecture, Platform 3.0, Professional Development, TOGAF®

The Open Group July Conference Emphasizes Value of Placing Structure and Agility Around Enterprise Risk Reduction Efforts

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15 in Philadelphia.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these discussions on Enterprise Transformation in the finance, government, and healthcare sector.

We’re here now with a panel of experts to explore new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We’ll learn how enterprises are better delivering risk assessment and, one hopes, defenses, in the current climate of challenging cyber security. And we’ll see how predicting risks and potential losses accurately, is an essential ingredient in enterprise transformation.

With that, please join me in welcoming our panel, we’re here with Jack Freund, the Information Security Risk Assessment Manager at TIAA-CREF. Jack has spent over 14 years in enterprise IT, is a visiting professor at DeVry University, and also chairs a Risk-Management Subcommittee for the ISACA. Welcome back, Jack.

Jack Freund: Glad to be here, Dana. Thanks for having me.

Gardner: We’re also here with Jack Jones. He is the Principal at CXOWARE, and he has more than nine years of experience as a Chief Information Security Officer (CISO). He is also an inventor of the FAIR, risk analysis framework. Welcome, Jack.

Jack Jones: Thank you very much.

Gardner: We’re also here with Jim Hietala. He is the Vice President, Security, at The Open Group. Welcome, Jim.

Jim Hietala: Thanks, Dana, good to be here.

Gardner: Let’s start with you, Jim. It’s been about six months since we spoke about these issues around risk assessment and understanding risk accurately, and it’s hard to imagine things getting any better in the last six months. There’s been a lot of news and interesting developments in the cyber-security landscape.

So has this heightened interest? What are The Open Group and others are doing in this field of risk assessment and accuracy and determining what your losses might be and how that can be a useful tool?

Hietala: I would say it has. Certainly, in the cyber security world in the past six or nine months, we’ve seen more and more discussion of the threats that are out there. We’ve got nation-state types of threats that are very concerning, very serious, and that organizations have to consider.

With what’s happening, you’ve seen that the US Administration and President Obama direct the National Institute of Standards and Technology (NIST) to develop a new cybersecurity framework. Certainly on the government side of things, there is an increased focus on what can we do to increase the level of cybersecurity throughout the country in critical infrastructure. So my short answer would be yes, there is more interest in coming up with ways to accurately measure and assess risk so that we can then deal with it.

Gardner: Jack Jones, do you also see a maturity going on, or are we just hearing more in the news and therefore there is a perception shift? How do you see things? How have things changed, in your perception, over the last six to nine months?

Jones: I continue to see growth and maturity, especially in areas of understanding the fundamental nature of risk and exploration of quantitative methods for it. A few years ago, that would have seemed unrealistic at best, and outlandish at worst in many people’s eyes. Now, they’re beginning to recognize that it is not only pragmatic, but necessary in order to get a handle on much of what we have to do from a prioritization perspective.

Gardner: Jack Freund are you seeing an elevation in the attention being paid to risk issues inside companies in larger organizations? Is this something that’s getting the attention of all the people it should?

Freund: We’re entering a phase where there is going to be increased regulatory oversights over very nearly everything. When that happens, all eyes are going to turn to IT and IT risk management functions to answer the question of whether we’re handling the right things. Without quantifying risk, you’re going to have a very hard time saying to your board of directors that you’re handling the right things the way a reasonable company should.

As those regulators start to see and compare among other companies, they’ll find that these companies over here are doing risk quantification, and you’re not. You’re putting yourself at a competitive disadvantage by not being able to provide those same sorts of services.

Gardner: So you’re saying that the market itself hasn’t been enough to drive this, and that regulation is required?

Freund: It’s probably a stronger driver than market forces at this point. The market is always going to be able to help push that to a more prominent role, but especially in information security. If you’re not experiencing primary losses as a result of these sorts of things, then you have to look to economic externalities, which are largely put in play by regulatory forces here in the United States.

Jones: To support Jack’s statement that regulators are becoming more interested in this too, just in the last 60 days, I’ve spent time training people at two regulatory agencies on FAIR. So they’re becoming more aware of these quantitative methods, and their level of interest is rising.

Gardner: Jack Jones, this is probably a good time for us to explain a little bit more about FAIR. For those listeners who might not be that familiar with it, please take a moment to give us the high-level overview of what FAIR is.

Jones: Sure, just thumbnail sketch of it. It’s, first and foremost, a model for what risk is and how it works. It’s a decomposition of the factors that make up risk. If you can measure or estimate the value of those factors, you can derive risk quantitatively in dollars and cents.

You see a lot of “risk quantification” based on ordinal scales — 1, 2, 3, 4, 5 scales, that sort of thing. But that’s actually not quantitative. If you dig into it, there’s no way you could defend a mathematical analysis based on those ordinal approaches. So FAIR is this model for risk that enables true quantitative analysis in a very pragmatic way.

Gardner: FAIR stands for a Factor Analysis of Information Risk. Is that correct?

Jones: That is correct.

Gardner: Jim Hietala, we also have in addition to a very interesting and dynamic cybersecurity landscape a major trend getting traction in big data, cloud computing, and mobile. There’s lots going on in the IT world. Perhaps IT’s very nature, the roles and responsibilities, are shifting. Is doing risk assessment and management becoming part and parcel of core competency of IT, and is that a fairly big departure from the past?

Hietala: As to the first question, it’s having to become kind of a standard practice within IT. When you look at outsourcing your IT operations to a cloud-service provider, you have to consider the security risks in that environment. What do they look like and how do we measure them?

It’s the same thing for things like mobile computing. You really have to look at the risks of folks carrying tablets and smart phones, and understand the risks associated with those same things for big data. For any of these large-scale changes to our IT infrastructure you’ve got to understand what it means from a security and risk standpoint.

Gardner: Jack Freund or Jack Jones, any thoughts about the changing role of IT as a service and service-level agreement brokering aspects of IT aligned with risk assessment?

Freund: I read an interesting article this morning around a school district that is doing something they call bring your own technology (BYOT). For anybody who has been involved in these sort of efforts in the corporate world that should sound very familiar. But I want to think culturally around this. When you have students wondering how to do these sorts of things and becoming accustomed to being able to bring current technology, oh my gosh. When they get to the corporate world and start to work, they’re going to expect the same sorts of levels of service.

To answer to your earlier question, absolutely. We have to find a way to embed risk assessment, which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive. That’s important.

Whether that’s an embedded function within IT or it’s an overarching function that exists across multiple business units, there are different models that work for different size companies and companies of different cultural types. But it has to be there. It’s absolutely critical.

Gardner: Jack Jones, how do you come down this role of IT shifting in the risk assessment issues, something that’s their responsibility. Are they embracing that or  maybe wishing it away?

Jones: It depends on whom you talk to. Some of them would certainly like to wish it away. I don’t think IT’s role in this idea for risk assessment and such has really changed. What is changing is the level of visibility and interest within the organization, the business side of the organization, in the IT risk position.

Previously, they were more or less tucked away in a dark corner. People just threw money at it and hoped bad things didn’t happen. Now, you’re getting a lot more board-level interest in IT risk, and with that visibility comes a responsibility, but also a certain amount of danger. If they’re doing it really badly, they’re incredibly immature in how they approach risk.

They’re going to look pretty foolish in front of the board. Unfortunately, I’ve seen that play out. It’s never pretty and it’s never good news for the IT folks. They’re realizing that they need to come up to speed a little bit from a risk perspective, so that they won’t look the fools when they’re in front of these executives.

They’re used to seeing quantitative measures of opportunities and operational issues of risk of various natures. If IT comes to the table with a red, yellow, green chart, the board is left to wonder, first how to interpret that, and second, whether these guys really get it. I’m not sure the role has changed, but I think the responsibilities and level of expectations are changing.

Gardner: Part of what FAIR does in risk analysis in general is to identify potential losses and put some dollars on what potential downside there is. That provides IT with the tool, the ability, to rationalize investments that are needed. Are you seeing the knowledge of potential losses to be an incentive for spending on modernization?

Jones: Absolutely. One organization I worked with recently had certain deficiencies from the security perspective that they were aware of, but that were going to be very problematic to fix. They had identified technology and process solutions that they thought would take them a long way towards a better risk position. But it was a very expensive proposition, and they didn’t have money in the IT or information security budget for it.

So, we did a current-state analysis using FAIR, how much loss exposure they had on annualized basis. Then, we said, “If you plug this solution into place, given how it affects the frequency and magnitude of loss that you’d expect to experience, here’s what’s your new annualized loss exposure would be.” It turned out to be a multimillion dollar reduction in annualized loss exposure for a few hundred thousand dollars cost.

When they took that business case to management, it was a no-brainer, and management signed the check in a hurry. So they ended up being in a much better position.

If they had gone to executive management saying, “Well, we’ve got a high risk and if we buy this set of stuff we’ll have low or medium risk,” it would’ve been a much less convincing and understandable business case for the executives. There’s reason to expect that it would have been challenging to get that sort of funding given how tight their corporate budgets were and that sort of thing. So, yeah, it can be incredibly effective in those business cases.

Gardner: Correct me if I am wrong, but you have a book out since we last spoke. Jack, maybe you could tell a bit about of that and how that comes to bear on these issues?

Freund: Well, the book is currently being written. Jack Jones and I have entered into a contract with Elsevier and we’re also going to be preparing the manuscript here over the summer and winter. Probably by second quarter next year, we’ll have something that we can share with everybody. It’s something that has been a long time coming. For Jack, I know he has wanted to write this for a long time.

We wanted to build a conversational book around how to assess risk using FAIR, and that’s an important distinction from other books in the market today. You really want to dig into a lot of the mathematical stuff. I’m speaking personally here, but I wanted to build a book that gave people tools, gave practitioners the risk tools to be able to handle common challenges and common opposition to what they are doing every day, and just understand how to apply concepts in FAIR in a very tangible way.

Gardner: Very good. What about the conference itself. We’re coming up very rapidly on The Open Group Conference. What should we expect in terms of some of your presentations and training activities?

Jones: I think it will be a good time. People would be pleased to have the quality of the presentations and some of the new information that they’ll get to see and experience. As you said, we’re offering FAIR training as a part of a conference. It’s a two-day session with an opportunity afterwards to take the certification exam.

If history is any indication, people will go through the training. We get a lot of very positive remarks about a number of different things. One, they never imagined that risk could be interesting. They’re also surprised that it’s not, as one friend of mine calls it “rocket surgery.” It’s relatively straightforward and intuitive stuff. It’s just that as a profession, we haven’t had this framework for reference, as well as some of the methods that we apply to make it practical and defensible before.

So we’ve gotten great feedback in the past, and I think people will be pleasantly surprised at what they experienced.

Freund: One of the things I always say about FAIR training is it’s a real red pill-blue pill moment — in reference to the old Matrix movies. I took FAIR training several years ago with Jack. I always tease Jack that it’s ruined me for other risk assessment methods. Once you learn how to do it right, it’s very obvious which are the wrong methods and why you can’t use them to assess risk and why it’s problematic.

I’m joking. It’s really great and valuable training, and now I use it every day. It really does open your eyes to the problems and the risk assessment portion of IT today, and gives a very practical and actionable things to do in order to be able to fix that, and to provide value to your organization.

Gardner: Jim Hietala, the emphasis in terms of vertical industries at the conference is on finance, government and healthcare. They seem to be the right groups to be factoring more standardization and understanding of risk. Tell me how it comes together. Why is The Open Group looking at vertical industries at this time?

Hietala: Specific to risk, if I can talk about that for a second, the healthcare world, at least here in the US, has new security rules, and one of the first few requirements is perform an annual risk assessment. So it’s currently relevant to that industry.

It’s the same thing with finance. One of the regulations around financial organizations tells them that, in terms of information security, they need to do a risk assessment. In government, clearly there has been a lot of emphasis on understanding risk and mitigating it throughout various government sectors.

In terms of The Open Group and verticals, we’ve done lots of great work in the area of Enterprise Architecture, security, and all the areas for which we’ve done work. In terms of our conferences, we’ve evolved things over the last year or so to start to look at what are the things that are unique in verticals.

It started in the mining industry. We set up a mining metals and exploration forum that looked at IT and architecture issues related specifically to that sector. We started that work several years ago and now we’re looking at other industries and starting to assess the unique things in healthcare, for example. We’ve got a one day workshop at Philadelphia on the Tuesday of the conference, looking at IT and transformation opportunities in the healthcare sector.

That’s how we got to this point, and we’ll see more of that from The Open Group in the future.

Gardner: Are there any updates that we should be aware of in terms of activities within The Open Group and other organizations working on standards, taxonomy, and definitions when it comes to risk?

Hietala: I’ll take that and dive into that. We at The Open Group originally published a risk taxonomy standard based on FAIR four years ago. Over time, we’ve seen greater adoption by large companies and we’ve also seen the need to extend what we’re doing there. So we’re updating the risk taxonomy standard, and the new version of that should be published by the end of this summer.

We also saw within the industry the need for a certification program for risk analysts, and so they’d be trained in quantitative risk assessment using FAIR. We’re working on that program and we’ll be talking more about it in Philadelphia.

Along the way, as we were building the certification program, we realized that there was a missing piece in terms of the body of knowledge. So we created a second standard that is a companion to the taxonomy. That will be called the Risk Analysis Standard that looks more at some of that the process issues and how to do risk analysis using FAIR. That standard will also be available by the end of the summer and, combined, those two standards will form the body of knowledge that we’ll be testing against in the certification program when it goes live later this year.

Gardner: Jack Freund, it seems that between regulatory developments, the need for maturity in these enterprises, and the standardization that’s being brought to bear by such groups as The Open Group, it’s making this quite a bit more of the science and less of an art.

What does that bring to organizations in terms of a bottom-line effect? I wonder if there is a use case or even an example that you could mention and explain that would help people better understand of what they get back when they go through these processes and they get this better maturity around risk?

Freund: I’m not an attorney, but I have had a lot of lawyers tell me — I think Jim had mentioned before in his vertical conversation — that a lot of the regulations start with performing annual risk assessment and then choose controls based upon that. They’re not very prescriptive that way.

One of the things that it drives in organizations is a sense of satisfaction that we’ve got things covered more than anything else. When you have your leadership in these organizations understanding that you’re doing what a regular reasonable company would do to manage risk this way, you have fewer fire drills. Nobody likes to walk into work and have to deal with hundred different things.

We’re moving hard drives out of printers and fax machines, what are we doing around scanning and vulnerabilities, and all of those various things that every single day can inundate you with worry, as opposed to focusing on the things that matter.

I like a folksy saying that sort of sums things up pretty well — a dime holding up a dollar. You have all these little bitty squabbly issues that get in the way of really focusing on reducing risk in your organization in meaningful ways and focusing on the things that matter.

Using approaches like FAIR, drives a lot of value into your organization, because you’re freeing up mind share in your executives to focus on things that really matter.

Gardner: Jack Jones, a similar question, any examples that exemplify the virtues of doing the due diligence and having some of these systems and understanding in place?

Jones: I have an example to Jack Freund’s point about being able to focus and prioritize. One organization I was working with had identified a significant risk issue and they were considering three different options for risk mitigation that had been proposed. One was “best practice,” and the other two were less commonly considered for that particular issue.

An analysis showed with real clarity that option B, one of the not-best practice options, should reduce risk every bit as effectively as best practice, but had a whole lot lower cost. The organization then got to make an informed decision about whether they were going to be herd followers or whether they were going to be more cost-effective in risk management.

Unfortunately, there’s always danger in not following the herd. If something happens downstream, and you didn’t follow best practice, you’re often asked to explain why you didn’t follow the herd.

That was part of the analysis too, but at the end of the day, management got to make a decision on how they wanted to behave. They chose to not follow best practice and be more cost-effective in using their money. When I asked them why they felt comfortable with that, they said, “Because we’re comfortable with the rigor in your analysis.”

To your question earlier about art-versus-science, first of all, in most organization there would have been no question. They would have said, “We must follow best practice.” They wouldn’t even examine the options, and management wouldn’t have had the opportunity to make that decision.

Furthermore, even if they had “examined” those options using a more subjective, artistic approach, somebody’s wet finger in the air, management almost certainly would not have felt comfortable with a non-best practice approach. So, the more scientific, more rigorous, approach that something like FAIR provides, gives you all kinds of opportunity to make informed decisions and to feel more comfortable about those decisions.

Gardner: It really sounds as if there’s a synergistic relationship between a lot of the big-data and analytics investments that are being made for a variety of reasons, and also this ability to bring more science and discipline to risk analysis.

How do those come together, Jack Jones? Are we seeing the dots being connected in these large organizations that they can take more of what they garner from big data and business intelligence (BI) and apply that to these risk assessment activities, is that happening yet?

Jones: It’s just beginning to. It’s very embryonic, and there are only probably a couple of organizations out there that I would argue are doing that with any sort of effectiveness. Imagine that — they’re both using FAIR.

But when you think about BI or any sort of analytics, there are really two halves to the equation. One is data and the other is models. You can have all the data in the world, but if your models stink, then you can’t be effective. And, of course, vice versa. If you’ve got great model and zero data, then you’ve got challenges there as well.

Being able to combine the two, good data and effective models, puts you in much better place. As an industry, we aren’t there yet. We’ve got some really interesting things going on, and so there’s a lot of potential there, but people have to leverage that data effectively and make sure they’re using a model that makes sense.

There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you. The models will grossly misinform you. So people have to be careful, because data is great, but if you’re applying it to a bad model, then you’re in trouble.

Gardner: We are coming up near the end of our half hour. Jack Freund, for those organizations that are looking to get started, to get more mature, perhaps start leveraging some of their investments in areas like big data, in addition to attending The Open Group Conference or watching some of the plenary sessions online, what tips do you have for getting started? Are there some basic building blocks that should be in place or ways in which to get the ball rolling when it comes to a better risk analysis?

Freund: Strong personality matters in this. They have to have some sort of evangelist in the organization who cares enough about it to drive it through to completion. That’s a stake on the ground to say, “Here is where we’re going to start, and here is the path that we are going to go on.”

When you start doing that sort of thing, even if leadership changes and other things happen, you have a strong commitment from the organization to keep moving forward on these sorts of things.

I spend a lot of my time integrating FAIR with other methodologies. One of the messaging points that I keep saying all the time is that what we are doing is implementing a discipline around how we choose our risk rankings. That’s one of the great things about FAIR. It’s universally compatible with other assessment methodologies, programs, standards, and legislation that allows you to be consistent and precise around how you’re connecting to everything else that your organization cares about.

Concerns around operational risk integration are important as well. But driving that through to completion in the organization has a lot to do with finding sponsorship and then just building a program to completion. But absent that high-level sponsorship, because FAIR allows you to build a discipline around how you choose rankings, you can also build it from the bottom up. You can have these groups of people that are FAIR trained that can build risk analyses or either pick ranges — 1, 2, 3, 4 or high, medium, low. But then when questioned, you have the ability to say, “We think this is a medium, because it met our frequency and magnitude criteria that we’ve been establishing using FAIR.”

Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis. In the end it’s an interesting and reasonable path to get to risk utopia.

Gardner: Jack Jones, any thoughts from your perspective on a good way to get started, maybe even through the lens of the verticals that The Open Group has targeted for this conference, finance, government and healthcare? Are there any specific important things to consider on the outset for your risk analysis journey from any of the three verticals?

Jones: A good place to start is with the materials that The Open Group has made available on the risk taxonomy and the soon to be published risk-analysis standard.

Another source that I recommend to everybody I talk to about other sorts of things is a book called ‘How to Measure Anything’ by Douglas Hubbard. If someone is even least bit interested in actually measuring risk in quantitative terms, they owe it to themselves to read that book. It puts into layman’s terms some very important concepts and approaches that are tremendously helpful. That’s an important resource for people to consider too.

As far as within organizations, some organizations will have a relatively mature enterprise risk-management program at the corporate level, outside of IT. Unfortunately, it can be hit-and-miss, but there can be some very good resources in terms of people and processes that the organization has already adopted. But you have to be careful there too, because with some of those enterprise risk management programs, even though they may have been in place for years, and thus, one would think over time and become mature, all they have done is dig a really deep ditch in terms of bad practices and misconceptions.

So it’s worth having the conversation with those folks to gauge how clueful are they, but don’t assume that just because they have been in place for a while and they have some specific title or something like that that they really understand risk at that level.

Gardner: Well, very good. I’m afraid we will have to leave it there. We’ve been talking with a panel of experts about the new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We’ve seen how enterprises are better delivering risk assessments, or beginning to, as they are facing challenges in cyber-security as well as undergoing the larger undertaking of enterprise transformation.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference in July 2013 in Philadelphia. There’s more information on The Open Group website about that conference for you to attend or to gather information from either live streaming or there are resources available by downloading an app for the conference.

So with that thanks to our panel. We’ve been joined by Jack Freund. He is the Information Security Risk Assessment Manager at TIAA-CREF. Thank you so much, Jack.

Freund: Thank you Dana.

Gardner: And also Jack Jones, the Principal at CXOWARE. Thank you, sir.

Jones: It’s been my pleasure. Thanks.

Gardner: And then also lastly, Jim Hietala, Vice President, Security at The Open Group. Thank you Jim.

Hietala: Thank you, Dana.

Gardner: And this is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator through these thought leader interview series. Thanks again for listening, and come back next time.

1 Comment

Filed under ArchiMate®, Business Architecture, Conference, Enterprise Architecture, Professional Development, TOGAF®

Three laws of the next Internet of Things – the new platforming evolution in computing

By Mark Skilton, Global Director at Capgemini

There is a wave of new devices and services that are growing in strength extending the boundary of what is possible in today’s internet driven economy and lifestyle.   A striking feature is the link between apps that are on smart phones and tablets and the ability to connect to not just websites but also to data collection sensors and intelligence analytical analysis of that information.   A key driver of this has also been the improvement in the cost-performance curve of information technology not just in CPU and storage but also the easy availability and affordability of highly powerful computing and mass storage in mobile devices coupled with access to complex sensors, advanced optics and screen displays results in a potentially truly immersive experience.  This is a long way from the early days of radio frequency identity tags which are the forerunner of this evolution.   Digitization of information and its interpretation of meaning is everywhere, moving into a range of industries and augmented services that create new possibilities and value. A key challenge in how to understand this growth of devices, sensors, content and services across the myriad of platforms and permutations this can bring.

·         Energy conservation

o   Through home and building energy management

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential
that connecting Devices

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems.

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Examples include Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.  Or Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in harsh or inaccessible environments. Disabled support through robotic prosthetics and communication synthesis.   Swarming robots that fly or mimic group behavior.  Swarming robots that mimic nature and decision making.

These are just the tip of want is possible; the early commercial ventures that are starting to drive new ways to think about information technology and application services.

A key feature I noticed in all these devices are that they augment previous layers of technology by sitting on top of them and adding extra value.   While often the long shadow of the first generation giants of the public internet Apple, Google, Amazon give the impression that to succeed means a controlled platform and investment of millions; these new technologies use existing infrastructure and operate across a federated distributed architecture that represents a new kind of platforming paradigm of multiple systems.

Perhaps a paradigm of new technology cycles is that as the new tech arrives it will cannibalize older technologies. Clearly nothing is immune to this trend, even the cloud,   I’ll call it even the evolution of a  kind a technology laws ( a feature  I saw in by Charles Fine clock speed book http://www.businessforum.com/clockspeed.html  but adapted here as a function of compound cannibalization and augmentation).  I think Big Data is an example of such a shift in this direction as augmented informatics enables major next generation power pays for added value services.

These devices and sensors can work with existing infrastructure services and resources but they also create a new kind of computing architecture that involves many technologies, standards and systems. What was in early times called “system of systems” Integration (Examples seen in the defence sector  http://www.bctmod.army.mil/SoSI/sosi.html  and digital ecosystems in the government sector  http://www.eurativ.com/specialreport-skills/kroes-europe-needs-digital-ecosy-interview-517996 )

While a sensor device can replace the existing thermostat in your house or the lighting or the access locks to your doors, they are offering a new kind of augmented experience that provides information and insight that enabled better control of the wider environment or the actions and decisions within a context.

This leads to a second feature of these device, the ability to learn and adapt from the inputs and environment.  This is probably an even larger impact than the first to use infrastructure in that it’s the ability to change the outcomes is a revolution in information.  The previous idea of static information and human sense making of this data is being replaced by the active pursuit of automated intelligence from the machines we build.   Earlier design paradigms that needed to define declarative services, what IT call CRUD (Create, Read, Update, Delete) as predefined and managed transactions are being replaced by machine learning algorithms that seek to build a second generation of intelligent services  that alter the results and services with the passage of time and usage characteristics.

This leads me to a third effect that became apparent in the discussion of lifestyle services versus medical and active device management.  In the case of lifestyle devices a key feature is the ability to blend in with the personal activity to enable new insight in behavior and lifestyle choices, to passively and actively monitor or tack action, not always to affect they behavior itself. That is to provide unobtrusive, ubiquitous presence.   But moving this idea further it is also about the way the devices could merge in a become integrated within the context of the user or environmental setting.  The example of biomedical devices to augment patient care and wellbeing is one such example that can have real and substantive impact of quality of life as well as efficiency in cost of care programs with an aging population to support.

An interesting side effect of these trends is the cultural dilemma these devices and sensors bring in the intrusion of personal data and privacy. Yet once the meaning and value of if this telemetry on safety , health or material value factors is perceived for the good of the individual and community, the adoption of such services may become more pronounced and reinforced. A virtuous circle of accelerated adoption seen as a key characteristic of successful growth and a kind of conditioning feedback that creates positive reinforcement.     While a key feature that is underpinning these is the ability of the device and sensor to have an unobtrusive, ubiquitous presence this overall effect is central to the idea of effective system of systems integration and borderless information flow TM (The Open Group)

These trends I see as three laws of the next Internet of things describing a next generation platforming strategy and evolution.

Its clear that sensors and devices are merging together in a way that will see cross cutting from one industry to another.  Motion and temperature sensors in one will see application in another industry.   Services from one industry may connect with other industries as combinations of these services, lifestyles and affects.

Iofthings1.jpg

Formal and informal communities both physical and virtual will be connected through sensors and devices that pervade the social, technological and commercial environments. This will drive further growth in the mass of data and digitized information with the gradual semantic representation of this information into meaningful context.  Apps services will develop increasing intelligence and awareness of the multiplicity of data, its content and metadata adding new insight and services to the infrastructure fabric.  This is a new platforming paradigm that may be constructed from one or many systems and architectures from the macro to micro, nano level systems technologies.

The three laws as I describe may be recast in a lighter tongue-in-cheek way comparing them to the famous Isaac Asimov three laws of robotics.   This is just an illustration but in some way implies that the sequence of laws is in some fashion protecting the users, resources and environment by some altruistic motive.  This may be the case in some system feedback loops that are seeking this goal but often commercial micro economic considerations may be more the driver. However I can’t help thinking that this does hint to what maybe the first stepping stone to the eventuality of such laws.

Three laws of the next generation of The Internet of Things – a new platforming architecture

Law 1. A device, sensor or service may operate in an environment if it can augment infrastructure

Law 2.  A device, sensor or service must be able  to learn and adapt its response to the environment as long as  it’s not in conflict with the First law

Law 3. A device, sensor or service  must have unobtrusive ubiquitous presence such that it does not conflict with the First or Second laws

References

 ·       Energy conservation

o   The example of  Nest  http://www.nest.com Learning thermostat, founded by Tony Fadell, ex ipod hardware designer and  Head of iPod and iPhone division, Apple.   The device monitors and learns about energy usage in a building and adapts and controls the use of energy for improved carbon and cost efficiency.

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.  Example such as UP Jawbone  https://jawbone/up and Fitbit  http://www.fitbit.com .

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential that connecting Devices such as Zensorium  http://www.zensorium.com

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  http://www.nytimes.com/2010/07/29/garden/29parents.html?pagewanted-all  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems. Examples include Toyota  advanced IT systems that will help drivers avoid road accidents.  Http://www.toyota.com/safety/ Google driverless car  http://www.forbes.com/sites/chenkamul/2013/01/22/fasten-your-seatbelts-googles-driverless-car-is-worth-trillions/

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Alpha Blue http://www.alphablue.co.uk Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.

o   Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity. Examples such as Lockitron  https://www.lockitron.com.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in  harsh or inaccessible environments. Examples include the NASA Mars curiosity rover that has active control programs to determine remote actions on the red planet that has a signal delay time round trip (13 minutes, 48 seconds EDL) approximately 30 minutes to detect perhaps react to an event remotely from Earth.  http://blogs.eas.int/mex/2012/08/05/time-delay-betrween-mars-and-earth/  http://www.nasa.gov/mission_pages/mars/main/imdex.html .  Disabled support through robotic prosthetics and communication synthesis.     http://disabilitynews.com/technology/prosthetic-robotic-arm-can-feel/.  Swarming robotc that fly or mimic group behavior.    University of Pennsylvania, http://www.reuters.com/video/2012/03/20/flying-robot-swarms-the-future-of-search?videoId-232001151 Swarming robots ,   Natural Robotics Lab , The University of Sheffield , UK   http://www.sheffield.ac.uk/news/nr/sheffield-centre-robotic-gross-natural-robotics-lab-1.265434

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0

Follow

Get every new post delivered to your Inbox.

Join 6,815 other followers

%d bloggers like this: