Category Archives: Data management

Q&A with Jim Hietala on Security and Healthcare

By The Open Group

We recently spoke with Jim Hietala, Vice President, Security for The Open Group, at the 2014 San Francisco conference to discuss upcoming activities in The Open Group’s Security and Healthcare Forums.

Jim, can you tell us what the Security Forum’s priorities are going to be for 2014 and what we can expect to see from the Forum?

In terms of our priorities for 2014, we’re continuing to do work in Security Architecture and Information Security Management. In the area of Security Architecture, the big project that we’re doing is adding security to TOGAF®, so we’re working on the next version of the TOGAF standard and specification and there’s an active project involving folks from the Architecture Forum and the Security Forum to integrate security into and stripe it through TOGAF. So, on the Security Architecture side, that’s the priority. On the Information Security Management side, we’re continuing to do work in the area of Risk Management. We introduced a certification late last year, the OpenFAIR certification, and we’ll continue to do work in the area of Risk Management and Risk Analysis. We’re looking to add a second level to the certification program, and we’re doing some other work around the Risk Analysis standards that we’ve introduced.

The theme of this conference was “Towards Boundaryless Information Flow™” and many of the tracks focused on convergence, and the convergence of things Big Data, mobile, Cloud, also known as Open Platform 3.0. How are those things affecting the realm of security right now?

I think they’re just beginning to. Cloud—obviously the security issues around Cloud have been here as long as Cloud has been over the past four or five years. But if you look at things like the Internet of Things and some of the other things that comprise Open Platform 3.0, the security impacts are really just starting to be felt and considered. So I think information security professionals are really just starting to wrap their hands around, what are those new security risks that come with those technologies, and, more importantly, what do we need to do about them? What do we need to do to mitigate risk around something like the Internet of Things, for example?

What kind of security threats do you think companies need to be most worried about over the next couple of years?

There’s a plethora of things out there right now that organizations need to be concerned about. Certainly advanced persistent threat, the idea that maybe nation states are trying to attack other nations, is a big deal. It’s a very real threat, and it’s something that we have to think about – looking at the risks we’re facing, exactly what is that adversary and what are they capable of? I think profit-motivated criminals continue to be on everyone’s mind with all the credit card hacks that have just come out. We have to be concerned about cyber criminals who are profit motivated and who are very skilled and determined and obviously there’s a lot at stake there. All of those are very real things in the security world and things we have to defend against.

The Security track at the San Francisco conference focused primarily on risk management. How can companies better approach and manage risk?

As I mentioned, we did a lot of work over the last few years in the area of Risk Management and the FAIR Standard that we introduced breaks down risk into what’s the frequency of bad things happening and what’s the impact if they do happen? So I would suggest that taking that sort of approach, using something like taking the Risk Taxonomy Standard that we’ve introduced and the Risk Analysis Standard, and really looking at what are the critical assets to protect, who’s likely to attack them, what’s the probably frequency of attacks that we’ll see? And then looking at the impact side, what’s the consequence if somebody successfully attacks them? That’s really the key—breaking it down, looking at it that way and then taking the right mitigation steps to reduce risk on those assets that are really important.

You’ve recently become involved in The Open Group’s new Healthcare Forum. Why a healthcare vertical forum for The Open Group?

In the area of healthcare, what we see is that there’s just a highly fragmented aspect to the ecosystem. You’ve got healthcare information that’s captured in various places, and the information doesn’t necessarily flow from provider to payer to other providers. In looking at industry verticals, the healthcare industry seemed like an area that really needed a lot of approaches that we bring from The Open Group—TOGAF and Enterprise Architecture approaches that we have.

If you take it up to a higher level, it really needs the Boundaryless Information Flow that we talk about in The Open Group. We need to get to the point where our information as patients is readily available in a secure manner to the people who need to give us care, as well as to us because in a lot of cases the information exists as islands in the healthcare industry. In looking at healthcare it just seemed like a natural place where, in our economies – and it’s really a global problem – a lot of money is spent on healthcare and there’s a lot of opportunities for improvement, both in the economics but in the patient care that’s delivered to individuals through the healthcare system. It just seemed like a great area for us to focus on.

As the new Healthcare Forum kicks off this year, what are the priorities for the Forum?

The Healthcare Forum has just published a whitepaper summarizing the workshop findings for the workshop that we held in Philadelphia last summer. We’re also working on a treatise, which will outline our views about the healthcare ecosystem and where standards and architecture work is most needing to be done. We expect to have that whitepaper produced over the next couple of months. Beyond that, we see a lot of opportunities for doing architecture and standards work in the healthcare sector, and our membership is going to determine which of those areas to focus on, which projects to initiate first.

For more on the The Open Group Security Forum, please visit http://www.opengroup.org/subjectareas/security. For more on the The Open Group Healthcare Forum, see http://www.opengroup.org/getinvolved/industryverticals/healthcare.

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security, risk management and healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Leave a comment

Filed under Cloud/SOA, Conference, Data management, Healthcare, Information security, Open FAIR Certification, Open Platform 3.0, RISK Management, TOGAF®, Uncategorized

Facing the Challenges of the Healthcare Industry – An Interview with Eric Stephens of The Open Group Healthcare Forum

By The Open Group

The Open Group launched its new Healthcare Forum at the Philadelphia conference in July 2013. The forum’s focus is on bringing Boundaryless Information Flow™ to the healthcare industry to enable data to flow more easily throughout the complete healthcare ecosystem through a standardized vocabulary and messaging. Leveraging the discipline and principles of Enterprise Architecture, including TOGAF®, the forum aims to develop standards that will result in higher quality outcomes, streamlined business practices and innovation within the industry.

At the recent San Francisco 2014 conference, Eric Stephens, Enterprise Architect at Oracle, delivered a keynote address entitled, “Enabling the Opportunity to Achieve Boundaryless Information Flow” along with Larry Schmidt, HP Fellow at Hewlett-Packard. A veteran of the healthcare industry, Stephens was Senior Director of Enterprise Architects Excellus for BlueCross BlueShield prior to joining Oracle and he is an active member of the Healthcare Forum.

We sat down after the keynote to speak with Stephens about the challenges of healthcare, how standards can help realign the industry and the goals of the forum. The opinions expressed here are Stephens’ own, not of his employer.

What are some of the challenges currently facing the healthcare industry?

There are a number of challenges, and I think when we look at it as a U.S.-centric problem, there’s a disproportionate amount of spending that’s taking place in the U.S. For example, if you look at GDP or percentage of GDP expenditures, we’re looking at now probably 18 percent of GDP [in the U.S.], and other developed countries are spending a full 5 percent less than that of their GDP, and in some cases they’re getting better outcomes outside the U.S.

The mere fact that there’s the existence of what we call “medical tourism, where if I need a hip replacement, I can get it done for a fraction of the cost in another country, same or better quality care and have a vacation—a rehab vacation—at the same time and bring along a spouse or significant other, means there’s a real wide range of disparity there. 

There’s also a lack of transparency. Having worked at an insurance company, I can tell you that with the advent of high deductible plans, there’s a need for additional cost information. When I go on Amazon or go to a local furniture store, I know what the cost is going to be for what I’m about to purchase. In the healthcare system, we don’t get that. With high deductible plans, if I’m going to be responsible for a portion or a larger portion of the fee, I want to know what it is. And what happens is, the incentives to drive costs down force the patient to be a consumer. The consumer now asks the tough questions. If my daughter’s going in for a tonsillectomy, show me a bill of materials that shows me what’s going to be done – if you are charging me $20/pill for Tylenol, I’ll bring my own. Increased transparency is what will in turn drive down the overall costs.

I think there’s one more thing, and this gets into the legal side of things. There is an exorbitant amount of legislation and regulation around what needs to be done. And because every time something goes sideways, there’s going to be a lawsuit, doctors will prescribe an extra test, and extra X-ray for a patient whether they need it or not.

The healthcare system is designed around a vicious cycle of diagnose-treat-release. It’s not incentivized to focus on prevention and management. Oregon is promoting these coordinated care organizations (CCOs) that would be this intermediary that works with all medical professionals – whether it was physical, mental, dental, even social worker – to coordinate episodes of care for patients. This drives down inappropriate utilization – for example, using an ER as a primary care facility and drives the medical system towards prevention and management of health. 

Your keynote with Larry Schmidt of HP focused a lot on cultural changes that need to take place within the healthcare industry – what are some of the changes necessary for the healthcare industry to put standards into place?

I would say culturally, it goes back to those incentives, and it goes back to introducing this idea of patient-centricity. And for the medical community, to really start recognizing that these individuals are consumers and increased choice is being introduced, just like you see in other industries. There are disruptive business models. As a for instance, medical tourism is a disruptive business model for United States-based healthcare. The idea of pharmacies introducing clinical medicine for routine care, such as what you see at a CVS, Wal-Mart or Walgreens. I can get a flu shot, I can get a well-check visit, I can get a vaccine – routine stuff that doesn’t warrant a full-blown medical professional. It’s applying the right amount of medical care to a particular situation.

Why haven’t existing standards been adopted more broadly within the industry? What will help providers be more likely to adopt standards?

I think the standards adoption is about “what’s in it for me, the WIIFM idea. It’s demonstrating to providers that utilizing standards is going to help them get out of the medical administration business and focus on their core business, the same way that any other business would want to standardize its information through integration, processes and components. It reduces your overall maintenance costs going forward and arguably you don’t need a team of billing folks sitting in an doctor’s office because you have standardized exchanges of information.

Why haven’t they been adopted? It’s still a question in my mind. Why would a doctor not want to do that is perhaps a question we’re going to need to explore as part of the Healthcare Forum.

Is it doctors that need to adopt the standards or technologies or combination of different constituents within the ecosystem?

I think it’s a combination. We hear a lot about the Affordable Care Act (ACA) and the health exchanges. What we don’t hear about is the legislation to drive toward standardization to increase interoperability. So unfortunately it would seem the financial incentives or things we’ve tried before haven’t worked, and we may simply have to resort to legislation or at least legislative incentives to make it happen because part of the funding does cover information exchanges so you can move health information between providers and other actors in the healthcare system.

You’re advocating putting the individual at the center of the healthcare ecosystem. What changes need to take place within the industry in order to do this?

I think it’s education, a lot of education that has to take place. I think that individuals via the incentive model around high deductible plans will force some of that but it’s taking responsibility and understanding the individual role in healthcare. It’s also a cultural/societal phenomenon.

I’m kind of speculating here, and going way beyond what enterprise architecture or what IT would deliver, but this is a philosophical thing around if I have an ailment, chances are there’s a pill to fix it. Look at the commercials, every ailment say hypertension, it’s easy, you just dial the medication correctly and you don’t worry as much about diet and exercise. These sorts of things – our over-reliance on medication. I’m certainly not going to knock the medications that are needed for folks that absolutely need them – but I think we can become too dependent on pharmacological solutions for our health problems.   

What responsibility will individuals then have for their healthcare? Will that also require a cultural and behavioral shift for the individual?

The individual has to start managing his or her own health. We manage our careers and families proactively. Now we need to focus on our health and not just float through the system. It may come to financial incentives for certain “individual KPIs such as blood pressure, sugar levels, or BMI. Advances in medical technology may facilitate more personal management of one’s health.

One of the Healthcare Forum’s goals is to help establish Boundaryless Information Flow within the Healthcare industry you’ve said that understanding the healthcare ecosystem will be a key component for that what does that ecosystem encompass and why is it important to know that first?

Very simply we’re talking about the member/patient/consumer, then we get into the payers, the providers, and we have to take into account government agencies and other non-medical agents, but they all have to work in concert and information needs to flow between those organizations in a very standardized way so that decisions can be made in a very timely fashion.

It can’t be bottled up, it’s got to be provided to the right provider at the right time, otherwise, best case, it’s going to cost more to manage all the actors in the system. Worst case, somebody dies or there is a “never event due to misinformation or lack of information during the course of care. The idea of Boundaryless Information Flow gives us the opportunity to standardize, have easily accessible information – and by the way secured – it can really aide in that decision-making process going forward. It’s no different than Wal-Mart knowing what kind of merchandise sells well before and after a hurricane (i.e., beer and toaster pastries, BTW). It’s the same kind of real-time information that’s made available to a Google car so it can steer its way down the road. It’s that kind of viscosity needed to make the right decisions at the right time.

Healthcare is a highly regulated industry, how can Boundarylesss Information Flow and data collection on individuals be achieved and still protect patient privacy?

We can talk about standards and the flow and the technical side. We need to focus on the security and privacy side.  And there’s going to be a legislative side because we’re going to touch on real fundamental data governance issue – who owns the patient record? Each actor in the system thinks they own the patient record. If we’re going to require more personal accountability for healthcare, then shouldn’t the consumer have more ownership? 

We also need to address privacy disclosure regulations to avoid catastrophic data leaks of protected health information (PHI). We need bright IT talent to pull off the integration we are talking about here. We also need folks who are well versed in the privacy laws and regulations. I’ve seen project teams of 200 have up to eight folks just focusing on the security and privacy considerations. We can argue about headcount later but my point is the same – one needs some focused resources around this topic.

What will standards bring to the healthcare industry that is missing now?

I think the standards, and more specifically the harmonization of the standards, is going to bring increased maintainability of solutions, I think it’s going to bring increased interoperability, I think it’s going to bring increased opportunities too. We see mobile computing or even DropBox, that has API hooks into all sorts of tools, and it’s well integrated – so I can integrate and I can move files between devices, I can move files between apps because they have hooks it’s easy to work with. So it’s building these communities of developers, apps and technical capabilities that makes it easy to move the personal health record for example, back and forth between providers and it’s not a cataclysmic event to integrate a new version of electronic health records (EHR) or to integrate the next version of an EHR. This idea of standardization but also some flexibility that goes into it.

Are you looking just at the U.S. or how do you make a standard that can go across borders and be international?

It is a concern, much of my thinking and much of what I’ve conveyed today is U.S.-centric, based on our problems, but many of these interoperability problems are international. We’re going to need to address it; I couldn’t tell you what the sequence is right now. There are other considerations, for example, single vs. multi-payer—that came up in the keynote. We tend to think that if we stay focused on the consumer/patient we’re going to get it for all constituencies. It will take time to go international with a standard, but it wouldn’t be the first time. We have a host of technical standards for the Internet (e.g., TCP/IP, HTTP). The industry has been able to instill these standards across geographies and vendors. Admittedly, the harmonization of health care-related standards will be more difficult. However, as our world shrinks with globalization an international lens will need to be applied to this challenge. 

Eric StephensEric Stephens (@EricStephens) is a member of Oracle’s executive advisory community where he focuses on advancing clients’ business initiatives leveraging the practice of Business and Enterprise Architecture. Prior to joining Oracle he was Senior Director of Enterprise Architecture at Excellus BlueCross BlueShield leading the organization with architecture design, innovation, and technology adoption capabilities within the healthcare industry.

 

Leave a comment

Filed under Conference, Data management, Enterprise Architecture, Healthcare, Information security, Standards, TOGAF®

Measuring the Immeasurable: You Have More Data Than You Think You Do

By Jim Hietala, Vice President, Security, The Open Group

According to a recent study by the Ponemon Institute, the average U.S. company experiences more than 100 successful cyber-attacks each year at a cost of $11.6M. By enabling security technologies, those companies can reduce losses by nearly $4M and instituting security governance reduces costs by an average of $1.5M, according to the study.

In light of increasing attacks and security breaches, executives are increasingly asking security and risk professionals to provide analyses of individual company risk and loss estimates. For example, the U.S. healthcare sector has been required by the HIPAA Security rule to perform annual risk assessments for some time now. The recent HITECH Act also added security breach notification and disclosure requirements, increased enforcement in the form of audits and increased penalties in the form of fines. Despite federal requirements, the prospect of measuring risk and doing risk analyses can be a daunting task that leaves even the best of us with a case of “analysis paralysis.”

Many IT experts agree that we are nearing a time where risk analysis is not only becoming the norm, but when those risk figures may well be used to cast blame (or be used as part of a defense in a lawsuit) if and when there are catastrophic security breaches that cost consumers, investors and companies significant losses.

In the past, many companies have been reluctant to perform risk analyses due to the perception that measuring IT security risk is too difficult because it’s intangible. But if IT departments could soon become accountable for breaches, don’t you want to be able to determine your risk and the threats potentially facing your organization?

In his book, How to Measure Anything, father of Applied Information Economics Douglas Hubbard points out that immeasurability is an illusion and that organizations do, in fact, usually have the information they need to create good risk analyses. Part of the misperception of immeasurability stems from a lack of understanding of what measurement is actually meant to be. According to Hubbard, most people, and executives in particular, expect measurement and analysis to produce an “exact” number—as in, “our organization has a 64.5 percent chance of having a denial of service attack next year.”

Hubbard argues that, as risk analysts, we need to look at measurement more like how scientists look at things—measurement is meant to reduce uncertainty—not to produce certainty—about a quantity based on observation.  Proper measurement should not produce an exact number, but rather a range of possibility, as in “our organization has a 30-60 percent chance of having a denial of service attack next year.” Realistic measurement of risk is far more likely when expressed as a probability distribution with a range of outcomes than in terms of one number or one outcome.

The problem that most often produces “analysis paralysis” is not just the question of how to derive those numbers but also how to get to the information that will help produce those numbers. If you’ve been tasked, for instance, with determining the risk of a breach that has never happened to your organization before, perhaps a denial of service attack against your web presence, how can you make an accurate determination about something that hasn’t happened in the past? Where do you get your data to do your analysis? How do you model that analysis?

In an article published in CSO Magazine, Hubbard argues that organizations have far more data than they think they do and they actually need less data than they may believe they do in order to do proper analyses. Hubbard says that IT departments, in particular, have gotten so used to having information stored in databases that they can easily query, they forget there are many other sources to gather data from. Just because something hasn’t happened yet and you haven’t been gathering historical data on it and socking it away in your database doesn’t mean you either don’t have any data or that you can’t find what you need to measure your risk. Even in the age of Big Data, there is plenty of useful data outside of the big database.

You will still need to gather that data. But you just need enough to be able to measure it accurately not necessarily precisely. In our recently published Open Group Risk Assessment Standard (O-RA), this is called calibration of estimates. Calibration provides a method for making good estimates, which are necessary for deriving a measured range of probability for risk. Section 3 of the O-RA standard uses provides a comprehensive look at how best to come up with calibrated estimates, as well as how to determine other risk factors using the FAIR (Factor Analysis of Information Risk) model.

So where do you get your data if it’s not already stored and easily accessible in a database? There are numerous sources you can turn to, both externally and internally. You just have to do the research to find it. For example, even if your company hasn’t experienced a DNS attack, many others have—what was their experience when it happened? This information is out there online—you just need to search for it. Industry reports are another source of information. Verizon publishes its own annual Verizon Data Breach Investigations Report for one. DatalossDB publishes an open data beach incident database that provides information on data loss incidents worldwide. Many vendors publish annual security reports and issue regular security advisories. Security publications and analyst firms such as CSO, Gartner, Forrester or Securosis all have research reports that data can be gleaned from.

Then there’s your internal information. Chances are your IT department has records you can use—they likely count how many laptops are lost or stolen each year. You should also look to the experts within your company to help. Other people can provide a wealth of valuable information for use in your analysis. You can also look to the data you do have on related or similar attacks as a gauge.

Chances are, you already have the data you need or you can easily find it online. Use it.

With the ever-growing list of threats and risks organizations face today, we are fast reaching a time when failing to measure risk will no longer be acceptable—in the boardroom or even by governments.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Data management, Information security, Open FAIR Certification, RISK Management, Uncategorized

Evolving Business and Technology Toward an Open Platform 3.0™

By Dave Lounsbury, Chief Technical Officer, The Open Group

The role of IT within the business is one that constantly evolves and changes. If you’ve been in the technology industry long enough, you’ve likely had the privilege of seeing IT grow to become integral to how businesses and organizations function.

In his recent keynote “Just Exactly What Is Going On in Business and Technology?” at The Open Group London Conference in October, Andy Mulholland, former Global Chief Technology Officer at Capgemini, discussed how the role of IT has changed from being traditionally internally focused (inside the firewall, proprietary, a few massive applications, controlled by IT) to one that is increasingly externally focused (outside the firewall, open systems, lots of small applications, increasingly controlled by users). This is due to the rise of a number of disruptive forces currently affecting the industry such as BYOD, Cloud, social media tools, Big Data, the Internet of Things, cognitive computing. As Mulholland pointed out, IT today is about how people are using technology in the front office. They are bringing their own devices, they are using apps to get outside of the firewall, they are moving further and further away from traditional “back office” IT.

Due to the rise of the Internet, the client/server model of the 1980s and 1990s that kept everything within the enterprise is no more. That model has been subsumed by a model in which development is fast and iterative and information is constantly being pushed and pulled primarily from outside organizations. The current model is also increasingly mobile, allowing users to get the information they need anytime and anywhere from any device.

At the same time, there is a push from business and management for increasingly rapid turnaround times and smaller scale projects that are, more often than not, being sourced via Cloud services. The focus of these projects is on innovating business models and acting in areas where the competition does not act. These forces are causing polarization within IT departments between internal IT operations based on legacy systems and new external operations serving buyers in business functions that are sourcing their own services through Cloud-based apps.

Just as UNIX® provided a standard platform for applications on single computers and the combination of servers, PCs and the Internet provided a second platform for web apps and services, we now need a new platform to support the apps and services that use cloud, social, mobile, big data and the Internet of Things. Rather than merely aligning with business goals or enabling business, the next platform will be embedded within the business as an integral element bringing together users, activity and data. To work properly, this must be a standard platform so that these things can work together effectively and at low cost, providing vendors a worthwhile market for their products.

Industry pundits have already begun to talk about this layer of technology. Gartner calls it the “Nexus of Forces.” IDC calls it the “third platform.” At the The Open Group, we refer to it as Open Platform 3.0™, and we announced a new Forum to address how organizations can address and support these technologies earlier this year. Open Platform 3.0 is meant to enable organizations (including standards bodies, users and vendors) coordinate their approaches to the new business models and IT practices driving the new platform to support a new generation of interoperable business solutions.

As is always the case with technologies, a point is reached where technical innovation must transition to business benefit. Open Platform 3.0 is, in essence, the next evolution of computing. To help the industry sort through these changes and create vendor-neutral standards that foster the cohesive adoption of new technologies, The Open Group must also evolve its focus and standards to respond to where the industry is headed.

The work of the Open Platform 3.0 Forum has already begun. Initial actions for the Forum have been identified and were shared during the London conference.  Our recent survey on Convergent Technologies confirmed the need to address these issues. Of those surveyed, 95 percent of respondents felt that converged technologies were an opportunity for business, and 84 percent of solution providers are already dealing with two or more of these technologies in combination. Respondents also saw vendor lock-in as a potential hindrance to using these technologies underscoring the need for an industry standard that will address interoperability. In addition to the survey, the Forum has also produced an initial Business Scenario to begin to address these industry needs and formulate requirements for this new platform.

If you have any questions about Open Platform 3.0 or if you would like to join the new Forum, please contact Chris Harding (c.harding@opengroup.org) for queries regarding the Forum or Chris Parnell (c.parnell@opengroup.org) for queries regarding membership.

 

Dave LounsburyDave is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, Dave leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia. Dave holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

 

 

1 Comment

Filed under Cloud, Data management, Future Technologies, Open Platform 3.0, Standards, Uncategorized, UNIX

Secure Integration of Convergent Technologies – a Challenge for Open Platform™

By Dr. Chris Harding, The Open Group

The results of The Open Group Convergent Technologies survey point to secure integration of the technologies as a major challenge for Open Platform 3.0.  This and other input is the basis for the definition of the platform, where the discussion took place at The Open Group conference in London.

Survey Highlights

Here are some of the highlights from The Open Group Convergent Technologies survey.

  • 95% of respondents felt that the convergence of technologies such as social media, mobility, cloud, big data, and the Internet of things represents an opportunity for business
  • Mobility currently has greatest take-up of these technologies, and the Internet of things has least.
  • 84% of those from companies creating solutions want to deal with two or more of the technologies in combination.
  • Developing the understanding of the technologies by potential customers is the first problem that solution creators must overcome. This is followed by integrating with products, services and solutions from other suppliers, and using more than one technology in combination.
  • Respondents saw security, vendor lock-in, integration and regulatory compliance as the main problems for users of software that enables use of these convergent technologies for business purposes.
  • When users are considered separately from other respondents, security and vendor lock-in show particularly strongly as issues.

The full survey report is available at: https://www2.opengroup.org/ogsys/catalog/R130

Open Platform 3.0

Analysts forecast that convergence of technical phenomena including mobility, cloud, social media, and big data will drive the growth in use of information technology through 2020. Open Platform 3.0 is an initiative that will advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to use them.

The survey confirms the value of an open platform to protect users of these technologies from vendor lock-in. It also shows that security is a key concern that must be addressed, that the platform must make the technologies easy to use, and that it must enable them to be used in combination.

Understanding the Requirements

The Open Group is conducting other work to develop an understanding of the requirements of Open Platform 3.0. This includes:

  • The Open Platform 3.0 Business Scenario, that was recently published, and is available from https://www2.opengroup.org/ogsys/catalog/R130
  • A set of business use cases, currently in development
  • A high-level round-table meeting to gain the perspective of CIOs, who will be key stakeholders.

The requirements input have been part of the discussion at The Open Group Conference, which took place in London this week. Monday’s keynote presentation by Andy Mulholland, Former Global CTO at Capgemini on “Just Exactly What Is Going on in Business and Technology?” included the conclusions from the round-table meeting. This week’s presentation and panel discussion on the requirements for Open Platform 3.0 covered all the inputs.

Delivering the Platform

Review of the inputs in the conference was followed by a members meeting of the Open Platform 3.0 Forum, to start developing the architecture of Open Platform 3.0, and to plan the delivery of the platform definition. The aim is to have a snapshot of the definition early in 2014, and to deliver the first version of the standard a year later.

Meeting the Challenge

Open Platform 3.0 will be crucial to establishing openness and interoperability in the new generation of information technologies. This is of first importance for everyone in the IT industry.

Following the conference, there will be an opportunity for everyone to input material and ideas for the definition of the platform. If you want to be part of the community that shapes the definition, to work on it with like-minded people in other companies, and to gain early insight of what it will be, then your company must join the Open Platform 3.0 Forum. (For more information on this, contact Chris Parnell – c.parnell@opengroup.org)

Providing for secure integration of the convergent technologies, and meeting the other requirements for Open Platform 3.0, will be a difficult but exciting challenge. I’m looking forward to continue to tackle the challenge with the Forum members.

Dr. Chris Harding

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

1 Comment

Filed under Cloud/SOA, Conference, Data management, Future Technologies, Open Platform 3.0, Semantic Interoperability, Service Oriented Architecture, Standards

IT Technology Trends – a Risky Business?

By Patty Donovan, The Open Group

On Wednesday, September 25, The Open Group will host a tweet jam looking at a multitude of emerging/converging technology trends and the risks they present to organizations who have already adopted or are looking to adopt them. Most of the technology concepts we’re talking about – Cloud, Big Data, BYOD/BYOS, the Internet of Things etc – are not new, but organizations are at differing stages of implementation and do not yet fully understand the longer term impact of adoption.

This tweet jam will allow us to explore some of these technologies in more detail and look at how organizations may better prepare against potential risks – whether this is in regards to security, access management, policies, privacy or ROI. As discussed in our previous Open Platform 3.0™ tweet jam, new technology trends present many opportunities but can also present business challenges if not managed effectively.

Please join us on Wednesday, September 25 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST for a tweet jam that will discuss and debate the issues around technology risks. A number of key areas will be addressed during the discussion including: Big Data, Cloud, Consumerization of IT, the Internet of Things and mobile and social computing with a focus on understanding the key risk priority areas organizations face and ways to mitigate them.

We welcome Open Group members and interested participants from all backgrounds to join the session and interact with our panel thought leaders led by David Lounsbury, CTO and Jim Hietala, VP of Security, from The Open Group. To access the discussion, please follow the #ogChat hashtag during the allotted discussion time.

  • Do you feel prepared for the emergence/convergence of IT trends? – Cloud, Big Data, BYOD/BYOS, Internet of things
  • Where do you see risks in these technologies? – Cloud, Big Data, BYOD/BYOS, Internet of things
  • How does your organization monitor for, measure and manage risks from these technologies?
  • Which policies are best at dealing with security risks from technologies? Which are less effective?
  • Many new technologies move data out of the enterprise to user devices or cloud services. Can we manage these new risks? How?
  • What role do standards, best practices and regulations play in keeping up with risks from these & future technologies?
  • Aside from risks caused by individual trends, what is the impact of multiple technology trends converging (Platform 3.0)?

And for those of you who are unfamiliar with tweet jams, here is some background information:

What Is a Tweet Jam?

A tweet jam is a one hour “discussion” hosted on Twitter. The purpose of this tweet jam is to share knowledge and answer questions on emerging/converging technology trends and the risks they present. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

  • Have your first #ogChat tweet be a self-introduction: name, affiliation, occupation.
  • Start all other tweets with the question number you’re responding to and the #ogChat hashtag.
    • Sample: “Big Data presents a large business opportunity, but it is not yet being managed effectively internally – who owns the big data function? #ogchat”
    • Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.
    • While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue!
    • A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rob Checkal (rob.checkal at hotwirepr.com). We anticipate a lively chat and hope you will be able to join!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Cloud, Cloud/SOA, Data management, Future Technologies, Open Platform 3.0, Platform 3.0, Tweet Jam

The Open Group Philadelphia – Day Three Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

We are winding down Day 3 and gearing up for the next two days of training and workshops.  Today’s subject areas included TOGAF®, ArchiMate®, Risk Management, Innovation Management, Open Platform 3.0™ and Future Trends.

The objective of the Future Trends session was to discuss “emerging business and technical trends that will shape enterprise IT”, according to Dave Lounsbury, Chief Technical Officer of The Open Group.

This track also featured a presentation by Dr. William Lafontaine, VP High Performance Computing, Analytics & Cognitive Markets, IBM Research, who gave an overview of the “Global Technology Outlook 2013”.  He stated the Mega Trends are:  Growing Scale/Lower Barrier of Entry; Increasing Complexity/Yet More Consumable; Fast Pace; Contextual Overload.  Mike Walker, Strategies & Enterprise Architecture Advisor for HP, noted the key disrupters that will affect our future are the business of IT, technology itself, expectation of consumers and globalization.

The session concluded with an in-depth Q&A with Bill, Dave, Mike (as shown below) and Allen Brown, CEO of The Open Group.Philly Day 3

Other sessions included presentations by TJ Virdi (Senior Enterprise Architect, Boeing) on Innovation Management, Jack Jones (President, CXOWARE, Inc.) on Risk Management and Stephen Bennett (Executive Principal, Oracle) on Big Data.

A special thanks goes to our many sponsors during this dynamic conference: Windstream, Architecting the Enterprise, Metaplexity, BIZZdesign, Corso, Avolution, CXOWARE, Penn State – Online Program in Enterprise Architecture, and Association of Enterprise Architects.

Stay tuned for post-conference proceedings to be posted soon!  See you at our conference in London, October 21-24.

Comments Off

Filed under ArchiMate®, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Open Platform 3.0, RISK Management, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

philly 2.jpgDay 2 at The Open Group conference in the City of Brotherly Love, as Philadelphia is also known, was another busy and remarkable day.

The plenary started with a fascinating presentation, “Managing the Health of the Nation” by David Nash, MD, MBA, Dean of Jefferson School of Population Health.  Healthcare is the number one industry in the city of Philadelphia, with the highest number of patients in beds in the top 10 US cities. The key theme of his thought-provoking speech was “boundaryless information sharing” (sound familiar?), which will enable a healthcare system that is “safe, effective, patient-centered, timely, equitable, efficient”.

Following Dr. Nash’s presentation was the Healthcare Transformation Panel moderated by Allen Brown, CEO of The Open Group.  Participants were:  Gina Uppal (Fulbright-Killam Fellow, American University Program), Mike Lambert (Open Group Fellow, Architecting the Enterprise), Rosemary Kennedy (Associate Professor, Thomas Jefferson University), Blaine Warkentine, MD, MPH and Fran Charney (Pennsylvania Patient Safety Authority). The group brought different sets of experiences within the healthcare system and provided reaction to Dr. Nash’s speech.  All agree on the need for fundamental change and that technology will be key.

The conference featured a spotlight on The Open Group’s newest forum, Open Platform 3.0™ by Dr. Chris Harding, Director of Interoperability.  Open Platform 3.0 was formed to advance The Open Group vision of Boundaryless Information Flow™ to help enterprises in the use of Cloud, Social, Mobile Computing and Big Data.  For more info; http://www.opengroup.org/getinvolved/forums/platform3.0

The Open Group flourishes because of people interaction and collaboration.  The accolades continued with several members being recognized for their outstanding contributions to The Open Group Trusted Technology Forum (OTTF) and the Service-Oriented Architecture (SOA) and Cloud Computing Work Groups.  To learn more about our Forums and Work Groups and how to get involved, please visit http://www.opengroup.org/getinvolved

Presentations and workshops were also held in the Healthcare, Finance and Government vertical industries. Presenters included Larry Schmidt (Chief Technologist, HP), Rajamanicka Ponmudi (IT Architect, IBM) and Robert Weisman (CEO, Build the Vision, Inc.).

2 Comments

Filed under Enterprise Architecture, Cybersecurity, Cloud/SOA, TOGAF®, ArchiMate®, Standards, Security Architecture, Business Architecture, Data management, Enterprise Transformation, Conference, O-TTF, Healthcare, Open Platform 3.0

The Open Group Philadelphia – Day One Highlights

By Loren K.  Baynes, Director, Global Marketing Communications at The Open Group.

PhillyOn Monday, July 15th, we kicked off our conference in Philadelphia. As Allen Brown, CEO of The Open Group, commented in his opening remarks, Philadelphia is the birthplace of American democracy.  This is the first time The Open Group has hosted a conference in this historical city.

Today’s plenary sessions featured keynote speakers covering topics ranging from an announcement of a new Open Group standard, appointment of a new Fellow, Enterprise Architecture and Transformation, Big Data and spotlights on The Open Group forums, Real-time Embedded Systems and Open Trusted Technology, as well as a new initiative on Healthcare.

Allen Brown noted that The Open Group has 432 member organizations with headquarters in 32 countries and over 40,000 individual members in 126 countries.

The Open Group Vision is Boundaryless Information Flow™ achieved through global interoperability in a secure, reliable and timely manner.  But as stated by Allen, “Boundaryless does not mean there are no boundaries.  It means that boundaries are permeable to enable business”

Allen also presented an overview of the new “Dependability Through Assuredness™ Standard.  The Open Group Real-time Embedded Systems Forum is the home of this standard. More news to come!

Allen introduced Dr. Mario Tokoro, (CEO of Sony Computer Systems Laboratories) who began this project in 2006. Dr. Tokoro stated, “Thank you from the bottom of my heart for understanding the need for this standard.”

Eric Sweden, MSIH MBA, Program Director, Enterprise Architecture & Governance\National Association of State CIOs (NASCIO) offered a presentation entitled “State of the States – NASCIO on Enterprise Architecture: An Emphasis on Cross-Jurisdictional Collaboration across States”.  Eric noted “Enterprise Architecture is a blueprint for better government.” Furthermore, “Cybersecurity is a top priority for government”.

Dr. Michael Cavaretta, Technical Lead and Data Scientist with Ford Motor Company discussed “The Impact of Big Data on the Enterprise”.  The five keys, according to Dr. Cavaretta, are “perform, analyze, assess, track and monitor”.  Please see the following transcript from a Big Data analytics podcast, hosted by The Open Group, Dr. Cavaretta participated in earlier this year. http://blog.opengroup.org/2013/01/28/the-open-group-conference-plenary-speaker-sees-big-data-analytics-as-a-way-to-bolster-quality-manufacturing-and-business-processes/

The final presentation during Monday morning’s plenary was “Enabling Transformation Through Architecture” by Lori Summers (Director of Technology) and Amit Mayabhate (Business Architect Manager) with Fannie Mae Multifamily.

Lori stated that their organization had adopted Business Architecture and today they have an integrated team who will complete the transformation, realize value delivery and achieve their goals.

Amit noted “Traceability from the business to architecture principles was key to our design.”

In addition to the many interesting and engaging presentations, several awards were presented.  Joe Bergmann, Director, Real-time and Embedded Systems Forum, The Open Group, was appointed Fellow by Allen Brown in recognition of Joe’s major achievements over the past 20+ years with The Open Group.

Other special recognition recipients include members from Oracle, IBM, HP and Red Hat.

In addition to the plenary session, we hosted meetings on Finance, Government and Healthcare industry verticals. Today is only Day One of The Open Group conference in Philadelphia. Please stay tuned for more exciting conference highlights over the next couple days.

Comments Off

Filed under ArchiMate®, Business Architecture, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Security Architecture, Standards, TOGAF®

As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

This latest BriefingsDirect discussion, leading into the The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore the business implications of the current shift to so-called Platform 3.0.

Known as the new model through which big data, cloud, and mobile and social — in combination — allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we’re here now to learn more how to leverage Platform 3.0 as more than a IT shift — and as a business game-changer. It will be a big topic at next week’s conference.

The panel: Dave Lounsbury, Chief Technical Officer at The Open Group; Chris Harding, Director of Interoperability at The Open Group, and Mark Skilton, Global Director in the Strategy Office at Capgemini. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They’re turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it’s a pretty fundamental change.

Lounsbury

If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we’re starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Harding: Enterprises have to keep up with the way that things are moving in order to keep their positions in their industries. Enterprises can’t afford to be working with yesterday’s technology. It’s a case of being able to understand the information that they’re presented, and make the best decisions.

Harding

We’ve always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we’re talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs — velocity, volume, and value — on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we’re now into is the multi-workload environment, where you have mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton

It has to do with not just one solution, not one subscription model — because we’re now into this subscription-model era … the subscription economy, as one group tends to describe it. Now, we’re looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we’re dealing with — 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Why should IT be thinking about this as a fundamental shift, rather than a modest change?

Lounsbury: A lot depends on how you define your IT organization. It’s useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it’s how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There’s no point giving someone data if it’s not been properly managed or if there’s incorrect information.

What’s going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we’ve seen in the open-source  role and things like that nature, but there’s the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they’re going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can’t do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally — how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I’d just like to add to Dave’s excellent points about, the shape of data has changed, but also about why should IT get involved. We’re seeing that there’s a shift in the constituency of who is using this data.

We have the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We’ve got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there’s also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: How do we prevent this from going off the rails?

Harding: This a very important point. And to add to the difficulties, it’s not only that a whole set of different people are getting involved with different kinds of information, but there’s also a step change in the speed with which all this is delivered. It’s no longer the case, that you can say, “Oh well, we need some kind of information system to manage this information. We’ll procure it and get a program written” that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It’s really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It’s a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Skilton: Capgemini has been doing work in this area. I break it down into four levels of scalability. It’s the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We’re very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it’s all about connectivity in the field. I meet a number of clients who are saying, “We’ve got this cloud service,” or “This service is in a certain area of my country. If I move to another parts of the country or I’m traveling, I can’t get connectivity.” That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the “long tail effect.” It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: We’re coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3.0, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We’re still in the formational stages of  “third platform” or Platform 3.0 for The Open Group as an industry. To some extent, we’re starting pretty much at the ground floor with that in the Platform 3.0 forum. We’re leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we’ve got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we’re really working toward in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Harding: We certainly also need to understand the business environment within which Platform 3.0 will be used. We’ve heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we’ve heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We’re also hearing about marketplaces for services, new ways in which services are being made available and combined.

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: Looking to the future, when we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we’ve begun to see that how a recommendation engine could be brought to bear. I’m thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thoughts?

Skilton: What we’re talking about is the next generation of the Internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the Internet.

I think that in the future, we’ll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We’ll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what’s happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: There’s this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, “So now this is where we could make a change to our business.” It’s the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It’s going to be different for every business, and I’m very happy to say this, it’s something that computers aren’t going to be able to do for a very long time yet. It’s going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it’s a very exciting time, and we’ll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we’ll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, “It’s going to be them” or “It’s going to be them.”

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I’d disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we’ll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Cloud/SOA, Conference, Data management, Enterprise Architecture, Platform 3.0, Professional Development, TOGAF®

Three laws of the next Internet of Things – the new platforming evolution in computing

By Mark Skilton, Global Director at Capgemini

There is a wave of new devices and services that are growing in strength extending the boundary of what is possible in today’s internet driven economy and lifestyle.   A striking feature is the link between apps that are on smart phones and tablets and the ability to connect to not just websites but also to data collection sensors and intelligence analytical analysis of that information.   A key driver of this has also been the improvement in the cost-performance curve of information technology not just in CPU and storage but also the easy availability and affordability of highly powerful computing and mass storage in mobile devices coupled with access to complex sensors, advanced optics and screen displays results in a potentially truly immersive experience.  This is a long way from the early days of radio frequency identity tags which are the forerunner of this evolution.   Digitization of information and its interpretation of meaning is everywhere, moving into a range of industries and augmented services that create new possibilities and value. A key challenge in how to understand this growth of devices, sensors, content and services across the myriad of platforms and permutations this can bring.

·         Energy conservation

o   Through home and building energy management

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential
that connecting Devices

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems.

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Examples include Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.  Or Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in harsh or inaccessible environments. Disabled support through robotic prosthetics and communication synthesis.   Swarming robots that fly or mimic group behavior.  Swarming robots that mimic nature and decision making.

These are just the tip of want is possible; the early commercial ventures that are starting to drive new ways to think about information technology and application services.

A key feature I noticed in all these devices are that they augment previous layers of technology by sitting on top of them and adding extra value.   While often the long shadow of the first generation giants of the public internet Apple, Google, Amazon give the impression that to succeed means a controlled platform and investment of millions; these new technologies use existing infrastructure and operate across a federated distributed architecture that represents a new kind of platforming paradigm of multiple systems.

Perhaps a paradigm of new technology cycles is that as the new tech arrives it will cannibalize older technologies. Clearly nothing is immune to this trend, even the cloud,   I’ll call it even the evolution of a  kind a technology laws ( a feature  I saw in by Charles Fine clock speed book http://www.businessforum.com/clockspeed.html  but adapted here as a function of compound cannibalization and augmentation).  I think Big Data is an example of such a shift in this direction as augmented informatics enables major next generation power pays for added value services.

These devices and sensors can work with existing infrastructure services and resources but they also create a new kind of computing architecture that involves many technologies, standards and systems. What was in early times called “system of systems” Integration (Examples seen in the defence sector  http://www.bctmod.army.mil/SoSI/sosi.html  and digital ecosystems in the government sector  http://www.eurativ.com/specialreport-skills/kroes-europe-needs-digital-ecosy-interview-517996 )

While a sensor device can replace the existing thermostat in your house or the lighting or the access locks to your doors, they are offering a new kind of augmented experience that provides information and insight that enabled better control of the wider environment or the actions and decisions within a context.

This leads to a second feature of these device, the ability to learn and adapt from the inputs and environment.  This is probably an even larger impact than the first to use infrastructure in that it’s the ability to change the outcomes is a revolution in information.  The previous idea of static information and human sense making of this data is being replaced by the active pursuit of automated intelligence from the machines we build.   Earlier design paradigms that needed to define declarative services, what IT call CRUD (Create, Read, Update, Delete) as predefined and managed transactions are being replaced by machine learning algorithms that seek to build a second generation of intelligent services  that alter the results and services with the passage of time and usage characteristics.

This leads me to a third effect that became apparent in the discussion of lifestyle services versus medical and active device management.  In the case of lifestyle devices a key feature is the ability to blend in with the personal activity to enable new insight in behavior and lifestyle choices, to passively and actively monitor or tack action, not always to affect they behavior itself. That is to provide unobtrusive, ubiquitous presence.   But moving this idea further it is also about the way the devices could merge in a become integrated within the context of the user or environmental setting.  The example of biomedical devices to augment patient care and wellbeing is one such example that can have real and substantive impact of quality of life as well as efficiency in cost of care programs with an aging population to support.

An interesting side effect of these trends is the cultural dilemma these devices and sensors bring in the intrusion of personal data and privacy. Yet once the meaning and value of if this telemetry on safety , health or material value factors is perceived for the good of the individual and community, the adoption of such services may become more pronounced and reinforced. A virtuous circle of accelerated adoption seen as a key characteristic of successful growth and a kind of conditioning feedback that creates positive reinforcement.     While a key feature that is underpinning these is the ability of the device and sensor to have an unobtrusive, ubiquitous presence this overall effect is central to the idea of effective system of systems integration and borderless information flow TM (The Open Group)

These trends I see as three laws of the next Internet of things describing a next generation platforming strategy and evolution.

Its clear that sensors and devices are merging together in a way that will see cross cutting from one industry to another.  Motion and temperature sensors in one will see application in another industry.   Services from one industry may connect with other industries as combinations of these services, lifestyles and affects.

Iofthings1.jpg

Formal and informal communities both physical and virtual will be connected through sensors and devices that pervade the social, technological and commercial environments. This will drive further growth in the mass of data and digitized information with the gradual semantic representation of this information into meaningful context.  Apps services will develop increasing intelligence and awareness of the multiplicity of data, its content and metadata adding new insight and services to the infrastructure fabric.  This is a new platforming paradigm that may be constructed from one or many systems and architectures from the macro to micro, nano level systems technologies.

The three laws as I describe may be recast in a lighter tongue-in-cheek way comparing them to the famous Isaac Asimov three laws of robotics.   This is just an illustration but in some way implies that the sequence of laws is in some fashion protecting the users, resources and environment by some altruistic motive.  This may be the case in some system feedback loops that are seeking this goal but often commercial micro economic considerations may be more the driver. However I can’t help thinking that this does hint to what maybe the first stepping stone to the eventuality of such laws.

Three laws of the next generation of The Internet of Things – a new platforming architecture

Law 1. A device, sensor or service may operate in an environment if it can augment infrastructure

Law 2.  A device, sensor or service must be able  to learn and adapt its response to the environment as long as  it’s not in conflict with the First law

Law 3. A device, sensor or service  must have unobtrusive ubiquitous presence such that it does not conflict with the First or Second laws

References

 ·       Energy conservation

o   The example of  Nest  http://www.nest.com Learning thermostat, founded by Tony Fadell, ex ipod hardware designer and  Head of iPod and iPhone division, Apple.   The device monitors and learns about energy usage in a building and adapts and controls the use of energy for improved carbon and cost efficiency.

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.  Example such as UP Jawbone  https://jawbone/up and Fitbit  http://www.fitbit.com .

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential that connecting Devices such as Zensorium  http://www.zensorium.com

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  http://www.nytimes.com/2010/07/29/garden/29parents.html?pagewanted-all  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems. Examples include Toyota  advanced IT systems that will help drivers avoid road accidents.  Http://www.toyota.com/safety/ Google driverless car  http://www.forbes.com/sites/chenkamul/2013/01/22/fasten-your-seatbelts-googles-driverless-car-is-worth-trillions/

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Alpha Blue http://www.alphablue.co.uk Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.

o   Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity. Examples such as Lockitron  https://www.lockitron.com.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in  harsh or inaccessible environments. Examples include the NASA Mars curiosity rover that has active control programs to determine remote actions on the red planet that has a signal delay time round trip (13 minutes, 48 seconds EDL) approximately 30 minutes to detect perhaps react to an event remotely from Earth.  http://blogs.eas.int/mex/2012/08/05/time-delay-betrween-mars-and-earth/  http://www.nasa.gov/mission_pages/mars/main/imdex.html .  Disabled support through robotic prosthetics and communication synthesis.     http://disabilitynews.com/technology/prosthetic-robotic-arm-can-feel/.  Swarming robotc that fly or mimic group behavior.    University of Pennsylvania, http://www.reuters.com/video/2012/03/20/flying-robot-swarms-the-future-of-search?videoId-232001151 Swarming robots ,   Natural Robotics Lab , The University of Sheffield , UK   http://www.sheffield.ac.uk/news/nr/sheffield-centre-robotic-gross-natural-robotics-lab-1.265434

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0

The era of “Internet aware systems and services” – the multiple-data, multi-platform and multi-device and sensors world

By Mark Skilton, Global Director at Capgemini

Communications + Data protocols and the Next Internet of Things Multi-Platform solutions

Much of the discussion on the “internet of things” have been around industry sector examples use of device and sensor services.  Examples of these I have listed at the end of this paper.  What are central to this emerging trend are not just sector point solutions but three key technical issues driving a new Industry Sector Digital Services strategy to bring these together into a coherent whole.

  1. How combinations of system technologies platforms are converging enabling composite business processes that are mobile , content and transactional rich  and with near real time persistence and interactivity
  2. The development of “non-web browser” protocols in new sensor driven machine data that are emerging that extend  new types of data into internet connected business and social integration
  3. The development of “connected systems” that move solutions in a new digital services of multiple services across platforms creating new business and technology services

I want to illustrate this by focusing on three topics:  multi-platforming strategies, communication protocols and examples of connected systems.

I want to show that this is not a simple “three or four step model” that I often see where mobile + applications and Cloud equal a solution but result in silos of data and platform integration challenges. New processing methods for big data platforms, distributed stream computing and in memory data base services for example are changing the nature of business analytics and in particular marketing and sales strategic planning and insight.  New feedback systems collecting social and machine learning data are  creating new types of business growth opportunities in context aware services that work and augment skills and services.

The major solutions in the digital ecosystem today incorporate an ever growing mix of devices and platforms that offer new user experiences and  organization. This can be seen across most all industry sectors and horizontally between industry sectors. This diagram is a simplistic view I want to use to illustrate the fundamental structures that are forming.

Iofthings1.jpg

Multiple devices that offer simple to complex visualization, on-board application services

Multiple Sensors that can economically detect measure and monitor most physical phenomena: light, heat, energy, chemical, radiological in both non-biological and biological systems.

Physical and virtual communities of formal and informal relationships. These human and/ or machine based  associations in the sense that search and discover of data and resources that can now work autonomously across an internet of many different types of data.

Physical and virtual Infrastructure that represent servers, storage, databases, networks and other resources that can constitute one or more platforms and environments. This infrastructure now is more complex in that it is both distributed and federated across multiple domains: mobile platforms, cloud computing platforms, social network platforms, big data platforms and embedded sensor platforms. The sense of a single infrastructure is both correct and incorrect in that is a combined state and set of resources that may or may not be within a span of control of an individual or organization.

Single and multi-tenanted Application services that operate in transactional, semi or non-deterministic ways that drive logical processing, formatting, interpretation, computation and other processing of data and results from one-to-many, many-to-one or many-to-many platforms and endpoints.

The key to thinking in multiple platforms is to establish the context of how these fundamental forces of platform services are driving interactions for many Industries and business and social networks and services. This is changing because they are interconnected altering the very basis of what defines a single platform to a multiple platform concept.

MS2This diagram illustrates some of these relationships and arrangements.   It is just one example of a digital ecosystem pattern, there can be other arrangements of these system use cases to meet different needs and outcomes.

I use this model to illustrate some of the key digital strategies to consider in empowering communities; driving value for money strategies or establishing a joined up device and sensor strategy for new mobile knowledge workers.   This is particularly relevant for key business stakeholders decision making processes today in Sales, Marketing, Procurement, Design, Sourcing, Supply and Operations to board level as well as IT related Strategy and service integration and engineering.

Taking one key stakeholder example, the Chief Marketing Officer (CMO) is interested and central to strategic channel and product development and brand management. The CMO typically seeks to develop Customer Zones, Supplier zones, marketplace trading communities, social networking communities and behavior insight leadership. These are critical drivers for successful company presence, product and service brand and market grow development as well as managing and aligning IT Cost and spend to what is needed for the business performance.  This creates a new kind of Digital Marketing Infrastructure to drive new customer and marketing value.  The following diagram illustrates types of  marketing services that raise questions over the types of platforms needed for single and multiple data sources, data quality and fidelity.

ms3These interconnected issues effect the efficacy and relevancy of marketing services to work at the speed, timeliness and point of contact necessary to add and create customer and stakeholder value.

What all these new converged technologies have in common are communications.  But  communications that are not just HTTP protocols but wider bandwidth of frequencies that are blurring together what is now possible to be connected.

These protocols include Wi-Fi and other wireless systems and standards that are not just in the voice speech band but also in the collection and use of other types of telemetry relating to other senses and detectors.

All these have common issues of Device and sensor compatibility, discovery and paring and security compatibility and controls.

ms4Communication standards examples for multiple services.

  • Wireless: WLAN, Bluetooth, ZigBee, Z-Wave, Wireless USB,
  •  Proximity Smartcard, Passive , Active, Vicinity Card
  • IrDA, Infrared
  • GPS Satellite
  • Mobile 3G, 4GLTE, Cell, Femtocell, GSM, CDMA, WIMAX
  • RFID RF, LF, HFbands
  • Encryption: WEP, WPA, WPA2, WPS, other

These communication protocols impact on the design and connectivity of system- to-system services. These standards relate to the operability of the services that can be used in the context of a platform and how they are delivered and used by consumers and providers..  How does the data and service connect with the platform? How does the service content get collected, formatted, processed and transmitted between the source and target platform?  How do these devices and sensors work to support extended and remote mobile and platform service?  What distributed workloads work best in a mobile platform, sensor platform or distributed to a dedicated or shared platform that may be cloud computing or appliance based for example?

Answering these questions are key to providing a consistent and powerful digital service strategy that is both flexible and capable of exploiting, scaling and operating with these new system and intersystem capabilities.

This becomes central to a new generation of Internet aware data and services that represent the digital ecosystem that deliver new business and consumer experience on and across platforms.ms5

This results in a new kind of User Experience and Presence strategy that moves the “single voice of the Customer” and “Customer Single voice” to a new level that works across mobile, tablets and other devices and sensors that translate and create new forms of information and experience for consumers and providers. Combining this with new sensors that can include for example; positional, physical and biomedical data content become a reality in this new generation of digital services.  Smart phones today have a price-point that includes many built in sensors that are precision technologies measuring physical and biological data sources. When these are built into new feedback and decision analytics creates a whole new set of possibilities in real time and near real time augmented services as well as new levels of resource use and behavior insight.

The scale and range of data types (text, voice, video, image, semi structured, unstructured, knowledge, metadata , contracts, IP ) about social, business and physical environments have moved beyond the early days of RFID tags to encompass new internet aware sensors, systems, devices and services.  ms6This is not just “Tabs and Pads” of mobiles and tablets but a growing presence into “Boards, Places and Spaces” that make up physical environments turning them in part of the interactive experience and sensory input of service interaction. This now extends to the massive scale of terrestrial communications that connect across the planet and beyond in the case of NASA for example; but also right down to the Micro, Nano, Pico and quantum levels in the case of Molecular and Nano tech engineering .   All these are now part of the modern technological landscape that is pushing the barriers of what is possible in today’s digital ecosystem.

The conclusion is that strategic planning needs to have insight into the nature of new infrastructures and applications that will support these new multisystem workloads and digital infrastructures.
I illustrate this in the following diagram in what I call the “multi-platforming” framework that represents this emerging new ecosystem of services.ms7

Digital Service = k ∑ Platforms + ∑ Connections

K= a coefficient measuring how open, closed and potential value of service

Digital Ecosystem = e ∑ Digital Services

e = a coefficient of how diverse and dynamic the ecosystem and its service participants.

I will explore the impact on enterprise architecture and digital strategy in future blogs and how the emergence of a new kind of architecture called Ecosystem Arch.

Examples of new general Industry sector services Internet of Things

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

4 Comments

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0

Questions for the Upcoming Platform 3.0™ Tweet Jam

By Patty Donovan, The Open Group

Last week, we announced our upcoming tweet jam on Thursday, June 6 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST, which will examine how convergent technologies such as Big Data, Social, Mobile and The Internet of Things are impacting today’s business operations. We will also discuss the opportunities available to those organizations who keep pace with this rapid pace of change and how they might take steps to get there.

The discussion will be moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct, and we welcome both members of The Open Group and interested participants alike to join the session.

The discussion will be guided by these five questions:

- Does your organization see a convergence of emerging technologies such as social networking, mobile, cloud and the internet of things?

- How has this convergence affected your business?

- Are these changes causing you to change your IT platform; if so how?

- How is the data created by this convergence affecting business models or how you make business decisions?

- What new IT capabilities are needed to support new business models and decision making?

To join the discussion, please follow the #ogp3 and #ogChat hashtag during the allotted discussion time.

For more information about the tweet jam, guidelines and general background information, please visit our previous blog post.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rob Checkal (rob.checkal at hotwirepr dot com) or leave a comment below. We anticipate a lively chat and hope you will be able to join us!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

Comments Off

Filed under Cloud, Cloud/SOA, Data management, Platform 3.0, Tweet Jam

Why should your business care about Platform 3.0™? A Tweet Jam

By Patty Donovan, The Open Group

On Thursday, June 6, The Open Group will host a tweet jam examining Platform 3.0™ and why businesses require it to remain relevant in today’s fast paced internet enabled business environment. Over recent years a number of convergent technologies have emerged which have the potential to disrupt the way we engage with each other in both our personal business lives. Many of us are familiar with the buzz words including Mobile, Social, Big Data, Cloud Computing, the Internet of Things, Machine-to-Machine (M2M) and Cosumerization of IT (CoIT) – but what do they mean for our current operating business environments and what should businesses be doing to ensure that they keep pace?

Gartner was the first to recognize this convergence of trends representing a number of architectural shifts which it called a ‘Nexus of Forces’. This Nexus was presented as both an opportunity in terms of innovation of new IT products and services and a threat for those who do not keep pace with evolution, rendering current Business Architectures obsolete.

Rather than tackle this challenge solo, The Open Group is working with a number of IT experts, analysts and thought leaders to better understand the opportunities available to businesses and the steps they need to take to get them there.

Please join us on Thursday, June 6 at 9:00 a.m. PT/12:00 p.m. ET/5:00 p.m. BST for a tweet jam, moderated by Dana Gardner (@Dana_Gardner), ZDNet – Briefings Direct, that will discuss and debate the issues around Platform 3.0™. Key areas that will be addressed during the discussion include: the specific technical trends (Big Data, Cloud, Consumerization of IT, etc.), and ways businesses can use them – and are already using them – to increase their business opportunity. We welcome Open Group members and interested participants from all backgrounds to join the session and interact with our panel thought leaders led by David Lounsbury, CTO and Chris Harding, Director of Interoperability from The Open Group. To access the discussion, please follow the #ogp3 and #ogChat hashtag during the allotted discussion time.

- Does your organization see a convergence of emerging technologies such as social networking, mobile, cloud and the internet of things?

- How has this convergence affected your business?

- Are these changes causing you to change your IT platform; if so how?

- How is the data created by this convergence affecting business models or how you make business decisions?

- What new IT capabilities are needed to support new business models and decision making?

And for those of you who are unfamiliar with tweet jams, here is some background information:

What Is a Tweet Jam?

A tweet jam is a one hour “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on Platform 3.0™. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

  • Have your first #ogChat or #ogp3 tweet be a self-introduction: name, affiliation, occupation.
  • Start all other tweets with the question number you’re responding to and the #ogChat or #ogp3 hashtag.
    • Sample: “There are already a number of organizations taking advantage of Platform 3.0 technology trends #ogp3”
    • Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.
    • While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue!
    • A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rob Checkal (rob.checkal at hotwirepr dot com). We anticipate a lively chat and hope you will be able to join!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

2 Comments

Filed under Cloud, Cloud/SOA, Data management, Platform 3.0, Tweet Jam

Thinking About Big Data

By Dave Lounsbury, The Open Group

“We can not solve our problems with the same level of thinking that created them.”

- Albert Einstein

The growing consumerization of technology and convergence of technologies such as the “Internet of Things”, social networks and mobile devices are causing big changes for enterprises and the marketplace. They are also generating massive amounts of data related to behavior, environment, location, buying patterns and more.

Having massive amounts of data readily available is invaluable. More data means greater insight, which leads to more informed decision-making. So far, we are keeping ahead of this data by smarter analytics and improving the way we handle this data. The question is, how long can we keep up? The rate of data production is increasing; as an example, an IDC report[1] predicts that the production of data will increase 50X in the coming decade. To magnify this problem, there’s an accompanying explosion of data about the data – cataloging information, metadata, and the results of analytics are all data in themselves. At the same time, data scientists and engineers who can deal with such data are already a scarce commodity, and the number of such people is expected to grow only by 1.5X in the same period.

It isn’t hard to draw the curve. Turning data into actionable insight is going to be a challenge – data flow is accelerating at a faster rate than the available humans can absorb, and our databases and data analytic systems can only help so much.

Markets never leave gaps like this unfilled, and because of this we should expect to see a fundamental shift in the IT tools we use to deal with the growing tide of data. In order to solve the challenges of managing data with the volume, variety and velocities we expect, we will need to teach machines to do more of the analysis for us and help to make the best use of scarce human talents.

The Study of Machine Learning

Machine Learning, sometimes called “cognitive computing”[2] or “intelligent computing”, looks at the study of building computers with the capability to learn and perform tasks based on experience. Experience in this context includes looking at vast data sets, using multiple “senses” or types of media, recognizing patterns from past history or precedent, and extrapolating this information to reason about the problem at hand. An example of machine learning that is currently underway in the healthcare sector is medical decision aids that learn to predict therapies or to help with patient management, based on correlating a vast body of medical and drug experience data with the information about the patients under treatment

A well-known example of this is Watson, a machine learning system IBM unveiled a few years ago. While Watson is best known for winning Jeopardy, that was just the beginning. IBM has since built six Watsons to assist with their primary objective: to help health care professionals find answers to complex medical questions and help with patient management[3]. The sophistication of Watson is the reaction of all this data action that is going on. Watson of course isn’t the only example in this field, with others ranging from Apple’s Siri intelligent voice-operated assistant to DARPA’s SyNAPSE program[4].

Evolution of the Technological Landscape

As the consumerization of technology continues to grow and converge, our way of constructing business models and systems need to evolve as well. We need to let data drive the business process, and incorporate intelligent machines like Watson into our infrastructure to help us turn data into actionable results.

There is an opportunity for information technology and companies to help drive this forward. However, in order for us to properly teach computers how to learn, we first need to understand the environments in which they will be asked to learn in – Cloud, Big Data, etc. Ultimately, though, any full consideration of these problems will require a look at how machine learning can help us make decisions – machine learning systems may be the real platform in these areas.

The Open Group is already laying the foundation to help organizations take advantage of these convergent technologies with its new forum, Platform 3.0. The forum brings together a community of industry thought leaders to analyze the use of Cloud, Social, Mobile computing and Big Data, and describe the business benefits that enterprises can gain from them. We’ll also be looking at trends like these at our Philadelphia conference this summer.  Please join us in the discussion.


2 Comments

Filed under Cloud, Cloud/SOA, Data management, Enterprise Architecture

It Is a Big World for Big Data After All

By E.G. Nadhan, HP

In the Information Week Global CIO blog, Patrick Houston says that big is bad when it comes to data, questioning the appropriateness of the term big data. Houston highlights the risk of the term being taken literally by the not-so-technical folks. Big data will continue to spread with emerging associative terms like big data expertbig data technologies, etc. I also see other reactions to this term like the one in Allison Watterson’s post, “What do you mean big data, little data is hard enough.” So why has it gained this broad adoption so fast?

Here are my top 5 reasons why the term big data has stuck, and why it may be appropriate, after all:

Foundational. It all started with data processing going decades back. Over the years, we have seen:

  • Big Computer - monolithic behemoths – or in today’s terms, legacy platforms
  • Big Network - local and wide area networks
  • Big Connector - the Internet that facilitated meaningful access with a purpose to consumers across the globe
  • Big Communicator - social media that has fostered communication beyond our imagination

It is all leading up to the generation and consumption of big data driven by presence. It was all about data to start with, and we have come back full circle to data again.

PervasiveBig Data will pervasively promote a holistic approach across all architectural elements of cloud computing:

  • Compute - complex data processing algorithms
  • Network - timely transmission of high volumes of data
  • Storage - various media to house <choose your prefix> bytes of data

FamiliarBig is always part of compound associations whether it be a hamburger (Big Mac), Big Brother or The Big Dipper. It is a big deal, shall we say? Data has always been generated and consumed with continued emergence of evolutionary technologies. You say big data and pictures of data rapidly growing like a balloon or spreading like water come to mind. It has something to do with data. There is something big about it.

Synthetic. Thomas C. Redman introduces a term “Informationlization” in the Harvard Business Review blog titled, “Integrate data into product, or get left behind.”  To me, the term big data is also about the synthesis individual pixels on the display device coming together to present a cohesive, meaningful picture.

Simple. You cannot get simpler than a three-letter word paired up with a four-letter word to mean something by itself. Especially when neither one is a TLA (three-letter acronym) for something very difficult to pronounce! Children in their elementary grades start learning these simple words before moving on to complex spelling bees with an abundance of vowels and y and x and q letters. Big data rolls off the tongue easily with a total of three syllables.

As humans, we tend to gravitate towards simplicity, which is why the whole world chimes in and sways back and forth when Sir Paul McCartney sings Hey Jude! decades after the first performance of this immortal piece. The line that sticks in our mind is the simplest line in the whole song – easy to render – one that we hum along with our hearts. Likewise, big data provides the most simplistic interpretation possible for a really complex world out there.

I actually like what Houston proposes – gushing data. However, I am not sure if it would enjoy the attention that big data gets. It represents a domain that needs to be addressed globally across all architectural layers by everyone including the consumers, administrators and orchestrators of data.

Therefore, big data is not just good enough – it is apt.

What about you? Do you have other names in mind? What does big data mean to you?

A version of this blog post originally appeared on the HP Enterprise Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHPwww.hp.com/go/journeyblog

2 Comments

Filed under Data management

The Open Group San Francisco Conference: Day 1 Highlights

By The Open Group Conference Team

With the end of the first day of the conference, here are a few key takeaways from Monday’s key note sessions:

The Enterprise Architect: Architecting Business Success

Jeanne Ross, Director & Principal Research Scientist, MIT Center for Information Systems Research

Ms. Ross began the plenary discussing the impact of enterprise architecture on the whole enterprise. According to Ross “we live in a digital economy, and in order to succeed, we need to excel in enterprise architecture.” She went on to say that the current “plan, build, use” model has led to a lot of application silos. Ms. Ross also mentioned that enablement doesn’t work well; while capabilities are being built, they are grossly underutilized within most organizations.

Enterprise architects need to think about what capabilities their firms will exploit – both in the short- and long-terms. Ms. Ross went on to present case studies from Aetna, Protection 1, USAA, Pepsi America and Commonwealth of Australia. In each of these examples, architects provided the following business value:

  • Helped senior executives clarify business goals
  • Identified architectural capability that can be readily exploited
  • Presented Option and their implications for business goals
  • Built Capabilities incrementally

A well-received quote from Ms. Ross during the Q&A portion of the session was, “Someday, CIOs will report to EA – that’s the way it ought to be!”

How Enterprise Architecture is Helping Nissan IT Transformation

Celso Guiotoko, Corporate Vice President and CIO, Nissan Motor Co., Ltd.

Mr. Guiotoko presented the steps that Nissan took to improve the efficiency of its information systems. The company adapted BEST – an IT mid-term plan that helped led enterprise transformation within the organization. BEST was comprised of the following components:

  • Business Alignment
  • Enterprise Architecture
  • Selective Sourcing
  • Technology Simplification

Guided by BEST and led by strong Enterprise Architecture, Nissan saw the following results:

  • Reduced cost per user from 1.09 to 0.63
  • 230,000 return with 404 applications reduced
  • Improved solution deployment time
  • Significantly reduced hardware costs

Nissan recently created the next IT mid-term plan called “VITESSE,” which stands for value information, technology, simplification and service excellence. Mr. Guiotoko said that VITESSE will help the company achieve its IT and business goals as it moves toward the production of zero-emissions vehicles.

The Transformed Enterprise

Andy Mulholland, Global CTO, Capgemini

Mr. Mulholland began the presentation by discussing what parts of technology comprise today’s enterprise and asking the question, “What needs to be done to integrate these together?” Enterprise technology is changing rapidly and  the consumerization of IT only increasing. Mr. Mulholland presented a statistic from Gartner predicting that up to 35 percent of enterprise IT expenditures will be managed outside of the IT department’s budget by 2015. He then referenced the PC revolution when enterprises were too slow to adapt to employees needs and adoption of technology.

There are three core technology clusters and standards that are emerging today in the form of Cloud, mobility and big data, but there are no business process standards to govern them. In order to not repeat the same mistakes of the PC revolution, organizations need to move from an inside-out model to an outside-in model – looking at the activities and problems within the enterprise then looking outward versus looking at those problems from the outside in. Outside-in, Mulholland argued, will increase productivity and lead to innovative business models, ultimately enabling your enterprise to keep up the current technology trends.

Making Business Drive IT Transformation through Enterprise Architecture

Lauren States, VP & CTO of Cloud Computing and Growth Initiatives, IBM Corp.

Ms. States began her presentation by describing today’s enterprise – flat, transparent and collaborative. In order to empower this emerging type of enterprise, she argued that CEOs need to consider data a strategic initiative.

Giving the example of the CMO within the enterprise to reflect how changing technologies affect their role, she stated, “CMOS are overwhelming underprepared for the data explosion and recognize a need to invest in and integrate technology and analytics.” CIOs and architects need to use business goals and strategy to set the expectation of IT. Ms. States also said that organizations need to focus on enabling growth, productivity and cultural change – factors are all related and lead to enterprise transformation.

*********

The conference will continue tomorrow with overarching themes that include enterprise transformation, security and SOA. For more information about the conference, please go here: http://www3.opengroup.org/sanfrancisco2012

Comments Off

Filed under Cloud, Cloud/SOA, Data management, Enterprise Architecture, Enterprise Transformation, Semantic Interoperability, Standards

2012 Open Group Predictions, Vol. 2

By The Open Group

Continuing on the theme of predictions, here are a few more, which focus on enterprise architecture, business architecture, general IT and Open Group events in 2012.

Enterprise Architecture – The Industry

By Leonard Fehskens, VP of Skills and Capabilities

Looking back at 2011 and looking forward to 2012, I see growing stress within the EA community as both the demands being placed on it and the diversity of opinions within it increase. While this stress is not likely to fracture the community, it is going to make it much more difficult for both enterprise architects and the communities they serve to make sense of EA in general, and its value proposition in particular.

As I predicted around this time last year, the conventional wisdom about EA continues to spin its wheels.  At the same time, there has been a bit more progress at the leading edge than I had expected or hoped for. The net effect is that the gap between the conventional wisdom and the leading edge has widened. I expect this to continue through the next year as progress at the leading edge is something like the snowball rolling downhill, and newcomers to the discipline will pronounce that it’s obvious the Earth is both flat and the center of the universe.

What I had not expected is the vigor with which the loosely defined concept of business architecture has been adopted as the answer to the vexing challenge of “business/IT alignment.” The big idea seems to be that the enterprise comprises “the business” and IT, and enterprise architecture comprises business architecture and IT architecture. We already know how to do the IT part, so if we can just figure out the business part, we’ll finally have EA down to a science. What’s troubling is how much of the EA community does not see this as an inherently IT-centric perspective that will not win over the “business community.” The key to a truly enterprise-centric concept of EA lies inside that black box labeled “the business” – a black box that accounts for 95% or more of the enterprise.

As if to compensate for this entrenched IT-centric perspective, the EA community has lately adopted the mantra of “enterprise transformation”, a dangerous strategy that risks promising even more when far too many EA efforts have been unable to deliver on the promises they have already made.

At the same time, there is a growing interest in professionalizing the discipline, exemplified by the membership of the Association of Enterprise Architects (AEA) passing 20,000, TOGAF® 9 certifications passing 10,000, and the formation of the Federation of Enterprise Architecture Professional Organizations (FEAPO). The challenge that we face in 2012 and beyond is bringing order to the increasing chaos that characterizes the EA space. The biggest question looming seems to be whether this should be driven by IT. If so, will we be honest about this IT focus and will the potential for EA to become a truly enterprise-wide capability be realized?

Enterprise Architecture – The Profession

By Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects

It’s an exciting time for enterprise architecture, both as an industry and as a profession. There are an abundance of trends in EA, but I wanted to focus on three that have emerged and will continue to evolve in 2012 and beyond.

  • A Defined Career Path for Enterprise Architects: Today, there is no clear career path for the enterprise architect. I’ve heard this from college students, IT and business professionals and current EAs. Up until now, the skills necessary to succeed and the roles within an organization that an EA can and should fill have not been defined. It’s imperative that we determine the skill sets EAs need and the path for EAs to acquire these skills in a linear progression throughout their career. Expect this topic to become top priority in 2012.
  • Continued EA Certification Adoption: Certification will continue to grow as EAs seek ways to differentiate themselves within the industry and to employers. Certifications and memberships through professional bodies such as the Association of Enterprise Architects will offer value to members and employers alike by identifying competent and capable architects. This growth will also be supported by EA certification adoption in emerging markets like India and China, as those countries continue to explore ways to build value and quality for current and perspective clients, and to establish more international credibility.
  • Greater Involvement from the Business: As IT investments become business driven, business executives controlling corporate strategy will need to become more involved in EA and eventually drive the process. Business executive involvement will be especially helpful when outsourcing IT processes, such as Cloud Computing. Expect to see greater interest from executives and business schools that will implement coursework and training to reflect this shift, as well as increased discussion on the value of business architecture.

Business Architecture – Part 2

By Kevin Daley, IBM and Vice-Chair of The Open Group Business Forum

Several key technologies have reached a tipping point in 2011 that will move them out of the investigation and validation by enterprise architects and into the domain of strategy and realization for business architects. Five areas where business architects will be called upon for participation and effort in 2012 are related to:

  • Cloud: This increasingly adopted and disruptive technology will help increase the speed of development and change. The business architect will be called upon to ensure the strategic relevancy of transformation in a repeatable fashion as cycle times and rollouts happen faster.
  • Social Networking / Mobile Computing: Prevalent consumer usage, global user adoption and improvements in hardware and security make this a trend that cannot be ignored. The business architect will help develop new strategies as organizations strive for new markets and broader demographic reach.
  • Internet of Things: This concept from 2000 is reaching critical mass as more and more devices become communicative. The business architect will be called on to facilitate the conversation and design efforts between operational efforts and technologies managing the flood of new and usable information.
  • Big Data and Business Intelligence: Massive amounts of previously untapped data are being exposed, analyzed and made insightful and useful. The business architect will be utilized to help contain the complexity of business possibilities while identifying tactical areas where the new insights can be integrated into existing technologies to optimize automation and business process domains.
  • ERP Resurgence and Smarter Software: Software purchasing looks to continue its 2011 trend towards broader, more intuitive and feature-rich software and applications.  The business architect will be called upon to identify and help drive getting the maximum amount of operational value and output from these platforms to both preserve and extend organizational differentiation.

The State of IT

By Dave Lounsbury, CTO

What will have a profound effect on the IT industry throughout 2012 are the twin horses of mobility and consumerization, both of which are galloping at full tilt within the IT industry right now. Key to these trends are the increased use of personal devices, as well as favorite consumer Cloud services and social networks, which drive a rapidly growing comfort among end users with both data and computational power being everywhere. This comfort brings a level of expectations to end users who will increasingly want to control how they access and use their data, and with what devices. The expectation of control and access will be increasingly brought from home to the workplace.

This has profound implications for core IT organizations. There will be less reliance on core IT services, and with that an increased expectation of “I’ll buy the services, you show me know to knit them in” as the prevalent user approach to IT – thus requiring increased attention to use of standards conformance. IT departments will change from being the only service providers within organizations to being a guiding force when it comes to core business processes, with IT budgets being impacted. I see a rapid tipping point in this direction in 2012.

What does this mean for corporate data? The matters of scale that have been a part of IT—the overarching need for good architecture, security, standards and governance—will now apply to a wide range of users and their devices and services. Security issues will loom larger. Data, apps and hardware are coming from everywhere, and companies will need to develop criteria for knowing whether systems are robust, secure and trustworthy. Governments worldwide will take a close look at this in 2012, but industry must take the lead to keep up with the pace of technology evolution, such as The Open Group and its members have done with the OTTF standard.

Open Group Events in 2012

By Patty Donovan, VP of Membership and Events

In 2012, we will continue to connect with members globally through all mediums available to us – our quarterly conferences, virtual and regional events and social media. Through coordination with our local partners in Brazil, China, France, Japan, South Africa, Sweden, Turkey and the United Arab Emirates, we’ve been able to increase our global footprint and connect members and non-members who may not have been able to attend the quarterly conferences with the issues facing today’s IT professionals. These events in conjunction with our efforts in social media has led to a rise in member participation and helped further develop The Open Group community, and we hope to have continued growth in the coming year and beyond.

We’re always open to new suggestions, so if you have a creative idea on how to connect members, please let me know! Also, please be sure to attend the upcoming Open Group Conference in San Francisco, which is taking place on January 30 through February 3. The conference will address enterprise transformation as well as other key issues in 2012 and beyond.

9 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Data management, Enterprise Architecture, Semantic Interoperability, Standards

Save the Date—The Open Group Conference San Francisco!

By Patty Donovan, The Open Group

It’s that time again to start thinking ahead to The Open Group’s first conference of 2012 to be held in San Francisco, January 30 – February 3, 2012. Not only do we have a great venue for the event, the Intercontinental Mark Hopkins (home of the famous “Top of the Mark” sky lounge—with amazing views of all of San Francisco!), but we have stellar line up for our winter conference centered on the theme of Enterprise Transformation.

Enterprise Transformation is a theme that is increasingly being used by organizations of all types to represent the change processes they implement in response to internal and external business drivers. Enterprise Architecture (EA) can be a means to Enterprise Transformation, but most enterprises today because EA is still largely limited to the IT department and transformation must go beyond the IT department to be successful. The San Francisco conference will focus on the role that both IT and EA can play within the Enterprise Transformation process, including the following:

  • The differences between EA and Enterprise Transformation and how they relate  to one another
  • The use of EA to facilitate Enterprise Transformation
  • How EA can be used to create a foundation for Enterprise Transformation that the Board and business-line managers can understand and use to their advantage
  • How EA facilitates transformation within IT, and how does such transformation support the transformation of the enterprise as a whole
  • How EA can help the enterprise successfully adapt to “disruptive technologies” such as Cloud Computing and ubiquitous mobile access

In addition, we will be featuring a line-up of keynotes by some of the top industry leaders to discuss Enterprise Transformation, as well as themes around our regular tracks of Enterprise Architecture and Professional Certification, Cloud Computing and Cybersecurity. Keynoting at the conference will be:

  • Joseph Menn, author and cybersecurity correspondent for the Financial Times (Keynote: What You’re Up Against: Mobsters, Nation-States and Blurry Lines)
  • Celso Guiotoko, Corporate Vice President and CIO, Nissan Motor Co., Ltd. (Keynote: How Enterprise Architecture is helping NISSAN IT Transformation)
  • Jeanne W. Ross, Director & Principal Research Scientist, MIT Center for Information Systems Research (Keynote: The Enterprise Architect: Architecting Business Success)
  • Lauren C. States, Vice President & Chief Technology Officer, Cloud Computing and Growth Initiatives, IBM Corp. (Keynote: Making Business Drive IT Transformation Through Enterprise Architecture)
  • Andy Mulholland, Chief Global Technical Officer, Capgemini (Keynote: The Transformed Enterprise)
  • William Rouse, Executive Director, Tennenbaum Institute at Georgia Institute of Technology (Keynote: Enterprise Transformation: An Architecture-Based Approach)

For more on the conference tracks or to register, please visit our conference registration page. And stay tuned throughout the next month for more sneak peeks leading up to The Open Group Conference San Francisco!

1 Comment

Filed under Cloud, Cloud/SOA, Cybersecurity, Data management, Enterprise Architecture, Semantic Interoperability, Standards

PODCAST: Why data and information management remain elusive after decades of deployments; and how to fix it

By Dana Gardner, Interabor Solutions

Listen to this recorded podcast here: BriefingsDirect-Effective Data Management Remains Elusive Even After Decades of Deployments

The following is the transcript of a sponsored podcast panel discussion on the state of data and information management strategies, in conjunction with the The Open Group Conference, Austin 2011.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect. Today, we present a sponsored podcast discussion in conjunction with the latest Open Group Conference in Austin, Texas, the week of July 18, 2011. We’ve assembled a distinguished panel to update us on the state of data and information management strategies. We’ll examine how it remains difficult for businesses to get the information they want in the way they can use, and why this has been a persistent problem. We’ll uncover the latest in the framework approach to information and data and look at how an information architect can make a big difference.

Here to help us better understand the role and impact of the information architect and also how to implement a successful data in information strategy is our panel. We’re here with Robert Weisman. He is CEO of Build The Vision Incorporated. Welcome to BriefingsDirect, Robert.

Robert Weisman: Thank you.

Gardner: We’re also here with Eugene Imbamba. He is Information Management Architect in IBM‘s Software Group. Welcome, Eugene.

Eugene Imbamba: Thank you very much.

Gardner: And we’re here also with Mei Selvage. She is the Lead in the IBM Community of Information Architects. Welcome to the show, Mei.

Mei Selvage: Thank you for having us.

Gardner: Tell me, Robert, why it is that it’s so hard for IT to deliver information access in the way that businesses really want.

Weisman: It’s the general insensitivity to information management concerns within the industry itself, which is very much becoming much more technology and tool-driven with the actual information not being taken into consideration. As a consequence, a lot of the solutions might work, but they don’t last, and they don’t, generally speaking, get the right information to the right person at the right time. Within The Open Group, we recognized this split about four years ago and that’s one reason that in TOGAF® 9 we redefined that information technology as “The lifecycle management of information and related technology within an organization.” We didn’t want to see an IM/IT split in organizations. We wanted to make sure that the architecture addressed the needs of the entire community, especially those requiring information and knowledge.

Gardner: Eugene, do you think if we focus more on the lifecycle management of information and the architecture frameworks like TOGAF, that we’ll get more to this requirement that business has that single view of reality?

Imbamba: Definitely, focusing on reference architecture methodologies are a good way to get going in the right direction. I don’t think it’s the end of all means to getting there. But, in terms of leveraging what’s been done, some of the architectures that have been developed, whether it’s TOGAF or some of the other artifacts out there, would help organizations, instead of spinning their wheels and reinventing the wheel, start building some of the foundational capabilities needed to have an enterprise information architecture.

Getting to the finish line

As a result, we’re seeing that each year with information management, projects starting up and projects collapsing for various reasons, whether it’s cost or just the process or people in place. Leveraging some of these artifacts, methods, and reference architectures is a way to help get started, and of course employing other areas of the information management disciplines to help get to the finish line.

Gardner: Mei, when it comes to learning from those that have done this well, what do we know about what works when it comes to data and information management? What can we point to and say, “Without question, moving in this direction is allowing us to be inclusive, move beyond just the data and databases, and get that view that the business is really looking for?”

Selvage: Eugene and I had a long debate over how we know that we’ve delivered a successful information architecture. Our conclusion comes out three plus one. The first piece is just like any strategy roadmap. You need to have a vision and strategy. To have a successful information architecture vision you really have to understand your business problem and your business vision. Then, you use applicable, proven referenced architecture and methodology to support that.

Once you have vision, then you come to the execution. How do you leverage your existing IT environments, integrates with them, keep good communication, and use the best practices? Finally, you have to get implemented on time and on schedule within the budget — and the end-user is satisfied.

Those are three parts. Then, the plus part is data governance, not just one-time project delivery. You’ll have to make sure that data governance is getting consistently implemented across the projects.

Gardner: How about in the direction of this organizational definition of what works and what doesn’t work? How important is it rather for an information architect role to emerge? Let’s start with you, Robert. Then, I’d like to take this to all of you. What is it about the information architect role that can play an important element here?

Weisman: The information architect will soon be called the knowledge architect to start realizing some of the promise that was seen in the 1980s and in the 1990s. The information architect’s role is essentially is to harmonize all manner of information and make sure it’s properly managed and accessible to the people who are authorized to see it. It’s not just the information architect. He has to be a team player, working closely with technology, because more and more information will be not just machine-readable, but machine-processable and interpretable. So he has to work with the people not only in technology, but with those developing applications, and especially those dealing with security because we’re creating more homogenous enterprise information-sharing environments with consolidated information holdings.

The paradigm is going to be changing. It’s going to be much more information-centric. The object-oriented paradigm, from a technical perspective, meant the encapsulation of the information. It’s happened, but at the process level.

When you have a thousand processes in the organization, you’ve got problems. Whereas, now we’d be looking at encapsulation of the information much more at the enterprise level so that information can be reused throughout the organization. It will be put in once and used many times.

Quality of information

The quality of the information will also be addressed through governance, particularly incorporating something called data stewardship, where people would be accountable, not only for the structure of the information but for the actual quality of the informational holdings.

Gardner: Thank you. Eugene, how do you see the role of the information architect as important in solidifying people’s thinking about this at that higher level, and as Robert said, being an advocate for the information across these other disciplines?

Imbamba: It’s inevitable that this role will definitely emerge and is going to take a higher-level position within organizations. Back to my earlier comment about information really becoming an issue, we have lots of information. We have variety of information and varied velocity of information requirements.

We don’t have enough folks today who are really involved in this discipline and some of the projections we have are within the next 20 years, we’re going to have a lot more information that needs to be managed. We need folks who are engaged in this space, folks who understand the space and really can think outside the box, but also understand what the business users want, what they are trying to drive to, and be able to provide solutions that really not only look at the business problem at hand but also what is the organization trying to do.

The role is definitely emerging, and within the next couple of years, as Robert said, the term might change from information architects to knowledge architects, based on where information is and what information provides to business.

Gardner: Mei, how far along are we actually on this definition and even professionalization of the information architect role?

Selvage: I’d like to share a little bit of what IBM is doing internally. We have a major change to our professional programs and certification programs. We’ve removed IT out of architect as title. We just call architect. Under architect we have business architecture, IT architecture, and enterprise architecture. Information architecture falls under IT architecture. Even though we were categorized one of the sub components of IT architecture.

Information architect, in my opinion, is more business-friendly than any other professionals. I’m not trying to put others down, but a lot of new folks come from data modeling backgrounds. They really have to understand business language, business process, and their roles.

When we have this advantage, we need to leverage those and not just keep thinking about how I create database structures and how I make my database perform better. Rather, my tasks today contribute to my business. I want to doing the right thing, rather than doing the wrong things sooner.

IBM reflects an industry shift. The architect is a profession and we all need to change our mindsets to be even broader.

Delivering business value

Weisman: I’d like to add to that. I fully agree, as I said, that The Open Group has created TOGAF 9 as a capability-based planning paradigm for the business planning. IM and IT are just two dimensions of that overall capability, and everything is pushed toward the delivery of business value.

You don’t have to align IM/IT with the business. IM and IT become an integral part of the business. This came out of the defense world in many cases and it has proven very successful.

IM, IT, and all of the architecture domains are going to have to really understand the business for that. It’ll be an interesting time in the next couple of years in the organizations that really want to derive competitive advantage from their information holdings, which is certainly becoming a key differentiator amongst large companies.

Gardner: Robert, perhaps while you’re talking about The Open Group, you could update us a bit on what took place at the Austin Conference, particularly vis-à-vis the workgroups. What was the gist of the development and perhaps any maturation that you can point to?

Weisman: We had some super presentations, in particular the one that Eugene and Mei gave that addressed information architecture and various associated processes and different types of sub- architectures/frameworks as well.

The Information Architecture Working Group, which is winding down after two years, has created a series of whitepapers. The first one addressed the concerns of the data management architecture and maps the data management body of knowledge processes to The Open Group Architecture Framework. That whitepaper went through final review in the Information Architecture Working Group in Austin.

We have an Information Architecture Vision paper, which is an overall rethinking of how information within an organization is going to be addressed in a holistic manner, incorporating what we’d like to think as all of the modern trends, all types of information, and figure out some sort of holistic way that we can represent that in an architecture. The vision paper is right now in the final review. Following that, we’re preparing a consolidated request for change to the TOGAF 9 specification. The whitepapers should be ready and available within the next three months for public consultation. This work should address many significant concerns in the domain of information architecture and management. I’m really confident the work that working group has done has been very productive.

Gardner: Now, you mentioned that Mei and Eugene delivered a presentation. I wonder if we can get an overview, a quick summary of the main points. Mei, would you care to go first?

Selvage: We’ve already talked a lot about what we have described in our presentation. Essentially, we need to understand what it means to have a successful solution information architecture. We need to leverage all those best practices, which come in a form of either a proven reference architecture or methodology, and use that to achieve alignment within the business. Eugene, do you have anything you want to specifically point out in our presentation?

Three keys

Imbamba: No, just to add to what you said. The three keys that we brought were the alignment of business and IT, using and leveraging reference architectures to successfully implement information architectures, and last was the adoption of proven methodology.

In our presentation, we defined these constructs, or topics, based on our understanding and to make sure that the audience had a common understanding of what these components meant. Then, we gave examples and actually gave some use cases of where we’ve seen this actually happen in organizations, and where there has been some success in developing successful projects through the implementation of these methods. That’s some of what we touched on.

Weisman: Just as a postscript from The Open Group, we’re coming with an Information Architecture and Planning Model. We have a comprehensive definition of data and information and knowledge; we’ve come up with a good generic lifecycle that can be used by all organizations. And, we addressed all the issues associated with them in a holistic way with respect to the information management functions of governance, planning, operations, decision support and business intelligence, records and archiving, and accessibility and privacy.

This is one of the main contributions that these whitepapers are going to provide is a good planning basis for the holistic management of all manner of information in the form of a complete model.

Gardner: We’ve heard about how the amount of data is going to be growing exponentially, perhaps 44 times in less than 10 years, and we’ve also heard that knowledge, information, and your ability to exploit it could be a huge differentiator in how successful you are in business. I even expect that many businesses will make knowledge and information of data part of their business, part of their major revenue capabilities — a product in itself.

Let’s look into the future. Why will the data and information management professionalization, this role of the information architect be more important based on some of the trends that we expect? Let’s start with you, Robert. What’s going to happen in the next few year that’s going to make it even more important to have the holistic framework, strategic view of data information?

Weisman: Right now, it’s competitive advantage upon which companies may rise and fall. Harvard Business School Press, Davenport in particular, has produced some excellent books on competitive analytics and the like, with good case studies. For example, a factory halfway through construction is stopped because they didn’t have timely access to the their information indicating the factory didn’t even need to be constructed. This speaks of information quality.

In the new service-based rather than industry-based economic paradigm, information will become absolutely key. With respect to the projected increase of information available, I actually see a decrease in information holdings within the enterprise itself.

This will be achieved through a) information management techniques, you will actually get rid of information; b) you will consolidate information; and c) with paradigms such as cloud, you don’t necessarily have to have information within the organization itself.

More with less

So you will be dealing with information holdings, that are accessible by the enterprise, and not necessarily just those that are held by the enterprise. There will also be further issues such as knowledge representation and the like, that will become absolutely key, especially with demographics as it stands now. We have to do more with less.

The training and professionalization of information architecture, or knowledge architecture, I anticipate will become key. However, knowledge architects cannot be educated totally in a silo, they also have to have a good understanding of the other architecture domains. A successful enterprise architect must understand all the the other architecture domains.

Gardner: Eugene, how about you, in terms of future trends that impact the increased importance of this role in this perspective on information?

Imbamba: From an IBM perspective, we’ve seen over the last 20 years organizations focusing on what I call an “application agenda,” really trying to implement enterprise resource planning (ERP) systems, supply chain management systems, and these systems have been very valuable for various reasons, reducing cost, bringing efficiencies within the business.

But, as you know, over the last 20 years, a lot of companies now have these systems in place, so the competitive advantage has been lost. So what we’re seeing right now is companies focusing on an information agenda, and the reason is that each organization has information about its customers, its products, its accounts like no other business would have.

So, what we’re seeing today is leveraging that information for competitive advantage, trying to optimize your business, gleaning the information that you have so that you can understand the relationships between your customers, between your partners, your suppliers, and optimize that to deliver the kinds of services and needs, the business wants and the customer’s needs. It’s a focus from application agenda to an information agenda to try and push what’s going on in that space.

Gardner: Mei, last word to you, future trends and why would they increase the need for the information architecture role?

Selvage: I like to see that from two perspectives. One is from the vendor perspective, just taking IBM as an example. The information management brand is the one that has the largest software products, which reflects market needs and the market demands. So there are needs to have information architects who are able to look over all those different software offerings in IBM and other major vendors too.

From the customer perspective, where I see a lot of trends is that many outsource basic database administration, kind of a commodity or activity out to a third-party where they keep the information architects in-house. That’s where we can add in the value. We can talk to the business. We can talk to the other components of IT, and really brings things together. That’s a trend I see more organizations are adopting.

Gardner: Very good. We’ve been discussing the role and impact of an information architect and perhaps how to begin to implement a more successful data and information strategy.

This comes to you as a sponsored podcast in conjunction with The Open Group Conference in Austin, Texas in the week of July 18, 2011. I’d like to thank our guests. We’ve been joined by Robert Weisman, CEO of Build The Vision Incorporated. Thanks so much, Robert.

Weisman: You’re very welcome. Thank you for inviting.

Gardner: And we’ve been here with Eugene Imbamba. He is Information Management Architect in IBM Software Group. Thank you, Eugene.

Imbamba: Thank you for having me.

Gardner: And Mei Selvage, she is Lead of the IBM Community of Information Architects. Thanks to you as well.

Selvage: You’re welcome. Thank you too.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks to our viewers and listeners as well, and come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com.

Copyright The Open Group 2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirect™ blogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

Comments Off

Filed under Data management